Software Engineering is a field of so-high socio-technical complexity that the properties (let alone the usefulness) of proposed methods and tools are not at all obvious. We need to evaluate them empirically.

This course introduces two different manners in which one can think about this situation and approach evaluations:

  1. A quantitative perspective. This aims at quantified statements about the tools and methods and is based on a positivist epistemological stance and corresponding culture.
  2. A qualitative perspective. This aims at making sense of the things that are going on to create the phenomena that give rise to the quantitative outcomes. This perspective is based on an interpretivist epistemological stance and has a culture that values different things.

Both perspectives have different strengths and weaknesses and are suitable for different types of research interest. In this course, we will learn to think in both of these perspectives and to appreciate the different benefits they provide. We will learn what it means that a study has high quality: it has high credibility and high relevance. We will train diagnosing the various quality problems that often reduce credibility or relevance.

We will work through the most common research methods and will discuss real examples (interesting published studies) of each, along with their strengths and weaknesses.

Participants will understand how and when to apply each method and for one of them develop some practical skills by doing so.


Literatur

  • Jacob Cohen: The Earth Is Round (p > .05). American Psychologist 49(12): 997003, 1994. Darrell Huff: How to lie with statistics, Penguin 1991.
  • John C. Knight, Nancy G. Leveson: An Experimental Evaluation of the Assumption of Independence in Multi-Version Programming. IEEE Transactions on Software Engineering 12(1):9609, January 1986.
  • John C. Knight, Nancy G. Leveson: A Reply to the Criticisms of the Knight and Leveson Experiment. Software Engineering Notes 15(1):24-35, January 1990.
  • Audris Mockus, Roy T. Fielding, James D. Herbsleb: Two Case Studies of Open Source Software Development: Apache and Mozilla. ACM Transactions of Software Engineering and Methodology 11(3):309-346, July 2002.
  • Timothy Lethbridge: What Knowledge Is Important to a Software Professional? IEEE Computer 33(5):44-50, May 2000.
  • David A. Scanlan: Structured Flowcharts Outperform Pseudocode: An Experimental Comparison. IEEE Software 6(5):28-36, September 1989.
  • Ben Shneiderman, Richard Mayer, Don McKay, Peter Heller: Experimental investigations of the utility of detailed flowcharts in programming. Commun. ACM 20(6):373-381, 1977.
  • Lutz Prechelt, Barbara Unger-Lamprecht, Michael Philippsen, Walter F. Tichy: Two Controlled Experiments Assessing the Usefulness of Design Pattern Documentation in Program Maintenance. IEEE Transactions on Software Engineering 28(6):595-606, 2002.
  • Lutz Prechelt. An Empirical Comparison of Seven Programming Languages: Computer 33(10):23-29, October 2000.
  • Lutz Prechelt: An empirical comparison of C, C++, Java, Perl, Python, Rexx, and Tcl for a search/string-processing program. Technical Report 2000-5, March 2000.
  • Tom DeMarco, Tim Lister: Programmer performance and the effects of the workplace. Proceedings of the 8th international conference on Software engineering. IEEE Computer Society Press, 268-272, 1985.
  • John L. Henning: SPEC CPU2000: Measuring CPU Performance in the New Millennium. Computer 33(7):28-35, July 2000.
  • Susan Elliot Sim, Steve Easterbrook, Richard C. Holt: Using Benchmarking to Advance Research: A Challenge to Software Engineering. Proceedings of the 25th International Conference on Software Engineering (ICSE'03). 2003.
  • Ellen M. Voorhees, Donna Harman: Overview of the Eighth Text REtrieval Conference (TREC-8).
  • Susan Elliott Sim, Richard C. Holt: The Ramp-Up Problem in Software Projects: A case Study of How Software Immigrants Naturalize. Proceedings of the 20th international conference on Software engineering, April 19-25, 1998, Kyoto, Japan: 361-370.
  • Oliver Laitenberger, Thomas Beil, Thilo Schwinn: An Industrial Case Study to Examine a Non-Traditional Inspection Implementation for Requirements Specifications. Empirical Software Engineering 7(4): 345-374, 2002.
  • Yatin Chawathe, Sylvia Ratnasamy, Lee Breslau, Nick Lanham, Scott Shenker: Making Gnutella-like P2P Systems Scalable. Proceedings of ACM SIGCOMM 2003. April 2003.
  • Stephen G. Eick, Todd L. Graves, Alan F. Karr, J.S. Marron, Audris Mockus: Does Code Decay? Assessing the Evidence from Change Management Data. IEEE Transactions of Software Engineering 27(1):12, 2001.
  • Chris Sauer, D. Ross Jeffrey, Lesley Land, Philip Yetton: The Effectiveness of Software Development Technical Reviews: A Behaviorally Motivated Program of Research. IEEE Transactions on Software Engineering 26(1):14, January 2000.

Zusätzliche Informationen

The course language is German, but the actual slides and practice sheets are in English.

The exam will be formulated in German, but answers may be given in English, too.

Homepage: http://www.inf.fu-berlin.de/w/SE/VorlesungEmpirie