Friday, July 18, 2008

Design and Analysis in Software Engineering

This entry is mostly a sticky-note for me and grad students working with me. In general, computer science and software engineering folks don't know much about experimental design, surveys, or how to compare two things. Fortunately, way back in 1994, Sheri Pfleeger started a column, only a couple pages long, giving the essentials of what we need to do valid comparisons of tools, techniques, or processes. So if you are looking for help designing a study, scan the titles below for a column that could help.

Fresno State students: click on the link you are interested in. If you're on-campus, you'll be taken there immediately. If you are off campus, you'll be taken to a page showing the abstract and a DOI. You can still access the paper from off campus by going here, log in, then paste the numeric part of the DOI into the box.

For University of Hawaii students, go here, and click on Software Engineering Notes to log in and go to the archive.


Experimental design in Software Engineering

Pfleeger, S. (October 1994) Design and Analysis in Software Engineering, Part 1: The Language of Case Studies and Formal Experiments.

Pfleeger, S. (January 1995) Experimental Design and Analysis in Software Engineering, Part 2: How to Set Up an Experiment.

Pfleeger, S. (April 1995) Experimental Design and Analysis in Software Engineering, Part 3: Types of Experimental Design.

Pfleeger, S. (July 1995) Experimental Design and Analysis in Software Engineering, Part 4: Choosing an Experimental Design.

Pfleeger, S. (December 1995) Experimental Design and Analysis in Software Engineering, Part 5: Analyzing the Data.

Evaluating software engineering methods and tools

Kitchenham, B. (January 1996) Evaluating software engineering methods and tools part 1: The evaluation context and evaluation methods.

Kitchenham, B. (March 1996) Evaluating software engineering methods and tools part 2: selecting an appropriate evaluation method—technical criteria.

Kitchenham, B. (July 1996) Evaluating software engineering methods and tools part 3: selecting an appropriate evaluation method—practical issues.

Sadler, C. & Kitchenham, B. (September 1996) Evaluating software engineering methods and tools —part 4: the influence of human factors.

Jones, L. & Kitchenham, B. (January 1997) Evaluating software engineering methods and tools part 5: the influence of human factors.

Kitchenham, B. & Jones, L. (March 1997) Evaluating software engineering methods and tools part 6: identifying and scoring features.

Kitchenham, B. (July 1997) Evaluating software engineering methods and tools, part 7: planning feature analysis evaluation.

Kitchenham, B. & Jones, L. (September 1997) Evaluating SW Eng. methods and tools, part 8: analysing a feature analysis evaluation.

Kitchenham, B. & Pickard, L.M. (January 1998) Evaluating software engineering methods and tools: part 9: quantitative case study methodology.

Kitchenham, B. & Pickard, L.M. (May 1998) Evaluating software eng. methods and tools part 10: designing and running a quantitative case study.

Kitchenham, B. & Pickard, L.M. (July 1998) Evaluating software engineering methods and tools, part 11: analysing quantitative case studies.

Kitchenham, B. (September 1998) Evaluating software engineering methods and tools part 12: evaluating DESMET.


Principles of Survey Research

Pfleeger, S. & Kitchenham, B. (November 2001) Principles of survey research: part 1: turning lemons into lemonade.

Pfleeger, S. & Kitchenham, B. (January 2002) Principles of survey research part 2: designing a survey.

Pfleeger, S. & Kitchenham, B. (March 2002) Principles of survey research: part 3: constructing a survey instrument.

Pfleeger, S. & Kitchenham, B. (May 2002) Principles of survey research part 4: questionnaire evaluation.

Pfleeger, S. & Kitchenham, B. (September 2002) Principles of survey research: part 5: populations and samples.

Pfleeger, S. & Kitchenham, B. (March 2003) Principles of survey research part 6: data analysis.