View this PageEdit this PageUploads to this PageHistory of this PageHomeRecent ChangesSearchHelp Guide

Practicum: Quality Assurance and Testing

Quality Assurance comprises a number of practices oriented toward early, economic detection of defects in the product.

Testing is that portion of QA devoted to running the software to see if it performs according to its specifications.

Goals of the practicum

The student should gain experience with the fundamental practice for quality assurance and various testing methods.
They will meet QA goals by practicing techniques for QA during each of the different stages of the development cycle.

Practicum parts

During this practicum students will gain familiarity with various quality assurance practice including:
  • Formal design inspection
  • Prototyping (especially interfaces)
  • Defect tracking
  • Code reading/inspections
  • Execution testing

Students also will practice various testing techniques. For unit testing, they'll gain experience with practices including
  • Testing for each requirement
  • Data-flow tests
  • Building test cases along with product
  • Regression testing
  • Keeping records

They'll also do integration testing, and use some practices like the "daily build and smoke test".

Since this practicum involves experience with all the different stages of the development process, it is recommended that students take this practicum during their final year of study, so that they will have experience with most of the stages. Student exercises will be a mixture of somewhat "canned" examples for practice, as well as real experience collaborating with teams who are working in various stages of the software lifecycle. There is no prescribed order for coverage of the material in the practicum, rather the schedule hinges on what other practicums are going on concurrently that can create mutually beneficial collaborations between students in this practicum and others.


Chapter 9 of Steve McConnell's Software Project Survival Guide talks about Quality Assurance, mostly from a management perspective.

Section 4.3 of McConnell's Rapid Development describes fundamentals of QA, and suggests other readings for different aspects of quality (testing, code reviews, etc.).


The tools students will need to help them in this practicum include:

  • Defect database

  • Integrated Debugger (for code walkthroughs)

Costs associated with this practicum

  • Software: Tools needed include a defect database, integrated debugger, and tools for automation of testing (especially regression tests).

  • Space: There should be meeting rooms for design/code reviews, testing areas, and a maintenance area, as outlined in the space resources for the software factory.

  • People: The practicum needs someone with experience using the tools (could be a student of faculty member), as well as someone to give guidance about the QA and testing processes (most likely faculty).


Assessment for this practicum is tailored to some of its individual parts. Based on the goals of each part, appropriate questions or artefacts for "testing" that part are decided on. Metrics for judging quality of student question/artefacts must still be decided upon.
Each goal is discussed in turn below. Most of the parts admit some assessment based on observation of students in the practicum participating in projects in the Software Factory; emphasis is given below to what other methods can be used.

Design inspections

To assess this goal, an essay-test would be appropriate. Give students a (canned, imperfect) design, and ask them to critique it.

Code reviews/inspections

This goal could also be assessed via the essay-test style (give some code and ask how the review would be conducted, what it ought to uncover, etc.). Some multiple choice questions about code reviews might be appropriate too (e.g. "what people attend a code review?", "how often should code reviews be conducted?").

Prototyping and Defect Tracking

General multiple-choice or short-answer questions might work here. These are probably best assessed as part of a project, though. Prototyping and getting feedback to revise interfaces will be a highly project-dependent exercise. Records of defect tracking should be a kind of check-off assessment.

Testing (for each requirement; dataflow)

Multiple choice questions about checklists of things for testing might
work here. Short answer questions about dataflow and requirements can work too, given a sample requirement document and/or piece of code.

Building test cases with the product; regression testing; recordkeeping

These are probably best assessed as part of the project. We want to ensure that test cases are not being built ad-hoc at the end, that regression testing happens, and that records are being kept about defects. There are a few possible exam-type questions from here too (e.g. knowing that modules that have had defects in the past are more likely to have more bugs).

Link to this Page