r27 - 13 Jul 2007 - 15:27:13 - MimiYinYou are here: OSAF >  Projects Web  >  DevelopmentHome > OsafQAProcess

OSAF QA Process Handbook


Introduction

This document captures the OSAF QA Process for Chandler Desktop and Chandler Server(Cosmo) products. It documents the different stages a release has to go through during the quality cycle and defines the goal of each stage. This document is complimentary to the different test specifications and clarifies the terminologies and steps used during the different quality assurance activities. We have identified 2 groups of users who will potentially be testing our products: people within OSAF and the open source community outside OSAF who may be willing to contribute towards finding bugs and improving the quality of our products. The following QA process is developed so it can be followed by either set of users.

Quality Goals

The goal of the Quality Assurance team is the following:

  • Ensure the release meets the requirements in the Design Documents provided by the Design/UI team.
  • Validate the design and confirm that the implementation conforms to the development standards and coding guidelines;
  • Ensure the product is usable and acceptable to the target user which includes feature set, functionality, UI, performance, operability and deployability;
  • Ensure the documentation is up-to-date and clear enough for any open source developer to extend application;
  • Ensure we can deliver a high quality OSAF release for the open source community

Acceptance Criteria

QA will start the "official" test cycle for each release when the following criteria is met:
  • The handoff build is feature complete(i.e all the features expected to be delivered in that milestone or release are fully complete)
  • Unit tests for the different functional components are developed, checked-in and run successfully.
  • Automated Functional tests are executed and they passed.

Release Criteria

The following is a general guideline for the release criteria. All the conditions in the release critieria need to be satisfied and completed before QA complete can be declared.
Area Release Criteria for the Area
Features All features scheduled for the release are code complete and all feature development is frozen
QA Functional and Integration Tests have run successfully
Certification Testing building chandler fully from the sources on all approved platforms; including software and hardware. Also testing the debug and end user's distributions on all the approved platforms
Bugs No blocker,critical or major bugs outstanding in bugzilla for the target milestone set for this release
Documentation Full set of documentation for all the features delivered in the release. Also documentation of the features that were originally in the design but didn't make it in the release
Performance certification For the performance criteria decided for the release, making sure the product performance confirms to that.

Certification Matrix

Certification matrix defines the exact environment in which the chandler release will be certified. For both software and hardware, the exact version numbers will be recorded in the certification matrix.

Platform

  • Windows XP
  • Mac OSX (PPC) 10.4.x
  • Mac OSX (Intel)
  • Linux Ubuntu

Third Party Software

See External Libraries wiki page for a detailed breakdown of what third party code is used in Chandler.

One of the important checks would be to make sure all the 3rd party software we use are secure i.e there have been no published security vulnerabilities in any of them that have not been fixed in our versions.

Currently the version numbers of all the dependent software has not been decided. We plan to build an installer for Chandler which will automatically install all the dependent 3rd party software which is not already on the user's machine.

Test Process

  • The design team writes the release document and the design specification for each of the features
  • QA Engineers develop the test specifications based on the design spec
  • Developers develop/implement features
  • Developers unit test the features
  • QA comes up with the acceptance criteria and starts testing
  • QA develops functional tests (20-30% of the test cases will be automated in this cycle) and file bugs
  • Developers resolve bugs as they are found
  • QA performs Integration test
  • Performance test is conducted during each test cycle. Performance tests will also be integrated with tinderbox nightly builds.
  • Bug councils will review the list of outstanding bugs continuously during the release cycle and assign target milestone and priorities to each open bug.
  • QA verifies resolved bugs and reviews release criteria to make sure everything is met from the release criteria.
  • Release the product to the user community
  • Process feedback from the user community and make sure it goes back to the Design group.

Types of Test Definitions

  • PreCheckInTests -- tests that developers should run before checking in changes
  • UnitTests -- automated tests that are run by the build system. Developers are responsible for adding unit tests for the components they develop. For every component in the system there should be corresponding unit tests that can be run just to validate the functionality of that component.
  • FunctionalTests -- complete set of manual and automated tests to test the functionality and performance of each feature in the release. These testcases will match one on one with the testcases in the test specification. Currently we have very limited number of automated tests for Chandler and any hellp from the user community will be greatly appreciated. If you have any expertise in writing automated tests for desktop UI applications and would like to contribute to Chandler test development, please contact us.
  • Performance tests -- performance testing may be conducted as part of functional tests to test the application startup time, response times, memory leak, CPU utilization, etc. The performance criteria will be developed by the Product/Design team for each release.
  • Integration tests -- test cases cases that test the application completely from end to end, after all functional components are code complete. This includes test cases with more complex scenarios than functional tests.
  • Regression tests -- subset of automated functional tests that will be run nightly during the development cycle to ensure no existing functionality was broken because of new feature development.
  • AcceptanceTests -- tests that anyone can run in order to "bless" a milestone/release. This is a more extensive list of manual tests that are conducted at the end of each milestone/release.

Bug Councils

Bug Council will meet once a week during the release cycle and possibly twice a week once we hit feature freeze. Representatives from Development, QA, Design and Build teams may be present. The council reviews the outstanding bugs and decides the priority and target milestone for each outstanding bug.

IRC QA Sessions

For every integration point release, we will be conducting a collaborative QA session with the user community during the weekly IRC Office hour. I believe this is the best method to get a wide range of user base to test the product and find some creative bugs.

Bug filing guidelines

When a bug is filed by a QA engineer or an external user, it would be very helpful if the following fields are set for the prioritizing the bug fix:
  • Product for e.g. Chandler or Cosmo
  • Component for e.g. Calendar, Parcel Framework etc
  • Severity for e.g. critical, major etc
  • Priority - Priority should be left to the default P3. The bug council will decide the priority based on number of other factors.
  • Summary - Summary line should be very descriptive for e.g. Summary view shows a blank screen when calendar tab is accessed.
  • Description - Detailed description including the build or svn revision number, detailed set of steps and whether reproducible consistently or not. Should attach data, log files also to corroborate the bug.
If you are testing on an already released version of Chandler then set the version number of the release in the bug. If it is in development, then set the version at the time of CVS update. The Target Milestone field will be set by the development/project leads as part of task management of the different deliverables. For more information on bug writing guidelines refer to http://bugzilla.osafoundation.org/bugwritinghelp.html.

Edit | WYSIWYG | Attach | Printable | Raw View | Backlinks: Web, All Webs | History: r27 < r26 < r25 < r24 < r23 | More topic actions
 
Open Source Applications Foundation
Except where otherwise noted, this site and its content are licensed by OSAF under an Creative Commons License, Attribution Only 3.0.
See list of page contributors for attributions.