QA & Release, building bridges

Registered by Gema Gomez

Discuss what went well and what didn't go so well in terms of QA - Release team communication during P and try to put in place the actions for smoother milestones during Q.

Blueprint information

Status:
Complete
Approver:
Pete Graner
Priority:
Essential
Drafter:
Gema Gomez
Direction:
Approved
Assignee:
Canonical Platform QA Team
Definition:
Approved
Series goal:
Accepted for quantal
Implementation:
Implemented
Milestone target:
milestone icon ubuntu-12.10
Started by
Gema Gomez
Completed by
Gema Gomez

Related branches

Sprints

Whiteboard

Discuss what can we do better going forward to facilitate the communication between the Release Team and the QA Team.
Objectives for this meeting:
- Figure out who is responsible for what and when.
- Figure out where to keep each bit of information and who needs to keep it updated
QA Questions:
- Testing priorities, unclear. We seem to be aiming at testing everything every respin
Mandatory tests need to be reviewed
Right now we only have a set of test cases that are just a pool of tests
- test cases should have manditory label removed - basically manual.
- manual testing effort, go through out test cases, and figure out which test cases.
- What is release criteria? needs to be around automated tests, escape analysis when the manual validation occurs. Use community testing to refine.
Push manual testing daily, safety net for end of cycle.
- What has gone into each respin?
Do better at communication, maybe remove the release sprint
Maybe analyse all the respins we did to figure out which ones could have been avoided
Bring all information in to one place at moment.
Release tracker - Release tracker to be about tracking results and lava to be about reporting all the information consolidated. Know where we sit in moment.
- Weather report is a source of information that needs to be integrated.
- Packages being updated, updating the image set.
We could use the iso tracker to keep notes of what is going on
Auto testing plus one manual validation of each image rebuilt.
- Coverage, who is keeping track? The tracker should be able to give us a view. Is there a way to tell how much code churn each change has introduced and where? i.e. can we do risk based testing instead of trying to cover everything every time?
A test automation run plus one run for manual verification should be sufficient for a respin
- Who decides what is risky and what is not and based on what? The QA team should have a say when a respin is about to be done during release week.
Release team, but it may need to be wider. Want to do on IRC channel so others can see it.
- How are the fixes prioritized? can we group the problems instead of doing more than one respin?
we try to do that, but it is not always easy, it is a tradeof.
Target of opportunities go on the pad way too easily.
- How is the risk assessed?
I think this one is answered already.
Release Team Questions:
- Why was there no review and scrubbing of manual test cases done by QA to make sure manual mandatory tests targeted for the release, made sense and were accurate?
  Rewrite of community didn't work.
 [nskaggs] Goal is to have all test cases for the release, ready and reviewed by feature freeze.
  [gema] QA Team could have team members assigned to the different dev teams and keep an eye on what is going on.
  [kate.stewart] update team reporting template to make it explicity what has changed for testing purposes.
- Where is test case/package execution path coverage information kept? (need to know which packages are tested out by individual test) so we can be smarter about retesting.
This information does not exist, this should be added to the test case management functionality in the tracker.
- How are minimum system hardware installations being tested? Recommended install configurations?
[cjwatson] review CD sleeves and how information is published - blythe
[gema] minimum and recommended configuration be included.
[cjwatson] worst case install path definition set up.
[kate.stewart ] places that need to be updated with this information - release.ubuntu.com, help.ubuntu.com,
https://help.ubuntu.com/community/Installation/SystemRequirements
https://help.ubuntu.com/12.04/installation-guide/i386/minimum-hardware-reqts.html, release notes, any information included on www.ubuntu.com, and the CD sleeves (Blythe)
- How are default application configurations being tested? Standard application migrations? (ie. Evolution -> Thunderbird default on LTS migration 10.04 -> 12.04)
[gema] Make sure that app migrations grant a test case addition. Desktop team needs to be involved.
- Why is there no/very limited manual testing of mandatory test cases prior to freeze windows, when fixes are easier to add and respins can be minimized? What can be done to improve this in quantal.
Timing will be changed.
- Where is prioritized order of image testing (manual and automated) to be kept? We need to identify highest risk first, so earliest notification that respin/testing abort needed so maximize time for development. Priorities should be set by QA and Release team prior to freeze.
Prioritized per respin. Coordinated in release meeting prior to spin kicking.
Use iso tracker as prioritized. Session installer right at end was example.
[gema] talk about prioritization of isos with stgraber
scheduling in automated testing. on demand testing is sufficent.
- Automated test report summaries, why can't these be summarized on iso tracker so manual and automated results are all on one location?
Tell the QA team if on demand runs are required
- Manual test report summaries, why can't we get testing coverage over week period, as well as daily, so we can use aggregation of manual results over time period to determine focus areas?
[stgraber] to add a view to the tracker for this
- Why was poor response from request to platform team for weekly manual testing not proactively communicated after first week?
more folk testing image. Steve/Pete wil talk to Rick.
[nskaggs] more proactive communication.
- Why late communication of platforms not available for testing during betas (beta1 amd64+mac, beta 2 arm)? If hardware availability problem known at start of cycle, we could have asked for more community help earlier.
We bought the hardware but someone forgot to send it, so we got it over there as soon as we realised.

2012/05/17 KES updated wording in Weekly Template: https://wiki.ubuntu.com/ReleaseTeam/Meeting/Agenda/TeamTemplate to encompass release as a whole, so
better awareness for testing.
2012/06/21: Gema: looked at weather report and some metrics there are quite old, some others seem to be still useful. We are not in a place where it makes sense to review and update that report just yet. We'll improve the testing reporting during this cycle and I think we should have a session next UDS to re-think the weather report and then move it to the new reporting. Adding a BP for that: https://blueprints.launchpad.net/ubuntu/+spec/qa-weather-report-reloaded

2012/09/12: gema: after discussion with skaet, jibel and balloons, it was clarified that there is no one single person responsible of deciding which test cases are mandatory and which ones are not, test cases are kept as before (default option) unless there is a very clear need to add one. This will probably change in the future as we improve automation, but at the moment I see no reason to start cutting down on the already slim set of test cases we have.

(?)

Work Items

Work items:
[nskaggs] Have all test cases for the release, ready and reviewed by feature freeze: DONE
[ubuntu-release] Based on Nick's test cases, decide which ones have to be mandatory and which ones are not: DONE
[gema] Find out who decides which test cases are mandatory: DONE
[gema] Make sure only mandatory test cases are marked mandatory on the tracker: POSTPONED
[nskaggs] Make sure community testing happens more evenly during the cycle: DONE
[cjwatson] Explain what the weather report is currently used for by the release team to help QA understand how to convey/integrate that info with the new reporting: POSTPONED
[gema] Figure out what information is in the weather report and if it needs to be integrated with the new reporting: DONE
[gema] Think about value added of having QA Team keep an eye on what is going on on the different dev teams (there would be value, we have no bandwidth): DONE
[kate.stewart] update team reporting template to make it explicit what has changed for testing purposes: DONE
[pwlars] Minimum configuration to be included on ISO testing: DONE
[cjwatson] Worst case install path definition set up: POSTPONED
[kate.stewart] Update process pages, with places that need to be updated with minimum and recommended configuration information - (see websites on the whiteboard) release notes, any information included on www.ubuntu.com, and the CD sleeves (get info from blythe where this is pulled from): DONE
[gema] Find out who in the desktop team needs to notify us from app migrations and make them aware. seb128 to send updates to the qa-team: DONE
[gema] Make sure that app migrations grant a test case addition. Desktop team needs to be involved (we don't have the bandwidth this cycle, hopefully next): POSTPONED
[kate.stewart] Talk about prioritization of isos with stgraber: DONE
[nskaggs] More proactive communication: DONE

Dependency tree

* Blueprints in grey have been implemented.