Identify build breaks on daily ISOs
Builds: start tracking the quality of the builds, define what a broken build is and track how long it takes to engineering teams to fix the problems they introduce that impact testing. By doing this, the aim is to raise awareness of impact of untested submissions and to be able to determine how many times we are held by bad submissions.
Add a dashboard to have a consolidated view of unit testing status. Collect initiatives from different engineering teams and either add it to Jenkins or consolidate it to make them easy to read and triage in case of failures.
Blueprint information
- Status:
- Complete
- Approver:
- Pete Graner
- Priority:
- Essential
- Drafter:
- Gema Gomez
- Direction:
- Approved
- Assignee:
- Canonical Platform QA Team
- Definition:
- Approved
- Series goal:
- Accepted for precise
- Implementation:
- Implemented
- Milestone target:
- None
- Started by
- Pete Graner
- Completed by
- Gema Gomez
Whiteboard
Decide the testing that we are going to do to decide whether a product build is broken or worth for further testing
Create a website to keep track of product builds sanity.
Keep a history of builds that we can refer to in the future to verify defects and for any testing reason (identify regressions).
Set up a mailing for people to be able to subscribe to the build status (try to structure in a way people do not have to read all the email but just the bits they are interested in).
Objective: Determine if a build is worth using for further testing.
Ideas
* Reporting tools: http://
* ISOs should be validated taking into account md5 signature, packages in that ISO match the manifest, ISO tree, etc. This test case should be executed before the default installation test. Check that runtime generated files are not there.
* After ISO static validation, we run default installation.
* After the system is installed, we run basic test cases on all the main components.
* Parse the installation log and check for warnings and problems during the install (add remarks in case something has gone wrong during the installation).
* Figure out where all the logs are for the unit tests(builder, launchpad)
* Write a script to extract the unit tests results and dump them into a report/database
Actions
* Decide the testing that we are going to do to decide whether a product build is broken or worth for further testing
* Create a website to keep track of product builds sanity.
* Keep a history of builds that we can refer to in the future to verify defects and for any testing reason (identify regressions).
* Set up a mailing for people to be able to subscribe to the build status (try to structure in a way people do not have to read all the email but just the bits they are interested in).
Additional information: Unit testing results gathering removed from this blueprint after discussion during QA Sprint, unit tests run at build time are not representative of the quality of the ISO post-integration, so those results are obsolete when the build becomes available. Removing the tasks to integrate those with the other ISO smoke tests.
Note 1:
= ISO Static Validation =
To be run after download and before the installation test as part of of the jobs <release>
This task must output results in jUnit XML format parsable by jenkins.
== Ubiquity based (Desktop/DVD) ==
File is a valid bootable iso
Content of the ISO match .list file
/casper/
filechecksum is already done as part of the downloader
Filesystem contains the following files/directories:
./autorun.inf
./boot
./casper
./.disk
./dists
./install
./isolinux
./md5sum.txt
./pics
./pool
./preseed
./README.
./ubuntu
./wubi.exe
/casper
/casper/
/casper/vmlinuz
/casper/initrd.lz
/casper/
/casper/
/casper/
(I think we could check the whole content of the ISO without the version of the deb files as it is not supposed to change)
./.disk/info exists and match the expected release/
wubi.exe is a valid MS Windows Executable
if report.html exists in the image directory, check that there is no error (e.g no conflict or uninstallable package)
== d-i based (Server/Alternate) ==
File is a valid bootable iso
Content of the ISO match .list file
filechecksum is already done as part of the downloader
Filesystem contains the following files/directories:
./boot
./cdromupgrade
./.disk
./dists
./doc
./install
./install/initrd.gz
./install/mt86plus
./install/netboot
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/
./install/sbm.bin
./install/vmlinuz
./isolinux
./md5sum.txt
./pics
./pool
./preseed
./README.
./ubuntu
./.disk/info exists and match the expected release/
if report.html exists in the image directory, check that there is no error (e.g no conflict or uninstallable package)
[21/03/2012] The tasks related to adding new test cases have been postponed, due to the new test harness that is being created. It makes no sense to create these new test cases now and migrate them to the new harness at the beginning of Q. We are relying on manual testing in the interim.
A wiki (https:/
Work Items
Work items for precise-alpha-1:
[gema] Set rotation schedule for the builds, first week to do it together to learn about the process: DONE
[jibel] Set up the infrastructure to keep a historic of ISOs: DONE
[hggdh2] Create a bug report to list the bugs that have been found during the daily iso testing (using current ISO bug report, after discussion with jibel): DONE
[gema] Decide on a tagging that makes sense and start using it to track ISO testing bugs and other kind of bugs going forward: DONE
Work items for precise-alpha-2:
[patrickmwright] Set up Jenkins to report ISO testing results: DONE
[patrickmwright] Assess how long would it take to improve the jenkins plugin to be fit for our purpose: DONE
[patrickmwright] Pilot ISO job restructuring: DONE
[patrickmwright] Create custom Dashboard Requirements: DONE
[patrickmwright] Tie jobs to iso under test: DONE
[hggdh2] Document how to interpret Jenkins results https:/
[nuclearbob] Generate a report to track the bugs based on the new tagging: DONE
[jibel] Document how to troubleshoot upgrade testing: https:/
[hggdh2] Document how to debug ISO installations: https:/
[hggdh2] Schedule and give a training session for the community on how to use Jenkins: DONE
[hggdh2] Document how to debug post installation problems: https:/
Review the Documentation for troubleshooting with Jenkins and update accordingly: DONE
[cjwatson] Show QA where to get CD Image logs and LiveFS logs (build logs), so that they can report on unit tests run at build time (mailed to gema.gomez 2012-01-10): DONE
[gema] Investigate where the smoke testing holes are: DONE
[gema] Review ISO testing trunk and improve it: DONE
[canonical-
[patrickmwright] Set up Lucid daily testing install tests: DONE
[jibel] Discuss with mvo to add software-center tests to daily testing: DONE
[jibel] Add software-center tests to daily testing: DONE
Work items for ubuntu-
[jibel] ISO static validation, write requirement (See note [1] below): DONE
[gema] Gema helping jibel triage ISO testing results, this has to be done in the morning so a rota with people in the US doesn't solve the problem: DONE
[hggdh2] Give training to the Platform QA Team about Jenkins maintenance: DONE
[hggdh2] Give training to the Platform QA Team about Jenkins maintenance: DONE
[jibel] Document how to debug ISO installations when doing manual testing: https:/
[gema] Improve the usage of the new tags defined in https:/
[albrigha-
Work items for ubuntu-
[gema] Define on a wiki what a broken build is: DONE
[albrigha-
[albrigha-
[jibel] Add a test scenario to the automated testing in Jenkins to address full updated installs with Ubiquity (problems to be detected with this testing: bug 743359 and bug 897680): POSTPONED