Identify build breaks on daily ISOs

Registered by Gema Gomez on 2011-10-17

Builds: start tracking the quality of the builds, define what a broken build is and track how long it takes to engineering teams to fix the problems they introduce that impact testing. By doing this, the aim is to raise awareness of impact of untested submissions and to be able to determine how many times we are held by bad submissions.

Add a dashboard to have a consolidated view of unit testing status. Collect initiatives from different engineering teams and either add it to Jenkins or consolidate it to make them easy to read and triage in case of failures.

Blueprint information

Status:
Complete
Approver:
Pete Graner
Priority:
Essential
Drafter:
Gema Gomez
Direction:
Approved
Assignee:
Canonical Platform QA Team
Definition:
Approved
Series goal:
Accepted for precise
Implementation:
Implemented
Milestone target:
None
Started by
Pete Graner on 2011-11-18
Completed by
Gema Gomez on 2012-03-21

Related branches

Sprints

Whiteboard

Decide the testing that we are going to do to decide whether a product build is broken or worth for further testing
Create a website to keep track of product builds sanity.
Keep a history of builds that we can refer to in the future to verify defects and for any testing reason (identify regressions).
Set up a mailing for people to be able to subscribe to the build status (try to structure in a way people do not have to read all the email but just the bits they are interested in).

Objective: Determine if a build is worth using for further testing.

   Ideas
   * Reporting tools: http://jasperforge.org/projects/jasperreports, http://www.reportlab.com/software/opensource/, jenkins
   * ISOs should be validated taking into account md5 signature, packages in that ISO match the manifest, ISO tree, etc. This test case should be executed before the default installation test. Check that runtime generated files are not there.
   * After ISO static validation, we run default installation.
   * After the system is installed, we run basic test cases on all the main components.
   * Parse the installation log and check for warnings and problems during the install (add remarks in case something has gone wrong during the installation).
   * Figure out where all the logs are for the unit tests(builder, launchpad)
   * Write a script to extract the unit tests results and dump them into a report/database

   Actions
   * Decide the testing that we are going to do to decide whether a product build is broken or worth for further testing
   * Create a website to keep track of product builds sanity.
   * Keep a history of builds that we can refer to in the future to verify defects and for any testing reason (identify regressions).
   * Set up a mailing for people to be able to subscribe to the build status (try to structure in a way people do not have to read all the email but just the bits they are interested in).

Additional information: Unit testing results gathering removed from this blueprint after discussion during QA Sprint, unit tests run at build time are not representative of the quality of the ISO post-integration, so those results are obsolete when the build becomes available. Removing the tasks to integrate those with the other ISO smoke tests.

Note 1:
= ISO Static Validation =
To be run after download and before the installation test as part of of the jobs <release>-<variant>-<arch>_default (e.g precise-desktop-amd64_default)

This task must output results in jUnit XML format parsable by jenkins.

== Ubiquity based (Desktop/DVD) ==
File is a valid bootable iso
Content of the ISO match .list file
/casper/filesystem.manifest match .manifest on the server
filechecksum is already done as part of the downloader

Filesystem contains the following files/directories:
./autorun.inf
./boot
./casper
./.disk
./dists
./install
./isolinux
./md5sum.txt
./pics
./pool
./preseed
./README.diskdefines
./ubuntu
./wubi.exe
/casper
/casper/filesystem.squashfs
/casper/vmlinuz
/casper/initrd.lz
/casper/filesystem.manifest
/casper/filesystem.manifest-remove
/casper/filesystem.size

(I think we could check the whole content of the ISO without the version of the deb files as it is not supposed to change)

./.disk/info exists and match the expected release/arch/buildid
wubi.exe is a valid MS Windows Executable

if report.html exists in the image directory, check that there is no error (e.g no conflict or uninstallable package)

== d-i based (Server/Alternate) ==
File is a valid bootable iso
Content of the ISO match .list file
filechecksum is already done as part of the downloader

Filesystem contains the following files/directories:
./boot
./cdromupgrade
./.disk
./dists
./doc
./install
./install/initrd.gz
./install/mt86plus
./install/netboot
./install/netboot/pxelinux.0
./install/netboot/pxelinux.cfg
./install/netboot/ubuntu-installer
./install/netboot/ubuntu-installer/i386
./install/netboot/ubuntu-installer/i386/boot-screens
./install/netboot/ubuntu-installer/i386/boot-screens/adtxt.cfg
./install/netboot/ubuntu-installer/i386/boot-screens/exithelp.cfg
./install/netboot/ubuntu-installer/i386/boot-screens/f10.txt
./install/netboot/ubuntu-installer/i386/boot-screens/f1.txt
./install/netboot/ubuntu-installer/i386/boot-screens/f2.txt
./install/netboot/ubuntu-installer/i386/boot-screens/f3.txt
./install/netboot/ubuntu-installer/i386/boot-screens/f4.txt
./install/netboot/ubuntu-installer/i386/boot-screens/f5.txt
./install/netboot/ubuntu-installer/i386/boot-screens/f6.txt
./install/netboot/ubuntu-installer/i386/boot-screens/f7.txt
./install/netboot/ubuntu-installer/i386/boot-screens/f8.txt
./install/netboot/ubuntu-installer/i386/boot-screens/f9.txt
./install/netboot/ubuntu-installer/i386/boot-screens/menu.cfg
./install/netboot/ubuntu-installer/i386/boot-screens/prompt.cfg
./install/netboot/ubuntu-installer/i386/boot-screens/rqtxt.cfg
./install/netboot/ubuntu-installer/i386/boot-screens/splash.png
./install/netboot/ubuntu-installer/i386/boot-screens/stdmenu.cfg
./install/netboot/ubuntu-installer/i386/boot-screens/syslinux.cfg
./install/netboot/ubuntu-installer/i386/boot-screens/txt.cfg
./install/netboot/ubuntu-installer/i386/boot-screens/vesamenu.c32
./install/netboot/ubuntu-installer/i386/initrd.gz
./install/netboot/ubuntu-installer/i386/linux
./install/netboot/ubuntu-installer/i386/pxelinux.0
./install/netboot/ubuntu-installer/i386/pxelinux.cfg
./install/netboot/ubuntu-installer/i386/pxelinux.cfg/default
./install/netboot/version.info
./install/README.sbm
./install/sbm.bin
./install/vmlinuz
./isolinux
./md5sum.txt
./pics
./pool
./preseed
./README.diskdefines
./ubuntu

./.disk/info exists and match the expected release/arch/buildid

if report.html exists in the image directory, check that there is no error (e.g no conflict or uninstallable package)

[21/03/2012] The tasks related to adding new test cases have been postponed, due to the new test harness that is being created. It makes no sense to create these new test cases now and migrate them to the new harness at the beginning of Q. We are relying on manual testing in the interim.
A wiki (https://wiki.ubuntu.com/QATeam/AutomatedTesting/BrokenBuild) has been added as a starting point on what a broken image/build is. We will continue to add content to it as the smoke testing develops further during Q.

(?)

Work Items

Work items for precise-alpha-1:
[gema] Set rotation schedule for the builds, first week to do it together to learn about the process: DONE
[jibel] Set up the infrastructure to keep a historic of ISOs: DONE
[hggdh2] Create a bug report to list the bugs that have been found during the daily iso testing (using current ISO bug report, after discussion with jibel): DONE
[gema] Decide on a tagging that makes sense and start using it to track ISO testing bugs and other kind of bugs going forward: DONE

Work items for precise-alpha-2:
[patrickmwright] Set up Jenkins to report ISO testing results: DONE
[patrickmwright] Assess how long would it take to improve the jenkins plugin to be fit for our purpose: DONE
[patrickmwright] Pilot ISO job restructuring: DONE
[patrickmwright] Create custom Dashboard Requirements: DONE
[patrickmwright] Tie jobs to iso under test: DONE
[hggdh2] Document how to interpret Jenkins results https://wiki.ubuntu.com/QATeam/AutomatedTesting/UnderstandingJenkinsResults: DONE
[nuclearbob] Generate a report to track the bugs based on the new tagging: DONE
[jibel] Document how to troubleshoot upgrade testing: https://wiki.ubuntu.com/QATeam/AutomatedTesting/DebuggingJenkinsUpgradeTesting: DONE
[hggdh2] Document how to debug ISO installations: https://wiki.ubuntu.com/QATeam/AutomatedTesting/DebuggingJenkinsISOInstallations: DONE
[hggdh2] Schedule and give a training session for the community on how to use Jenkins: DONE
[hggdh2] Document how to debug post installation problems: https://wiki.ubuntu.com/QATeam/AutomatedTesting/DebuggingJenkinsPostInstallProblems: DONE
Review the Documentation for troubleshooting with Jenkins and update accordingly: DONE
[cjwatson] Show QA where to get CD Image logs and LiveFS logs (build logs), so that they can report on unit tests run at build time (mailed to gema.gomez 2012-01-10): DONE
[gema] Investigate where the smoke testing holes are: DONE
[gema] Review ISO testing trunk and improve it: DONE
[canonical-platform-qa] Start applying the new tags to bugs https://wiki.ubuntu.com/QATeam/AutomatedTesting/TestingTypeAndBugTracking: DONE
[patrickmwright] Set up Lucid daily testing install tests: DONE
[jibel] Discuss with mvo to add software-center tests to daily testing: DONE
[jibel] Add software-center tests to daily testing: DONE

Work items for ubuntu-12.04-beta-1:
[jibel] ISO static validation, write requirement (See note [1] below): DONE
[gema] Gema helping jibel triage ISO testing results, this has to be done in the morning so a rota with people in the US doesn't solve the problem: DONE
[hggdh2] Give training to the Platform QA Team about Jenkins maintenance: DONE
[hggdh2] Give training to the Platform QA Team about Jenkins maintenance: DONE
[jibel] Document how to debug ISO installations when doing manual testing: https://wiki.ubuntu.com/DebuggingISOInstallation: DONE
[gema] Improve the usage of the new tags defined in https://wiki.ubuntu.com/QATeam/AutomatedTesting/TestingTypeAndBugTracking: DONE
[albrigha-deactivatedaccount] Come up with a list of test cases that need to be added to the daily ISO smoke testing: DONE

Work items for ubuntu-12.04-beta-2:
[gema] Define on a wiki what a broken build is: DONE
[albrigha-deactivatedaccount] ISO static validation, write a script that validate the ISO and integrate it into Jenkins: POSTPONED
[albrigha-deactivatedaccount] Compile a list of applications that are installed by default by the ISO installers for Desktop and propose two or three basic test cases for each that could be run post install giving us basic confidence that the ISO is good for further testing: POSTPONED
[jibel] Add a test scenario to the automated testing in Jenkins to address full updated installs with Ubiquity (problems to be detected with this testing: bug 743359 and bug 897680): POSTPONED