Improving automated certification testing of Kernel SRUs

Registered by Ara Pulido

The Hardware Certification team routinely performs certification testing of Kernel SRUs for recent releases. The objective is to run a small test suite on every system that has been certified with stock Ubuntu and ensure no regressions have occurred. This compliments the testing done by QA through the sheer range of hardware components and drivers exercised in the testing. The goal is to protect the Ubuntu user with a certified system from suddenly finding that something important doesn't work.

Currently the test suite consists of just over 20 tests as defined at https://wiki.ubuntu.com/Kernel/kernel-sru-workflow/CertificationTestSuite. These tests are unlikely to fail unless there has been a major regression and due to the automated nature of the testing (which is essential due to the volume of systems which need to be covered in a short period of time) there are no manual tests. So far this has meant that even though there have been regressions encountered in SRU updates, the certification testing process has not picked them up. Some have been bugs that could only be discovered through manual testing and some have been fairly obscure corner cases that the test suite simply didn't address.

The goal of this blueprint is to significantly increase the coverage of the SRU test suite within a few limits, the first being that all tests must be automated and the second being that there needs to be a sensible limit on the execution time of the test suite (to be discussed, but a guideline would be 30 minutes to 1 hour) so that SRUs can still be released in a timely manner. Each test should have as high a value as possible, so must focus on the most important hardware functionality first. Wireless testing is non-existent and testing of graphics and audio is not as good as it should be for such important pieces of functionality, so we intend to increase coverage in these areas at least. When considering new tests to add we will be looking to keep the SRU test suite synchronized with the test suite used to certify systems. This means that in the first instance new SRU tests should be sourced from the certification test suite and if a suitable test is not found then any new test created should also be included in the certification test suite.

Initial thoughts on what needs to be tested are here: https://docs.google.com/document/d/1fHfOnnnVCXsSayz3XlaL9hgrl-uTBXHx7Ijbkc6aKTo/edit?hl=en_US

== Agenda ==

* Introduction and overview
* Wireless testing
* Video testing
* Audio testing
* Q&A/Comments

Blueprint information

Status:
Not started
Approver:
Ara Pulido
Priority:
Essential
Drafter:
Brendan Donegan
Direction:
Approved
Assignee:
Brendan Donegan
Definition:
Approved
Series goal:
None
Implementation:
Not started
Milestone target:
None

Related branches

Sprints

Whiteboard

== Definition of done ==

• Tests identified with an X in the document above are implemented and include in the SRU test suite for all releases.
• Enhancements are made to the lab to enable the tests which require extra equipment (USB, Media cards, KVMs and Webcams for screenshots. Access points for wireless testing)
• Suitable tests from the certification test suite are included in the SRU test suite.
• Tests from the certification test suite are updated to make them suitable for the SRU test suite (i.e. fully automated, reasonable runtime, reliable)
• Functional areas where coverage is lacking are identified (e.g. graphics, wireless, audio)
• New tests which increase coverage in these areas are included in the SRU test suite (and the certification test suite if they don't already exist there).
• All new tests are run as part of a 'trial' SRU test run (which uses the SRU test suite, but not proposed packages).

(?)

Work Items