Remaining work for a fully functional toolchain CI loop at Ubuntu LEB

Registered by Ricardo Salveti

This session will discsuss the current issues we're facing with the toolchain CI builds, looking for different solutions and discuss with the TWG how frequent the builds should be to be valid.

This will also cover the validation plans for both native and cross Linaro GCC.

More info at https://linaro-public.papyrs.com/public/4120/LINUX2011-TOOLCHAIN-CI

Blueprint information

Status:
Not started
Approver:
Ricardo Salveti
Priority:
Undefined
Drafter:
None
Direction:
Needs approval
Assignee:
None
Definition:
Discussion
Series goal:
None
Implementation:
Unknown
Milestone target:
None

Related branches

Sprints

Whiteboard

==========

Notes from the session:

 Marcin created a gcc linaro package and cross toolchain packages from gcc-ubuntu and gcc-linaro
 cross-toolchain packages need to be tested on x86
 Michael normally runs the gcc testsuite in qemu for his testing
 for build testing, builds packages natively with testsuites in them
 Problem with testing packages is that it's easy to cross-build them, but not test - core packages have some test wrapper script
 some packages have support for pushing binaries to a remote machine for testing - need a bit of setup with a board file that will tell it where to push the packages for testing

one approach would be to build cross compiler in cloud, build arm packages with this compiler and then submit the built packages to be tested in LAVA

Test support:
 * Native package build
 * Test execution at the package set that was natively built (current package set is 'build-essential')
 * Cross package build at the cloud, probably via jenkins
 * Test execution with the packages built using the cloud, after giving the packages and the test request at lava
  * Packages and test descriptions get queued in LAVA, results land somewhere

Stage one:
 1) Native build native test. Cross build
 2) lower prio - native test of the cross-built package

Would like to have some sort of event saying if something fails

1. Currently planning to do daily builds of both the native and cross package (dailydeb recipe via lp)
Available via staging ppa
2. Another ppa would have a source package set that we define for building using the above, and running tests - just so that we can get source
3. Pull the above source packages in a cloud (jenkins) instance, and build them using the binaries from step 1
This produces a build result into lava
4. a job is submitted to lava to get the unittest package produced by step 3 and run it natively

This could be used to cross/native build the benchmarks and run the tests

Michael Hope would like to archive the old packages somewhere - need to dump it to a directory on a web server somewhere to preserve it (could probably use the nas box in the lab)
ACTION: Marcin to check if the bzr recipe is still broken after Launchpad updated the bzr package to a newer version
ACTION: Danilo to check if there is a way to disable package purge once a new package is pushed to the ppa
ACTION: Michael to provide the package list that we'd use to test the toolchain
Marcin:
 - Create the test definition at LAVA for the native test
 - Create the test definition at jenkins to run at the cloud
    - Also push the results to Lava's dashboard

(?)

Work Items

Work items:
Check if the bzr recipe is still broken after Launchpad updated the bzr package to a newer version: DONE
[danilo] check if there is a way to disable package purge once a new package is pushed to the ppa: TODO
[michaelh1] to provide the package list that we'd use to test the toolchain: TODO
Create the test definition at LAVA for the native test: TODO
Create the test definition at jenkins to run at the cloud: TODO
Push the results to Lava's dashboard: TODO

Dependency tree

* Blueprints in grey have been implemented.

This blueprint contains Public information 
Everyone can see this information.