Linaro Dashboard Improvements

Registered by Paul Larson

Find out if there are any use cases not met by recent improvements in Launch Control, and what should be done to improve it further.

Blueprint information

Paul Larson
Zygmunt Krynicki
Needs approval
Andy Doan
Series goal:
Accepted for trunk
Milestone target:
Completed by
Michael Hudson-Doyle

Related branches



User stories:

As a LAVA user I'd like to learn about database schema so that I can write new data views

As a LAVA user I'd like to learn about existing data views so that I may reuse them in my reports
 - done in 2011.06

As a LAVA user I'd like to learn about the bundle format (schema) so that I may understand better what information can be stored in the system.

As a LAVA user I'd like to to see attachments associated with any test run so that I may benefit from custom data stored in this method. (see = download + inline viewer for selected mime types)
 - Done via template update, but needs to be integrated into the release

As a LAVA user I'd like to sort some tabular data sets by any visible column to quickly locate the most interesting data.
As a LAVA user I'd like to quickly see all the test failures without having to browse through successful tests.

As a tester, I want to be notified when a test fails

As a LAVA user I'd like to see uploaded bundles in their original format (JSON) with some level of pretty-printing (syntax + colors + structure) so that I may understand the format better.

As a LAVA user with slow/flaky Internet connection I would like to pull tests from production system quickly. I would benefit if bundle format was compressed by default.

As a LAVA user with slow/flaky Internet connection I would like to pull tests from production system quickly. I would benefit if bundle format was compressed by default.

As a LAVA server admin I would benefit if LAVA dashboard could process uploaded test results faster as this blocks other parts of the system and is horribly slow today (this user story is a stretch but it's still a good idea to check and improve why put() is so horribly slow).

As a LAVA evaluator I'd like to discover how LAVA software stack is organized to understand how to use/deploy LAVA in my organization/group. (provide a visual overview of various lava layers)

As a non-core LAVA developer I'd like to understand what extension interfaces are available at each LAVA stack layer so that I can integrate my custom solution.

As a LAVA user submitting results from new component I would like to see if my bundle failed to deserialize and understand where the error is (this could be coupled with raw bundle viewer (assuming the JSON itself can be deserialized) that would highlight the relevant part of the object and relevant part of the Bundle schema).

Etherpad copy:

x`Welcome to Ubuntu Developer Summit!
#uds-o #track #topic
put your session notes here

- list the topics we'd like to see implemented in the next 6 months
- roughly agree on the scope of each change
Proposed topics in order of relative priority:
- integration with lava-server (lava application container) and renaming
- transition to linaro-django-xmlrpc and auth token support
- online developer documentation for report authors
  - tutorial + examples + restructured text online
- online documentation for the bundle format
 - online schema documentation based on schema in linaro-dashboard-bundle
 - online bundle viewer (pure json + syntax highlighting)
 - do something sensible with attachments (click to download)
- reevaluating views that may become useless when we get lots of streams/bundles/results
 - sort by column
  - look to see if there is a good existing django solution for this
 - choose how many items to see on one page
 - click on test failures and show just the failures
- discoverability of data views
- performance analysis and improvement of DashboardAPI.put()
 - benchmark, then optimize
 - maybe do background processing of bundles, but only if required
- storage efficiency for bundles/attachments
 - compress original bundle on disk
- reconsideration of privacy requirements
 - baseline "security"
  - ACTION: asac to write a disclaimer/banner; zyga to integrate that being displayed on all report/bundlestream pages; once we have a good set of benchmarks being visualized, asac to give a heads up to the TSC
  - for the case of benchmarks whose results cannot be published by license: private instance
- how to support baselines, how to notify interested parties
 + notification doesnt really need to be in the dashboard or LAVA whatsoever; this feature can be implemented on client/report side by applying its own logic to dataview results.
 + baseline can be extraced through dataviews, for example by using a specific test result (from beginning of the cycle, or from running plain gcc upstream)
 + ACTION: we need to go back to WGs that implement benchmarks to explain what kind of baseline they really need
ACTION: spring to submit a bug about dashboard lc-tool not being able to use proxies (reference
ACTION: Need https certificate
As a results submitter, I would like to see if my bundle didn't deserialize and better understand where the error is.


Work Items

Dependency tree

* Blueprints in grey have been implemented.

This blueprint contains Public information 
Everyone can see this information.