How could we automate certification testing
The current certification testing is largely manual, what approach or method we could follow to make it all or 90% automated?
A possible approach could be to base 90% of certification testing into component testing (both sw and hardware) verified via api drivers (i.e. not UI in applications) and 10% in system use cases. The aim would be for the 90%of the test suite provide a solid confidence level that the system performs well for a given hw (to be run frequently), and 10% to test the integration of component and that the overall user experience remains at the desire level.
And example would be - automated component tests for:
* establish and verifying a network connection
* http/ip stack testing
* data streams
* graphics rendering (displaying an specific pattern that can be capture and easily programmatically validate)
* audio playing (looking for exceptions or panics while performing this task)
* non reporting exception/panics of launching firefox
* file system
manual test case:
* launch firefox and play youtube
Blueprint information
- Status:
- Not started
- Approver:
- None
- Priority:
- Undefined
- Drafter:
- Victor Tuson Palau
- Direction:
- Needs approval
- Assignee:
- None
- Definition:
- New
- Series goal:
- None
- Implementation:
- Unknown
- Milestone target:
- None
- Started by
- Completed by
Whiteboard
----------------
* Talk to Kevin (krafty) and determine where the manual test requirements came from
* Spec out a set of tests to be automated;
- Follow-up on list generated created during the COP
- Review consumer test recommendations
- Determine which tests are most expensive to run manually
Examples are:
- Application tests
- Network
- Sound and video
Action Items:
- Move application testing to QA (Ameet to follow-up with Carlos and Marjo)
- For the remaining tests (Network/
- Get feedback from other teams for implementation ideas in 3 test areas from the generated list