HW Accelerated Video Decode and Rendering support

Registered by Ricardo Salveti

Currently the Ubuntu Touch based image delivers full hardware accelerated video decode and rendering support by reusing the Android Media Player via libhybris.

While this solution works, it creates issues when moving forward with Desktop convergence, as generally Gstreamer is supported instead.

Another goal we need to keep in mind is reducing the amount of running Android services, and trying to just isolate the Android Hardware Abstraction Layer, making our common and generic stack to make use of it otherwise (see https://blueprints.launchpad.net/ubuntu/+spec/client-1303-sound-support-pulse-audioflinger for a similar discussion, but related with audio support).

As the Gstreamer SDK now delivers Android support, one possible solution would be to isolate the media bits responsible for video decode/rendering and use Gstreamer on top it. That would allow Gstreamer to be used in our convergence history.

Goals with this session:
 * Describe the current solution and cover what is not desired when looking from the convergence pov
 * Investigate the work done for the Gstreamer Android SDK, and evaluate if it'd indeed be a feasible solution
 * Understand how DRM fits in the picture (might be needed for hardware vendors)
 * Propose a future convergence plan

Blueprint information

Status:
Complete
Approver:
Bill Filler
Priority:
Undefined
Drafter:
Ricardo Salveti
Direction:
Approved
Assignee:
Jim Hodapp
Definition:
New
Series goal:
Proposed for saucy
Implementation:
Implemented
Milestone target:
milestone icon ubuntu-13.10
Started by
Jim Hodapp
Completed by
Jim Hodapp

Whiteboard

* Discuss the current media stack implementation
    * libstagefright, hybris, qtubuntu-media, apps (mediaplayer-app, camera-app, etc)
    * Some downsides: libstagefright not as fully featured as GStreamer:
        * limited sources, limited filtering, post processing
    * An upside: fully integrated into many mobile hardware platforms for accelerated
      encoding/decoding.

* Use of the GStreamer SDK on Android:
    * Research what works today, how complete is this solution?
    * What does the architecture look like and how exactly will it best fit into the Ubuntu Touch
      platform?
    * Convergence goal: allow for a platform converged solution that just works on the range of
      devices: phone, tablet, desktop, tv

* DRM
    * Research what currently exists in Android
        * How much of this code is in the core C/C++ code?
        * How much is part of the Java layer, and thus unusable by Ubuntu Touch?
        * How does Android handle DRM keys from an application perspective and is this reusable by
          Ubuntu Touch?

== Current Plan/Issues ==

(?)

Work Items

Work items for ubuntu-13.04-month-5:
[jhodapp] Discuss with community team and create a Ubuntu Touch media specific freenode channel: DONE
[jhodapp] Experiment with GStreamer and eglelssink and openslessink our target platforms: DONE
[jhodapp] Talk to pat-mcgowan about codec licensing requirements, also look at Smartphone PRD spreadsheet: DONE
[pat-mcgowan] Codecs and licensing requirements: DONE

Work items for ubuntu-13.04-month-6:
[jhodapp] Flesh out our use cases with thomas.voss and lool: DONE
[jhodapp] Enable network (web) playback of media with Android layer: POSTPONED
[thomas-voss] Decide which API we want to expose - (Gstreamer, platform-api, QtMultimedia, Combination) (people to implement sinks and sources and filters, or only precanned use cases possible): Gstreamer - QtMultimedia: DONE
[thomas-voss] Find out about DRM requirements: DONE
[pat-mcgowan] Research on DRM requirements and schemes we want to support: DONE

Work items for ubuntu-13.07:
[jhodapp] Create C wrapper for hybris around libstagefright MediaCodec and MediaCodecList classes: DONE
[jhodapp] Convert gstamc GStreamer plugin from calling into Dalvik VM for MediaCodec access to new MediaCodec/MediaCodecList hybris wrapper: DONE
[jhodapp] Create efficient video rendering GStreamer sink of raw decoded video for GStreamer/MediaCodec hybrid pipeline: DONE
[jhodapp] Merge libhybris media codec layer with upstream: DONE
[jhodapp] Test Internet streaming (e.g. HTTP) of h.264 video: DONE

Work items for ubuntu-13.08:
[jhodapp] Test audio playback with new video plugins: DONE
[jhodapp] Port gstamcaudiodec.c from JNI calls to libhybris media wrapper: POSTPONED
[jhodapp] Make gstamc and mirsink work with playbin: DONE
[jhodapp] Get playbin pipeline working with QtMultimedia backend on Ubuntu Touch: DONE
[jhodapp] Test playbin playback with video and audio on the pulse audio image: DONE
[rsalveti] Test the QtMultimedia GStreamer 1.0 backend on Ubuntu Touch: DONE
[laney] Package GStreamer 1.1.3 with new decoder and sink plugins and place in a PPA: DONE

Work items for ubuntu-13.09:
[jhodapp] Test play/pause/seek in the mediaplayer-app with the new GStreamer backend: DONE
[jhodapp] Update the mediaplayer-app debian package to reflect new dependencies: DONE
[jhodapp] Port thumbnail generator to GStreamer 1.0: TODO
[rsalveti] Create FFe for the gst-plugins-bad1.0 changes: DONE
[rsalveti] Work with laney to get jim's work in the archive: DONE

Work items for ubuntu-13.11:
[jhodapp] Cleanup and optimize new GStreamer plugins getting ready for upstream project submission: INPROGRESS

Dependency tree

* Blueprints in grey have been implemented.