Sensor service

Registered by Thomas Voß on 2013-04-17

Tracks the implementation status of the sensor service that enables applications to consume sensor events, both in terms of raw sensor readings and in terms of fusioned results.

Blueprint information

Thomas Voß
Thomas Voß
Michael Frey
Series goal:
Milestone target:
Started by
Thomas Voß on 2013-05-23

Related branches



Current mobile devices offer a multitude of different sensors, e.g., accelerometers, magnetometers, compasses and ambient light sensors. These sensors provide information about a device’s position, acceleration and orientation in 3 dimensional space, and they offer a multitude of raw data about the device’s surrounding. Applications want to be able to access a device’s sensor readings to enable deep integration with the platform, to provide users with information that is distilled/calculated from the raw sensor data, or to provide new and natural ways of interacting with the device and applications, e.g., shake a phone to skip a track in an audio player, stabilizing camera images in low-light conditions resulting in high exposure times or augmented reality applications that require a deep understanding of the “world”.

The Ubuntu platform should provide access to raw and interpreted sensor readings to developers, with a central service that satisfies the following requirements:

  * Ease of use: Client applications should not need to care about where sensor data originates from or the low-level system interfaces for accessing the actual HW. We want applications to reason in terms of data sources and measurements obtained from these sources. On top of this, we want to provide applications with results from sensor fusion calculations that are carried out in a central place. That is, according to the diagram presented before, we want to provide a certain set of interpreted data to applications, in addition to the raw sensor readings.
  * Power efficiency and security: We want to funnel all incoming sensor data from actual HW through a central service that aggregates and processes the data, making sure that calculations that are expensive in terms of CPU cycles and power consumption need to be carried out only once. More to this, sensor readings can be considered a risk to a user’s privacy and we want to make sure that only authorized applications are allowed to obtain the respective data.

Finally, list of sensors that should be exposed to developers ([R] refers to raw sensor readings, [F] refers to interpreted readings):

[R] Accelerometer No
[R] Magnetometer No
[R] Gyroscope No
[R] Altimeter No
[R] Temperature No
[R] Proximity No
[R] Light No
[R] Gravity No
[R] Pressure No
[R] Compass No

[F] Linear Acceleration Yes
[F] Rotation Vector Yes
[F] Orientation Yes
[F] Rotation Matrix Yes
[F] Azimuth, Pitch and Roll Yes

Output of the Hangout on Air about Sensors:


Work Items

Work items:
Pull over functionality from the former AAL+: DONE
Iterate sensor-access bits in the platform API: DONE
Provide an iteration 0 of the service, implemented with the help of Android's SensorService: DONE
Expose sensor-functionality to SDKs (QML/JS): TODO
Transition the Android SensorService to Ubuntu: TODO
Expose Dummy sensors for testing purposes: TODO

Dependency tree

* Blueprints in grey have been implemented.

This blueprint contains Public information 
Everyone can see this information.