OSK current state & thoughts on future

Registered by kevin gunn

This blueprint is meant to be a "mind blurping" session on the OSK topic
It is meant to capture
- Capture the current state of OSK in the Touch architecture, what changes are needed prior to 13.10, - what potential ideas may we wish to capture for efforts beyond 13.10

see whiteboard for post-vUDS session update

Blueprint information

Status:
Complete
Approver:
None
Priority:
Undefined
Drafter:
None
Direction:
Needs approval
Assignee:
Unity UI Team
Definition:
Approved
Series goal:
Accepted for saucy
Implementation:
Informational Informational
Milestone target:
milestone icon ubuntu-13.10
Started by
Michał Sawicz
Completed by
Michał Sawicz

Related branches

Sprints

Whiteboard

From May 2013 vUDS:kgunn
notes with highlight/authorship can be found here: http://pad.ubuntu.com/uds-1305-client-1305-unity-ui-unity8-osk
Othewise the following is the raw text :
Purpose of this meeting: a brainstorming session to see how to architect the OSK on the Ubuntu Phone with respect to the shell

Overview of OSK today:
- maliit keyboard being used right now is client-server, client written in QML.
- there is a server (maliit-server) that loads keyboard as plugin
- this server supplies surfaces for keyboards to be drawn on
Is separate process which looks after controlling this service.

To be done right now
- design changes
- CJK language support
- rotation problem: the C++ part controls the QML scene layout, which isn't correctly updating the shape on rotation.

If we got the surface control integrated into Mir/Shell, then it would simplify the control of the surface, and Mir can control it best as Shell does need to know where OSK is and if it is open.

From the application point of view, the toolkit will mostly look after invokation of the OSK.

Integrating OSK into Mir also simplifies event management - as opposed to having 2 separate processes and IPC between them.

Concerns to be aware of:
- propagating key events from shell down to app should not happen
- keyboard generates key events, need to ensure no key-event-loops

It means we can drop code too.
- should be able to remove much of the custom code in the mir-server which can be easily integrated into Mir itself, which will help fix security and focus concerns.

[pmcgowan] - how about 3rd party keyboards?
- the event injection abd delivery is abstracted away, so only the view component would be replacable. Could easily replace the QML file, but other toolkits would be difficult.
- putting keyboard in the shell limits the ways it could be replaced. Limit good for security? Well not necessary.

Instead could supply a surface ID, and giving an interface a keyboard should implement. We could enforce: Application talks to Mir, Mir talks to keyboard. Then anyone could make a keyboard in whatever way they want, as long as they implement the interface.

Let's address some concerns by Pete:

[pete-woods] From playing around with the notes app:

Missing features:
Corrective text [captured in a blueprint already]
Accelerating delete (when you hold the delete key and it accelerates from deleting letters to deleting whole words, etc.

1. Has maliit support for simple key acceleration? Do we want it?
2. Key starts as delete charaters, then switches to delete words, then delete sentences.

In Mir, the repeat policy can be adjusted. Or at the keyboard level? Keyboard level feels more elegant, but passes messages to Mir like "delete word", "delete sentence"...

First: Is it a requirement? How should it behave?

Tap outside the OSK / text area does not dismiss it

Should application do it? Yes as if application has several text input boxes, tapping on another textbox, keyboard should stay open. SDK issue - not keyboard responsibility.

Worst case, swipe down hides keyboard.
shell will handle focus between apps, but the app itself when it loses focus should dismiss the keyboard

Hover to select is limited
Can't select an individual word
SDK issue, not keyboard specific.

Predictive text Planned per BP
Different modes for different input types - e.g. entering:
URL in web browser (.com buttom)
Email address: (@ button)
Numbers (number keys only?) [planned per BP]
- Localised keyboards [Planned per PB]

Planned:
Complete word correction/predication
Autocorrect on space
Add to user dictionary
Highlight spelling errors in pre-edit
Auto-capitalization, implementation and on/off setting
Auto-correction, on/off setting
Show correction suggestion setting (always show, in portrait, always hide)
Check spelling on/off setting
Press previously entered mis-spelled words to correct

General issues:
Contrast between keys and background too low for easy visual distinction
Keys too flat on the Nexus 10 OSK (i.e. not tall enough)

Work items:
- look into adjustments to maliit-server & keyboard and see what we need to expose to/from the shell. (4/5 functions on platform API), and derive work items from that for an iteration 0.

Goal: get it working with Mir as it is now, then can re-evaluate it to fix the security and focus model.

Jun25 kgunn:
Decisions made wrt near term OSK & unity8-mir integration
from notes
decision: initially we just need to take what we have today that is osk out of shell, and make it work with unity8/mir - then in the future we can revisit
for now we need
#1) need an input method interface from mir
#2) need mallit to have a way to announce to shell its been requested by the application - so the shell can then allow the osk surface to be visible

(?)

Work Items

Work items:
look into adjustements made in the mallit-server & keyboard, determine what's needed to/from shell: DONE

Dependency tree

* Blueprints in grey have been implemented.