Browse Prior Art Database

Dynamically generate response-points on big-size hand-held device's border

IP.com Disclosure Number: IPCOM000239507D
Publication Date: 2014-Nov-13
Document File: 5 page(s) / 90K

Publishing Venue

The IP.com Prior Art Database

Abstract

Touch screen devices, such as mobile phone and pad, are in the trend of increasing screen size. However, usability when held by one single hand is greatly impacted because the fingers can not reach every UI elements (App, button, .etc). To address this problem, this disclosure a system which dynamically generates response-points on the device border and pushing them to fingers to trigger pre-customized operations. This system supports random holding gesture, including two hands switching, and also the association between one finger to multiple operations in different operation context.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 51% of the total text.

Page 01 of 5

Dynamically generate response Dynamically generate response

Dynamically generate response-

--points on big

points on big

points on big-

--size hand

size hand

size hand-

--held device

held device

held device'

''s border

s border

Problem Statement

When holding a big-size hand-held device only by a single hand, the fingers can not reach all Apps on the desktop, or click all UI elements (App, button,.etc).

Core idea

When a touch-screen extends to its border, it will dynamically generate response-points on the border, which associates the operations of UI elements with the fingers. In this way, the fingers' action,such as pushing/sliding, can directly activate the UI events(launching App, pushing button,.etc). The association between UI events and fingers' action on the border can be pre-customized and activated under border-sensing mode.

The invention detail is described as below:


1. Physical pre-condition
The touch sensor needs to be extended to the device side borders, which can sense pushing and sliding.

1



Page 02 of 5

2. Activate border-sensing mode

     
The border-sensing mode is used switch on-off for the dynamic response-points generation, which is turned off by default. A UI can be offered to enable/configure the border-sensing mode or not.


3. The association between fingers and UI elements to be activated
The association defines under a)*which foreground* b)*which finger* to activate c)*the event list* of which UI elements or operations to be activated.


- The finger will by default activate the event on the top of the list.

- The finger can skip current event through operation such as sliding, and thus switch to the next event in order.
- The association can be pre-defined manually, and use the user usage habit collected as default value.

finger

event list

App1, App2, App3

"App1 Activity2"

app4

XXX

4. Typical scenario 1
1)Enable *border-sensing mode*
2)Detect the fingers' positions on the border. Supposing holding by right hand, there will be one finger on the right border( called FINGER1),

three or four on the left(called FINGER2, FINGER3,.etc).


- Here no need to distinct every exact finger, just identify fingers sequence.

- Here supporting switching between two hands through comparing the fingers number on two side borders.

foreground

FINGER1

home screen

"App1 Activity1"

FINGER2

home screen

XXX

XXX

2



Page 03 of 5

3)Identify current foreground. Here use home screen as example.

          4)According on one predefined association relationship between fingers and UI elements under different foreground, automatically associate the Apps to the fingers. eg, FINGER1 to App1, App2 and App3, FINGER2 to App4.etc.

          5)On the GUI top, draw an icon of the App1 appearing beside the FINGER1 position, App4 icon beside the FINGER2, while others in the same manner.


6)FINGER1 pressing will directly open the App1 .

7)Associate buttons in the App to the fingers. eg, 'Button1' to FINGER1, 'Button2' to FINGER2 in the same way as step 4.


8)On the GUI top, there will be...