Browse Prior Art Database

Word-based Command Gestures

IP.com Disclosure Number: IPCOM000181832D
Publication Date: 2009-Apr-14
Document File: 5 page(s) / 237K

Publishing Venue

The IP.com Prior Art Database

Related People

Ethan Cheng: INVENTOR [+2]

Abstract

Gesture commands on touch sensors based on the shapes of “words” are easier to define, remember, and use; thus they can be used to control devices and applications.

This text was extracted from a PDF file.
At least one non-text object (such as an image or picture) has been suppressed.
This is the abbreviated version, containing approximately 24% of the total text.

Page 1 of 5

"Word"-based Command Gestures Ethan Cheng

"Word"-based Command Gestures

1. Inventor(s): Ethan Cheng and Kipling Inscore
2. Synaptics Incorporated, Santa Clara, CA, USA

3. Short Summary

Gesture commands on touch sensors based on the shapes of "words" are easier to define, remember, and use; thus they can be used to control devices and applications.

W

E

W

Figure 1. Entry of "WEB" as overlapped characters to trigger a "WEB" related command.

4. Some Problems Solved

Examples of some of the problems addressed by the invention include:

Gesture-type control of electronic systems has grown in prominence in recent times. The proliferation of handheld devices, which has smaller viewable and input areas than desktop units, means that users often need to interact with devices with reduced or non-existent keyboards or pointing devices. This means that menu-based or key-based (e.g. hot key combinations) navigation and command approaches have become more impractical.

Some devices seek to predefine and teach its users certain gesture commands. Some of these can be intuitive, such as a dial-turning motion for rotating an image. However, many of these predefined gestures strike users as obtuse, and can be difficult to learn, remember, and use (e.g. a triangle to close a file, or a square to open a file).

Copyright © 2009 Synaptics Incorporated, All Rights Reserved. Page: 1 of 5

Information contained in this publication is provided as-is, with no express or implied warranties, including any warranty of merchantability, fitness for any particular purpose, or non-infringement. Synaptics Incorporated assumes no liability whatsoever for any use of the information contained herein, including any liability for intellectual property infringement. This publication conveys no express or implied licenses to any intellectual property rights belonging to Synaptics or any other party. Synaptics may, from time to time and at its sole option, update the information contained herein without notice.

>

W

>

WE

>

W

B

E

W

[This page contains 12 pictures or other non-text objects]

Page 2 of 5

"Word"-based Command Gestures Ethan Cheng

Some devices seek to enable users to define and use customized gestures that are individual to the user. These typically require a user to think of each gesture, and train the system to recognize the gesture by providing multiple samples, accepting correct recognitions by the system, and rejecting false positives and negatives. This can pose frustrating and cumbersome experiences for the end users. Also, there is a limit to the number of such gestures that are available in practice. End users may run out of ideas, or find themselves unable to recall what they defined for particular commands if there are too many such customized gesture commands.

Gesture commands based on the shapes of "words" addresses these problems.

5. G...