Browse Prior Art Database

Triggering tooltip on touch based display using eye tracking Disclosure Number: IPCOM000229373D
Publication Date: 2013-Jul-25
Document File: 4 page(s) / 128K

Publishing Venue

The Prior Art Database


The solution proposed herein describes how context sensitive tooltips can be displayed on touchscreens, without using a pointer. The technique focuses on how user gaze can be used as an alternative for hover for showing contextual tooltips. Tooltip can be shown for actionable items, like buttons as well as other items like input fields or even any visual element.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 51% of the total text.

Page 01 of 4

Triggering tooltip on touch based display using eye tracking

A touchscreen is an electronic visual display that can detect the presence and location of a touch within the display area. Touchscreens are fast replacing conventional (non-touch) video display systems everywhere. These touchable panels play a prominent role in the design of interactive displays and digital appliances such as the personal digital assistant (PDA), satellite navigation devices, mobile phones, video games, tablets, ATMs etc..

But with the advent of touch screens, input devices like mouse and keyboard are going away. Earlier user assistance could be easily provided for user interfaces by showing small context help popups called tooltips on hover of a pointer (controlled by pointing devices like mouse, touchpad, navigation keys etc) on an actionable button/area. But with the ever increasing popularity of touchscreens, showing tooltips on hover is not possible anymore. This invention addresses this problem and describes how context sensitive tooltips can be displayed on touchscreens, without using a pointer.

Following table summaries known solutions and their drawbacks:

Known Solution Drawbacks

Design UI to avoid the need for tooltips

Generally touchscreens are comparatively smaller for portability, cost reduction etc. Screen real estate is a scarce resource and there are limitations on how descriptive and self-explanatory a user interface could be. Also the targeted users for many touchscreen devices are generally very non-technical.

Additional buttons takes up more space and increases complexity of the UI making it very confusing for end users. Actionable areas around a button also calls for more space and end user would need additional indications that there is an actionable area.

Long press can be used for displaying tooltip, but it calls for additional user education that such an action is possible. Also many times long press is used for other actions such as displaying context menu, switching to edit mode etc. Also there is a possibility and risk that a faulty long press could accidentally be


Additional buttons or actionable areas for triggering tooltip

Long press (press and hold)

Page 02 of 4

interpreted as the press/selection and the corresponding action could be triggered. This could be quite unacceptable for critical transactions.

Tooltip Split Button Instead of using normal button, the tooltip splitbutton would be used to implement the user interface. This button has two actionable areas, the button label (main action) and the tooltip icon (secondary action to invoke tooltip). This still requires additional screen space and the tooltip icon is placed too close to the action button, leading to possible accidental clicks

Proximity Sensor Proximity sensors could be costly, requires additional hardware and al...