Surety is performing system maintenance this weekend. Electronic date stamps on new Prior Art Database disclosures may be delayed.
Browse Prior Art Database

Self Evolving UI : Automatic Evolution & Personalisation of User Interface based on System & Application Event Tracing

IP.com Disclosure Number: IPCOM000236051D
Publication Date: 2014-Apr-03
Document File: 7 page(s) / 92K

Publishing Venue

The IP.com Prior Art Database


Disclosed is a self-evolving user-interface system which automatically implements a response for a user action or a sequence of user actions based on usage patterns detected from the prior interaction history of the user and the device or application context for which the user interface is implemented.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 51% of the total text.

Page 01 of 7

Self Evolving UI : Automatic Evolution & Personalisation of User Interface based on System & Application Event Tracing


Users who interact with certain devices or applications for a significant amount of time tend to develop a sense of expectation for the behaviour of the system for certain kinds of input actions (e.g. touch-based gestures or keyboard-shortcuts). These regular patterns of interaction develop a sense of intuition in the user's mind and the user expects this kind of behaviour from other devices or applications as well.

However, if other devices or applications which the user interacts with do not contain the expected response for the input actions that they are used to, the interaction experience can be frustrating. Often, users realize this frustration by first attempting to perform the action they are used to, and then, on realizing that this action is not generating the expected response, look for the appropriate action that is specific to the particular device or application they are using which will invoke the required response. Thus, there is a pattern of a user input action which does not invoke any response followed by the correct user input action which invokes the required response, wherein the first action is more intuitive for the user for invoking the same response.

For example, a user may interact with an application which has a zoom feature, but the "pinch to zoom" gesture that this user has experienced in other devices does not work. As a follow-up, the user locates and presses the +ZOOM_IN button for manually zooming in,


Page 02 of 7

Similarly, the user also notices frustration when the device or platform provides a layered user interface where a common task requires a sequence of actions (such as multiple mouse clicks or key presses) in order to perform. While many applications provide a mechanism for users to define their own short-cuts (such as a set of keys which if pressed together performs a user-defined action), existing platforms do not automatically detect the need for such a short-cut based on a user's experienced frustration.


The disclosed invention automatically detects a user's frustration by observing the user's inputs for some time and detecting patterns, which are rectified by automatically creating new input action and response associations:

Aspect 1: Look out for commonly used but 'unhandled' user interaction events, and

identify likely handling implementations for them, based on handled events in common

followup sequence.


Page 03 of 7








7. response.

Monitor User Interaction Events at the Operating Environment and application and also monitor application's response towards the event if any. Collate this data per user, as well as across users

For c...