Browse Prior Art Database

Methodology for Audible Announcement of Distinguishing between Interface Objects of the Same Type and Class to Visually Impaired Users

IP.com Disclosure Number: IPCOM000106072D
Original Publication Date: 1993-Sep-01
Included in the Prior Art Database: 2005-Mar-20
Document File: 2 page(s) / 98K

Publishing Venue

IBM

Related People

Johnson, WJ: AUTHOR [+3]

Abstract

A methodology is described that recognizes mouse input for invoking a mouse cursor that audibly announces interface objects it encounters (hereinafter to be known as the talking mouse cursor). Hereinafter, the term "objects" refers to windows, icons, panels, etc.; anything visually presentable on a desktop.

This text was extracted from an ASCII text file.
This is the abbreviated version, containing approximately 52% of the total text.

Methodology for Audible Announcement of Distinguishing between Interface Objects of the Same Type and Class to Visually Impaired Users

      A methodology is described that recognizes mouse input for
invoking a mouse cursor that audibly announces interface objects it
encounters (hereinafter to be known as the talking mouse cursor).
Hereinafter, the term "objects" refers to windows, icons, panels,
etc.; anything visually presentable on a desktop.

      Currently, mouse cursors are controlled by moving the cursor
around within a Cartesian coordinate system (i.e., X/Y plane).  In a
window based graphical user interface for an operating system, the
mouse cursor must encounter an object in order to act on it.  For
navigating around the desktop, this method is usually adequate for
people who are not visually impaired.  However, even those users are
frequently faced with the dilemma of a cluttered desktop, which can
make navigating around the desktop quite tedious.  This creates a
burden on the user because time is spent minimizing windows to icons
and/or rearranging their position on the screen manually, in order to
find an application that is not clearly visible.  Furthermore, this
method is not adequate for visually impaired people.  A method which
responds to user interaction (i.e., mouse movement), enriches desktop
object manipulation and makes interfaces more useable is desirable.

      As the cursor is navigated through the desktop, the talking
mouse cursor will audibly announce the title text of each object as
the mouse cursor first encounters that object.  Furthermore, whenever
there are more than one object of the same type and class on the
desktop, the methodology will detect this and append a suffix to the
audible announcement that would normally occur.

      For example, suppose a user has four OS/2* icons and windows on
his desktop.  The title bar text of each icon is "OS/2 Window."  As
the user repositions the talking mouse cursor from outside the OS/2
frame window onto the frame window of the leftmost OS/2 session, the
talking mouse cursor's audible announcement would be

              "OS/2 window one of four...leftmost window".

      Suffixes which may be appended to the audible announcement of
the title text may indicate:
the number of related/identical objects on the desktop

o   window...one of four
o   icon...one of two
the location of the object under the mouse cursor, relative to the
other objects (giving not only position, but direction information as
well)

o   leftmost
o   rightmost
o   topmost
o   bot...