Browse Prior Art Database

A METHOD FOR USING ELECTRONIC MESSAGING SYSTEMS TO REDUCE PERPLEXITY

IP.com Disclosure Number: IPCOM000014189D
Original Publication Date: 2001-May-01
Included in the Prior Art Database: 2003-Jun-19
Document File: 2 page(s) / 83K

Publishing Venue

IBM

Abstract

A Method for Using Electronic Messaging Systems to Reduce Perplexity in Speech Recognition (MI892001) Inventors: Paul Cohen, Elton Sherwin To reduce perplexity in speech (or other language) recognition and to enhance recognition rates, information from prior correspondence (e.g., note, letter or memo) and fields in the correspondence, addressee and nickname directories, categorized logs, a note being replied to, time and date stamp history, and other collections of data is used to form language models which can be dynamically selected for use in recognition. The process is dynamic in that the sources and data in the sources may vary over time, and the selection of language models may vary depending on sundry factors (field in which word is located, user, subject of correspondence, addressee, task, or other context). For example, in determining the correct addressee in that field of a note, a model including addressee information in prior notes, especially recent notes or notes of similar subject matter, would help recognize the proper word.

This text was extracted from a PDF file.
At least one non-text object (such as an image or picture) has been suppressed.
This is the abbreviated version, containing approximately 60% of the total text.

Page 1 of 2

  A METHOD FOR USING ELECTRONIC MESSAGING SYSTEMS TO REDUCE PERPLEXITY

A Method for Using Electronic Messaging Systems to Reduce Perplexity in Speech Recognition (MI892001)

Inventors: Paul Cohen, Elton Sherwin

    To reduce perplexity in speech (or other language) recognition and to enhance recognition rates, information from prior correspondence (e.g., note, letter or memo) and fields in the correspondence, addressee and nickname directories, categorized logs, a note being replied to, time and date stamp history, and other collections of data is used to form language models which can be dynamically selected for use in recognition.

    The process is dynamic in that the sources and data in the sources may vary over time, and the selection of language models may vary depending on sundry factors (field in which word is located, user, subject of correspondence, addressee, task, or other context). For example, in determining the correct addressee in that field of a note, a model including addressee information in prior notes, especially recent notes or notes of similar subject matter, would help recognize the proper word.

    The invention may be used in narrowing choices derived by other processes. Other techniques include using Markov (probabilistic) modelling in which uttered words are compared against baseforms, frequency of word sequences (called n-grams), and word clustering. N-grams involve looking at a body of text and determining how often a pair of words (in sequence), a triplet of words, a 4-gram of words, or an n-gram of words occurs in the body of text. The invention improves over prior art "triggered" models in which certain words invoke appropriate models, most recently used text models, and user-selected vocabulary techniques.

    On...