Browse Prior Art Database

Improving Question Answering Systems with Predefined Question Answer Stores

IP.com Disclosure Number: IPCOM000240815D
Publication Date: 2015-Mar-04
Document File: 4 page(s) / 63K

Publishing Venue

The IP.com Prior Art Database

Abstract

Disclosed is a simple, elegant, and general approach of improving any off-the-shelf question answering (QA) systems by simply building a predefined question-answer store on the top of a QA system such that an input question can be answered without going through the actual QA system if an exact or a semantially similar question is found in the store. This disclosed approach provides benefits including efficiency improvements, allowing answer overrides and customization, type ahead support, and incorporating user feedback.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 51% of the total text.

Page 01 of 4

Improving Question Answering Systems with Predefined Question Answer Stores

1 Background


Modern question answering (QA) systems often suffer from the following challenges: (a) It is not straightforward to override responses from the system for specific questions when required (b) it is difficult to enrich the system by incorporating user feedback on the system's response. To address these challenges, existing approaches, such as dependency relation matching [1], relation extraction [2], or machine learning [3, 4], often require hard-coded answers or retraining the whole system. Other approaches, such as [5, 6], deal with such challenges naturally by directly considering user feedback and customized answers as frequently asked questions. However, such approaches fail to answer questions when the questions or semantically similar ones do not exist in the knowledge base.


2 Approach


The main idea of the proposed approach is to build a predefined question answer store of the underlying QA system. When a question is asked, the predefined question answer store is queried first. The input question is analyzed against the predefined questions and answers in the store. Predefined answers are returned without going through the underlying QA system if appropriate ones are found. Otherwise, the question is answered by the underlying QA system.

Note that we make no assumption on how a input question is analyzed against the predefined question answer store. For example, one can measure the semantic similarity between the input question and the predefined ones in the store [6], or score the input question against both predefined questions and answers.

This predefined QA store is constructed, maintained, and updated on an ongoing basis, either manually or automatically. For example, it can be initialized by extracting questions and their predefined answers from the corpus [7], or manually created by domain experts. As the system runs, the predefined question answer store can be updated based on the user feedback or domain requirements without changing the underlying QA system.

The proposed approach has advantages as follows.

Customizing/overriding answers. Answers can be customized or overridden when necessary. For example, when the system owner needs a different answer from what the underlying QA system provides.

Incorporating user feedback. The questions answered correctly by underlying QA system can be easily added to the predefined question answer store to make sure that they will be answered correctly in the future.

Efficient and effective. If a question semantically matches ones in the predefined question answer store, an accurate predefin...