Browse Prior Art Database

Explaining why question-and-answer system answers change over time by listing the most relevant changes to the question-and-answer system Disclosure Number: IPCOM000234018D
Publication Date: 2014-Jan-07
Document File: 4 page(s) / 68K

Publishing Venue

The Prior Art Database


A method and system for determining the most likely reasons a question-and-answer system gives different answers over time.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 38% of the total text.

Page 01 of 4

Explaining why question-and-answer system answers change over time by listing the most relevant changes to the question -and-answer system

It is difficult to track changes in the answers given by a question-and-answer system (Q/A system) over time. The issue is faced by analysts working on the development of these systems and end users depending on the systems' recommendations. Currently, a user or analyst has to have an insight for why the answer has changed.

The primary reasons an answer from a Q/A system may change are:

Changes to the Q/A system's model, for example from continued training.

The Q/A system's algorithms are updated as part of regular development.

Corpus changes (input documents to the Q/A system)

A naive solution is to do a simple listing of all possible differences: all new documents added to the corpus, and all code change sets and all work requests added to the system during the development cycle, and then let the user sort out what may have caused the change in the answers. Instead, user's system will pore through the overall list of changes to find the most likely, relevant, and impactful changes.

Solution overview

This article describes an automated approach to analyze all three avenues of change (model, algorithms, corpus/input data) and determine which avenue(s) led to differing recommendations from the Q/A system. This approach will consider the overall list of changes and determine the most likely, relevant, and impactful changes.

The solution starts by asking a question to an old version and new version of a given Q/A system, and looking at the scoring differences between two answers. When the Q/A system recommends a new answer, either the new answer has been scored higher, or the old answer has been scored higher. In either case, there will be differences in the scoring vector across one or more features for the answer(s). The system will determine the features whose scores have changed the most and use those as input to two detailed search algorithms below. After applying the two algorithms, the system produces a summary of analysis results suggesting specific changes in the corpus or code. In short, the solution works backwards from the feature changes back to the inputs that caused these changes.

Solution details

The following is a step-by-step walk through of the system

1) Ask a question to Q/A system V and V+1.

2) Find the differences between answer and evidence from Q/A system V and V+1.

a) One key difference is candidate answer/evidence text. (Is there a new candidate answer, new evidence text, etc)

b) Determine the differences in the feature scoring vectors for each candidate answer that appears in V and V+1 results

3) ALGORITHM 1 - Analyze the corpus differences (Input to this algorithm is the new/delta text from the question/candidate answers/evidence text, and the feature values that have changed the most.

a) Look at the V and V+1 ingestions and note the deltas. The deltas can include, but...