Browse Prior Art Database

Probabilistic Flash Cards

IP.com Disclosure Number: IPCOM000109384D
Original Publication Date: 1992-Aug-01
Included in the Prior Art Database: 2005-Mar-24
Document File: 4 page(s) / 127K

Publishing Venue

IBM

Related People

Brown, PF: AUTHOR [+4]

Abstract

Disclosed is a scheme for selecting flash cards to show to a student so that he may learn words from a foreign language. The scheme attempts to select the order in which the cards are shown to the student so as to maximize the probability that at some later date the student will know the definition of a word selected at random from a text in the foreign language. The scheme uses a probabilistic model which models the student's prior knowledge of the foreign language and also the rate at which he learns and forgets words.

This text was extracted from an ASCII text file.
This is the abbreviated version, containing approximately 52% of the total text.

Probabilistic Flash Cards

       Disclosed is a scheme for selecting flash cards to show
to a student so that he may learn words from a foreign language.  The
scheme attempts to select the order in which the cards are shown to
the student so as to maximize the probability that at some later date
the student will know the definition of a word selected at random
from a text in the foreign language.  The scheme uses a probabilistic
model which models the student's prior knowledge of the foreign
language and also the rate at which he learns and forgets words.

      The scheme works as follows.  At the nth time, tn, the student
is presented with a flash card containing the word wn and asked for a
response rn.  The student responds rn = yes if he knows the word wn
and rn = no if he does not.  The selection of the word wn for
the nth query is made so as to maximize the probability that the
student will know a randomly selected word from text at some later
time tn+1 = tn + t.  That is, wn is selected to achieve the maximum
possible conditional probability Pr(Rn + 1 = yes | R1n-1 = r1n-1,
W1n-1 = w1n-1, Wn = wn, T1n = t1n, Tn+1 = tn + t) of
the n+1st response at t units of time in the future.  Here Ri, Wi,
and Ti are random variables for the i th response, word and time.

      The scheme requires a probabilistic model Q for randomly
selecting a word from text, and a probabilistic model M for computing
the probability of the student's response.
Assume:
(Q)  The model Q computes the probability Pr(Wn = wn) of selecting
the n th word according to its frequency is some large corpus of
text.
(M1) The model M involves a hidden parameter vector R which reflects
the student's initial state of knowledge about the foreign language
and his ability to learn and remember what he learns.
(M2) The model M computes:
           1.   The conditional probability Pr(Ri = ri | R1k = r1k,
except Ri, W1k = w1k, T1k = t1k, R = r) of the i th
response given the parameter vector and previous and future
responses, words and times.
           2.   The probability Pr(R = r | W1k = w1k, T1k = t1k) of
the parameter vector given a word and time sequence.
(M3) The model M is 'causal' in the sense that:
           1.   The conditional probability in M1(1) of the i th
response given R and the history does not depend on the words or
times occurring after the i th time.
           2.   The probability in M2(2) of the parameter vector R is
independent of the word and time sequences.
(M4) As a function of wn, for fixed r1n+1, wn+1, w1n-1, and r, the
conditional probability
      M(Rn+1 = rn+1 | r1n, Wn+1 = wn+1, Wn = wn, W1n-1 = w1n-1, R
= r)  (1)
has the same value for all wn * wn+1 .
(M5) The conditional probabilities appearing in M4 are ind...