Browse Prior Art Database

Extraction of Handwritten Responses from Forms

IP.com Disclosure Number: IPCOM000111508D
Original Publication Date: 1994-Feb-01
Included in the Prior Art Database: 2005-Mar-26
Document File: 2 page(s) / 44K

Publishing Venue

IBM

Related People

Camp, WO: AUTHOR

Abstract

Disclosed is a method for extracting the image of a form from that of the image of a form plus customer responses written on the form.

This text was extracted from an ASCII text file.
This is the abbreviated version, containing approximately 67% of the total text.

Extraction of Handwritten Responses from Forms

      Disclosed is a method for extracting the image of a form from
that of the image of a form plus customer responses written on the
form.

      The common way to process form images is to subtract the image
of the form from the image of the form plus customer responses.  This
requires perfect registration, no distortions anywhere, no changes in
the form, etc.  It does not always work.  Another way is to use areas
with different colors to help drop out the respondent areas.  This
does not help when the customer runs out of the allowed areas.  So a
more reliable, and more robust method of separating the customer
responses from the form is desirable.

      This invention uses the differences in spatial frequencies
between machine print and hand print to find the areas of hand print.
This is further enhanced by the use of a specially designed font(s)
of a form to enhance this difference.  This invention makes use of
fonts for the form which will have very sharp vertical lines, and,
therefore, be readily detected by forming vertical histograms for
each line of text, and testing the contour of this histogram across
the line of text, for very abrupt transitions between bins with
pixels and bins without pixels.  Areas where these transitions are
less abrupt are assumed to be handwritten responses to the form.  A
method to find this distinction quickly and accurately using a neural
network wherein the calculatio...