Browse Prior Art Database

Control System for Self Calibrating Digital Converter Offsets in an Analog System

IP.com Disclosure Number: IPCOM000117796D
Original Publication Date: 1996-Jun-01
Included in the Prior Art Database: 2005-Mar-31
Document File: 4 page(s) / 145K

Publishing Venue

IBM

Related People

Chapman, DB: AUTHOR [+2]

Abstract

Analog circuits have offset voltages that can vary with usage conditions and subtract from the usable dynamic range of the circuits. This disclosure describes a system that calibrates out the offsets by means of a 1 bit Analog to Digital Converter (ADC), a logic control system and Digital to Analog Converters (DACs).

This text was extracted from an ASCII text file.
This is the abbreviated version, containing approximately 44% of the total text.

Control System for Self Calibrating Digital Converter Offsets in
an Analog System

      Analog circuits have offset voltages that can vary with usage
conditions and subtract from the usable dynamic range of the
circuits.  This disclosure describes a system that calibrates out the
offsets by means of a 1 bit Analog to Digital Converter (ADC), a
logic control system and Digital to Analog Converters (DACs).

      As shown in the Figure, the analog part of the system is
composed of three analog circuit functions (1-3) and an ADC (5).
Each one requires one or two offset calibrations that are adjusted
with DAC's 26-31.  For purposes of description, input amp 1 is
considered the front of the analog circuits while the filter 3 is the
back of the analog circuits.

      The preferred sequence of calibration is from back to front,
followed by a calibration sequence from the front to the back.  The
alternate embodiment is from the front to the back.  Time delays are
provided to allow the analog circuitry to settle before offset
measurements are made.  Only the Most Significant Bit (MSB) of the
ADC or a comparator is used.  A digital filter is provided to make
the measurements more reliable by filtering out noise.  The residual
offset error is reduced by imposing one of two criteria on the DAC
adjustment:  either the filtered reading from the ADC is zero or the
ADC must change polarity from

 positive offset to negative offset.
Provision is made for a short calibration sequence that allows just
one circuit to be calibrated.

      The use of only the MSB of the ADC (or a simple comparator in a
system that does not have an ADC) reduces the amount of logic needed.
The system measures the polarity of the offset; hence, corrections to
the offset correction DACs are made one Least Significant Bit (LSB)
at a time.

      The sequence of events for the preferred embodiment is for all
of the switches (6 through 9 and 13) to be open.  Conductor 14 and
switches 10, 11 and 12 are not present in the preferred embodiment.
The ADC is autocalibrated.  Switch 13 is then closed to provide a DC
signal path into the ADC from the Filter 3.  The Filter is given a
preliminary offset calibration.  Switch 8 is closed to connect VGA 2
to the filter and the offsets in the VGA are nulled out with the gain
of the VGA set to a nominal value.  Then switch 7 is closed and Input
Amp 1 is adjusted with its Low Pass Filter (LPF) enabled.  The LPF is
then disabled and first stage of the Input Amp is nulled out.  Note
that in this example, the two offset adjustments for the Input Amp
are independent.  Now the switches are left as they are (all closed
except for 6 and 9) and the VGA is set to a zero gain condition.  The
VGA low gain DAC is adjusted.  The VGA is now digitally set to the
maximum gain and the high gain offset DAC is adjusted.  If the VGA is
designed in a linear manner, then the Input Amp offsets will have
been nulled out for all gains b...