Browse Prior Art Database

Calibration of Memory Driver with Offset in a Memory Controller and Memory Device Interface in a Communication Bus Disclosure Number: IPCOM000195276D
Publication Date: 2010-Apr-27
Document File: 3 page(s) / 142K

Publishing Venue

The Prior Art Database


Described is a method for improving the AC performance characteristics of an interface by coupling bus components, such as a DRAM and memory controller, during driver training and then reducing mismatches by controlling the DRAM drive impedance to yield improvements in timing margin. In generic terms, coupling the components on a shared electrical bus via DRAM drive impedance adjustment during system operation will dramatically reduce known offset issues. The method shown can be applied to any number of system or sub-system electrical communication busses.

This text was extracted from a PDF file.
At least one non-text object (such as an image or picture) has been suppressed.
This is the abbreviated version, containing approximately 55% of the total text.

Page 1 of 3

Ȉ ˇ ˄ˇ

Ȉ ˇ ˄ˇ Ȉ ˇ ˄ˇ

GDDR memory DRAMs are designed to train drive impedance and termination values against a reference resistor. Process variations and resolution can cause variations in the final DRAM training values. Such variations can occur within the memory controller, if it trains in a similar method, causing a mismatch in DRAM and controller impedances. This mismatch can cause timing offsets due to the reference voltage not being properly aligned to the resulting data eye. This problem generically exists on many other interface types and the affect on reduced timing margins is also present in these other situations.

    For example, when a device drives a "0", the impedances of device driver and controller termination determine at what voltage a "0" will be on the net. To get the most timing margin on this data interface, it is important to deal with any variations in these impedances. By modifying the device drive impedance so that the vertical center of the read eye to the controller is optimized, the timing margin will be maximized.

    Figure 1 shows an implementation for a bus structure where the low level would need adjustment because the device driver impedance and controller impedance determine the value of the low level voltage. The functional data path is compared against the ideal low level in the test path. The expected output indicates that the test path is either above or below the expected value in order to help find the best device drive impedance setting.

˙ ˝ ˇ ˛˚ ˝ ~

    This implementation can increment or decrement the drive impedance of the device which results in changing the "0" value seen by the controller. The controller modifies the device drive impedance by adjusting the voltage at the reference resistor of the device. The device drive impedance is then adjusted until there is a change in the test path giving a "Read 0" low voltage equal to Vlow of 0.4 Vdd which is the optimum for the reference of voltage of 0.7 Vdd. The tuning of the device drive impedance is done independently for each device.


[This page contains 16 pictures or other non-text objects]

Page 2 of 3

    After the training is done, the optimal setting is locked. However, changes in temperature and voltage levels result in variations of the device drive impedance and controller termination which leads to a less optimum low voltage level. By monitoring the test path, the non-optimal condition can be detected and re-calibration can be initiated if needed.

    Figure 2 provides an example of determining the appropriate driver training levels when comparing the test to functional paths. In general terms, Vlow being too low would show the test path as always reading a '1' value, and Vlow being too high would show the test path as always reading the correct functional value (always reading a '0' as a '0' and always...