Browse Prior Art Database

CAMERA LINEARITY TEST STRUCTURE

IP.com Disclosure Number: IPCOM000006566D
Original Publication Date: 1992-Aug-01
Included in the Prior Art Database: 2002-Jan-15
Document File: 2 page(s) / 131K

Publishing Venue

Motorola

Related People

George Granger: AUTHOR [+2]

Abstract

Some defect/particle detection systems designed for semiconductor applications arc based on a combination of a vision sub-system (video camera with lens), an accu- rate x-y translation sub-system (stage) that steps the semi- conductor wafer beneath the camera and a computer system which controls the equipment and stores meas- urement information. Particle detection is accomplished by measuring optical scattering of laser light from parti- cles that rest on the surface of the wafer. Digitization of the video camera image offers the ability to obtain locational information on the position of the defects within the field of view of the camera. Proper calibration of the vision system combined with accurate locational data from the x-y stage allows the determination of x-y coor- dinates for each particle detected on the wafer being examined with reference to a coordinate system arbi- trarily defined on the wafer, typically near the wafer cen- ter. Tbis particle coordinate locational data, along with the wafer, is then transferable to other analysis equip- ment for further characterization of the physical and chemical nature of the defects.

This text was extracted from a PDF file.
At least one non-text object (such as an image or picture) has been suppressed.
This is the abbreviated version, containing approximately 43% of the total text.

Page 1 of 2

MOTOROLA INC. Technical Developments Volume 16 August 1992

CAMERA LINEARITY TEST STRUCTURE

by George Granger and Tom Remmel

  Some defect/particle detection systems designed for semiconductor applications arc based on a combination of a vision sub-system (video camera with lens), an accu- rate x-y translation sub-system (stage) that steps the semi- conductor wafer beneath the camera and a computer system which controls the equipment and stores meas- urement information. Particle detection is accomplished by measuring optical scattering of laser light from parti- cles that rest on the surface of the wafer. Digitization of the video camera image offers the ability to obtain locational information on the position of the defects within the field of view of the camera. Proper calibration of the vision system combined with accurate locational data from the x-y stage allows the determination of x-y coor- dinates for each particle detected on the wafer being examined with reference to a coordinate system arbi- trarily defined on the wafer, typically near the wafer cen- ter. Tbis particle coordinate locational data, along with the wafer, is then transferable to other analysis equip- ment for further characterization of the physical and chemical nature of the defects.

  However, all camera/lens systems have some degree of inherent distortion, or non-linearity, h both x and y directions. This non-linearity, known as either pincush- ion or barrel distortion, can be severe in video cameras, approaching 10% non-linearity near the field of view. Such distortion produces erroneous coordinate data for the defects detected by the video patterned wafer inspec- tion system making it unfeasible to locate these same defects in other characterization or analysis systems.

  The camera linearity test structure described below was designed to precisely measure the extent of both x and y distortion in such video camera/lens systems. By quantifying the exact degree of non-linearity of the sys- tem, one can correct for it mathematically.

  The test structure consists of a substrate material which has multiple repeating patterns on it. One such pattern is depicted in the drawing of Figure 1. The key aspects of the pattern arc: (1) two 21x21 arrays of fea- tures, (2) the features in these arrays arc small squares,

B M010'01a, 1°C. 1992

and (3) three rectangular arrays of lines arranged as a pair in one comer of the pattern and a single in the other. The square. arrays consist of an equal number of features that arc equally spaced in both the x and y direc- tions; the overall size of the array is dependent upon the field of view of the video camera/lens system (related to lens magnifications). The spacing of the features within the array is determined by the desired accuracy in mcas- urement and subsequent modeling ofthe distortion cor- rection; the greater the number of these features, the more accurate the correction will be. We used an array of 21x21 features, each s...