Controlling Gain of the PMT Coupled to a Scintillator Crystal for Use in Oil Well Total Gamma Ray...
Publication Date: 2004-Oct-08
The IP.com Prior Art Database
Total Gamma ray log is one of the most important LWD or wireline measurements. Loss of accuracy can occur in wells of high temperatures due to the loss of gain of the detector such as a scintillator crystal coupled to a photo-multiplier tube. Increase of thermal noise at high temperatures is another cause of loss of accuracy for Gamma Ray measurement. Prior art uses plateau method for choosing a voltage at the PMT in order to minimize the variance of the Gamma count rate with temperature. However, the miminum variance does not necesarrily mean an optimal accuracy. Namely, with the plateau method, relatively stable count rate is achieved due to mutually compensating effects at high temperatures: increase of the thermal noise is compensated by a loss of the useful signal. The problem becomes more serios when the detector ages, which is accompanied by a dramatic increase of thermal noise and loss of resolution. In the present invention we propose a calibration procedure and gain control procedure using the measurement of the three sensors: Count rates in 3 energy windows of the Gamma ray detector, Temperature measurement and Voltage measurement. Increase of temperature causes a loss of gain of the Gamma Ray detector. Gain of the detector is inversely correlated with the RATIO of the OFFSET to SLOPE in a straight line fit of the count rates in the three energy windows covering the right sidelobe of the Compton spectrum. Optimally filtered RATIO is used as a feed-back signal to control the voltage so that the RATIO is kept constant during the entire run. Temperature sensor measurement is used for a crude initialization of the PMT voltage upon a tool start-up. Empirical linear relationships between Voltage, Gain, Ratio and Temperature are used in the gain control algorithm as parameters that facilitate convergence of the voltage to the voltage required to keep the Gain constant. ----------- Here is a sketch of what is in mind for high temperature Gamma. Seems to be very easy implement and easy to calibrate Temperature correction monitors 3 measurements: 1) T – temperature 2) V – voltage 3) R – ratio of offset to slope for 3 energy windows = shape of the Compton spectrum. Heavily filtered ratio Calibration done at room temperature To determines the Vo and Ro. Vo is determined by measuring plateau curve at room temperature and requiring that the Vo is such that the count rate is 10 lower than that for plateau point. Plateau test is run using assembled tool and Amersham Gamma Ray calibrator. Once the starting values of the parameters To,Vo,Ro have been measured and the tool is sent downhole, these parameters are sampled every minute. As a first step the voltage is adjusted using known dV/dT: set voltage to be V=Vo+(dV/dT)(T-To) This is a first crude approximation. From this point adjust the voltage as follows: V=V(one sample ago)+0.1(dV/dR)(R-Ro) Coefficient 0.1 is used here for a purpose: (dV/dR) is known approximately and we do not want voltage control to overshoot, correct the voltage too much. Instead we want our algorithm to converge slowly towards correct voltage and correct ratio R. Ratio is a function of gain, so converging to correct ratio means converging to correct gain. Regards, Sergey