Abstract In this study, we evaluated the performance of a commercial pixelated cadmium zinc telluride (CZT) detector for spectroscopy and identified its feasibility as a Compton camera for radiation monitoring… Click to show full abstract
Abstract In this study, we evaluated the performance of a commercial pixelated cadmium zinc telluride (CZT) detector for spectroscopy and identified its feasibility as a Compton camera for radiation monitoring in a nuclear power plant. The detection system consisted of a 20 mm × 20 mm × 5 mm CZT crystal with 8 × 8 pixelated anodes and a common cathode, in addition to an application specific integrated circuit. The performance of the various radioisotopes 57Co, 133Ba, 22Na, and 137Cs was evaluated. In general, the amplitude of the induced signal in a CZT crystal depends on the interaction position and material non-uniformity. To minimize this dependency, a drift time correction was applied. The depth of each interaction was calculated by the drift time and the positional dependency of the signal amplitude was corrected based on the depth information. After the correction, the Compton regions of each spectrum were reduced, and energy resolutions of 122 keV, 356 keV, 511 keV, and 662 keV peaks were improved from 13.59%, 9.56%, 6.08%, and 5%–4.61%, 2.94%, 2.08%, and 2.2%, respectively. For the Compton imaging, simulations and experiments using one 137Cs source with various angular positions and two 137Cs sources were performed. Individual and multiple sources of 133Ba, 22Na, and 137Cs were also measured. The images were successfully reconstructed by weighted list-mode maximum likelihood expectation maximization method. The angular resolutions and intrinsic efficiency of the 137Cs experiments were approximately 7°–9° and 5 × 10−4–7 × 10−4, respectively. The distortions of the source distribution were proportional to the offset angle.
               
Click one of the above tabs to view related content.