We present a novel application of machine learning techniques to optimize the design of a radiation detection system. A decision tree-based algorithm is described which greedily optimizes partitioning of energy… Click to show full abstract
We present a novel application of machine learning techniques to optimize the design of a radiation detection system. A decision tree-based algorithm is described which greedily optimizes partitioning of energy depositions based on a minimum detectable concentration metric - appropriate for radiation measurement. We apply this method to the task of optimizing sensitivity to radioxenon decays in the presence of a high rate of radon-progeny backgrounds (i.e., assuming no physical radon removal by traditional gas separation techniques). Assuming other backgrounds are negligible, and considering sensitivity to each xenon isotope separately (neglecting interference between isotopes), we find that, in general, high resolution readout and high spatial segmentation yield little additional capability to discriminate against radon backgrounds compared to simpler detector designs.
               
Click one of the above tabs to view related content.