High-frame-rate ultrasound imaging uses unfocused transmissions to insonify an entire imaging view for each transmit event, thereby enabling frame rates over 1000 frames per second (fps). At these high frame… Click to show full abstract
High-frame-rate ultrasound imaging uses unfocused transmissions to insonify an entire imaging view for each transmit event, thereby enabling frame rates over 1000 frames per second (fps). At these high frame rates, it is naturally challenging to realize real-time transfer of channel-domain raw data from the transducer to the system back end. Our work seeks to halve the total data transfer rate by uniformly decimating the receive channel count by 50% and, in turn, doubling the array pitch. We show that despite the reduced channel count and the inevitable use of a sparse array aperture, the resulting beamformed image quality can be maintained by designing a custom convolutional encoder–decoder neural network to infer the radio frequency (RF) data of the nullified channels. This deep learning framework was trained with in vivo human carotid data (5-MHz plane wave imaging, 128 channels, 31 steering angles over a 30° span, and 62 799 frames in total). After training, the network was tested on an in vitro point target scenario that was dissimilar to the training data, in addition to in vivo carotid validation datasets. In the point target phantom image beamformed from inferred channel data, spatial aliasing artifacts attributed to array pitch doubling were found to be reduced by up to 10 dB. For carotid imaging, our proposed approach yielded a lumen-to-tissue contrast that was on average within 3 dB compared to the full-aperture image, whereas without channel data inferencing, the carotid lumen was obscured. When implemented on an RTX-2080 GPU, the inference time to apply the trained network was 4 ms, which favors real-time imaging. Overall, our technique shows that with the help of deep learning, channel data transfer rates can be effectively halved with limited impact on the resulting image quality.
               
Click one of the above tabs to view related content.