With the advances of robot technology in the medical field, robotic palpation has been a research hot topic and attracted wide attention. Tumor recognition is considered as a significant work… Click to show full abstract
With the advances of robot technology in the medical field, robotic palpation has been a research hot topic and attracted wide attention. Tumor recognition is considered as a significant work in robotic palpation. Especially, the determination of tumor depth is a key prerequisite for tumor resection in robot-assisted minimally invasive surgery (RMIS). However, the lack of tactile feedback in RMIS prevents the robot from accurately perceiving information about lesions such as tumor depth. To solve this problem, we investigate the recognition of the depths of hard inclusion in soft tissue phantoms. In this article, tactile array data acquired by autonomous robotic palpation are used to recognize different depths of hard inclusions. Aiming at the classification problem of tumor depth recognition, we perform various deep learning methods and present ordinal classification methods based on the 2-D convolutional neural network- long short-term memory (2-D-CNN-LSTM) architecture for high-dimensional spatial–temporal tactile array data. The methods exploit the ordinal information of labels by changing the conventional one-hot encoding to unimodal probability distribution encodings, including Gaussian distribution, Cauchy distribution, and Laplace distribution encoding. Finally, the experimental results prove the feasibility of the proposed methods. The average recognition accuracies of the methods can reach 97%. In particular, the recognition accuracies can be above 99% when using all the robotic fingers’ tactile data.
               
Click one of the above tabs to view related content.