LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Human-Machine Cooperative Echolocation Using Ultrasound

Photo from wikipedia

Echolocation has been shown to improve the independence of visually impaired people, and utilizing ultrasound in echolocation offers additional advantages, such as a higher resolution of object sensing and ease… Click to show full abstract

Echolocation has been shown to improve the independence of visually impaired people, and utilizing ultrasound in echolocation offers additional advantages, such as a higher resolution of object sensing and ease of extraction from background sounds. However, humans cannot innately make and hear ultrasound. A wearable device that enables ultrasonic echolocation, i.e., that transmits ultrasound through an ultrasonic speaker and converts the reflected ultrasound into audible sound, has therefore been attracting interest. Such a system can be utilized with machine learning (ML) to help visually impaired users recognize objects. We have therefore been developing a cooperative echolocation system that combines human recognition with ML recognition. As the first step toward cooperative echolocation, this paper presents the effectiveness of ML in echolocation. We implemented a prototype device and evaluated the performance of object detection with/without ML and found that the mental workload on the user was significantly decreased when ML was used. Based on the findings from the evaluation, we discussed the design of cooperative echolocation.

Keywords: human machine; echolocation; cooperative echolocation; machine cooperative; echolocation using

Journal Title: IEEE Access
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.