LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Energy-Efficient IoT Sensor Calibration With Deep Reinforcement Learning

Photo by mbrunacr from unsplash

The modern development of ultra-durable and energy-efficient IoT based communication sensors has much application in modern telecommunication and networking sectors. Sensor calibration to reduce power usage is beneficial to minimizing… Click to show full abstract

The modern development of ultra-durable and energy-efficient IoT based communication sensors has much application in modern telecommunication and networking sectors. Sensor calibration to reduce power usage is beneficial to minimizing energy consumption in sensors as well as improve the efficiency of devices. Reinforcement learning (RL) has been received much attention from researchers and now widely applied in many study fields to achieve intelligent automation. Though various types of sensors have been widely used in the field of IoT, rare researches were conducted in resource optimizing. In this novel research, a new style of power conservation has been explored with the help of RL to make a new generation of IoT devices with calibrated power sources to maximize resource utilization. A closed grid multiple power source based control for sensor resource utilization has been introduced. Our proposed model using Deep Q learning (DQN) enables IoT sensors to maximize its resource utilization. This research focuses solely on the energy-efficient sensor calibration and simulation results show promising performance of the proposed method.

Keywords: iot; sensor calibration; energy efficient; energy

Journal Title: IEEE Access
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.