LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

An Enhanced Floating Gate Memory for the Online Training of Analog Neural Networks

Photo by anniespratt from unsplash

Floating gate (FG) memory has long erasing time, which limits its application as an electronic synapse in online training. This paper proposes a novel enhanced floating gate memory (EFM) by… Click to show full abstract

Floating gate (FG) memory has long erasing time, which limits its application as an electronic synapse in online training. This paper proposes a novel enhanced floating gate memory (EFM) by TCAD simulation. Here, three other structures are simulated just for comparison. The simulation results show that the erasing speed is about 34ns while the other three need the time over 1.8ms, which makes the operation speed of long-term potentiation (LTP) more symmetrical to long-term depression (LTD). In addition, both LTP and LTD are approximately linear in the simulation results. The speed, linearity, and symmetry of weight update are the keys to online training of analog neural networks. These excellent performances indicated a potential application of EFM in analog neuro-inspired computing.

Keywords: gate memory; analog; floating gate; enhanced floating; online training

Journal Title: IEEE Journal of the Electron Devices Society
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.