LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Smart materials make smarter circuits

Photo from wikipedia

One of the several ways in which brains are not like ordinary computers is that their hardware is not fixed. While computational neural networks typically adapt the strengths of connections… Click to show full abstract

One of the several ways in which brains are not like ordinary computers is that their hardware is not fixed. While computational neural networks typically adapt the strengths of connections between their nodes in response to training data, biological neural networks have considerably more scope for plasticity, including the ability to grow new neurons and to rearrange the physical connections between them. It is still unclear how our own cognitive flexibility flows from the brain’s reconfigurable adroitness, but our capacity to process and learn from a mass of unstructured input data far exceeds anything yet demonstrated for neural networks made from silicon. Hence the interest in so-called neuromorphic computing, which attempts more directly to translate the characteristics of biological information processing into artificial systems. A big enough digital logic circuit should in principle be able to simulate more or less anything — which is what most efforts towards ‘whole brain simulation’ are counting on. But while they rest on the assumption of functionalism, which asserts a substrate-independence of computational or cognitive algorithms, there could be advantages to building the requisite computational versatility and adaptability into the material hardware of the system: the medium, you might say, becomes an aspect of the message. That is the philosophy behind a new study that describes a dynamic, neuromorphic platform for neural-network-type processing in which the devices themselves can be given different functions via electronic transformations applied to their material substrate. This flexibility of performance in a single material has previously been one of the obstacles to realizing neuromorphic hardware. Zhang et al. have constructed arrays of devices that can act as resistors and capacitors as well as analogues of neurons and synapses, all made from the same materials and all inter-convertible using applied electric fields (Science 375, 533–539; 2022). They are not silicon-based, but fabricated instead from thin (50 nm) films of the nickelate perovskite NdNiO3 (NNO). At room temperature, this crystalline solid is a metallic conductor, albeit with electrons that move in a correlated rather than independent fashion. The conductivity can be modified, however, by doping the material with hydrogen using a catalytic process. This creates mobile hydrogen ions (protons) distributed in the lattice; electrically induced redistribution of the protons can alter the conductivity so as to switch the two-terminal devices into metastable resistive states, or capacitive (so that they can act as memory devices), or can give them a threshold-based ‘firing’ ability to discharge pulses (like a neuron), or a nonlinear current–voltage characteristic that mimics a synapse. Crucially, this switching is quick, because of the relatively fast proton migration. The researchers deposit the NNO film on a lanthanum aluminate substrate, which would be compatible with standard fabrication methods used in the semiconductor industry (a silicon substrate also works), and they control the proton distributions and thus the device characteristics with positive and negative pulses applied to the gold and palladium electrodes. They use optical and Raman spectroscopy to identify and characterize the different functional states, finding that the proton gradient near the palladium electrode is the crucial parameter for achieving multi-functionality. The devices can typically be updated using no more energy per device than is needed to reset synaptic junctions in the brain. Zhang et al. demonstrate that their device arrays can be used for standard machine-learning tasks such as numeral character recognition. Strikingly, the array was able to apportion its resources so as to handle the task most efficiently. When trained on the numerals 0 to 4, it spontaneously used a smaller network of devices than when confronted with all ten numerals. And when the larger network trained for 0–9 was given only 0–4 again, it deactivated some nodes and shrank to reflect the decreased ‘cognitive’ load. This ability to allocate resources according to the task — to ‘grow when required’ — is another neuromorphic feature that static networks do not possess. Such features show the benefits of a broader philosophy in recent materials design: to relocate the ‘smartness’ of some structures into the fabric itself. ❐

Keywords: neural networks; materials make; philosophy; smart materials; make smarter; smarter circuits

Journal Title: Nature Materials
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.