Researchers develop brain-like transistor

Friday, 05 January, 2024

Researchers develop brain-like transistor

Researchers have developed a new synaptic transistor that is capable of higher-level thinking, akin to the human brain.

Designed by researchers at Northwestern University, Boston College and the Massachusetts Institute of Technology (MIT), the device processes and stores information like the human brain. Experiments demonstrate that the transistor goes beyond simple machine-learning tasks to categorise data and is capable of performing associative learning.

While previous studies have leveraged similar strategies to develop brain-like computing devices, those transistors cannot function outside cryogenic temperatures. The new device, however, is stable at room temperatures and also operates at fast speeds, consumes very little energy and retains stored information even when power is removed. The research findings have been published in the journal Nature.

Mark C. Hersam, who co-led the research, said the brain has a fundamentally different architecture from a digital computer. “In a digital computer, data move back and forth between a microprocessor and memory, which consumes a lot of energy and creates a bottleneck when attempting to perform multiple tasks at the same time. On the other hand, in the brain, memory and information processing are co-located and fully integrated, resulting in orders-of-magnitude higher energy efficiency. Our synaptic transistor similarly achieves concurrent memory and information processing functionality to more faithfully mimic the brain,” Hersam said.

Recent advances in artificial intelligence (AI) have motivated researchers to develop computers that operate more like the human brain. Conventional, digital computing systems have separate processing and storage units, causing data-intensive tasks to use up large amounts of energy. With smart devices continuously collecting vast quantities of data, researchers are looking for ways to uncover new ways to process it all without consuming an increasing amount of power.

Hersam said that for many decades, the paradigm in electronics has been to build everything out of transistors and use the same silicon architecture. Much progress has been made by packing more and more transistors into integrated circuits, but this consumes a lot of power, particularly in the current era of big data where digital computing is on track to overwhelm the grid. “We have to rethink computing hardware, especially for AI and machine-learning tasks,” Hersam said.

Hersam and his team explored new advances in the physics of moiré patterns, a type of geometrical design that arises when two patterns are layered on top of one another. When two-dimensional materials are stacked, new properties emerge that do not exist in one layer alone. And when those layers are twisted to form a moiré pattern, the tunability of electronic properties becomes possible.

For the new device, the researchers combined two types of atomically thin materials: bilayer graphene and hexagonal boron nitride. When stacked and twisted, the materials formed a moiré pattern. By rotating one layer relative to another, the researchers achieved different electronic properties in each graphene layer even though they are separated by only atomic-scale dimensions. With the right choice of twist, the researchers harnessed moiré physics for neuromorphic functionality at room temperature.

“With twist as a new design parameter, the number of permutations is vast. Graphene and hexagonal boron nitride are very similar structurally but just different enough that you get exceptionally strong moiré effects,” Hersam said.

Image caption: A schematic showing the different layers within the new technology. Image credit: Mark C. Hersam/Northwestern University

To test the transistor, the researchers trained it to recognise similar — but not identical — patterns. First, the researchers showed the device one pattern: 000 (three zeroes in a row). Then, they asked the AI to identify similar patterns, such as 111 or 101. “If we trained it to detect 000 and then gave it 111 and 101, it knows 111 is more similar to 000 than 101. 000 and 111 are not exactly the same, but both are three digits in a row. Recognising that similarity is a higher-level form of cognition known as associative learning,” Hersam said.

In experiments, the synaptic transistor recognised similar patterns, displaying its associative memory. Even when the researchers threw curveballs — like giving it incomplete patterns — it still successfully demonstrated associative learning.

Top image credit:

Related Articles

Hidden semiconductor activity spotted by researchers

Researchers have discovered that the material that a semiconductor chip device is built on,...

3D reflectors help boost data rate in wireless communications

Cornell researchers have developed a semiconductor chip that will enable smaller devices to...

Scientists revolutionise wireless communication with 3D processors

Scientists have developed a method for using semiconductor technology to manufacture processors...

  • All content Copyright © 2024 Westwick-Farrow Pty Ltd