Researchers put a new spin on machine learning

Wednesday, 10 January, 2024

Researchers put a new spin on machine learning

Researchers at Tohoku University have shown a proof-of-concept of an energy-efficient computer compatible with current AI. It utilises the stochastic behaviour of nanoscale spintronics devices and is suitable for probabilistic computation problems such as inference and sampling.

With the slowing down of Moore’s Law, there has been an increasing demand for domain-specific hardware. Probabilistic computers with naturally stochastic building blocks (probabilistic bits, or p-bits) are a representative example owing to their potential to efficiently address various computationally hard tasks in machine learning (ML) and artificial intelligence (AI). Much like quantum computers are suitable for inherently quantum problems, room-temperature probabilistic computers are suitable for intrinsically probabilistic algorithms, which are used for training machines and computational hard problems in optimisation and sampling.

The researchers showed that robust and fully asynchronous (clockless) probabilistic computers can be realised at scale using a probabilistic spintronic device called magnetic tunnel junction (sMTJ) interfaced with powerful Field Programmable Gate Arrays (FPGA). Until now, sMTJ-based probabilistic computers have only been capable of implementing recurrent neural networks.

“As feedforward neural networks underpin most modern AI applications, augmenting probabilistic computers toward this direction should be a pivotal step to hit the market and enhance the computational capabilities of AI,” said Professor Kerem Camsari, the Principal Investigator of the project.

In the recent development, the researchers made two advances. First, leveraging earlier works by the Tohoku University team on stochastic magnetic tunnel junctions at the device level to demonstrate the fastest p-bits at the circuit level by using in-plane sMTJs. Second, by enforcing an update order at the computing hardware level and leveraging layer-by-layer parallelism, the researchers demonstrated the basic operation of the Bayesian network as an example of feedforward stochastic neural networks.

Professor Shunsuke Fukami from Tohoku University said that while the current demonstrations are small-scale, their designs can be scaled up by making use of CMOS-compatible Magnetic RAM technology. This could enable advances in machine learning applications while unlocking the potential for efficient hardware realisation of deep/convolutional neural networks.

Image credit:

Related News

Movies of ultrafast electronic circuitry in space and time

Researchers at the University of Konstanz have successfully filmed the operations of extremely...

Researchers reveal spin-orbit effects on exciton complexes in diamond

Researchers have analysed acceptor-bound excitons at cryogenic temperatures to reveal new...

Wearable sticker enables communication through gestures

Researchers have combined polydimethylsiloxane (PMDS) and a fiber Bragg grating to create a...

  • All content Copyright © 2024 Westwick-Farrow Pty Ltd