Researchers have shown that it is possible to acquire artificial intelligence using tiny nanomagnets that interact with neurons in the brain.
The new method, developed by a team led by researchers at Imperial College London, could significantly reduce the energy consumption of artificial intelligence (AI), which is currently doubling every 3.5 months worldwide.
In an article published today Nanotechnology of nature, The international team has created the first evidence that nanomagnets can be used to process array type IA. Researchers have shown that nanomagnets can be used for “time series prediction” work, such as predicting and controlling insulin levels in diabetic patients.
Artificial intelligence that uses “neural networks” aimed at reproducing the functionality of specific parts of the brain, where neurons communicate with each other to process and retain information. Most of the mathematics used to power neural networks was originally invented by physicists to describe how magnets interact, but at the time it was very difficult to use direct magnets because researchers did not know how to insert data and get information.
Instead, software running on traditional silicon-based computers was used to mimic magnetic interactions, consequently mimicking the brain. The team was now able to use magnets to process and store data by eliminating software simulation intermediaries and offering potentially huge energy savings.
Nano-magnets can come in different “states” depending on their direction. Applying a magnetic field to an array of nanomagnets depends on the properties of the input field, but changes the position of the magnet depending on the position of the surrounding magnet.
The team, led by researchers from the Imperial Department of Physics, was then able to devise a strategy to calculate the number of magnets in each state after crossing the field with the ‘north’.
Dr. Jack Gartside, co-author of the study, said: “We’ve been trying for a long time to figure out how to access data, ask a question and get answers from magnetic computing to solve the problem. Creates a way to eliminate computer software that does energy-hungry simulations.
Co-author Killian Staning adds: “The way a magnet communicates gives us all the information we need; The laws of physics itself become computers. A
The team leader said. Will Brainford says: “Building hardware based on Sherington and Kirkpatrick’s software algorithms was a long-term goal. It was not possible to use spins on atoms in conventional magnets, but by increasing the spins to nano-patterned arrays we were able to achieve the necessary controls and readouts. A
Reduce energy consumption
AI is now being used in a variety of settings, from voice recognition to self-driving cars. But AI can require huge amounts of training to perform even the most simple tasks. For example, solving a Rubik’s Cube requires the equivalent power of two nuclear power plants running for an hour to train AI.
Most of the energy used to achieve this in conventional silicon-chip computers is wasted on inefficient transport of electrons during processing and memory storage. However, nanomagnets do not rely on the physical transport of particles like electrons, but rather process and transfer information in the form of a “magnon” wave, where each magnet affects the position of the neighboring magnet.
This means that much less energy is wasted and data is processed and stored, which can be done together without being a separate process like conventional computers. This innovation could make nanomagnetic computing 100,000 times more efficient than conventional computing.
The team will then teach the system using real-world data, such as ECG signals, and hope to turn it into a real computing device. Finally, magnetic systems can be integrated into conventional computers to improve the energy efficiency of intensive processing work.
Their energy efficiency means they can be driven by renewables and used to do the ‘AI Edge’ – where it is stored, such as Antarctica’s meteorological stations are processed rather than sent to large data centers.
This means that they can be used on wearable devices to process biometric data about the body, such as predicting and controlling insulin levels for people with diabetes or detecting abnormal heartbeats.