MIT’s New AI Chip Is 1 million Times Faster Than The Synapses In Human Brain

A team of researchers with the Massachusetts Institute of Technology (MIT) has been working to push the speed limits of a type of a previously developed human-made analog synapse, which is cheaper to build and is more energy efficient, and promises faster computation.

The multidisciplinary team used programmable resistors, which are the central building blocks in analog deep learning, just like transistors are the core elements for building digital processors to produce โ€œanalog deep learningโ€.

The resistors are built into repeating arrays to create a complex, layered network of artificial โ€œneuronsโ€ and โ€œsynapsesโ€ that execute computations just like a digital neural network. Such a network can then be trained to achieve complex AI tasks such as image recognition and natural language processing.

The researchers used a practical inorganic material in the fabrication process that enables their devices to run 1 million times faster than previous versions. The study claimed that it is about 1 million times faster than the synapses in the human brain.

Additionally, this organic material also makes the resistor extremely energy-efficient. Unlike materials used in their device’s previous version, the newly developed material is compatible with silicon fabrication techniques and could pave the way for integration into commercial computing hardware for deep-learning applications.

โ€œWith that key insight, and the very powerful nanofabrication techniques we have at MIT.nano, we have been able to put these pieces together and demonstrate that these devices are intrinsically very fast and operate with reasonable voltages. This work has really put these devices at a point where they now look really promising for future applications,โ€ said senior author Jesรบs A. del Alamo, the Donner Professor in MITโ€™s Department of Electrical Engineering and Computer Science (EECS).

โ€œThe working mechanism of the device is electrochemical insertion of the smallest ion, the proton, into an insulating oxide to modulate its electronic conductivity. Because we are working with very thin devices, we could accelerate the motion of this ion by using a strong electric field, and push these ionic devices to the nanosecond operation regime,โ€ explained senior author Bilge Yildiz, the Breene M. Kerr Professor in the departments of Nuclear Science and Engineering and Materials Science and Engineering.

โ€œThe action potential in biological cells rises and falls with a timescale of milliseconds since the voltage difference of about 0.1 volt is constrained by the stability of water,โ€ said senior author Ju Li, the Battelle Energy Alliance Professor of Nuclear Science and Engineering and professor of materials science and engineering. โ€œHere we apply up to 10 volts across a special solid glass film of nanoscale thickness that conducts protons, without permanently damaging it. And the stronger the field, the faster the ionic devices” he added.

The said programmable resistors significantly increase the speed at which aย neural networkย is trained, while considerably reducing the cost and energy to conduct the training.

The latest development could help scientists develop deep learning models much faster, which could then be applied in uses such as self-driving cars, fraud detection, and medical image analysis.

โ€œOnce you have an analog processor, you will no longer be training networks everyone else is working on. You will be training networks with unprecedented complexities that no one else can afford to, and therefore vastly outperform them all. In other words, this is not a faster car, this is a spacecraft,โ€ adds lead author and MIT postdoc Murat Onen.

The findings of the research were published in the journal ‘Scienceโ€™.

Kavita Iyer
Kavita Iyerhttps://www.techworm.net
An individual, optimist, homemaker, foodie, a die hard cricket fan and most importantly one who believes in Being Human!!!

Read More

Suggested Post