×

Future chip architecture will revolutionise AI-powered data handling

IBM's new computer architecture will be better equipped to handle data load from AI

Unlike the conventional computers, the researchers propose that brain-inspired computers could have coexisting processing and memory units | Image for representation

Scientists at IBM are developing a new computer architecture, better equipped to handle increased data loads from artificial intelligence (AI).

The designs, published in the Journal of Applied Physics, draw on concepts from the human brain and significantly outperform conventional computers in comparative studies.

Today's computers are built on the von Neumann architecture, developed in the 1940s. Such computing systems feature a central processer that executes logic and arithmetic, a memory unit, storage, and input and output devices.

Unlike the stovepipe components in conventional computers, the researchers propose that brain-inspired computers could have coexisting processing and memory units.

Abu Sebastian, a scientist at the International Business Machines (IBM) in the US, explained that executing certain computational tasks in the computer's memory would increase the system's efficiency and save energy.

"If you look at human beings, we compute with 20 to 30 watts of power, whereas AI today is based on supercomputers which run on kilowatts or megawatts of power," Sebastian said.

"In the brain, synapses are both computing and storing information. In a new architecture, going beyond von Neumann, memory has to play a more active role in computing," he said.

The IBM team drew on three different levels of inspiration from the brain. The first level exploits a memory device's state dynamics to perform computational tasks in the memory itself, similar to how the brain's memory and processing are co-located.

The second level draws on the brain's synaptic network structures as inspiration for arrays of phase change memory (PCM) devices to accelerate training for deep neural networks.

The dynamic and stochastic nature of neurons and synapses inspired the team to create a powerful computational substrate for spiking neural networks, researchers said.

Phase change memory is a nanoscale memory device built from compounds of germanium, tellurium and antimony sandwiched between electrodes.

These compounds exhibit different electrical properties depending on their atomic arrangement. For example, in a disordered phase, these materials exhibit high resistivity, whereas in a crystalline phase they show low resistivity.