Thursday, July 6, 2017

IBM Builds A Scalable Computer Chip Inspired By The Human Brain

See here for another article on this topic. See here for more about DARPA brain projects, see here for more about the B.R.A.I.N. Initiative.


By Alex Knapp


"I’m holding in my hand a chip with one million neurons, 256 million synapses, and 4096 cores. With 5.4 billion transistors, it's the largest chip IBM has built."

Dr. Dharmendra S. Modha sounds positively giddy as he talks to me on the phone. This is the third time I've talked to him about his long-term project - an IBM project with the goal of creating an entirely new type of computer chip, SyNAPSE, whose architecture is inspired by the human brain. This new chip is a major success in that project.

"Inspired" is the key word, though. The chip's architecture is based on the structure of our brains, but very simplified. Still, within that architecture lies some amazing advantages over computers today. For one thing, despite this being IBM's largest chip, it draws only a tiny amount of electricity - about 63 mW - a fraction of the power being drawn by the chip in your laptop.

What's more, the new chip is also scalable - making possible larger neural networks of several chips connected together. The details behind their research has been published today in Science.

"In 2011, we had a chip with one core," Modha told me. "We have now scaled that to 4096 cores, while shrinking each core 15x by area and 100x by power."

Each core of the chip is modeled on a simplified version of the brain's neural architecture. The core contains 256 “neurons” (processors), 256 “axons” (memory) and 64,000 “synapses” (communications between neurons and axons). This structure is a radical departure from the von Neumann architecture that's the basis of virtually every computer today (including the one you're reading this on.)

Work on this project began in 2008 in a collaboration between IBM and several universities over the years. The project has received $53 million in funding from the Defense Advanced Research Projects Agency (DARPA). The first prototype chip was developed in 2011, and a programming language and development kit was released in 2013.

"This new chip will provide a powerful tool to researchers who are studying algorithms that use spiking neurons," Dr. Terrence J. Sejnowski told me. Sejnowski heads Computational Neurobiology Laboratory at the Salk Institute. He's unaffiliated with IBM's project but is familiar with the technology. "We know that such algorithms exist because the brain uses spiking neurons and can outperform all existing approaches, with a power budget of 20 watts, less than your laptop."

It's important to note, though, that the SyNAPSE system won't replace the computers of today - rather, they're intended to supplement them. Modha likened them to co-processors used in high performance computers to help them crunch data faster. Or, in a more poetic turn as he continued talking to me, he called SyNAPSE a "right-brained" computer compared to the "left-brained" architecture used in computers today.

"Current von Neumann machines are fast, symbolic, number-crunchers," he said. "SyNAPSE is slow, multi-sensory, and better at recognizing sensor data in real-time."

So to crunch big numbers and do heavy computational lifting, we'll still need conventional computers. Where these "cognitive" computers come in is in analyzing and discerning patterns in that data. Key applications include visual recognition of patterns - something that Dr. Modha notes would be very useful for applications such as driverless cars.

As Sejnowski told me, "The future is finding a path to low power computing that solves problems in sensing and moving -- what we do so well and digital computers do so awkwardly."

And that's what IBM is looking to do with SyNAPSE - finding the patterns that normal computers can't. As Modha put it, "Google Maps can plot your route, but SyNAPSE can see if there's a pothole."

What gives the SyNAPSE an advantage in pattern recognition is that, unlike a traditional computer, which crunches data sequentially, its brain-inspired architecture allows for more parallel processing. For example, in a facial recognition app, one core of the chip might be focused on nose shape, one on hair texture and color, one on eye color, etc. Each individual core is slower than a traditional processor, but since they run simultaneously in parallel, the chip as a whole can perform this type of operation much more quickly and accurately.

Other potential applications for the chip include use in cameras to automatically identify interesting items in cluttered environments. Modha's team also believes that the chip could be quite useful in natural language processing - being able to parse out and obey commands from people. Kind of like the computers on Star Trek that understood when they were in use and when people were just talking among themselves.

It probably won't be long before we see more of these applications in action. The scalable chip that IBM developed was built using conventional fabrication techniques for other chips - it just requires some different workflow.

Already over 200 programs have been developed for the chip, thanks to a simulation of the architecture running on supercomputers at at the Lawrence Livermore and Lawrence Berkeley National Laboratories. Those simulations allowed IBM to develop a programming language for the chip even before it existed.

"We've been working with IBM for the last 18 months and are extremely impressed with their achievement," Prof. Tobi Delbruck of the Institute of Neuroinformatics at UZH-ETH Zurich told me. "Applications like real time speech and vision that run continuously on battery power may finally be within reach."

"It's too soon to say who will win the race to implement practical realizations of brain-like computing in silicon," Delbruck added. "but IBM's solution is a serious contender."

Now that this new chip architecture has been developed and a fabrication technique setup, Modha said that the technology now is "like the 4 minute mile. Now that someone's done it, a lot of people can do it."

To help facilitate the development of the chip, both on the hardware and software side, IBM has developed a teaching curriculum for universities, its customers, its employees, and more.

On the hardware end, Modha's next goal is the development of what he calls a "neurosynaptic supercomputer." This would be a traditional supercomputer that uses both traditional and SyNAPSE chips - a computer with both a left and right brain, as it were - enabling it both to crunch numbers and quickly analyze real-time patterns as the data's crunched.

One question that Modha couldn't answer, though, what what the new chip means for video games - nobody's programmed one for SyNAPSE yet.
"That's an interesting question," he laughed. "But we're too busy for games!"

No comments:

Post a Comment