IBM thinks silicon chips should look more like the human brain.
Big Blue has announced the development of what it calls a “software ecosystem” for next-generation silicon chips, inspired by the architecture of the human cortex. In theory, this ecosystem will result in technology that mirrors the brain’s perceptive and cognitive functions.
“Dramatically different from traditional software, IBM’s new programming model breaks the mold of sequential operation underlying today’s von Neumann architectures and computers,” read IBM’s statement on the matter. “It is instead tailored for a new class of distributed, highly interconnected, asynchronous, parallel, large-scale cognitive computing architectures.”
As part of the project, IBM researchers have developed several pieces of software, including a digital neuron model that can process information in a “brain-like” way, and which supports a “wide range of deterministic and stochastic neural computations, codes, and behaviors.” A network of these synthetic neurons could sense, remember, and act upon environmental data.
Those researchers are also hard at work on a massively parallel and highly scalable software simulation of a “cognitive computing architecture comprising a network of neurosynaptic cores.” There’s a growing library (referred to as a “cognitive system store”) of applications and algorithms for the ecosystem, along with a programming model intended to help researchers figure out the operation of each new corelet—i.e., a blueprint of a network of neurosynaptic cores that specifies a particular function—without needing to puzzle out the underlying mechanics.
IBM is also building a teaching curriculum that will incorporate all of this—the library, programming models, simulators, and more—into one package. It refers to this curriculum as the “library.”
The company’s long-term goal is to create a chip system comprised of billions of neurons and trillions of synapses, one that needs to consume only a little bit of electricity. Such a system could ingest a massive amount of raw information and analyze it at blazing speed—all while leveraging a widely dispersed network of technologically advanced sensors to draw in multiple varieties of data.
IBM argues that “traditional” computers simply can’t meet the rising demands of Big Data—at least not in the same way as the human brain, which is capable of absorbing and interpreting enormous amounts of information while burning relatively little power and occupying a skull-sized amount of space. In late 2011, it first demonstrated brain-inspired chip architecture based on neurosynaptic cores—the beginning of a lengthy and multi-phased project that has attracted funding and collaboration from DARPA and others.