|Researchers at IBM have unveiled what they are calling a breakthrough software ecosystem designed for programming silicon chips that have an architecture inspired by the function, low power, and compact volume of the brain.|
Scientists from IBM have unveiled what they are calling a breakthrough software ecosystem designed for programming silicon chips that have an architecture inspired by the function, low power, and compact volume of the brain.
The technology could enable a new generation of artificial intelligence sensor networks that mimic the brain’s abilities for perception, action, and cognition.
The work is being carried out as part of a DARPA funding project. SyNAPSE is a DARPA program that aims to develop electronic neuromorphic machine technology that scales to biological levels.
These innovations are being presented at The International Joint Conference on Neural Networks in Dallas, TX.
Modern computing systems were designed decades ago for sequential processing according to a pre-defined program. Although they are fast and precise “number crunchers,” computers of traditional design become constrained by power and size while operating at reduced effectiveness when applied to real-time processing of the noisy, analog, voluminous, Big Data produced by the world around us. In contrast, the brain—which operates comparatively slowly and at low precision—excels at tasks such as recognizing, interpreting, and acting upon patterns, while consuming the same amount of power as a 20 watt light bulb and occupying the volume of a two-liter bottle.
IBM’s long-term goal is to build a chip system with ten billion neurons and hundred trillion synapses, while consuming merely one kilowatt of power and occupying less than two liters of volume.
Systems built from these chips could bring the real-time capture and analysis of various types of data closer to the point of collection. They would not only gather symbolic data, which is fixed text or digital information, but also gather sub-symbolic data, which is sensory based and whose values change continuously. This raw data reflects activity in the world of every kind ranging from commerce, social, logistics, location, movement, and environmental conditions.
Take the human eyes, for example. They sift through over a Terabyte of data per day. Emulating the visual cortex, low-power, light-weight eye glasses designed to help the visually impaired could be outfitted with multiple video and auditory sensors that capture and analyze this optical flow of data.
"We are populating the Earth and space with sensors, cameras and microphones, and then moving [the data they create] to the data center. Data is going to computation. But with a low power and brain-like capabilities of our chips, and the ability to do pattern recognition, we can move intelligent computation back to the edge," Modha said. "The sensor becomes the computer."
These sensors would gather and interpret large-scale volumes of data to signal how many individuals are ahead of the user, distance to an upcoming curb, number of vehicles in a given intersection, height of a ceiling or length of a crosswalk. Like a guide dog, sub-symbolic data perceived by the glasses would allow them to plot the safest pathway through a room or outdoor setting and help the user navigate the environment via embedded speakers or ear buds.
This same technology -- at increasing levels of scale -- can form sensory-based data input capabilities and on-board analytics for automobiles, medical imagers, healthcare devices, smartphones, cameras, and robots.
To help people understand how the technology could be used, they brainstormed a selection of sample applications. Think of them as cognitive apps. In this video, Bill Risk, one of the managers of the SyNAPSE project, explains some of the apps.
|By 33rd Square||Subscribe to 33rd Square|