COMPUTERS THAT THINK LIKE WE DO

Conventional computers, even those running artificial intelligence programs, plod through calculations by sending ones and zeros along electronic pathways, one after the other.
In contrast, the brains of mammals think and remember by activating networks of neurons, or brain cells, which then activate and communicate with similar networks across the brain.
That’s what neuromorphic computers do as well, and Intel has just made one that does it even better than its predecessors.
Intel’s new “Loihi” chip holds a million artificial neurons, six times more than the company’s previous most densely populated version, on a silicon wafer the size of a small fingernail.
Those million digital neurons are linked to each other through 120 million connections. 
That leap in capacity is crucial in developing the electronic brains that eventually will enable robots to see and navigate autonomously through their surroundings, touch, and even smell.
For example, Intel’s engineers trained a first-generation Loihi chip to detect the odors of 10 hazardous chemicals, such as acetone and ammonia, in a stew of compounds.
The chip was able to correctly identify each scent after only one exposure; conventional machine learning would have required a computer to be shown 3,000 samples before gaining a perfect score, the research team calculated.
At Los Alamos National Laboratory, scientists tested Loihi’s ability to learn from its mistakes as it figured out how to identify a set of handwritten numerals. The chip accomplished the task as quickly as an artificial intelligence program but used only one-hundredth of the AI’s needed power to do so.
Neuromorphic chips from BrainChip, Intel, Synsense, and others also might help scientists gain insight into the workings of the brain; unlike artificial intelligence, neuromorphic chips make visible their electronic patterns of thought and perception, enabling scientists to record the flow and pattern of signals.
TRENDPOST: Neuromorphic computing progress is gaining speed. In a new paper, Samsung engineers, working with researchers at Harvard University, have suggested a way to record electronic patterns in human brains and “paste” them directly into neuromorphic chips.
While the chips would not become self-aware, they could faithfully mimic human brain patterns, such as the ways in which sight or sound is processed.
TREND FORECAST: Clusters of neuromorphic chips will make up the brains of new generations of autonomous robots, allowing them to see, hear, smell, touch, and process at speeds approaching that of the human brain, making those robots seem more like living beings instead of machines.

Comments are closed.

Skip to content