IBM Neural Processor
Digitalnerds blog is reader-supported. When you buy through links on our site, we may earn an affiliate commission. we never accept free products from manufacturers.Learn more.
No computer approaches the capabilities of the human brain storage and processing abilities. With the most advanced equipment on the market failing to get a higher IQ than a child of three years things have all the premises to change in the near future. IBM announced some days ago that they have managed to develop a neural chip, capable of superior performance compared to existing microprocessors ( in certain areas ). Before you jump all up with stories like “robots will conquer all of us” I invite you to read this article completely so that you understand exactly how things work with this new chip.
What is a neural chip?
To fully understand what a neural chip is one must understand exactly how it’s built. In current processors there are computing units and memory units ( or RAM ). They are generally designed as independent units, interchangeable and fully replaceable. To create this chip, IBM has studied how the human brain is designed. The brain consists of neurons that are linked each other by synapses. These synapses are responsible for memory and thinking in the same time and although this build has not yet fully understood how storage memory exists, this did not prevent the IBM engineer team to try to reproduce this “supercomputer”. By using this reduced model as a starting point, IBM has designed the chip by interplay of the processing unit with the storage unit. Thus, the processing units are linked together like synapses, through RAM. A computer calculus operation will automatically trigger a “memory” that relates to other computing units. Of course, the process is more complex and here it is explained very abstract to be understood by many people who hadn’t had any contact with computer architecture until now.
Performance of a neural chip
Currently this new chip has very small storage capacity and is only in an experimental stage and not really coming close to a actual computer with contemporary technology but it already has many advantages. Its development will continue and it will take years until a commercial version is to be sold on the market. However, when it will be launched, do not expect to have something like that around the house. It will be used for research and security applications as performance compared to conventional computers operations performance will be lower. Instead, this chip will excel at operations of speech recognition, recognition of patterns and will be a very good support for the development of artificial intelligence that is the trend and direction in which current technology is going now and probably in the near future. I now wait comments about the inherent robot invasion.