“Memory has so far been viewed as a place where we merely store information. But in this work, we conclusively show how we can exploit the physics of these memory devices to also perform a rather high-level computational primitive. The result of the computation is also stored in the memory devices, and in this sense the concept is loosely inspired by how the brain computes.”
Dr. Abu Sebastian, IBM
IBM Research release a paper explaining the creation of technology which could make computers 200 times faster and more energy efficient.
IBM scientists demonstrated that an unsupervised machine-learning algorithm, running on one million phase change memory devices, successfully found temporal correlations in unknown data streams.
When compared to state-of-the-art classical computers, this prototype technology is expected to yield 200x improvements in both speed and energy efficiency, making it highly suitable for enabling ultra-dense, low-power, and massively-parallel computing systems for applications in AI.
Dr. Evangelos Eleftheriou, IBM Fellow and research paper co-author said:
“This is an important step forward in our research of the physics of AI, which explores new hardware materials, devices and architectures,”
Our host, David Organ, explains the science behind this technology.
Read the IBM Research Blog this story is based on here.
Originally broadcast: 03 November 2017