Neuromorphic engineering—building machines that mimic the function of organic brains in hardware as well as software—is becoming more and more prominent. The field has progressed rapidly, from conceptual beginnings in the late 1980s to experimental field programmable neural arrays in 2006, early memristor-powered device proposals in 2012, IBM’s TrueNorth NPU in 2014, and Intel’s Loihi neuromorphic processor in 2017. Yesterday, Intel broke a little more new ground with the debut of a larger-scale neuromorphic system, Pohoiki Beach, which integrates 64 of its Loihi chips.
Where traditional computing works by running numbers through an optimized pipeline, neuromorphic hardware performs calculations using artificial “neurons” that communicate with each other. This is a workflow that’s highly specialized for specific applications, much like the natural neurons it mimics in function—so you likely won’t replace conventional computers with Pohoiki Beach systems or its descendants, for the same reasons you wouldn’t replace a desktop calculator with a human mathematics major.
However, neuromorphic hardware is proving able to handle tasks organic brains excel at much more efficiently than conventional processors or GPUs can. Visual object recognition is perhaps the most widely realized task where neural networks excel, but other examples include playing foosball, adding kinesthetic intelligence to prosthetic limbs, and even understanding skin touch in ways similar to how a human or animal might understand it.