How brain-inspired analog systems could make drones more efficient

Electrical and computer engineers want to mimic the brain’s visual system to create AI tools for guiding autonomous systems.

The artificial intelligence systems that guide drones and self-driving cars rely on neural networks—trainable computing systems inspired by the human brain. But the digital computers they run on were initially designed for general-purpose computing tasks ranging from word processing to scientific calculations and have ultra-high reliability at the expense of high-power consumption.

To explore novel computer systems that are energy efficient particularly for machine learning, engineers at the University of Rochester are developing new analog hardware, with the possible application toward more efficient drones. Rochester engineers are attempting to do so by abandoning conventional state-of-the-art neural networks developed on digital hardware for computer vision. Instead, they’re turning to predictive coding networks, which are based on neuroscience theories that the brain has a mental model of the environment and constantly updates it based on feedback from the eyes.

“Research by neuroscientists has shown that the workhorse of developing neural networks—this mechanism called back propagation—is biologically implausible and our brains’ perception systems don’t work that way,” says Michael Huang, a professor of electrical and computer engineering, of computer science, and of data science and artificial intelligence at Rochester. “To solve the problem, we asked how our brains do it. The prevailing theory is predictive coding, which involves a hierarchical process of prediction and correction—think paraphrasing what you heard, telling it to the speaker, and using their feedback to refine your understanding.”

Huang notes that the University of Rochester has a rich history in computer vision research and that the late computer science professor Dana Ballard was an author on one of the earliest, most influential papers on predicative coding networks.

The Rochester-led team includes Huang and electrical and computer engineering professors Hui Wu and Tong Geng, their students, as well as two research groups from Rice University and UCLA. The team will receive up to $7.2 million from the Defense Advanced Research Projects Agency (DARPA) over the next 54 months to develop biologically inspired predictive coding networks for digital image recognition built on analog circuits. While the initial prototype will look at classifying static images, if they can get the analog system to approach the performance of existing digital approaches, they believe it can be translated to more complex perception tasks needed by self-driving cars and autonomous drones.

And while the approach is novel, the system will not use any experimental devices but will instead be manufactured using existing technologies like the complementary metal oxide semiconductor (CMOS).

Continue Reading