• United States

Neural computing should be based on insect brains, not human ones

News Analysis
Mar 31, 20203 mins
Computers and PeripheralsData CenterInternet of Things

Insect brains are a better model for artificial intelligence in IoT than human brains are because they are simpler and focus on key processes, scientists say.

The bumble bee brain is a better model than the human brain for neural networks that might be used to run autonomous robots, an academic team believes.

“It is pretty impressive that a bee can fly over five miles, then remember its way home, with a brain the size of a pinhead,” says Professor James Marshall, of the University of Sheffield, quoted by multiple newspapers that were reporting on a presentation Marshall made to the American Association for the Advancement of Science conference in February.

“It makes sense to me that we should try and mimic a bee brain in [autonomous systems], drones and driverless cars.”

Marshall is referring to a form of deep-learning computing for which developers are creating electronic architectures that mimic neurobiological architectures that could replace traditional computing. Deep-learning computing falls within artificial intelligence in which computers learn through rewards for recognizing patterns in data. A difference is that in deep learning neural processes are used. Variations include neuromorphic computing that I wrote about here that can analyze high- and low-level detail such as edges and shapes.

Bees “are basically mini-robots,” says Marshall, quoted in the Daily Telegraph. “They’re really consistent visual navigators, they can navigate complex 3-D environments with minimal learning and using only a million neurons in a cubic millimeter of the brain.”

That size element could grab the attention of developers who are working toward tiny robots that communicate with each other to self-organize and could be used, for example, to move objects in factories.

Robots equipped with computing that is modeled after bee brains could be instructed to go to a  particular destination, and, using external reference points and internal sensing capabilities, they would go there. Current unmanned aerial vehicles – drones – use global positioning satellites to determine their location, compasses for direction, and barometers for sensing altitude. Unlike the autonomy of what bee-modeled robots might do, the drones’ navigation is susceptible to processing delays when they encounter anomalies, such as unexpected obstacles.

In experiments with actual bees, the team glued radar tags to honeybees’ backs. (It’s “easier said than done,” Joe Woodgate of Queen Mary University of London, said according to the multiple news reports. “They’re very good at escaping from us and when we do succeed, we’re left holding an angry bee.”) 

The researchers tracked the bees’ flight paths with radar, and based on the data they gathered, came up with an algorithm that simulates the way the bees apparently think about moving through space. One skill the bees have is optic flow, in which, as they fly, they can gauge what objects are closer than others based on how rapidly they appear to be moving relative to other objects. The ones that seem to be moving faster are closer the Daily Mail explains.

Unrelated to the bees, Intel has announced it has readied a cloud-based neuromorphic research system with an architecture inspired by the human brain, that can be used to solve artificial-intelligence challenges as well as other computationally difficult problems.


Patrick Nelson was editor and publisher of the music industry trade publication Producer Report and has written for a number of technology blogs. Nelson wrote the cult-classic novel Sprawlism.

The opinions expressed in this blog are those of Patrick Nelson and do not necessarily represent those of IDG Communications, Inc., its parent, subsidiary or affiliated companies.