The advancement of edge computing, along with increasingly powerful chips, may make it possible for artificial intelligence (AI) to operate without wide-area networks (WAN).\nResearchers working on a project at the University of Waterloo say they can make AI adapt as computational power and memory are removed. And indeed if they can do that, it would allow the neural networks to function free of the internet and cloud \u2014 the advantages being better privacy, lower data-send costs, portability and the utilization of AI applications in geographically remote areas.\nThe scientists say they can teach AI to learn it doesn\u2019t need lots of resources.\nThe group claims to be doing it by copying nature and placing the neural network in a virtual environment. They \u201cthen progressively and repeatedly deprive it of resources.\u201d The AI subsequently evolves and adapts, the team members say in a news article on the school\u2019s website.\nThe engine essentially learns to work around the fact that it doesn't have huge resources to draw on \u2014 AI typically uses a lot of power and processing capability.\n\u201cThe deep-learning AI responds by adapting and changing itself to keep functioning,\u201d the researchers say.\nMaking AI smaller\nWhenever computational power or memory is removed from the school's experimental AI, it becomes smaller and is thus \u201cable to survive in these environments,\u201d says Mohammad Javad Shafiee, a research professor at Waterloo and the system\u2019s co-creator.\nFitting the deep-learning engine onto a chip for use in robots, smartphones, or drones \u2014 where both connectivity and weight can be issues \u2014 are possible uses for the technology, the researchers say.\n\u201cWhen put on a chip and embedded in a smartphone, such compact AI could run its speech-activated virtual assistant and other intelligent features,\u201d the news article continues.\nEdge AI\nThe University of Waterloo\u2019s stand-alone AI isn\u2019t the first edge-ified AI that we\u2019ve seen, though. Unrelated to the Waterloo project, Intel earlier this year launched its Movidus Neural Compute Stick.\nThat ground-breaking, no-cloud-required, plug-and-play neural compute device (retailing at under $100) is geared towards prototyping and then deploying neural vision networks at the edge with no internet needed. It\u2019s no larger than a computer memory stick.\nGaining momentum from that launch, Movidius\u2019s technology is also being used in Google\u2019s upcoming Raspberry Pi-based hobbyist AIY Vision Kit, a do-it-yourself neural vision processor for the Pi camera that costs less than $50. It, too, is portable, simply requiring the Pi computer, camera and the Movidius-running, VisionBonnet Raspberry PI add-on board. Again, no network is needed. The Google TensorFlow-based software can recognize common objects, faces and animals. Movidius\u2019s vision processing can also now be found in security cameras, drones and industrial machines.\nIn the case of the University of Waterloo\u2019s AI project, the researchers say they have been able to obtain a 200-times reduction in the size of overall deep-learning AI software for object recognition.\nAdd to that the absence of a need for a network, and \u201cthis could be an enabler in many fields where people are struggling to get deep-learning AI in an operational form,\u201d the Waterloo scientists say.