AIoT seems to be here to stay, and the next development step—as with processing—will be to evolve specialized devices for key markets.
Technology trends in artificial intelligence (AI) and the IoT have started to merge, a trend that industry has named the Artificial Intelligence of Things (AIoT). The movement of AI from the cloud to the edge is providing a solution to bandwidth and security issues that have hampered more widespread adoption of IoT in key markets. If the history of technology development is a reliable guide to the future, this convergence has at least two more stages to play out in the next few years.
The IoT has generated tremendous interest of late, but for many applications two significant issues have arisen. One is security; the data flowing across the network from IoT devices as well as control of the devices themselves depend heavily on adequate security against cyberattack. Because the threat is constantly evolving to become more intense, security has required continual vigilance and mitigation by IoT developers. At the same time, many potential users have held off on making a major push into the use of IoT technology due to uncertainty regarding the safety of their systems and data.
A second issue that has been restricting IoT adoption has been the bandwidth required to send data to the cloud for processing. As the number of installed devices rises and the amount of data involved increases, IoT deployment is becoming bound by the bandwidth resources and costs involved in the data collection. This has become even more of a concern as AI becomes an increasingly important element in extracting value from all that data.
AI’s importance in data processing has grown substantially because traditional data processing techniques are becoming increasingly cumbersome. Developing and coding an effective algorithm for extracting useful information from masses of data requires both time and application expertise that many potential users lack. It can also result in brittle software that is difficult to maintain and modify as needs change. AI, particularly machine learning (ML), allows the processor to develop its own algorithms based on training to reach desired outcomes rather than by depending on expert analysis and software development. Further, AI algorithms can be readily adapted to new requirements through additional training.
The recent trend in AI moving to the edge is bringing these two technologies together. Information extraction from IoT data currently occurs mostly in the cloud, but if much or all of that information can be extracted locally, the issues of bandwidth and security become less significant. With AI running in the IoT device there is little need to send masses of raw data across the network; only compact conclusions need be communicated. And with less communications traffic, network security is easier to enhance and maintain. The local AI can even help improve device security by examining incoming traffic for signs of tampering.
Figure 1 Predictive maintenance of industrial machinery is one application where the merger of AI and IoT will see increasing evolution. (Source: Morguefile.com – Kevin Connors)
AIoT looks to be following a development path similar to the way microprocessors evolved in the 1980s. Processing began as separate devices handling distinct tasks: generic processor, memory, serial interface peripherals, parallel interface peripherals, and the like. These eventually saw integration of device tasks into single-chip microcontrollers, followed by evolution into specialized microcontrollers targeting specific applications. The AIoT looks to follow the same path.
For now, AIoT design uses processors supplemented with generic AI acceleration and AI middleware. Processors with AI acceleration onboard are also beginning to debut. If history is going to repeat, the next stage for AIoT will be the evolution of AI-enhanced processors tailored to specific applications.
For a tailored device to be economically viable it needs to address common needs across a range of thematically-related applications. Such applications are already beginning to become visible. One such theme is predictive maintenance. AI coupled with IoT sensors on industrial machinery are helping users identify unusual patterns in vibration and current consumption that are harbingers of device failure. The benefits of having the AI local to the sensor device include reduced data bandwidth and latency as well as the ability to isolate device response from its network connection. A specialized predictive maintenance AIoT device would serve a substantial market.
A second theme is voice control. The popularity of voice assistants such as Siri and Alexa have primed consumers to demand voice control capability in all manner of devices. A dedicated voice control AIoT device would help solve bandwidth and latency issues and help ensure functionality during unstable connectivity. The number of potential uses for such a device is staggering.
There are other potential themes for specialized AIoT devices to address. Environmental sensing for both industrial safety and building management is one. Chemical process control is another. Automotive systems autonomy is a third. Cameras that recognize specific targets is a fourth. More will undoubtedly arise.
AIoT seems to be here to stay, and the next development step – as with processing – will be to evolve specialized devices for key markets. Beyond that the industry will most likely evolve configurable AI accelerators that can be tailored to their application so that the benefits of AIoT can reach more, smaller markets efficiently.
There are many technical challenges that still need to be overcome. Device size and power consumption are always issues at the edge, and AI needs to do more to address them. Development tools can do more to simplify the application development effort when using AI. And developers need to learn more about AI as an alternative approach to application development. But if history is any guide, these challenges will be quickly overcome.
This article was originally published on EDN.
Rich Quinnell is a retired engineer and writer, and former Editor-in-Chief at EDN.
Related articles: