Edge vision AI adds to connected intelligence theme at embedded world 2023

Article By : Nitin Dahad

More edge vision AI but at lower cost and lower energy footprint in integrated systems solutions is what vendors say customers are looking at more and more.

The ‘intelligent connected edge’ has been an ongoing theme at embedded world in recent years, and the 2023 exhibition in Nuremberg, Germany, continued this theme with aplomb. The extra nuance this year is that edge vision AI is gaining more attention as almost all industry sectors try to implement some form of image recognition to do whatever they do better and more intelligently.

Sally - ST MCU embedded world
Sally Ward-Foxton, EE Times, looked at various solutions including a demo of STMicroelectronics’ STM32N6 MCU. (Image EE Times/ Sally Ward-Foxton)

The solutions on offer all tend to claim to offer the ability to add vision and artificial intelligence (AI) processing to systems at a lower cost, low power consumption, and overall better energy efficiency. My colleague, Sally Ward-Foxton, offers a great summary of the solutions she saw at embedded world in her article, “TinyML Comes To Embedded World’ over on EE Times – this includes the demo of STMicroelectronics’ STM32N6 MCU, which will feature a dedicated in-house developed AI accelerator on-chip based on the Arm Cortex-M7.

At the show, Texas Instruments (TI) introduced a family of six Arm Cortex-based vision processors for applications such as video doorbells, machine vision and autonomous mobile robots. Its new AM62A, AM68A and AM69A processors, are supported by open-source evaluation and model development tools, and common software that is programmable through industry-standard application programming interfaces (APIs), frameworks and models.

The company’s vice president for processors, Sameer Wasson, said, “In order to achieve real-time responsiveness in the electronics that keep our world moving, decision-making needs to happen locally and with better power efficiency. This new processor family of affordable, highly integrated SoCs will enable the future of embedded AI by allowing for more cameras and vision processing in edge applications.”

ADLINK - Stephen Huang
Machine vision and autonomous mobile robotics (AMR) is a key application for edge AI vision solutions. Stephen Huang of ADLINK Technology (left) said machine vision is becoming more and more of a requirement among customers, (Image: Nitin Dahad)

Machine vision and autonomous mobile robotics (AMR) is a key application for edge AI vision solutions. Stephen Huang, president & COO of ADLINK Technology told us machine vision is becoming more and more of a requirement among customers, along with a trend for low code/no code implementation of the technology. The company’s booth included a demo of ADLINK’s all in one camera development kit, which aims to simplify prototyping and testing of edge AI vision systems; it incorporates the hardware and software pre-installed with a full range of interfaces, including digital I/O, COM, and LAN ports, and an array of optional accessories. The GUI-based development tool provides a low-code environment with software tools for labeling and training, and two open source selected models so users can build a proof-of-concept without AI expertise.

ADLINK - edge vision embedded world
ADLINK’s booth featured an all in one camera development kit, which aims to simplify prototyping and testing of edge AI vision systems. (Image: Nitin Dahad)

Advantech also talked about the focus of its customers on edge AI, and especially vision – for applications ranging from self-driving robots to healthcare where vision systems are used to detect different pills. Energy consumption is something that the company is conscious of. As the company’s business development manager, Claus Giebert, told us, “A lot of customers will sooner or later notice power dissipation.” He mused that the people buying the hardware are not the people paying the energy bills so this will become more important as sustainability drives increase. He added that this can be addressed with modules like its that provide the performance when needed but are otherwise low power.

Meanwhile, congatec added TI processors to its modules, to introduce a SMARC Computer-on-Module featuring the TDA4VM processor to enable vision and AI processing. The dual Arm Cortex-A72-based module targets industrial mobile machinery requiring near-field analytics, such as automated guided vehicles and autonomous mobile robots, construction and agricultural machinery, as well as vision-focused industrial or medical solutions requiring ‘powerful but energy-efficient’ edge AI processors. Integrating the TI TDA4VM processor on a standardized Computer-on-Module simplifies the design-in process of this powerful processor technology, allowing designers in various embedded industries to focus on their core competencies. Martin Danzer, director of product management at congatec, said, “We see that autonomous driving based on AI and computer vision is one of the most important markets for embedded and edge computing technologies beside the second major growth accelerator digitization. We will make TI processors available on our credit-card-sized SMARC Computer-on-Module ecosystem with all the added values. These include fast prototyping and application development, cost-effective carrier board designs, and ultra-reliable, responsive and performant resources from design-in to series production of OEM systems.”

congatec - embedded world
On the congatec booth with the company’s CTO, Konrad Garhammer (right). (Image: Nitin Dahad)

This is a common theme we hear from many vendors. The move towards solutions which require little or no coding to deploy edge vision, System-on-Module (SOM), and Computers-on-Module (COM). Companies can save up-front costs and shorten time to market compared to fully customized designs, especially if they are producing small volume quantities.

Software-defined IoT and platforms for vision

Qualcomm used the show to launch new intelligent edge processors for IoT and robotics. Amongst these are software-defined IoT processors for both IoT devices and visual environments, as well as new robotics platforms. Dev Singh, vice president responsible for building, enterprise & industrial automation at Qualcomm Technologies, talked about the growing trend towards customers looking for more and more compute but with less power. He told embedded.com, “The trends are around more AI and compute, with low power footprint. That’s a huge need where the net zero carbon neutrality is the number one care about for all big players. And how do you achieve that? You achieve that by doing more processing.”

Qualcomm - Dev Singh
Dev Singh of Qualcomm (left) talked about the growing trend towards customers looking for more and more compute but with less power. (Image: Nitin Dahad)

“That’s what is needed because you are looking at factories of the future. You want more connectivity, but at an energy efficient footprint. And that’s where our platform plays itself, lends itself beautifully, because it’s a heterogeneous compute, and you use the right blocks for right applications. And while you’re not using it, the system goes and collapses. Unlike just using CPU for everything or just using GPU for anything, we have dedicated blocks, and that gives us that power thermal envelope advantage. And more than that is also integration of 5G, and integration of more AI.”

He said that Qualcomm is uniquely positioned to address that. “The power thing is quite important as you move beyond x86 into lower power architectures. I think that is a huge pain point for the industry, and we’ve recognized this. We’ve been talking to a lot of players who want to add more but want to do that at a much lower energy footprint. You can do general-purpose computing, but now adding AI means you’re going to add another dongle from somebody else, and that just keeps increasing. You need an integrated solution that is very energy efficient, and that’s what’s happening.”

To address this, Qualcomm introduced at the show its QCM5430 processor and QCS5430 processors, which it said are Qualcomm’s first software defined IoT solutions, designed to support up to five vision sensors inputs, and video encoding at up to 4K30. Hence, they are designed to support machine vision requirements with low-power and advanced edge-AI processing; when necessary, the edge AI can switch to cloud processing for handling multiple camera connections, optimizing between response time and power efficiency, as the manufacturer or user requires. The company said these combine performance, premium connectivity, and support for multiple OS options, enabling the platforms to scale across a wide footprint of IoT devices and deployment configurations for a visual environment.  This is aimed at OEMs building things like industrial handheld devices, retail equipment, mid-tier robots and connected cameras, and AI edge boxes. Customers are able to choose between premium, pre-set, or customized feature packs, and then upgrade them in the future to support their own needs or to provide customer upgrades.

In addition to these processors, Qualcomm also launched its entry level Qualcomm Robotics RB1 platform and Qualcomm Robotics RB2 platform, based on the new Qualcomm QRB2210 processor and Qualcomm QRB4210 processor, respectively. Both platforms are optimized for smaller devices and lower power consumption, compared to the previously introduced RB3, RB5 and RB6 platforms, making them more cost effective and accessible for industry. The Qualcomm Robotics RB1 and Qualcomm Robotics RB2 platforms feature general compute and AI-focused performance and communications technologies, with built-in support for machine vision support for up to three cameras, providing on-board intelligence to meld this data with innovative high-performance sensors from TDK Corporation for applications such as autonomous navigation.

 

This article was originally published on embedded.

Nitin Dahad is the Editor-in-Chief of embedded.com, and a correspondent for EE Times, and EE Times Europe. Since starting his career in the electronics industry in 1985, he’s had many different roles: from engineer to journalist, and from entrepreneur to startup mentor and government advisor. He was part of the startup team that launched 32-bit microprocessor company ARC International in the US in the late 1990s and took it public, and co-founder of The Chilli, which influenced much of the tech startup scene in the early 2000s. He’s also worked with many of the big names—including National Semiconductor, GEC Plessey Semiconductors, Dialog Semiconductor and Marconi Instruments.

 

Leave a comment