Does analog have a place in the machine learning world?

Article By : Majeed Ahmad

Aspinity claims its inference-based analog signal processing technology provides a power-efficient solution for battery-operating devices.

The electronics design industry is changing quite a bit, driven mainly by the proliferation of sensors and demand to generate and gather more information. That’s leading to having more sensors, and most of these sensors are much smaller and easily deployable. And they are without wires. In other words, these sensors employ wireless transport and are battery-powered.

As of now, the sheer amount of data gathered by sensors is digitized locally and is sent to the cloud for processing. However, as the amount of data and the number of sensors continue to grow, engineers are bound to consider the energy aspects related to sensor data. There are many applications that are always sensing, which requires changing of batteries too often. For instance, when someone speaks or there is a security violation like broken glass or a machine starts malfunctioning.

Aspinity, a Pittsburgh-based upstart that has its roots in the neuromorphic computing work carried out at West Virginia University, claims to have a solution geared toward solving this problem for the battery-powered devices. The company’s co-founder and CEO Tom Doyle recently spoke with Planet Analog to explain the nitty and gritty of this inference-based solution employing analog signal processing.

Aspinity is a semiconductor company that provides chips as well as software that goes along with the analog machine learning (ML) model and firmware. “In some cases, we also do analog compression,” Doyle said. “It’s all about ease of integration in the existing silicon, allowing algorithms to be more power-efficient.” Aspinity’s silicon partners include Infineon and STMicroelectronics.

Figure 1 The comparison highlights the difference between digital- and analog-centric sensor data processing. Source: Aspinity

Infineon, which now owns microcontrollers from Cypress Semiconductor and is also very strong in the sensor space, is developing a partnership with Aspinity on both of these ends. “We work with the sensors to bring in data very early and at low power,” Doyle said. “Then, we let the PSoC microcontrollers wake up when valuable data is there and do further processing so that they communicate relevant information to the cloud.”

STMicro also offers low-power MCUs, but they are not low power enough for battery-operating devices. “So, we are at the front-end of the MCU to let it know when the relevant data is there,” Doyle said. “That’s how we see our technology integrating onto existing silicon and enabling new products.”

Here, Aspinity’s technology allows MCU and analog-to-digital converter (ADC) to remain in the sleep mode until relevant data is available. For example, when there is a wake word trigger or an alarm goes off. It’s important to note that these things happen sporadically, and sometimes they never happen. Therefore, Doyle added, why use the current paradigm that digitizes all that data.

Analog ML solution

The underlying technology in Aspinity’s inference solution—reconfigurable analog modular processor (RAMP)—is a neural-inspired processing platform. For example, Aspinity can build an algorithm to interface with the microphone sound to perform inference. Next, ML stages are built and transformed into analog neural network (NN) stages. And that will make a decision and wake up the system. It will decide if it’s speech or not; if there is a glass break; and if there is a vibrational problem in the machine.

To drive ML capabilities and heavy workloads, Aspinity’s solution looks at the analog domain and determines what it can do for inferencing and making decisions. When you look at neural networks, it’s a series of multiply-accumulate functions. Here, as Doyle points out, a neural network is a prime example of non-linear analog. There are many different ways it can go; so, compute can be done in analog circuitry with certain IP and knowhow. “Multiply accumulates can be implemented using simple analog transistors,” said Doyle. “It doesn’t have to be gates.”

Figure 2 Inference using analog in-memory computing (top) requires both analog and digital processing, while inference in the analogML core (bottom) uses only analog processing. Source: Aspinity

By moving ML workloads from digital to analog and doing it closer to the sensors, engineers can deal with sensor data deluge more effectively. They can employ solid-state electronics to move ML workloads to the earliest possible instance to know when the data is relevant. In other words, they can move workloads as close to sensors as possible and pick relevant analog data. That’s a venue of analog’s role in the rapidly emerging machine-learning world.

Aspinity’s technology, protected through IPs, is taking ML workload from digital domain to analog domain. “It’s much more efficient to do that way because the sensor data is inherently analog,” Doyle said. “If we can challenge the notion of taking the data and digitizing it and look at the data to find out if it’s relevant or not, a lot of energy can be saved.”

Aspinity strives to move ML workload earlier in the signal chain, where doing it in the analog realm is inherently low-power. It also removes downstream components. That’s been made possible by inferencing in the analog domain. So, yes, analog has a place in the rapidly evolving machine learning design world.

This article was originally published on Planet Analog.

Majeed Ahmad, editor-in-chief of EDN and Planet Analog, has covered the electronics design industry for more than two decades.

 

Leave a comment