Machine vision in battery-powered IoT – Alif and Edge Impulse claim big gains in tinyML
Alif Semiconductor and Edge Impulse have claimed ‘dramatic increases’ in performance of machine learning in embedded systems through the combination of the former’s Ensemble family of microcontrollers (MCUs) and fusion processors with the latter’s ML development platform. The pair claimed the improvements are enough even to power on-board machine vision in battery-powered IoT applications, as well as embedded ML for voice- and vibration-based IoT use cases.
Vision-based use cases in embedded IoT applications have been difficult for developers, they said, as processors have been either too weak in terms of processing performance, or too power-hungry, especially in the case of battery-powered systems. As well, traditional development workflows for embedded designs are not suited for the additional steps required to select, configure, train, and deploy an ML model in an embedded MCU.
Alif Semiconductor’s Ensemble line is powered by Arm’s Ethos-U55 microNPU and Cortex-M55 CPU, and combines high-performance and high-efficiency processing around the concept of “always-available, battery-friendly, AI-accelerated environmental sensing”. It said: “This significantly improves AI/ML performance compared to current CPU-bound approaches, while consuming only a fraction of the power of such solutions.”
The U55/M55 core in the Ensemble E7 runs convolutional neural network workloads 100-times faster (780us compared to 74ms) than a Cortex-M7 at a similar clock speed, it said. Meanwhile, Edge Impulse, a specialist in miniaturised ML (tinyML) in battery-powered IoT, has extended its development platform to take advantage of the AI acceleration built into the Ensemble devices, to keep power consumption in check at the same time.
The firm – which works with the likes of Advantech, Nordic Semiconductor, and Polycom – says its platform quickly and easily guides IoT developers through the process of collecting and structuring datasets, designing ML algorithms with ready-made building blocks, validating the models with real-time data, and deploying fully optimised solutions to embedded IoT modules. The pair have been showing their collaboration at the Embedded Vision Summit this week.
Steve Pancoast, vice president of software and system design at Alif Semiconductor, said: “The improvements to neural network inference times on our Ethos-U55 based system while working with Edge Impulse have been very impressive. The ease of use that the Edge Impulse platform brings, combined with the performance improvements resulting from their EON tuner and compiler, will make a big difference for our customers’ design cycles.”
Zach Shelby, co-founder and chief executive at Edge Impulse, said: “Edge Impulse’s mission has always been to enable our users to create the next generation of intelligent devices. Our partnership with Alif really lets us take this to the next level. We are very excited to be able to highlight the combined capabilities of our development platform and the Ensemble devices, and are looking forward to seeing the kinds of products they will be able to create.”