Industry Expert Blogs

Spencer Acain

Challenges of AI at the edge

Spencer Acain - Siemens Digital Industries Software - VIP
Aug 10, 2022

Artificial intelligence is taking an ever-larger role in modern society, from controlling massive factories, to balancing complex product requirements through generative design, to helping guide wayward drivers to the nearest all-night burger joint. What all these use cases have in common is the need for considerable processing power to drive the AIs ability to "think". While factories and data centers can accommodate the high processing and power requirements of AI, edge devices will struggle to incorporate necessary hardware while remaining within their power envelope. Although solutions do exist today, not all of them are suitable for every application nor will all of them be able to scale with the continued growth of AI in its many applications. With that in mind, it's important to consider applications of current solutions, as well as what new technology will enable in the future.

One of the easiest ways to handle AI processing at the edge is to not handle it there. A common practice, especially on smartphones and other wearable tech, is to offload the computationally intensive tasks associated with AI – particularly the demanding natural language processing – to remote data centers better suited to the tasks. While this does provide a quick and easy solution, it also has significant drawbacks. The first obvious drawback is that the AI functionality is no longer truly at the edge, rather it is beholden to edge device's ability to maintain a stable connection to the remote server. The second major concern is privacy and data security. For a company, allowing potentially proprietary or mission-critical data to be handled by a remote server, or even a third party, may present an unacceptable security risk. Another issue is the time delay in sending data off to a remote data center and awaiting its decision. In real-time critical decision situations, this delay can lead to significant if not catastrophic results. As AI begins to become more integrated with professional software, it will be vital for data security that the edge devices running this software be capable of handling the necessary processing unaided by external servers.

While some of today's solutions struggle to meet the future demands of advanced AI, others are rising to the challenge. Since its inception, AI has largely run on general-purpose computing hardware which limits its efficiency due to overhead functionality that AI does not need or use. However, this is starting to change. Established chip makers and startups alike are beginning to produce chips tailored to AI applications, realizing substantial efficiency gains without compromising on performance. In creating these next generation AI-enabled edge chips, designers are relying on a wide range of advanced and emergent techniques such as those enabled by Siemens Catapult portfolio of High-Level Synthesis tools. This is especially true for natural language processing at the edge since it is a functionality being widely adopted but also requiring significant processing power. No one method will solve all the challenges faced by AI on edge devices, however, depending on the application, by combining multiple techniques such as adding a DSP to help handle audio processing, creating new analog chips which can do faster and more efficient matrix calculations, or developing systems that leverage Processing in Memory (PIM) technology, executing complex AI tasks at the edge while maintaining reasonable power requirements is well within reach.

Factory devices are also getting smarter with process equipment producing massive data sets useful in large-scale optimization, predictive maintenance and many other areas. However, in a factory with hundreds or thousands of smart devices, collecting all this data and sending it to a data center for analysis presents a daunting task for network infrastructure. But now, thanks to more efficient AI chips, rather than each device simply collecting data and passing it on, the device itself has sufficient AI capability to process the data before sending only the pertinent data on for further analysis – without incurring substantial power overhead. This creates a distributed network of smart devices capable of independently analyzing their own data rather than relying exclusively on a central server to collate data and make decisions which in turn allows for a more flexible and robust smart factory to be developed. A good example of this is vision processing directly integrated into cameras used for QC checking. With built-in AI, the camera could internally decide whether a part passes or not and only send data back to a central server when a non-conforming part is detected. In this example, not only is the amount of transmitted data drastically reduced but the central server will be able to build a database of observed failure modes to help further refine the production process.

AI might be a relative newcomer to the world of manufacturing and design, but it's here to stay. With that in mind, developing new technology and methods to enable AI on devices of every performance level will be key in driving wide-spread adoption across every industry and sector while simultaneously allowing for smarter, faster, algorithms to be deployed. This positive feedback loop of growth and innovation will push not just AI but industry as a whole to reach new heights of progress, helping drive technology to a smarter future.

Click here to read more ...



Find Top SoC Solutions
for AI, Automotive, IoT, Security, Audio & Video...



I understand
This website uses cookies to store information on your computer/device. By continuing to use our site, you consent to our cookies. Please see our Privacy Policy to learn more about how we use cookies and how to change your settings if you do not want cookies on your computer/device.