Smarter, Faster, Closer: The Power of Edge AI Workloads

  • Joe Abajian

Artificial intelligence workloads are the engines driving innovation, transforming operational data into fuel for solutions powering the connected world. At the edge, AI workloads have evolved to maximize available computing power through the use of new sophisticated development tools and software resources from a variety of organization types. AI development solution providers such as Edge Impulse and MathWorks have continued to advance model optimization techniques, bringing innovative intelligence to edge systems. Computer vision models and generative neural network inference are focal points for development on edge devices today, but compute-intensive edge AI workloads will diversify greatly in the near future.

In many cases, enterprises using edge computing are still sending data to models in the cloud or datacenter, which is often slower and more expensive than on-device processing. Third-party development solution providers bear the burden of bringing large models to small devices—but this daunting engineering challenge is not without reward. Edge, IoT, and embedded engineers currently working on projects with AI workloads expected a significant increase in LLM usage, image generation, and natural language processing. This dramatic increase in use presents an opportunity for solution providers to capture additional value by targeting these workloads while also adding pressure onto engineering organizations to expand their edge AI strategies.

Types of Artificial Intelligence Workloads Used by Edge, IoT, and Embedded Engineers
in Current Projects and Expected in Three Years
Edge AI

Due to model size and resource demand, engineering organizations have not widely deployed LLMs and image generators on the edge yet. Innovative model optimization techniques, smaller models (MiniGPT-4, TinyLlama, etc.), and hardware improvements, however, are making LLMs and image generation practical edge AI workloads. In fact, both workloads feature the largest expected increases in usage in VDC’s Voice of the IoT Engineer survey. Through the next few years, engineers expect computer vision and tabular data to remain the most commonly deployed workloads, reaffirming the continued need for tool providers to support the latest vision and analytics models while also supporting emerging edge AI workload types.

As enterprises bring more advanced LLMs to the edge due to their versatile capabilities and usability, LLM use cases and end applications will diversify. Currently, LLMs are used primarily for natural language processing at control systems, predictive maintenance, interpreting unstructured data (traffic patterns, environmental conditions, safety protocols, regulations), and data augmentation. NXP, Qualcomm, and other hardware providers with development solutions have specifically focused on helping their customers derive value from custom LLM applications through model-optimized hardware and software platforms. Given the significant investment in LLM technology from open source foundations and private entities alike, LLM capabilities and use cases will evolve faster than other types of AI models, making it a must-watch workload for edge AI development solution providers.

For more information on edge AI development, see VDC’s recently published report, Edge AI Development Solutions: Navigating the Edge of Innovation.

Scroll to Top

About Mitch

Mitch Solomon

President

Mitch has spent years supporting senior leaders of operational and industrial technology companies as well as private equity investors that participate in the space.  He is an active member of the Technology and Innovation Council at Graham Partners, a leading industrial technology focused private equity firm, and serves on the advisory boards of OptConnect (a top IoT connectivity provider) and DecisionPoint (a rapidly growing operational technology systems integrator).  Mitch has worked closely with a wide range of industrial technology clients on a diverse array of growth opportunities and challenges including applications of AI, c-suite recruiting, strategic planning, new market identification and entry, product strategy, competitive positioning, revenue retention, value proposition identification and messaging, sales strategy and execution, and board presentations. Mitch holds a BA from Northwestern University and an MBA from The Tuck School of Business at Dartmouth College.