www.design-reuse-embedded.com
Find Top SoC Solutions
for AI, Automotive, IoT, Security, Audio & Video...

Where Is The Edge AI Market And Ecosystem Headed?

The different forms of inference at the edge and the outlook for the accelerator ecosystem.

semiengineering.com, Dec. 05, 2019 – 

Until recently, most AI was in datacenters and most was training. Things are changing quickly. Projections are AI sales will grow rapidly to $10s of billions by the mid 2020s, with most of the growth in Edge AI Inference.

Edge inference applications
Where is the Edge Inference market today? Let's look at the markets from highest throughput to lowest.

Edge Servers
Recently Nvidia announced inference sales outstripped training for the first time. Much of this was likely shipped to datacenters, but there also are many applications outside of datacenters, generally referred to as "the edge." This means that sales of PCIe inference boards for edge inference applications are likely in the hundreds of millions of dollars per year and rapidly growing.

There are a wide range of applications: surveillance, facial recognition, retail analytics, genomics/gene sequencing, and more. Since training is done in floating point and quantization requires a lot of skill/investment, most edge server inference is likely done in 16-bit floating point with only the highest volume applications being done in INT8. PCIe inference boards range from 75W (Nvidia Tesla T4) to 200W (Habana Goya).

Click here to read more...

 Back

Partner with us

List your Products

Suppliers, list and add your products for free.

More about D&R Privacy Policy

© 2024 Design And Reuse

All Rights Reserved.

No portion of this site may be copied, retransmitted, reposted, duplicated or otherwise used without the express written permission of Design And Reuse.