Find Top SoC Solutions
for AI, Automotive, IoT, Security, Audio & Video...

HBM2e Top Contender for AI Applications

HBM2e and GDDR6 DRAMs are vying for AI applications. However, HBM2e has a competitive edge. But you be the judge

eetasia.com, Jun. 05, 2019 – 

Do you know which memory is best for your AI applications? These days, among artificial intelligence (AI) system designers, that's not the most frequently asked question. Instead, the most asked question deals with AI accelerators.

Which one is best for my particular AI application? Is it capable of learning new things? What levels of complex calculations can it handle? And so on, and so forth.

But you, as a system designer, entering into this new AI frontier, do you know which memory is well suited as a companion for your selected AI accelerator?

One has to keep in mind that an AI memory gap exists to some degree. That's because today's and next generation AI accelerators are so fast that traditional memory technology lags behind as far as AI applications are concerned.

The AI accelerators and memory issues include high bandwidth, access time, and memory energy consumption on AI processor chips. In training or inference AI applications, convolutional neural networks (CNNs) or deep neural networks (DNNs) have a processing engine connecting to memory.

click here to


Partner with us

Visit our new Partnership Portal for more information.

List your Products

Suppliers, list and add your products for free.

More about D&R Privacy Policy

© 2020 Design And Reuse

All Rights Reserved.

No portion of this site may be copied, retransmitted, reposted, duplicated or otherwise used without the express written permission of Design And Reuse.