www.design-reuse-embedded.com
You are here : design-reuse-embedded.com  > Artificial Intelligence  > AI and Machine learning
Download Datasheet        Request More Info

Overview

Neural networks (NNs) are enabling an explosion in technological progress across industries. NNAs are a fundamental class of processors, likely to be as significant as CPUs and GPUs. Potential applications for NNAs are innumerable. The new PowerVR Series2NX Neural Network Accelerator (NNA) delivers high-performance computation of neural networks at very low power consumption in minimal silicon area.

Benefits

PowerVR 2NX is a completely new architecture designed from the ground-up to provide:
  • The industry's highest inference/mW IP cores to deliver the lowest power consumption*
  • The industry's highest inference/mm2 IP cores to enable the most cost-effective solutions*
  • The industry's lowest bandwidth solution* – with support for fully flexible bit depth for weights and data including low bandwidth modes down to 4-bit
  • Industry-leading performance of 2048 MACs/cycle in a single core, with the ability to go to higher levels with multi core

Applications

The PowerVR 2NX NNA is designed to power inference engines across a range of markets, with a highly scalable architecture designed to power future solutions across many others.

Companies building SoCs for mobile, surveillance, automotive and consumer systems can integrate the new PowerVR Series2NX Neural Network Accelerator (NNA) for high-performance computation of neural networks at very low power consumption in minimal silicon area.

Potential applications for NNAs are innumerable, but include: photography enhancement and predictive text enhancement in mobile devices; feature detection and eye tracking in AR/VR headsets; pedestrian detection and driver alertness monitoring in automotive safety systems; facial recognition and crowd behavior analysis in smart surveillance; online fraud detection, content advice, and predictive UX; speech recognition and response in virtual assistants; and collision avoidance and subject tracking in drones.

Features

  • 2x the performance and half the bandwidth of nearest competitor
  • First dedicated hardware solution with flexible bit depth support from 16-bit down to 4-bit
  • Lowest bandwidth Neural Network (NN) solution
  • Architected to support multiple operating systems, including Linux and Android
  • Includes hardware IP, software and tools to provide a complete neural network solution for SoCs
  • Efficiently runs all common neural network computational layers
  • Depending on the computation requirements of the inference tasks, it can be used standalone – with no additional hardware required – or in combination with other processors such as CPUs and GPUs

Partner with us

Visit our new Partnership Portal for more information.

Submit your material

Submit hot news, product or article.

List your Products

Suppliers, list and add your products for free.

More about D&R Privacy Policy

© 2018 Design And Reuse

All Rights Reserved.

No portion of this site may be copied, retransmitted,
reposted, duplicated or otherwise used without the
express written permission of Design And Reuse.