- 100G AES Encryption Core
- 3DES Crypto Engine
- 3DES-ECB 1 Billion trace DPA resistant cryptographic accelerator core
- Active mesh against tampering attacks - Active Shield
- AES (ECB), 1 Billion Trace DPA Resistant Cryptographic Accelerator Cores
- AES (ECB-CBC-CFB-CTR), 1 Billion Trace DPA Resistant Cryptographic Accelerator Cores
- More Products...
IP-SOC DAYS 2025 IP-SOC DAYS 2024 IP-SOC DAYS 2023 IP-SOC DAYS 2022 IP-SOC DAYS 2021 IP-SOC 2024 IP-SOC 2023 IP-SOC 2022 IP-SOC 2021
|
|||||||
![]() |
|

Embedded artificial intelligence: How to increase security for industrial systems?
- Xiphera and Austin Electric Partner to Strengthen Hardware-Based Security in South Korea
- CEA-Leti and Soitec Announce Strategic Partnership to Leverage FD-SOI for Enhanced Security of Integrated Circuits
- BT Group Joins the CHERI Alliance to Advance Cybersecurity Innovation
- Safety Without Security Is an Illusion in the Age of Autonomous Vehicles
- NCSC proposes its PQC transition timeline to UK Policy makers: guiding the UK to a quantum-safe future, Jeremy Bradley NCSC
- Tenstorrent Acquires Blue Cheetah Analog Design (Jul. 02, 2025)
- Consumer-Tech Brand, Nothing, Taps Ceva's RealSpace Software to Bring Immersive Spatial Audio to Headphones and Earbuds (Jul. 02, 2025)
- Intel Reportedly Weighs Dropping 18A, Bets on 14A to Attract Clients and Challenge TSMC (Jul. 02, 2025)
- Arteris Expands Multi-Die Network-on-Chip Design IP and Software (Jul. 02, 2025)
- Three Pillars for Semiconductor Success in the Chiplet Economy (Jul. 02, 2025)
- See Latest News>>
The European InSecTT project focuses on the security of embedded AI. As a partner in this three-year project, CEA-Leti developed innovative solutions to authenticate intelligent systems and protect them from various cyberattacks. These advances coincide with the promulgation of the European AI Act.
www.leti-cea.com, May. 23, 2024 –
The EU InSecTT project wrapped up in the Fall of 2023, only a few months before the promulgation of the European AI Act. This timing is particularly significant given the notable parallels between the research conducted by CEA-Leti researchers and the issues addressed in this new regulation.
A lack of regulation to evaluate the reliability of AI
The AI Act takes note of the prodigious rise of artificial intelligence, from generative AI such as ChatGPT to embedded AI (industrial robots, production equipment, automotive, home automation, smart cities...). It lays down rules to ensure that this wave of innovation goes hand-in-hand with transparency, data governance and respect for fundamental rights.
As a result, AI could be banned in applications where the risks are considered unacceptable, such as remote biometric identification in public spaces. The text also paves the way for future certification frameworks for the safety and security of AI systems.
However, there are currently no protocols, standards or norms for assessing the reliability of an embedded AI system.
The recent work by CEA-Leti researchers has begun to fill this gap.
Important scientific results
The researchers did not fail at their task as they covered three types of advances and authored 11 scientific publications, one of which won a Best Paper Award.
Their studies first demonstrated the technical feasibility of authenticating embedded AI systems. For instance, if a chain of industrial robots with embedded AI ceases to function, the chain manager needs to be able to verify that the robots have not been hacked and that their data and models are correct.
CEA-Leti successfully overcame this challenge thanks to a technological platform it developed in collaboration with IRT Nanoelec. Known as HistoTrust, this platform integrates blockchain technology and secure hardware modules to guarantee the authenticity of AI systems as close to the robots as possible.