- 100G AES Encryption Core
- 3DES Crypto Engine
- 3DES-ECB 1 Billion trace DPA resistant cryptographic accelerator core
- Active mesh against tampering attacks - Active Shield
- AES (ECB), 1 Billion Trace DPA Resistant Cryptographic Accelerator Cores
- AES (ECB-CBC-CFB-CTR), 1 Billion Trace DPA Resistant Cryptographic Accelerator Cores
- More Products...
IP-SOC DAYS 2025 IP-SOC DAYS 2024 IP-SOC DAYS 2023 IP-SOC DAYS 2022 IP-SOC DAYS 2021 IP-SOC 2024 IP-SOC 2023 IP-SOC 2022 IP-SOC 2021
|
|||||||
![]() |
|

Embedded artificial intelligence: How to increase security for industrial systems?
- IHP, eMemory and its subsidiary PUFsecurity Break Ground on Open Access to Hardware PUF-based Security IP in Europe
- Programmable Hardware Delivers 10,000X Improvement in Verification Speed over Software for Forward Error Correction
- Xiphera and Austin Electric Partner to Strengthen Hardware-Based Security in South Korea
- CEA-Leti and Soitec Announce Strategic Partnership to Leverage FD-SOI for Enhanced Security of Integrated Circuits
- BT Group Joins the CHERI Alliance to Advance Cybersecurity Innovation
- Keysight Supports Post-Quantum Cryptography Evaluation (Jul. 24, 2025)
- Mixel Supports Automotive SerDes Alliance (ASA) Motion Link SerDes IP (Jul. 23, 2025)
- TSMC to Exit GaN, Focus on Advanced Packaging (Jul. 23, 2025)
- UMC Announces Software Acquisition and Upcoming Earnings Release (Jul. 23, 2025)
- Arteris Selected by Whalechip for Near-Memory Computing Chip (Jul. 22, 2025)
- See Latest News>>
The European InSecTT project focuses on the security of embedded AI. As a partner in this three-year project, CEA-Leti developed innovative solutions to authenticate intelligent systems and protect them from various cyberattacks. These advances coincide with the promulgation of the European AI Act.
www.leti-cea.com, May. 23, 2024 –
The EU InSecTT project wrapped up in the Fall of 2023, only a few months before the promulgation of the European AI Act. This timing is particularly significant given the notable parallels between the research conducted by CEA-Leti researchers and the issues addressed in this new regulation.
A lack of regulation to evaluate the reliability of AI
The AI Act takes note of the prodigious rise of artificial intelligence, from generative AI such as ChatGPT to embedded AI (industrial robots, production equipment, automotive, home automation, smart cities...). It lays down rules to ensure that this wave of innovation goes hand-in-hand with transparency, data governance and respect for fundamental rights.
As a result, AI could be banned in applications where the risks are considered unacceptable, such as remote biometric identification in public spaces. The text also paves the way for future certification frameworks for the safety and security of AI systems.
However, there are currently no protocols, standards or norms for assessing the reliability of an embedded AI system.
The recent work by CEA-Leti researchers has begun to fill this gap.
Important scientific results
The researchers did not fail at their task as they covered three types of advances and authored 11 scientific publications, one of which won a Best Paper Award.
Their studies first demonstrated the technical feasibility of authenticating embedded AI systems. For instance, if a chain of industrial robots with embedded AI ceases to function, the chain manager needs to be able to verify that the robots have not been hacked and that their data and models are correct.
CEA-Leti successfully overcame this challenge thanks to a technological platform it developed in collaboration with IRT Nanoelec. Known as HistoTrust, this platform integrates blockchain technology and secure hardware modules to guarantee the authenticity of AI systems as close to the robots as possible.