Edge AI

Self-contained Edge NPUs for everything from Embedded ML to Generative AI

Harnessing neural networks to accelerate inference at the edge. Artificial Intelligence (AI) is changing our world, and Edge AI brings its power even closer to action. By reducing latency and enhancing privacy, Edge AI enables advanced AI capabilities on even the most resource-constrained devices. Whether through real-time decision-making in autonomous vehicles, immediate health monitoring through wearable medical devices, or predictive maintenance in industrial machinery, Edge AI is ushering in a new era of smart devices.

Ceva offers self-contained Edge Neural Processing Units (NPUs) that operate independently without relying on a host CPU. They’re designed to handle a broad spectrum of AI applications, from the ultra-low-power and always-on requirements of Embedded ML to the high computational demands of Generative AI. Ceva’s scalable NPU family supports AI processing capabilities ranging from tens of GOPS (Giga Operations Per Second) to hundreds of TOPS (Tera Operations Per Second).

TechInsights report

Ceva NPU Core Targets TinyML Workloads

Ceva’s NeuPro-Nano licensable neural processing unit (NPU) targets processors that run TinyML workloads, offering up to 200 billion operations per second (GOPS) for power-constrained edge IoT devices.

Download Report

Edge AI Technology's end-devices

REOLINK Duo 3

70mai Omni Dash Cam

Jinpei 4K Mini Action Camera

Nikon Z50II

FUJI XT-5

FUJI XM-5