Back to home

Tag

on-device AI

On-device AI runs inference locally on phones, laptops, wearables, or edge devices instead of sending data to the cloud. It matters for latency, privacy, offline use, and hardware design, with small language models, NPUs, and assistants like Siri driving the shift.

2 articles