Tag
on-device AI
On-device AI runs inference locally on phones, laptops, wearables, or edge devices instead of sending data to the cloud. It matters for latency, privacy, offline use, and hardware design, with small language models, NPUs, and assistants like Siri driving the shift.
2 articles

Industry News/Apr 8
Apple at 50: How it can still win in AI
Apple is betting on on-device AI and a Gemini-powered Siri reboot after years of delay. Former insiders think that may still be enough.

Model Releases/Mar 26
Small Language Models: Llama 3.2 and Phi-3 Transform AI
Llama 3.2 and Phi-3 redefine AI, shifting focus from cloud-heavy solutions to efficient, privacy-focused on-device applications.