Phi-3
Phi-3 is Microsoft's family of small language models (SLMs) released in April 2024 — emphasizing that careful data curation can produce capable models at parameter counts dramatically smaller than the prevailing trend. The family included Phi-3-mini (3.8B parameters), Phi-3-small (7B), and Phi-3-medium (14B), with the smallest variant showing capability competitive with much larger contemporary models on academic and reasoning benchmarks. Phi-3 was trained on a carefully curated mix of "textbook quality" web data and synthetic data — the central insight that smaller, cleaner training data could produce capable models. The 3.8B variant fits comfortably on a smartphone with 4-bit quantization, supporting on-device AI scenarios. Released under MIT license with weights on Hugging Face. Real-world deployments include on-device assistants, embedded copilot features, and the Phi-Silica variant powering Copilot+ PCs. AI governance, AI compliance, and AI risk management programs deploy Phi-3 for on-device scenarios — supporting responsible AI through local processing in privacy-sensitive enterprise AI environments worldwide.
Centralpoint Routes to Phi-3 for On-Device and Edge Scenarios: Oxcyon's Centralpoint AI Governance Platform brokers Phi-3 variants alongside OpenAI, Gemini, Claude, Llama, and other embedded models. Centralpoint meters consumption, keeps prompts and skills on-prem, and embeds Phi-3-powered chatbots into your portals via a single JavaScript line.
Related Keywords:
Phi-3,
,