HomeWinBuzzer NewsMicrosoft's New Phi Silica AI Model Enhances Windows Copilot Library

Microsoft’s New Phi Silica AI Model Enhances Windows Copilot Library

With 3.3 billion parameters, Phi Silica is the smallest of the Phi-3 AI family and integrates directly with the Windows Copilot Library.

-

As we reported yesterday, Microsoft has introduced Phi-3-Vision, a new addition to its Phi-3 AI small language model family, at the Build 2024 developer conference. This multimodal model aims to enhance the company's suite of AI tools, providing developers with more robust capabilities. In another announcement, Microsoft CEO Satya Nadella revealed Phi Silica, another member of the Phi small language model family.

With 3.3 billion parameters, Phi Silica is the smallest model in the family, designed for optimal performance on neural processing units (NPUs) in + PCs. This model enables local inferencing and improves first-token latency performance, offering developers an API to create user experiences across the Windows ecosystem.

Windows Copilot Library Expansion

The Phi Silica model is integrated into Microsoft's new Windows Copilot Library, a comprehensive set of APIs powered by over 40 on-device models included with Windows. The library aims to enhance developers' ability to leverage AI within the Windows environment. Available APIs in the Windows App SDK release in June will include functionalities like Studio Effects, Live Captions Translations, Phi Silica, Optical Character Recognition (OCR), and Recall User Activity. Additional APIs, such as Text Summarization, Vector Embeddings, and Retrieval-Augmented Generation (RAG), are scheduled for future release.

Copilot+ PCs and Future Plans

Copilot+ PCs, the brand name for Windows PCs equipped with specialized neural network chips for running AI applications like Copilot and Phi Silica, are set to begin shipping in mid-June. These devices will feature Qualcomm's Arm-based Snapdragon X Elite and Plus chips. Microsoft, along with several major PC manufacturers, plans to introduce these laptops this summer. Intel is also preparing to launch its own Copilot+ PC-based processor, code-named Lunar Lake, expected in the third quarter of 2024.

Initially, Microsoft focused on large language models (LLMs) that operate in the cloud. However, in April, the company introduced Phi-3-mini, a smaller model designed to run locally on PCs. Phi Silica is a derivative of Phi-3-mini, specifically tailored for Copilot+ PCs. This shift towards small language models (SLMs) addresses the limitations of LLMs, which require substantial memory and storage, making them impractical for typical PCs. SLMs like Phi Silica balance accuracy and speed with a 3.3 billion parameter model.

Local AI Processing for Privacy

Most AI interactions occur in the cloud, with services like Microsoft's existing Copilot communicating with remote servers. However, the upcoming Recall PC search engine will run locally on NPUs for privacy reasons, avoiding indexing of incognito or anonymous searches. This local processing capability is a key feature of the new AI models, providing enhanced privacy and performance.

Microsoft has highlighted that Windows is the first platform to feature a small language model custom-built for NPUs and included out-of-the-box. It remains unclear if Phi Silica will be pre-installed on Copilot+ PCs as a local version of Copilot, a question Microsoft has until June 23 to address. This development underscores Microsoft's commitment to integrating advanced AI capabilities directly into its operating system, offering users enhanced functionality and performance.

SourceMicrosoft
Luke Jones
Luke Jones
Luke has been writing about Microsoft and the wider tech industry for over 10 years. With a degree in creative and professional writing, Luke looks for the interesting spin when covering AI, Windows, Xbox, and more.

Recent News

Mastodon