
BOOK OF NEWS
May 19 - 22, 2025
Introduction
Foreword from Frank X. Shaw
Welcome to Microsoft Build, our annual event focused on the developer community, and to this year’s Book of News. Within this edition, you’ll learn about more than 50 announcements designed to make it easier for developers to work faster, think bigger and build at scale.
For 50 years Microsoft has been empowering developers with tools and platforms to turn their ideas into reality, accelerating innovation at every stage, and at Build we’re showing our vision for what’s next in software development and AI.
The Book of News is your starting point for our announcements, designed to provide an easy-to-navigate guide to our most current updates and deliver critical insights into the topics and developments that pique your interest. As always, your feedback is invaluable to us. Happy building. 😊
fxs
What is the Book of News?
The Microsoft Build 2025 Book of News is your guide to key news items that we are announcing at Microsoft Build. The interactive Table of Contents gives you the option to select the items you are interested in, and the translation capabilities make the Book of News more accessible globally (just click the Translate button below the Table of Contents to enable translations).
We pulled together a folder of imagery related to a few of the news items – please take a look at the imagery here. To watch keynotes and sessions related to news items, we have links below many of the news items to give you quick access to upcoming sessions and on-demand videos.
We hope the Book of News provides all the information, executive insight and context you need. If you have any questions or feedback regarding content in the Book of News, please email [email protected].
If you are interested in speaking with an industry analyst about news announcements at Microsoft Build or Microsoft’s broader strategy and product offerings, please contact [email protected].

1. AI at Work 1.1. Agents
1.1.1. Train Copilot to be a business expert with Copilot Tuning
With Microsoft 365 Copilot Tuning, customers can use their own company knowledge to train models that perform domain-specific tasks with increased speed and accuracy. Copilot Tuning ensures that access to a tuned model is restricted to permissions of the underlying training data within an enterprise.
Agents built with Microsoft 365 Copilot’s Agent Builder can take advantage of these tuned models. For example, a legal firm can build an agent that generates documents for legal professionals that incorporate the style, structure and expertise of their prior work. A consulting company working in a specialized industry, such as aviation, may build a Q&A agent using a model tuned on internal documents dealing with regional or international regulatory requirements to assist their consultants craft accurate answers to domain specific questions.
Copilot Tuning will be rolling out in June as part of the Copilot Tuning Program, for customers with 5,000 or more Microsoft 365 Copilot licenses.
Additional resources:
1.1.2. Advanced development tools enable creation of smarter, more connected agents for Teams
Advanced development tools for Microsoft Teams will include support for the Agent2Agent (A2A) protocol, an updated Teams AI library and agentic memory and will enable the creation of smarter, collaborative agents to enhance productivity and enterprise efficiency while ensuring more secure, compliant development environments. The following features will be in preview in May:
- A2A protocol: Developers will be able to build agents for Teams using the A2A protocol that enables more secure, peer-to-peer communication between agents. These agents will exchange messages, data and credentials without relying upon centralized intermediaries, so they can tackle complex enterprise tasks more effectively and efficiently.
- Updated Teams AI library: This will enable developers to build more powerful collaborative agents for Teams. With the enhanced version in preview for JavaScript and C#, the library will simplify the creation of custom agents and provide access to the latest capabilities.
- Agentic memory on Teams: Agents will be able to efficiently recall user interactions in Teams, fostering more personalized and context-aware experiences.
- Automated agent validation: Developers will be able to automate compliance checks to ensure their agents meet Store policies and are optimized for high-quality performance.
- Adoption and engagement insights: The Developer portal will provide analytics to help developers track adoption and engagement with real-time insights, providing key metrics to refine and optimize agent performance.
- Agents in Teams meetings: Developers will be able to build agents for both private and group use directly from the meeting surface. Additionally, developers will be able to build agents that can access meeting AI notes via the new Insights API.
Additional resources:
1.2. Microsoft Copilot Studio
1.2.1. New pro-code features help developers create powerful agents
Microsoft Copilot Studio and Microsoft 365 Copilot are expanding with new capabilities that make it easier than ever for makers to leverage pro-code experiences to build. These updates provide makers with the flexibility and advanced tools needed to create more powerful, secure agents that can scale across diverse channels and include:
- Microsoft 365 Agents Toolkit for Visual Studio streamlines the development of enterprise-grade agents by integrating AI tools like Microsoft 365 Agents SDK and Azure AI Foundry, along with easy project scaffolding, testing and publishing in Visual Studio. TypeSpec for Copilot is also now natively integrated in the toolkit. This is generally available.
- Microsoft 365 Agents SDK enables building enterprise-grade, scalable, multichannel agents and offers flexibility for advanced needs, customization with Azure AI Foundry, easy integration with Copilot Studio and Visual Studio and the capability to publish across multiple platforms with more than 10 messaging channels. This is generally available.
- Microsoft 365 Copilot APIs is a suite of enterprise-ready APIs that enable developers to build fast, context-aware, secure and generative AI (GenAI) experiences with Microsoft 365 data – powered by retrieval, chat and compliance capabilities. The Retrieval API is in preview.
- Agent Store features agents from Microsoft, partners and customer organizations. It offers immersive and personalized discovery and engagement across Microsoft 365 Copilot endpoints, including Copilot Chat, recommending agents based on usage signals. Users can quickly find and utilize agents, conduct searches and share with colleagues to boost collaboration and productivity, ultimately getting more done. This is generally available.
- Enhanced Microsoft Power Platform connector SDK will allow developers to build enhanced Power Platform connectors faster than before with the new SDK. These enhanced Power Platform connectors consume structured data in a structured format that is easy for agents to read, understand and reason upon enterprise data sources. This is in preview.
- Bring Your Own Models (BYOM) from Azure AI Foundry will enable makers to use the more than 1,900 models from Azure AI Foundry Models in their agents built with Copilot Studio, enabling the use of industry-specific and fine-tuned models, whether it is for scenario-specific prompts or summarization in agents built in Copilot Studio. This is in preview.
- Microsoft Dynamics 365 data in Microsoft 365 Copilot will enable Microsoft 365 Copilot users to find Dynamics 365 CRM insights across sales, service, supply chain and marketing to drive their business. Initially, this will be for a scoped set of Dynamics 365 entities, including Contact, Opportunity, Lead and Account. This is in private preview.
Additional resources:
- Blog: Multi-agent orchestration, maker controls, and more: Microsoft Copilot Studio announcements at Microsoft Build 2025
- Download: Visual assets
- Keynote: Microsoft Build opening keynote
- Breakout: Building agents for Microsoft 365 Copilot
- Breakout: Create agents for Microsoft 365 Copilot with Microsoft 365 Agents SDK
- Breakout: Exploring the Agent landscape
- Breakout: Architecting your multi agent solutions with Copilot Studio and Microsoft 365 Agents SDK
- Breakout: Secure and govern your enterprise-scale agents built with Microsoft Copilot Studio
- Breakout: Lessons: Deploying Copilot Studio in enterprise software environments
- Breakout: Build Microsoft Teams collaborative agents as virtual colleagues with Visual Studio Code
- Breakout: Introducing the Agent Store: Build agents and publish them in the Agent Store for Microsoft 365 Copilot
1.2.2. New features help makers build better agents with natural language
Microsoft Copilot Studio and Microsoft 365 Copilot are expanding with new capabilities that give low-code makers more tools in their toolbox to help them build more capable agents within Copilot Studio. These new tools include:
- Support for multiple agent systems will allow multiple agents (agents built using Copilot Studio, Azure AI Foundry Agent Service or the Microsoft 365 Agents SDK) to work together as a team, combining specialized skills to distribute work and deliver more comprehensive answers and experiences. This is in preview. Additionally, native support for open standards, including Agent2Agent (A2A) in Copilot Studio and Azure AI Foundry, is in preview. With these capabilities, agents built on Microsoft platforms will be able to discover peer agents, negotiate tasks and complete work together while honoring the identity, governance and safety controls that offer protection for Microsoft AI workloads.
- Model Context Protocol (MCP) support within Copilot Studio helps ensure agents have consistent, governed access to data and models across the external systems. This is generally available. Additionally, new MCP servers for Microsoft Dataverse and Microsoft Dynamics 365 will enable agents to work with Dynamics 365 and Dataverse tools. These servers are in private preview.
- The computer use tool will give agents the ability to perform enterprise tasks on user interfaces across both desktop and web apps. With AI vision and understanding, users will be able to automate repetitive tasks using Computer Using Agent (CUA) technology for tasks, such as data transfer, document processing, market research and compliance monitoring. Users will be able to run them at scale on their own virtual machines, or those hosted by Microsoft, to accelerate deployment, simplify management and reduce costs. This tool is currently available through the Frontier program for eligible customers with 500,000+ Copilot Studio messages and an environment in the US.
- Tools enhancements across the existing set of supported tools and actions will provide agents and apps with the ability to do more. Examples of tools include prompts, document processing, deep reasoning prompts, agent flows, connectors, REST APIs and more. These updates are in preview.
- Code Interpreter will allow agents to write and run Python code to perform complex tasks such as calculating math, advanced analytics and generating data visualizations. This is in preview.
- Operational database for agents powered by Dataverse will be the operational database that underpins agents built in Copilot Studio optimized for speed and real-time interactions. This is in preview.
- Dataverse search is the foundational data layer that powers an agent’s ability to understand, reason and act across organizational knowledge. It connects structured and unstructured data from Microsoft 365, Dynamics 365, Power Platform and external systems like Zendesk, ServiceNow, SAP, Databricks and more – transforming fragmented data into a unified, context-rich knowledge network. Additionally, Dataverse and uploaded files will have increased efficiency and search capabilities like image extraction, multiple-language support and ability to query embedded tabular files. This is generally available.
With these updates, Microsoft continues to position itself as the central platform for building, managing and scaling agentic workflows – empowering developers, makers and enterprise organizations to create intelligent agents that work easily across Microsoft 365 and beyond.
Additional resources:
- Blog: Multi-agent orchestration, maker controls, and more: Microsoft Copilot Studio announcements at Microsoft Build 2025
- Download: Visual assets
- Breakout: Building agents for Microsoft 365 Copilot
- Breakout: Building Effective Agents with Microsoft 365 Copilot
- Breakout: Create agents for Microsoft 365 Copilot with Microsoft 365 Agents SDK
- Breakout: Add more knowledge to Microsoft 365 Copilot with Copilot connectors and actions
- Breakout: GenAI for Enterprise: Intelligent Apps and Agents with Dataverse & MCP
- Breakout: What’s new in Copilot Studio
- Breakout: Exploring the Agent landscape
- Breakout: Architecting your multi-agent solutions with Copilot Studio and Microsoft 365 Agents SDK
- Breakout: Use agents and automations to transform business processes with Copilot Studio
1.3. Microsoft 365 Copilot
1.3.1. Microsoft 365 Copilot Wave 2 spring release now available
The Microsoft 365 Copilot Wave 2 spring release is rolling out and will include:
- An updated Microsoft 365 Copilot app designed for human-agent collaboration.
- A new Create experience that brings the power of OpenAI GPT-4o image generation.
- Copilot Notebooks that turn content and data into instant insights and action (generally available).
- Copilot Search and Copilot Memory that will begin rolling out in June.
- Researcher and Analyst, first-of-their-kind reasoning agents for work, will roll out to customers worldwide this month via the Frontier program.
Additional resources:
1.3.2. New Copilot features in Outlook transform how users work in the AI era
In the era of AI, the pace of work is faster than ever, but users still spend a tremendous amount of time on administrative tasks like reading and replying to emails and preparing for meetings. New Copilot features for Outlook mail and calendar, generally available, support users with rich contextual insights and advanced reasoning by:
- Making it easier to navigate inboxes, by providing summaries of search results and attached files directly within emails.
- Helping users prepare for meetings more quickly and effectively, by surfacing and summarizing relevant context, tasks, documents and other resources related to the meeting topic.
Additional resources:
1.3.3. New features in Copilot Pages increase productivity
Microsoft Copilot Pages introduced an entirely new workflow built for the Copilot era, giving users the ability to turn a Copilot response into a dynamic, editable and shareable page. New features in Pages, generally available in May, will allow users to be more productive with Copilot by providing the ability to:
- Create Pages from a mobile phone. When using Copilot Chat on a mobile device, users will be able to create a new page with a click and then share or edit that page directly on a mobile device or return to it later via web-based access.
- Turn a page into a Word document with a single click. While the shareable, editable format of Pages is great for building out content and collaborating with others, there remains the need to deliver content via traditional files and documents or to use Microsoft Word-specific workflows. This new feature will connect these workflows, bridging Copilot responses through Pages collaboration into a Word document.
- Find pages easier than ever with the ability to search and filter a list of pages in the Pages module of the Microsoft 365 Copilot app.
- Bring more interactivity and output types into work with the availability of interactive charts and code blocks as output options in Chat and Users will be able to ask Copilot for a response that’s well-suited for a chart and it will turn this into a dynamic, easy to comprehend format to add to a page, use and share with others.
Additional resources:
1.4. Microsoft 365
1.4.1. GitHub app for Teams includes new features
With Microsoft Teams, developers have a unique opportunity to collaborate in a platform that powers human-agent collaboration and matches their needs. The enhanced GitHub app for Teams is now faster, more intuitive and packed with new features, such as notification cards, streamlined actions with slash commands and organized pull request conversations with threading support. This is generally available.
Additional resources:

2. Azure 2.1. AI
2.1.1. New models and partnerships added to Azure AI Foundry Models
Azure AI Foundry is a unified platform to design, customize and manage AI apps and agents. Azure AI Foundry Models empowers enterprises to access top-tier models, deploy with confidence and optimize for performance – accelerating AI innovation at scale.
The most powerful AI workloads run on Microsoft Azure due to its deep infrastructure advantage and strong partnership with OpenAI. While the expectation is continued leadership with Azure OpenAI, Microsoft also knows how important enabling choice is for customers.
Azure AI Foundry Models is expanding with new cutting-edge models, including Grok 3 from xAI available today, Flux Pro 1.1 from Black Forest Labs coming soon and Sora coming soon in preview via Azure OpenAI. There are now over 10,000 open-source models from Hugging Face available in Foundry Models. Support for full fine-tuning empowers developers to tailor fine-tunable models to their needs. Additionally, a new developer tier for fine-tuning—no hosting fees is rolling out.
Additional resources:
- Blog: Azure AI Foundry: Your AI App and agent factory
- Blog: Announcing Grok 3 on Azure AI Foundry
- Breakout: Unveiling Latest Innovations in Azure AI Foundry Model Catalog
- Breakout: Solving the Unsolvable: Advanced Agentic AI Models in Azure AI Foundry
- Breakout: Bring AI Foundry to Local: Building cutting-edge on-device AI experiences
- Breakout: Revolutionizing AI Apps with Multimodal Models in Azure AI Foundry
2.1.2. Azure AI Foundry Agent Service now generally available
Azure AI Foundry Agent Service, generally available, empowers professional developers to design, deploy and scale enterprise-grade AI agents to automate business processes. With support for multi-agent workflows, developers can now orchestrate multiple specialized agents to handle complex tasks, accelerate decision-making and boost operational efficiency, allowing organizations to build and manage a digital AI-powered workforce. It also supports open protocols, such as Agent2Agent (A2A) and Model Context Protocol (MCP), enabling interoperability across agent frameworks and ecosystems while maintaining precise control over orchestration and execution.
The service simplifies agent development by integrating easily with knowledge sources, such as Microsoft Bing, Microsoft SharePoint, Microsoft Azure Databricks and Microsoft Fabric. It also connects with a centralized catalog of tools and pre-built agent templates, in preview, that developers will be able to easily customize.
Azure AI Foundry Agent Service is built to work easily with a unified runtime that combines the strengths of Semantic Kernel and AutoGen, offering developers a consistent and composable experience for building, testing and deploying AI agents. Developers can build agent systems locally, simulate interactions and deploy them unchanged to the cloud –promoting consistent behavior across environments.
To help agents perform reliably, Azure AI Foundry Agent Service introduces bring your own (BYO) thread storage with Azure Cosmos DB and robust AgentOps capabilities – such as tracing, evaluation and monitoring – helping developers validate, observe and optimize agent behavior with confidence.
The healthcare agent orchestrator code sample and streamlined deployment tools are also now available in the Azure AI Foundry Agent Catalog. With these new capabilities, developers can leverage preconfigured agents with multi-agent orchestration and open-source customization options that allow developers and researchers to build and test how agents could coordinate multi-disciplinary healthcare data workflows, such as tumor boards. Modular, specialized, multimodal AI agents work together to quickly complete tasks that would take hours, with the goal to effectively augment clinician specialists with customized cutting-edge agentic AI.
Additional resources:
- Blog: Announcing General Availability of Azure AI Foundry Agent Service
- Blog: Developing next-generation cancer care management with multi-agent orchestration
- Watch: Stanford Medicine and the healthcare agent orchestrator
- Keynote: Microsoft Build opening keynote
- Breakout: Azure AI Foundry: The Agent Factory
- Breakout: Developer essentials for agents and apps in Azure AI Foundry
- Breakout: Azure AI Foundry Agent Service: Transforming workflows with Azure AI Foundry
- Breakout: Building the digital workforce: Multi-agent apps with Azure AI Foundry
2.1.3. Azure AI Foundry enhances AI development with monitoring and evaluation tools
Azure AI Foundry Observability is introducing new features, in preview, for built-in observability into metrics for performance, quality, cost and safety – all incorporated alongside detailed tracing in a streamlined dashboard. These new capabilities are designed to provide deeper insights into the quality, performance and safety of agents and will include:
Enhanced developer support: Azure AI Foundry will support developers throughout their journey, enabling evaluations during model tuning, system prompt upgrades and transitions between models. The journey begins in the agent playground, where customers will be able to evaluate their agents in a controlled environment. As developers transition to code, local evaluation will be enabled, allowing evaluations to be run automatically on every relevant commit through GitHub Actions and Microsoft Azure DevOps workflows.
Easy production monitoring: Once agents are deployed to production, Azure AI Foundry Observability will offer a single pane of glass dashboard for continuous monitoring. With just one configuration step, quality and safety evaluators will run continuously and will be directly connected to traces, facilitating efficient debugging and ensuring optimal performance.
AI development teams will be able to operationalize and monitor workflows with ongoing, comprehensive, real-time insight and analytics across the entire AI estate. These features are in preview.
Additional resources:
- Blog: Azure AI Foundry: Your AI App and agent factory
- Keynote: Microsoft Build opening keynote
- Breakout: AI and Agent Observability in Azure AI Foundry and Azure Monitor
- Demo: Is your LLM-powered app safe? Evaluate it!
- Demo: Continuously improve your Agent in production
- Demo: From risk to reward: AI governance integrations for Azure AI Foundry
- Lab: Evaluate and improve the quality and safety of your AI apps
2.1.4. Automate model selection and AI app design with Azure AI Foundry
Forrester’s data show that 85% of enterprises are pursuing multi-model strategies, highlighting the importance of rapidly testing and deploying new models as a key competitive advantage. However, developers face challenges when navigating the vast selection of AI models to deliver business solutions, often leading to guesswork and inefficiencies. The challenge extends beyond technical integration to the full stack of prompt tuning, performance validation and deployment orchestration. Azure AI Foundry addresses these issues by simplifying model selection with automated model routing, fully customizable and context-aware templates and real-time performance monitoring.
Azure AI Foundry will offer a new Model Router, in preview, which will automatically select the best OpenAI model for prompts, leading to higher quality and lower cost outputs. Additionally, automated evaluation, A/B experimentation and tracing in Foundry Observability will support rollback to proven models if new ones underperform, enabling developers to stay on the cutting-edge of model capabilities to deliver cost-effective solutions.
Further streamlining productivity, Azure AI Foundry will offer new AI templates designed for common, high-value use cases and technical patterns, designated by customers. These templates will enable developers to design, customize and deploy AI solutions in minutes and scale them into production quickly, significantly reducing development time.
Additional resources:
- Blog: Announcing Developer Essentials for Agents and Apps in Azure AI Foundry
- Breakout: Azure AI Foundry: The Agent Factory
- Breakout: Developer essentials for agents and apps in Azure AI Foundry
- Breakout: Customizable Solution Patterns in Azure AI Foundry
- Breakout: AI and Agent Observability in Azure AI Foundry and Azure Monitor
- Breakout: Unveiling Latest Innovations in Azure AI Foundry Models
2.1.5. Agentic retrieval engine in Azure AI Search available in preview
Agents have changed the way users interact with data; they require a dynamic way to connect to institutional knowledge. Microsoft Azure AI Search will offer a new declarative query engine, designed for agents. Agentic retrieval is a premium feature in Azure AI Search that will analyze, plan and execute a retrieval strategy using an Azure OpenAI model. It will include conversational history for better context, return results tuned for agent needs and deliver a query activity log that shows exactly what happened in the plan. This feature is in preview.
Additional resources:
2.1.6. Delivering Azure AI Foundry Services, Models and tools to Copilot Studio for building AI business process automation solutions
Developers and makers can build agents to transform their business processes with Microsoft Copilot Studio, powered by Azure AI Foundry Models and tools.
In addition to the robust set of pre-built orchestrator, skills, knowledge sources and agent tools in Copilot Studio, low-code developers can leverage more than 1,900 Foundry Models as well as fine-tuned models (in preview) to support agent prompts and answers. They can also use vectorized indices from Azure AI Search (generally available) for retrieval augmented generation (RAG) and hand off multi-agents orchestrated with Azure AI Foundry Agent Service, in preview, to Copilot Studio.
The Microsoft 365 Agent Toolkit is an extension for Microsoft Visual Studio and GitHub. It simplifies the development process for building, debugging and deploying enterprise-grade custom engines and declarative agents for Microsoft 365. Powered by Microsoft 365 Agents SDK available in C#, JavaScript and Python, developers using Azure AI Foundry Services can publish their agents to Microsoft 365, Microsoft Teams, custom web apps and to more than15 channels.
Additional resources:
- Blog: Azure AI Foundry: Your AI App and agent factory
- Keynote: Microsoft Build opening keynote
- Breakout: Azure AI Foundry: The Agent Factory
- Breakout: Developer essentials for agents and apps in Azure AI Foundry
- Breakout: Copilot Studio Agents & Azure AI Foundry: better together
- Breakout: Architecting your multi agent solutions with Copilot Studio and Microsoft 365 Agents SDK
2.1.7. Azure AI Foundry Local brings the power of AI to Windows and MacOS
Azure AI Foundry Local, will be available on Windows 11 and MacOS, and will include model inferencing, models and agents as a service and model playground for fast and efficient local AI development. Foundry Local will bring the power of open-source models in Azure AI Foundry to client devices.
In preview, Foundry Local will make it easy to run AI models, tools and agents directly on-device, whether Windows 11 or MacOS.
Foundry Local will be included in Windows Al Foundry and will deliver best-in-class Al capabilities on Windows with excellent cross-silicon performance and availability on millions of Windows devices.
Leveraging ONNX Runtime, Foundry Local is designed for situations where users can save on internet data usage, prioritize privacy and reduce costs. It will be ideal for industry-specific use cases where constant cloud connectivity is now available.
Additional resources:
- Blog: Unlock Instant On-Device AI with Foundry Local
- Keynote: Microsoft Build opening keynote
- Breakout: Azure AI Foundry: The Agent Factory
- Breakout: Bring AI Foundry to Local: Building cutting-edge on-device AI experiences
- Demo: Build and ship cross platform apps on-device with Foundry Local and ONNX
2.1.8. Azure AI Foundry integrates enhanced security capabilities
Azure AI Foundry now integrates several significantly enhanced security capabilities, helping organizations securely build and deploy generative AI applications:
- Prompt Shields are now generally available in Azure AI Content Safety, providing protection against jailbreaks and injection attacks by intercepting malicious prompts before they influence model behavior. Microsoft is also previewing Spotlighting, which uniquely identifies adversarial prompts embedded in external data sources by separating trusted from untrusted inputs, significantly reducing cross-domain injection risks.
- Task adherence, a new Azure AI Content Safety control, helps ensure agents stay aligned with user intent by detecting when actions deviate from approved task boundaries. This feature is in preview.
- Microsoft Defender for Cloud integration brings real-time security recommendations and runtime alert monitoring into the AI development workflow. Developers can identify and address vulnerabilities earlier, reducing friction with security teams and improving delivery speed. This feature is in preview.
- Azure AI Foundry Models now includes a Personally Identifiable Information (PII) detection content filter, powered by Azure AI Language. This filter automatically detects and redacts sensitive information, such as PII and Protected Health Information (PHI), supporting compliance and data security. These combined enhancements deliver integrated security, safety and privacy capabilities across the AI development lifecycle, helping Azure AI Foundry customers scale generative AI solutions with stronger safeguards and faster deployment.
Additional resources:
2.2. Database & Analytics
2.2.1. Azure Migrate application and code assessment now generally available
Semi-structured data is increasingly crucial in the era of AI apps. Semi-structured data provides the necessary structure for AI algorithms to learn from data and understand complex, real-world information such as text, documents, emails, graphs and more.
The unification of data estates is expanding with the preview of Cosmos DB (NoSQL) in Fabric. Cosmos DB (NoSQL) in Fabric will bring enterprise-grade dynamic scalability, consistent reliability and low latency serving of semi-structured data that Walmart, OpenAI, Adobe, DocuSign, Microsoft Teams and other of the world’s largest mission-critical apps rely on. Developers will be able to deploy Cosmos DB (NoSQL) in just a few clicks to build high-performance, distributed apps with ease.
Cosmos DB (NoSQL) in Fabric enables organizations to bring semi-structured operational data into Fabric, just like their structured and analytical data. With support for both SQL and NoSQL models, Microsoft Fabric will give developers the flexibility to build modern AI applications grounded on both operational and analytical data sets.
Additional resources:
- Blog: Get to insights faster with SaaS databases and “chat with your data”
- Keynote: Microsoft Build opening keynote
- Breakout: Microsoft Fabric for Developers: Build Scalable Data & AI Solutions
- Breakout: What’s New in Microsoft Databases: Empowering AI-Driven App Dev
- Breakout: Enable Agentic AI Apps with a Unified Data Estate in Microsoft Fabric
2.2.2. Create powerful AI agents in Azure AI Foundry with data in Azure Databricks
Developers will be able to use Azure AI Foundry to easily create AI agents that leverage their valuable enterprise data stored in Microsoft Azure Databricks. This new feature, in preview, will allow AI agents to connect with Azure Databricks for real-time data processing, helping design and customize reliable, enterprise-ready AI agents.
Taking it a step further, the connector will enable agents to leverage data scientists’ Spark jobs within Azure Databricks. Additionally, agents will be able to interact with data via natural language query (NLQ) using Azure Databricks Genie rooms, facilitating a multi-agent agentic workflow.
Access control, security and data governance will be maintained through authorizations between Azure AI Foundry and Azure Databricks Unity Catalog. Azure AI Foundry also includes responsible AI and content safety features. This unified experience within Azure will enhance advanced analytics and AI capabilities.
Additional resources:
2.2.3. Expanding chat with user data capabilities to Power BI and Copilot Studio
Data plays a critical role in agentic AI, enabling AI agents to operate independently, make informed decisions and take meaningful actions. That’s why Microsoft is expanding capabilities and deepening integrations between data and AI platforms.
Chat with your data experiences will allow teams to explore and reason over complex datasets simply by asking questions. This will remove the barriers between users and insights, enabling everyone – from business analysts to data scientists – to make faster, more confident decisions directly within the tools they already use.
Data agents in Microsoft Fabric are AI-powered assistants that go beyond simple data retrieval from OneLake — they also engage in natural language conversations about it. These agents can understand the structure, meaning and context of data to surface insights that are timely, relevant and actionable. Building on this are two new experiences that will allow Power BI and Copilot Studio users to chat with their data, expanding ways to explore, analyze and act on data and include:
- Chat with your data: Chat with your data is a new experience in Power BI. This full-screen Copilot experience will be easily accessible from the left pane to help find relevant reports, analyze data and answer questions from any data users have access to across multiple reports, semantic models, apps and data agents, without having to navigating to specific reports. Previously, Copilot was limited to the right pane of a single report, allowing questions only about that open report. With just a few clicks, users will be able to discover and leverage Fabric data agents, which tap into domain expertise to explore, analyze and refine findings. Chat with your data in Power BI will be coming in the next few months to Microsoft 365 Copilot, allowing business users to ask questions about and get insights from their Power BI data directly in Microsoft 365 Copilot Chat.
- Enrich custom agents in Microsoft Copilot Studio with insights from Fabric data agents: Fabric data agents can be added to any custom agent built in Microsoft Copilot Studio. These agents will be deployed across channels like Microsoft Teams and Microsoft 365 Copilot. Once connected, the custom agent will use the Fabric data agent to retrieve insights from OneLake, respecting data access permissions. Developers will be able to also define actions, such as send an email or trigger a workflow, to automate processes, making it easier for users to interact with data and streamline tasks without leaving the chat experience. This feature will be available in preview in the next few weeks.
Additional resources:
2.2.4. Digital twin builder in Microsoft Fabric now in preview
In today’s data-driven world, there is a growing need for tools that can easily integrate and manage the vast amounts of data generated by the physical world. Having a digital twin can significantly enhance an organization’s ability to make informed decisions by bringing the physical world into the digital world. Digital twin builder in Microsoft Fabric is designed to make this process simpler and more efficient, enabling organizations to harness the power of their data like never before.
Digital twin builder will be a new capability within Real-Time Intelligence in Microsoft Fabric, in preview, that will provide a simple and fast way to build and manage digital representations of real-world environments. It will offer a no-code/low-code interface, making it easy for nontechnical users to create and manage digital twins. Additional features will include:
- Allowing customers to easily connect, map data from physical assets, processes and systems and contextualize it as a digital twin.
- Democratizing and scaling digital twins by making them more accessible and actionable for operators and decision makers.
- Using the digital twin data as an AI-ready foundation to enhance deep analytics, what-if analysis and AI-powered automation leveraging Fabric native capabilities.
Additional resources:
2.2.5. Accelerate delivery of GenAI with PostgreSQL capabilities in GitHub Copilot
GitHub Copilot will extend capabilities to PostgreSQL in the new PostgreSQL extension for Visual Studio Code (VS Code). These new capabilities, in preview, will help developers using PostgreSQL and VS Code accelerate the delivery of generative AI (GenAI) apps. Developers working with PostgreSQL need to navigate complex PostgreSQL-specific features to help maintain high query performance and data integrity and security in production environments. The new PostgreSQL capabilities in GitHub Copilot will bring Microsoft Copilot AI assistance and database context for PostgreSQL directly into the development environment, boosting developer productivity.
According to Stack Overflow’s 2024 Developer Survey, PostgreSQL and VS Code are the preferred database and integrated development environment (IDE) among developers. Bringing AI assistance directly into VS Code will allow developers to use natural language to interact with their PostgreSQL database and development tools and leverage real-time, expert-level assistance to write efficient SQL queries, design good database schemas and follow best practices for performance and security – all without any additional service setup or cost of bringing their own AI subscriptions. This will help to reduce the barrier to deliver GenAI capabilities on a broader scale.
Additional resources:
- Blog: Announcing the Public Preview of the PostgreSQL Extension for Visual Studio Code
- Demo: Boost Your Development Workflows with PostgreSQL
- Breakout: What’s New in Microsoft Databases: Empowering AI-Driven App Dev
- Breakout: Building advanced agentic apps with PostgreSQL on Azure
- Lab: Build an Agentic App with PostgreSQL, GraphRAG and Semantic Kernel
2.2.6. Global secondary index in Azure Cosmos DB now in preview
With global secondary index in Microsoft Azure Cosmos DB, now in preview, developers will be able to create an automatically updated index over a subset of the transactional NoSQL data and attributes to optimize complex queries in their apps.
Global secondary index will remove the need to scan all the operational data in an Azure Cosmos DB database. This will enable faster queries and minimize latency while also helping to make sure that queries do not negatively impact transactional performance. An example use case is the AI apps and agents that need to deliver information and services quickly to users or other agents.
Global secondary indexes will be easily created via the Azure portal, eliminating the need for manual provisioning. The autoscale throughput feature will help them adapt effortlessly to varying traffic levels, maintaining performance under all circumstances. Global secondary index will further enhance the hybrid search capabilities announced for Azure Cosmos DB at Microsoft Ignite 2024.
Additional resources:
2.2.7. Integration of Azure Cosmos DB with Azure AI Foundry empowers agentic AI
Developers designing and customizing AI apps and agents can now use Azure Cosmos DB accounts to power AI solutions in Azure AI Foundry.
Customers can now securely store and manage the conversation threads between users and AI agents in their Azure Cosmos DB accounts, using the Azure AI Foundry SDK. This enables agents to recall the content of previous thread conversations and messages and pick up conversations where they left off. Threads storage is now generally available.
In the next few weeks, developers will be able to use the data stored in their Azure Cosmos DB accounts to power AI solutions in Azure AI Foundry. Customers will be able to connect and access their Azure Cosmos DB data using Azure AI Foundry in app code, and Azure Cosmos DB will be the first Azure database able to power agents and models in Azure AI Foundry with real-time, operational data.
Additional resources:
- Blog: Azure AI Foundry Connection for Azure Cosmos DB and BYO Thread Storage in Azure AI Agent Service
- Breakout: Design scalable data layers for multi-tenant apps with Azure Cosmos DB
- Breakout: What’s New in Microsoft Databases: Empowering AI-Driven App Dev
- Breakout: Azure AI Foundry Agent Service: Transforming workflows with Azure AI Foundry
2.2.8. PostgreSQL extension for VS Code simplifies database management
Microsoft continues its commitment to open-source and the enhancement of the developer experience with the release of a new high-quality PostgreSQL extension for Visual Studio Code (VS Code). According to Stack Overflow’s 2024 Developer Survey, PostgreSQL and VS Code are the preferred database and integrated development environment (IDE) among developers. The new PostgreSQL extension for VS Code, in preview, will enhance developers’ productivity by enabling them to manage their PostgreSQL databases easily within their preferred development environment.
Without a dedicated VS Code extension, developers relied on other disconnected, outdated tools and inflexible approaches to connect and manage their PostgreSQL database alongside their development environment. The new PostgreSQL extension will simplify and centralize workflows between VS Code and PostgreSQL without context switching or complex configurations. Developers will be able to easily manage their PostgreSQL databases directly within their preferred user interface whether they’re developing in Microsoft Azure, a local Docker container or on-premises. The new extension will offer improved functionality with key features including:
- Connection manager will simplify connectivity with input options for connection string parsing, which identifies the database provider, parameters and Azure connections, and Docker images for local deployment.
- Object explorer will consolidate database administration by allowing users to view and manage database objects in a structured way.
- Query editor will enhance the querying experience with stored query history and context-aware IntelliSense code editing features that recommend code completions.
- Results viewer will centralize and enhance management of query results with the ability to filter/search data directly within the results viewer tab, export and copy/paste results.
Additional resources:
- Blog: Announcing the Public Preview of the PostgreSQL Extension for Visual Studio Code
- Demo: Boost Your Development Workflows with PostgreSQL
- Breakout: What’s New in Microsoft Databases: Empowering AI-Driven App Dev
- Breakout: Building advanced agentic apps with PostgreSQL on Azure
- Lab: Build an Agentic App with PostgreSQL, GraphRAG and Semantic Kernel
2.2.9. Reimagine database development with SQL Server solutions
Microsoft SQL Server 2025, built on a foundation known for best-in-class security and performance, will empower customers to develop modern AI apps using their data. It will provide built-in, extensible AI capabilities, enhanced developer productivity and easy integration with Microsoft Azure and Microsoft Fabric – all within a SQL Server engine using the familiar T-SQL language. Microsoft Copilot, integrated into the modernized SQL Server Management Studio 21, will streamline SQL development, allowing customers to develop and deliver faster than ever before.
SQL Server 2025, in preview, will speed up time-to-market for customers’ new apps because they will be able to:
- Boost search intelligence using advanced semantic search alongside full text search and filtering; it will allow customers to run generative AI models of their choice using their own data.
- Take advantage of the most significant release of SQL Server in the past decade to process and manage data flows more simply and efficiently using native JSON support, built-in REST API, Change Event Streaming for real-time data updates and GitHub Copilot.
- Leverage the most secure database to improve credential management and reduce potential vulnerabilities with support for Microsoft Entra ID managed identities through Azure Arc.
- Increase workload uptime and improve concurrency for SQL Server apps with enhanced query optimization, optimized locking and improved failover reliability.
- Achieve zero-ETL, real-time analytics by replicating SQL Server data to OneLake with Fabric database mirroring.
SQL Server Management Studio 21, generally available, is a modern database management tool based on Visual Studio 2022 with 64-bit support. The latest release provides an improved user interface compared to earlier versions, with an enhanced Query Editor with better readability and improved tab organization. Git integration enhances source control, making it easier to use and more secure. The Always Encrypted assessment feature allows customers to quickly and easily identify columns that can be encrypted.
Copilot in SQL Server Management Studio 21, in preview, will enable customers to streamline SQL development by offering real-time suggestions, faster problem diagnosis and best practice recommendations.
Additional resources:
2.2.10. Shortcut transformations help create AI-ready data faster in OneLake
For teams tasked with building new AI and analytics solutions, finding and accessing the necessary data across a sea of disconnected data services can be challenging at the best of times. That’s where OneLake can help. OneLake is designed as the single point to discover and explore data for everyone in an entire organization. OneLake can help organizations unify their entire multicloud data estate using shortcuts and mirroring to bring in data without data duplication or movement.
OneLake is expanding the existing shortcuts offering with shortcut transformations. This new capability, in preview, will help customers automatically transform data as they bring it into OneLake or move it between OneLake data items. Shortcut transformations will virtualize data in OneLake while converting the data format to Delta Lake format or applying AI-powered transformations such as summarization, translation and document classification – all powered by Azure AI Foundry. With just a few clicks, customers will be able to virtualize data in OneLake and make it ready for analytics or AI.
Additional resources:
- Blog: Get to insights faster with SaaS databases and “chat with your data”
- Keynote: Microsoft Build opening keynote
- Breakout: Microsoft Fabric for Developers: Build Scalable Data & AI Solutions
- Demo: Bring all your data from everywhere into OneLake with Microsoft Fabric
- Demo: Simplifying Medallion Implementation with Materialized Views in Fabric
2.2.11. Translytical task flows in Microsoft Fabric in preview
Power BI reports have historically been a one-way street, with users reading reports and switching to different tools to act. With the introduction of translytical capabilities in Power BI, in preview, users will be able to automate action directly within the report to streamline decision-making and operational follow-through.
This set of capabilities – grouped together in a task flow – will bridge transactional processes with analytical insights, enabling seamless transitions from insights to actions within Power BI and other Fabric environments.
Users will be able to programmatically write back, including update, add or delete records of data, based on the filter context passed from the report. Users will also be able to automate a wide variety of tasks and even take actions in other systems via external APIs. Examples include submitting approval workflows, triggering dynamic notifications and augmenting data on the fly.
Additional resources:
2.3. DigApp
2.3.1. Introducing agentic DevOps: Accelerating the end-to-end software lifecycle
Microsoft is evolving the software development lifecycle through autonomous and semi-autonomous agents that operate as a member of the dev and operations teams, automating, optimizing and accelerating every stage of the software lifecycle. New agentic capabilities in GitHub Copilot will work with developers and together with other agents to solve routine and complex tasks, bring apps to market faster, increasing code quality and security, reducing technical debt and reframing the economics of operating, maintaining and modernizing apps in production. Through agentic DevOps, developers will have the freedom to focus on higher value creative work, while operators can proactively identify, mitigate and resolve issues in production. New capabilities will include:
- GitHub Copilot coding agent: Developers will be able to assign GitHub Copilot a range of development tasks, from autonomously refactoring code and improving test coverage to fixing defects and implementing new features. For complex tasks, GitHub Copilot will be able to collaborate with other agents across all stages of the software lifecycle. Operating as a member of the development team, GitHub Copilot will relieve developers of routine tasks to focus developers on more impactful tasks. This is now in preview.
- New app modernization capabilities in GitHub Copilot: Developers will be able to offload complex and time-consuming tasks to rapidly update, upgrade and modernize Java and .NET apps in GitHub Copilot. Java developers will be able to modernize apps on Microsoft Azure faster using AI agents, from code assessment to remediating app code, configurations and dependencies, across thousands of files, reducing effort from days and months to hours.
Upgrading .NET and Java versions is a common and repetitive task for developers. GitHub Copilot will be able to streamline this upgrade process by autonomously generating an upgrade plan, executing the plan with full visibility and control in every step and providing a final summary of the process. These capabilities will enable developers to gain app performance efficiencies, address security vulnerabilities and reduce technical debt. This is now in preview.
- New SRE agent: The SRE agent will help cloud developers reduce the cost of operations while improving app uptime. SREs face several challenges today, including alert fatigue, analyzing root cause quickly, manual workflows and ensuring apps meet strict service level agreements (SLAs). The SRE agent will automatically respond to production alerts, autonomously mitigate issues and determine root cause analysis (RCA), reducing resolution time from hours to minutes. The SRE agent will continuously monitor app health and performance for production apps on Azure, including Kubernetes, platform as a service (PaaS), serverless and database services, to build context and provide insights for faster troubleshooting. The SRE agent will also work with SWE capabilities in GitHub Copilot to proactively identify app issues and assign issues to agents in GitHub to rapidly drive to resolution. This will be in preview in the coming weeks.
- GitHub Models: GitHub Models will extend leading AI models from Azure AI Foundry directly to developers using GitHub, now with native integration into the GitHub experience and workflow. This models as a service solution (MaaS) will streamline AI development by embedding models directly into the GitHub user experience, alongside repositories, enabling familiar workflows such as pull requests, commits, code reviews and continuous integration and continuous delivery (CI/CD). This AI-native experience will automate model evaluations so developers can instantly experiment with leading models from OpenAI, Meta, Cohere, Microsoft and Mistral, evaluate the best model based on cost and performance, rapidly prototype AI apps and agents and deploy them to production. This is now in preview.
- GitHub Copilot agent mode: Agent mode will supercharge the AI-assisted coding experience to become a true peer programmer. Acting alongside developers in their Integrated Development Environment (IDE), agent mode will be able to build features, refactor legacy code and even heal itself when things break. It will go far beyond autocomplete to help developers analyze a codebase, edit across multiple files, run tests, fix errors and suggest terminal commands, all from a single prompt. Available in Visual Studio Code (VS Code) and Visual Studio, agent mode will expand to new IDEs, including JetBrains, Eclipse and Xcode. This is now in preview.
These new capabilities are designed to make software development and operations more efficient, allowing teams to focus on higher-value activities and innovate faster.
Additional resources:
- Blog: Agentic DevOps: Evolving software development with GitHub Copilot and Microsoft Azure
- Blog: Reimagining App Modernization for the Era of AI
- Blog: Introducing Azure SRE Agent
- Download: Visual assets
- Breakout: Reimagining Software Development and DevOps with Agentic AI
- Breakout: The Agent Awakens: Collaborative Development with GitHub Copilot
- Breakout: Accelerate Azure Development with GitHub Copilot, VS Code & AI
- Breakout: Java App Modernization Simplified with AI
- Breakout: Agent Mode in Action: AI Coding with Vibe and Spec-Driven Flows
- Breakout: The Future of .NET App Modernization Streamlined with AI
- Breakout: Develop, Build and Deploy LLM Apps using GitHub Models and Azure AI Foundry
- Demo: Simplifying .NET upgrades with GitHub Copilot
2.3.2. Introducing GitHub Copilot coding agent and updates to GitHub Models
GitHub’s deeper integration of AI and AI agents highlights the agentic evolution of the GitHub platform. These developments, now in preview, include:
GitHub Copilot coding agent: GitHub Copilot will evolve from an in-editor assistant to an agentic AI partner with a first-of-its-kind autonomous, asynchronous developer agent integrated into the GitHub platform. GitHub Copilot will be able to test, iterate and refine code in GitHub as an AI agent that developers can delegate both routine and specialized tasks to move projects forward. Copilot is an AI teammate, not just a tool.
GitHub Models: As a first stop for anyone developing with AI, GitHub Models will give teams a single, trusted hub to explore best‑in‑class models and create, store, evaluate and share prompts, without leaving GitHub. By centralizing model and prompt evaluation in an intuitive space, users will be able to build, test and manage AI features directly from their repository, without switching context or tools. Plus, organization-level model controls ensure developers can experiment and move fast with the guardrails to do so securely.
Additional resources:
- Blog: GitHub Copilot: Meet the new coding agent
- Download: Visual assets
- Breakout: Accelerate Azure Development with GitHub Copilot, VS Code & AI
- Breakout: Agent Mode in Action: AI Coding with Vibe and Spec-Driven Flows
- Breakout: Develop, Build and Deploy LLM apps using GitHub Models and Azure AI Foundry
- Breakout: Reimagining Software Development and DevOps with Agentic AI
- Breakout: The Agent Awakens: Collaborative Development with GitHub Copilot
- Demo: Prototype, build, and deploy AI apps quickly with GitHub Models
2.3.3. Open-sourcing GitHub Copilot Chat in Visual Studio Code
AI is quickly becoming core to the entire software lifecycle – powering everything from app creation and testing to refactoring and optimizing apps in production – with GitHub Copilot now central to development in Visual Studio Code (VS Code). This type of innovation is best when done in the open, in collaboration with the community. Today, we’re taking the next step in our open-source journey and open-sourcing GitHub Copilot Chat in VS Code.
Over the next few months, the AI-powered capabilities from the GitHub Copilot extensions will be part of the VS Code open-source repository, the same open-source repository that drives the most popular software development tool. This reflects our commitment to transparency, community-driven innovation, and to giving developers a greater voice in shaping the future of AI-assisted development.
Open source enables Microsoft and the community to collaborate and co-innovate, bring better and more creative ideas to market, and ultimately accelerate innovation across the AI-assisted software development lifecycle.
Transitioning the AI capabilities that power the Copilot experience in VS Code to open source reflects a commitment to open, transparent and community-driven development of AI-powered tools – continuing the model that made VS Code the most popular editor.
Additional resources:
2.4. Industry
2.4.1. Microsoft Planetary Computer Pro now in preview
Microsoft Planetary Computer Pro, in preview, will allow customers to deploy within Microsoft Azure and will help them enhance geospatial insights generation and integrate those insights into enterprise data and AI workflows such as Microsoft Fabric or third-party products like Esri ArcGIS. It will enable the ingestion, management and dissemination of customers’ private geospatial datasets. It will facilitate the integration of geospatial data with business operations for more informed decision-making.
By providing efficient access and management of geospatial data, Microsoft Planetary Computer Pro will reduce barriers to applying Microsoft Copilot and AI models to these datasets. It will also allow for the integration of partner solutions with data, simplifying the process of generating geospatial insights. These insights then will be incorporated into mainstream data and AI workflows, supporting better and faster business decision-making.
Additional resources:
2.5. Science
2.5.1. Microsoft Discovery will help research and development teams boost breakthroughs
Microsoft Discovery is an enterprise agentic platform that helps accelerate research and discovery by transforming the entire discovery process with agentic AI — from scientific knowledge reasoning to hypothesis formulation, candidate generation, and simulation and analysis. The platform enables scientists and researchers to collaborate with a team of specialized AI agents to help drive scientific outcomes with speed, scale and accuracy using the latest innovations in AI and supercomputing.
Additional resources:

3. Business Applications 3.1. Business Applications
3.1.1. Power Apps redefines human/agent collaboration
Microsoft Power Apps is introducing powerful new capabilities for building AI-powered solutions to enable deeper collaboration with agents for both makers and users. The capabilities will include:
- The ability to build plans in Power Apps using a unified development canvas is now generally available. In this experience developers collaborate with agentic AI to define business requirements, generate data models, process maps (in preview) and design solution architecture. Suggested technology in plans will also include Microsoft Copilot Studio agents (generally available May 30th), pages and dashboards in addition to apps and flows. Makers will be able to easily reuse existing assets like apps, tables and flows, and build plans from existing solution.
Additionally, developers in Power Apps Studio will be able to use natural language to create fully functional, generative pages with underlying React code, offering a faster, more intuitive app-building experience. This is available in the Early Access Program.
- The agent feed for apps is a new hub for human-agent collaboration where users can view, manage and monitor agents within their app. An activity feed keeps users updated on actions of agents and guides them to relevant screens when human input is needed. Makers in Early Access Program can set up agents to suggest the most relevant actions based on historical data and current records and empower business users to build their own templatized automations to expedite work.
- Bring code-first apps to Power Platform will allow developers to develop and deploy apps using their preferred tools, with full control over their code, and then bring those to Power Platform. For example, in Cursor or VS Code, a developer will be able to ask the AI assistant about APIs available on a data source and have the assistant help write code to invoke a chosen API, while retaining the ability to edit the code. When these apps run with Power Platform, will be able to leverage managed platform capabilities and can invoke flows or agents like those created within Power Apps. This is available in the Early Access Program.
Additional resources:
- Blog: Reimagining human-agent collaboration for a new era of app development with Microsoft Power Apps
- Download: Visual assets
- Watch: Walkthrough of agent feed in Power Apps – users and makers experience
- Breakout: Build agent-first solutions with Power Platform and Copilot Studio
- Breakout: Collaborate with a team of agents to build intelligent solutions
- Breakout: Extend your Copilot Agent in Power Apps with Copilot Studio and new SDKs
3.1.2. Plan, ship and build AI-agentic native business portals with Microsoft Power Pages
New features and updates in Microsoft Power Pages are aimed at making business portal development more accessible and efficient for professional developers. These enhancements are designed to streamline the development process, integrate advanced tools and provide innovative solutions for creating dynamic and interactive business portals. Key updates include:
- Bring your own code: Makers will be able to bring their own code to Power Pages, leveraging third-party, new-age code generation tools. This capability will make coding more accessible by enabling development through natural language. This approach will shift the programmer’s role from manual coding to guiding, testing and refining AI-generated code, making the process more intuitive and engaging while maintaining enterprise standards. This capability is in preview.
- Integration with Visual Studio Code (VS Code): Power Pages will integrate with VS Code, allowing developers to preview their portals directly from VS Code without leaving their development environment. This feature will include UI actions to run CLI commands and switch environments, streamlining the development process. This feature is in preview.
- Languages feature: Power Pages is also expanding its multilingual support, allowing customers to create websites in custom languages beyond the 45 languages currently available. This functionality allows all out-of-the-box components like forms, lists, multistep forms and card galleries to utilize content snippets for specifying content translation, enabling customers to build websites in a language of their choice. This feature is generally available and set to uplevel the creation process for multilingual portals.
- Adding agents in Power Pages: Makers will be able to add agents in Power Pages, including agents from Microsoft Copilot Studio. This means makers will be able to integrate multiple agents into their sites, enabling users to securely update their Microsoft Dataverse records or perform various tasks through these agents. Users will be able to effortlessly switch between agents, such as using chat agents for Q&A. This capability, in preview, is key for creating interactive and responsive business portals.
- Client API support for business rules: This update provides a robust set of functions within the Power Pages client library for easy interaction with business portals. This update is generally available.
- Intelligent list search and customization: This feature will use natural language to query large datasets and customize AI insights, making data interaction more intuitive and efficient. It is currently in preview.
Copilot in Power Pages, generally available, makes it easier for makers to design or leverage new templates for pages. These updates enhance the capabilities of Power Pages, making it easier for developers to create dynamic, interactive and secure business portals and drive innovation in business portal development.
Additional resources:
3.1.3. Microsoft unlocks the autonomous enterprise with MCP servers for Dynamics 365
Microsoft Dynamics 365 will be AI-ready with new Model Context Protocol (MCP) servers, in preview, that will make Dynamics 365 data and actions accessible for AI agents. Prebuilt Microsoft agents and custom agents built in Microsoft Copilot Studio or other agent platforms will be able to use the MCP servers to enrich context and execute end-to-end process orchestration across departments and systems. Features include:
- Dynamics 365 apps go MCP-native: Microsoft’s business apps will be MCP server-compliant, instantly elevating them to first-class citizens in the new agent-driven ecosystem.
- Copilot Studio: Copilot Studio will emerge as a premier MCP host, enabling business agents to easily communicate with all Dynamics 365 apps through standardized protocols.
- Cross-app intelligence at scale: Enterprises with multiple Dynamics 365 instances – a mix of MCP-compliant apps – will be able to orchestrate intelligent agent workflows that span business silos.
- Agent development simplified: Building agents for partners and customers will be radically streamlined. MCP standardization will remove complexity and accelerate time to value.
Additional resources:

4. Edge 4.1. Edge
4.1.1. AI APIs and Phi small language model allow developers to enhance web apps
Many web developers are looking for alternative methods to integrate AI features into their apps while maintaining data privacy. Cloud-based models raise privacy and cost concerns, while managing on-device models is complicated.
With the new AI APIs in Microsoft Edge, developers can now quickly and easily incorporate AI functionality into their web apps using models built into the browser. Furthermore, these APIs give sites and extensions access to Phi-4-mini, a 3.8 billion parameter model developed by Microsoft that compares favorably with significantly larger models in testing.
The new family of AI APIs in Edge includes the Prompt API to easily prompt the model and a set of Writing Assistance APIs for generating, summarizing and editing text, both now available as developer trials. The Translator API to get text translations will be released in the next couple of months. With these APIs built into Edge, developers can streamline the development process and offload high-frequency AI tasks, thereby minimizing costs and effort.
For developers dealing with sensitive data or working in regulated industries, these APIs offer the privacy and security of on-device processing, eliminating the need to send data to external cloud services.
These experimental APIs are intended as potential web standards and will work across platforms, browsers and with other AI models. Developers can benefit from Edge offering access to the compact yet powerful Phi-4-mini model.
The new APIs are available in the Edge Canary and Dev channels, and the Edge team is inviting feedback to iterate and develop more AI APIs for the developer community.
Additional resources:
4.1.2. PDF translation in Microsoft Edge creates new documents in a few clicks
Navigating PDF content in a different language can be frustrating. Imagine buying a new device, but the instructions are in a foreign language. Or imagine working in a global company and needing to train a team on new policies, but the documents provided aren’t in the local language. This often leads to confusion and lost productivity.
Traditionally, readers had to translate PDFs by highlighting content line by line. This is time-consuming and exhausting and resulted in fragmented translations that lost original meaning and context.
Microsoft Edge will be able to translate full pages of PDFs from more than 70 languages. With just a few clicks, users will be able to open a PDF in Edge, click the Translate icon in the Edge address bar and quickly create a new document fully translated into the language of choice. Users will get real-time translations of entire PDF documents, eliminating the struggle to understand important documents.
PDF translation will be generally available next month, and Canary users can try it now.
Additional resources:
4.1.3. Automate tasks, summarize documents with Microsoft Copilot Chat in Edge for Business
AI can provide smart answers but using it repeatedly for the same tasks can feel monotonous. Additionally, sorting through vast amounts of web information can lead to extraneous details, making it frustrating and time-consuming to refine prompts for perfect answers.
Agents in Microsoft 365 Copilot, now in preview, will address this by automating tasks and providing more meaningful results. For example, Sales Assistant Agent will generate leads, track customer interactions and offer sales insights. Agents will be available in early June through Copilot Chat in the Microsoft Edge for Business sidebar and will enable users to access them without leaving the page and breaking their flow.
Copilot Chat in Edge for Business is getting better with the new ability to summarize online Microsoft 365 Word, Excel and PowerPoint documents. Available in the sidebar, summarization will help users digest complex information, save time and stay in their flow without leaving the document.
This feature is in preview for commercial users with a Microsoft 365 Copilot license.
Additional resources:
4.1.4. Block inappropriate sites at no cost on Edge for Business
For IT admins in schools and small businesses, safeguarding students and employees on the web is a daunting task. Every day, new websites pop up that are distracting at best and full of scams and malware at worst.
Web content filtering, in preview, is a key line of defense that Microsoft Edge for Business is offering at no additional cost to schools and small businesses that standardize on Edge for Business exclusively. Not all organizations can afford costly web content filtering solutions, and some solutions depend on the customer to provide their own lists of websites to block.
Web content filtering on Edge for Business is simple. Admins can block millions of inappropriate sites just by selecting categories and the block list updates daily.
Configuration is done in the Edge management service in the Microsoft 365 admin center, making the UI simple and deployment quick. Filtering even works when the device is off the organization’s network and includes smart defaults designed for schools, such as age-appropriate content. Clear reporting is available through Power BI.
This option is available for managed Windows devices running Windows 10 or newer OS. It requires a Microsoft 365 Education or Business Premium license and the use of Microsoft Intune. With filtering that applies to all web traffic, organizations that adopt Edge for Business exclusively can take advantage of a powerful tool to help create a secure and distraction-free online environment.
Additional resources:

5. Security 5.1. Security for AI
5.1.1. Microsoft Purview SDK will offer enterprise-grade data security and compliance controls
For enterprises to confidently adopt AI apps, developers need to build their AI solutions with enterprise-grade data security and compliance controls. Yet, the challenge lies in the implementation — security measures can often become a blocker in the development process, especially when developers lack easy-to-integrate solutions.
Microsoft Purview SDK, in preview, will provide REST APIs, documentation and code samples for embedding Microsoft Purview data security and compliance into AI apps directly from any integrated development environment.
By embedding Microsoft Purview REST APIs into AI apps, the code will push prompt and response-related data into Microsoft Purview, which automatically provides signals to security and compliance teams to support investigations by discovering, protecting and governing data. Additionally, Microsoft Purview performs real-time classification over prompts to identify and block sensitive data from being accessed by the large language model. It can empower organizations and data security admins enabling AI adoption with industry-leading data security and compliance capabilities, including the ability to:
- Accelerate the adoption of generative AI (GenAI) apps and agents across the enterprise.
- Gain comprehensive visibility around data security risks and posture across the AI portfolio with a single pane of glass.
- Reduce the risk of data oversharing and data leakage.
Additional resources:
5.1.2. New Microsoft Purview capabilities for AI interactions within Azure AI workloads
With organizations building AI apps and the rapid pace of AI innovation security risks, such as sensitive data leakage, are amplified. Security teams need both visibility and controls to manage risks in custom-built AI solutions. Microsoft Purview will now natively integrate with Azure AI, enabling data security admins to discover, protect and govern data risks in AI interactions within Azure AI workloads. These updates, in preview, include:
- Microsoft Purview Data Security Posture Management (DSPM) for AI will enable data security admins to discover data security risks, such as sensitive data in user prompts and responses, and unethical AI usage, such as harmful and violent content, in AI apps and agents built by developers using Azure AI Foundry Services. Additionally, to understand the overall risk posture of these custom AI apps, admins will also see the AI usage by users’ risk levels and will be able to receive recommended actions to address these risks.
- In Insider Risk Management, data security admins with appropriate permissions will be able to detect and respond to risky AI interactions. For example, when a departing employee accesses a custom AI app built on Azure AI to retrieve an anomalous amount of sensitive data, data security admins with the right permissions will receive alerts and will be able to conduct investigations and leverage risk-adaptive controls, such as removing the employee’s access to the AI app.
- All AI interactions will be logged into Microsoft Purview Audit. Data compliance admins will be able to create Data Lifecycle Management policies to retain and/or delete AI interactions, conduct eDiscovery cases and detect and investigate noncompliant usage, such as harmful content, in Communication Compliance.
Additional resources:
5.1.3. Microsoft Defender for Cloud brings security insights into Azure AI Foundry
Microsoft Defender for Cloud, Microsoft’s cloud-native application protection platform (CNAPP), will integrate AI security posture and runtime threat protection for AI services in the Azure AI Foundry portal. With this integration, developers will gain visibility into security risks, vulnerabilities and security alerts from Defender for Cloud to protect against threats, such as jailbreak attacks, sensitive data leakage and wallet abuse, within the Azure AI Foundry environment. Integrated security insights will include:
- AI security posture recommendations that will surface potential misconfigurations and vulnerabilities within AI services and provide security best practices to remediate related risks to projects.
- Threat protection alerts for AI services that will provide developers with visibility to active threats on their AI services, with over 15 detections enriched with Microsoft threat intelligence that will enable developers with the relevant mitigation steps.
Security insights from Defender for Cloud is in preview and will be made available in the Azure AI Foundry portal by June 2025.
Additional resources:
5.1.4. New Microsoft Purview capabilities for Copilot Studio now in preview
Organizations are leveraging the power of AI agents built in Microsoft Copilot Studio to drive business innovation and efficiency. Many are creating agents to engage directly with their customers. Organizations want to gain visibility into the activity of unauthenticated users and proactively manage potential risks in these AI agent interactions. For instance, a financial services firm may want to prevent the exposure of sensitive investment data or client records when deploying AI agents that interact with external users.
Microsoft Purview Data Security Posture Management (DSPM) for AI and Audit will support Copilot Studio agent interactions specifically for agents created by organizations for their customers. These interactions will be visible in DSPM for AI, allowing data security admins to have visibility into the AI interactions via Audit. This enhancement will help admins remain informed about the data risks associated with their agents and help keep sensitive data secure. These capabilities are in preview.
Additional resources:
5.1.5. Data Loss Prevention controls for Microsoft 365 Copilot agents
Data oversharing and leakage is top of mind for organizations adopting generative AI (GenAI) technologies. According to Microsoft market research, 80% of business leaders cite data leakage by employees using AI as their top concern regarding GenAI adoption. Microsoft Purview Data Loss Prevention (DLP) controls will extend to agents in Microsoft 365 Copilot to help reduce the risk of AI-related oversharing at scale.
Last year, Microsoft announced DLP for Microsoft 365 Copilot, which will enable data security admins to exclude Microsoft SharePoint documents with specified sensitivity labels from being summarized or used to create responses in Microsoft 365 Copilot. This capability will be generally available in late June.
This capability will also extend to Microsoft 365 Copilot agents now in preview. Admins will be able to prevent sensitive content within a labeled document from being readily available to copy and paste into other apps or processed by Copilot for grounding data. Examples include confidential legal documents with highly specific verbiage that could lead to improper guidance if summarized by an AI agent or modified by users, or “internal only” documents with data that shouldn’t be copied and pasted into emails sent outside of the organization.
Additional resources:
5.1.6. Microsoft Entra Agent ID now in preview
Microsoft is reaching the first critical milestone in extending its complete identity and access solution to AI agents with the introduction of Microsoft Entra Agent ID. Now in preview, this will center around the first-party integration with Microsoft Copilot Studio and Azure AI Foundry.
It will tackle the AI agent sprawl problem by assigning a unique identifier to every agent in an environment. With Entra Agent ID, admins will be able to:
- See all AI agents created using Copilot Studio and Azure AI Foundry in one place.
- Know what those agents can access inside their organization.
Additional resources:
5.1.7. Azure AI Foundry evaluation now integrated with Microsoft Purview
AI regulations and standards like the EU AI Act introduce new compliance requirements that demand organizations take accountability for documenting AI apps and agents’ use cases and their controls to address risks. As developers build AI apps and agents, they need guidance and tools to help them evaluate risks based on compliance requirements organizations are subject to, as well as ways to share the control and evaluation insights with compliance and risk teams.
The Azure AI Foundry evaluation tool will be integrated with Microsoft Purview Compliance Manager. For example, for developers who are building an AI agent in Europe, their compliance teams may request a Data Protection Impact Assets (DPIA) and Algorithmic Impact Assessment (AIA) to be completed to support compliance with Article 9 (Risk Management System) and Article 17 (Technical Documentation) of the EU AI Act. Based on Microsoft Purview Compliance Manager’s step-by-step guidance on controls implementation and testing, compliance teams will be able to evaluate risks such as potential bias, cybersecurity vulnerabilities or lack of transparency in model behavior.
Once evaluations are conducted in Azure AI Foundry, developers will be able to obtain a report with documented risk, mitigations and residual risk for compliance teams to upload to Microsoft Purview Compliance Manager to support audits and provide evidence to regulators or external stakeholders. This update is in preview.
Additional resources:

6. Supporting the Agentic Web 6.1. Model Context Protocol
6.1.1. Microsoft contributes identity and registry standards to support MCP ecosystem
Microsoft and GitHub have joined the Model Context Protocol (MCP) Steering Committee and are delivering broad first-party support for MCP across their agent platform and frameworks, spanning GitHub, Copilot Studio, Dynamics 365, Azure, Azure AI Foundry, Semantic Kernel, Foundry Agents and Windows 11.
Microsoft and GitHub are announcing two new contributions to the MCP ecosystem to help advance secure, at-scale adoption of the open protocol, ultimately benefiting developers’ agentic innovation across platforms:
- Identity and authorization specification: Microsoft’s identity and security teams have collaborated with Anthropic, the MCP Steering Committee and the broader MCP community to design an updated authorization spec that allows MCP-connected apps to more securely connect to MCP servers. This spec enables people to use their Microsoft Entra ID or other trusted sign-in methods to give agents and large language model (LLM)-powered apps access to data and services such as personal storage drives or subscription services. This is an important step toward enabling agent-based experiences in enterprise and consumer contexts where trust and accountability are essential.
- Public, community-driven registry of MCP servers: GitHub and the MCP Steering Committee have collaborated to design a registry service for MCP servers. This registry service allows anyone to implement public or private, up-to-date, centralized repositories for MCP server entries and enable the discovery and management of various MCP implementations with their associated metadata, configurations and capabilities.
Both contributions are now generally available and reflect Microsoft’s broader commitment to advancing open standards and shared infrastructure for the next generation of agents.
Additional resources:
6.2. NLWeb
6.2.1. NLWeb brings conversational interfaces directly to the web
NLWeb, a new open project, plays a similar role to HTML for the agentic web. NLWeb makes it easy for websites to provide a conversational interface for users with the model of their choice and their own data. This allows users to interact directly with web content in a rich, semantic manner. Every NLWeb endpoint is also a Model Context Protocol (MCP) server, so websites can make their content easily discoverable and accessible to AI agents if they choose. This is publicly available.
Additional resources:

7. Windows 7.1. Windows
7.1.1. Introducing Windows AI Foundry, a unified platform for local AI development
Several updates for local AI development on Windows that developers will be able to advantage of include:
Windows AI Foundry, an evolution of Windows Copilot Runtime, offers a unified and reliable platform supporting the AI developer lifecycle from model selection, optimization, fine-tuning and deployment across client and cloud. Windows AI Foundry includes several capabilities:
- Windows ML is the foundation of Windows AI platform and the built-in AI inferencing runtime on Windows. This enables developers to bring their own models and deploy them efficiently across the silicon partner ecosystem including – AMD, Intel, NVIDIA and Qualcomm spanning CPU, GPU and NPU. Windows ML is an evolution of DirectML (DML) based on our learnings from the past year, listening to feedback from many developers, our silicon partners, and our own teams developing AI experiences for Copilot+ PCs. This offers several benefits including:
- Simplified Deployment: Developers will be able to ship production apps without needing to package ML runtimes, hardware execution providers, or drivers with their app. Windows ML will detect the target hardware on the client device, pull down the appropriate execution providers and select the right one for inference based on developer provided configuration. This is in preview.
- Automatically adapts to future generations of AI hardware: Developers will be able to build AI apps confidently, even in a fast-changing silicon ecosystem. As new hardware becomes available, Windows ML will keep all required AI dependencies up to date and will adapt to new silicon while maintaining model accuracy and hardware compatibility.
- New tools to prepare and ship performant models: A robust set of tools in AI Toolkit for VS Code (AI Toolkit) will support model and app preparation – conversion to ONNX from PyTorch, quantization, optimization, compilation and profiling – to help developers ship production apps with proprietary or open-source models. These tools are being designed to simplify the process of preparing and shipping performant models via Windows ML without having to create multiple builds and complex logic. This is in preview.
Windows AI Foundry integrates Foundry Local and other model catalogs like Ollama and NVIDIA NIMs, offering developers quick access to ready-to-use open-source models on diverse Windows silicon. This offers the ability for developers to browse, test, interact and deploy models in their local apps. Features, in preview, include:
- Rich catalog of ready-to-use OSS models: Foundry Local will automatically detect device hardware (CPU, GPU, and NPU) and list compatible models for developers to try.
- Command Line Interface (CLI): This will allow developers to leverage CLI commands like “foundry model list, foundry model run” to browse, test and interact with models running on a local server. Developers will be able to leverage the Foundry Local SDK to easily integrate Foundry Local in their app.
Ready-to-use APIs powered by on-device models for Windows development: Windows AI APIs offer a fast and easy path for Windows app developers to integrate AI capabilities that run locally on Copilot+ PCs. These APIs are powered by inbox models that ship with Windows. Features include:
- LoRA for Phi Silica: LoRA (Low-Rank Adaption) makes fine-tuning more efficient by updating only a small subset of parameters of the model with custom data. This will allow improved performance on desired tasks without affecting the model’s overall abilities. This is available in preview on Snapdragon X Series NPUs and will be available on Intel and AMD Copilot+ PCs in the coming months. Developers can now access LoRA for Phi Silica in Windows App SDK 1.8 Experimental 2.
- Knowledge Retrieval for large language models (LLMs) and Semantic Search: Semantic Search APIs help developers create powerful search experiences using their own app data. These APIs power both semantic search (search by meaning, including image search) and lexical search (search by exact words), giving users more intuitive and flexible ways to find what they need. These search APIs run locally on all device types and offer seamless performance and privacy. On Copilot+ PCs, semantic capabilities are enabled for premium experience. Beyond traditional search, these APIs also support RAG (Retrieval-Augmented-Generation), enabling developers to ground LLM output with their own custom data. These APIs are available in private preview today.
- Existing APIs move to stable release: Developers can leverage certain text and image APIs into their published apps to run locally and only on Copilot+ PCs. This is generally available.
Additional resources:
- Blog: Advancing Windows for AI development: New platform capabilities and tools introduced at Build 2025
- Download: Visual assets
- Breakout: An overview of Windows Copilot Runtime
- Breakout: Bring your own model to Windows using Windows Copilot Runtime
- Breakout: Fastest & easiest way to integrate AI using Windows Copilot Runtime
7.1.2. Introducing native support for Model Context Protocol and App Actions on Windows
Boosting app discoverability, reach and engagement is essential in today’s competitive market. Microsoft aims to transform how users find and interact with apps on Windows through two new offerings Model Context Protocol (MCP) on Windows and App Actions on Windows.
Model Context Protocol (MCP) on Windows, will offer a standardized framework for AI agents to connect with native Windows apps, enabling them to easily participate in agentic interactions on Windows. Windows apps can expose specific functionality to augment the skills and capabilities of agents installed locally on a Windows PC. This will be available in a private developer preview with select partners in the coming months and will include these components:
- MCP Registry for Windows: A single, secure and trustworthy source to make MCP servers accessible to AI agents on Windows. Agents will be able to discover installed MCP servers on Windows PCs via the MCP Registry, leverage their expertise, and offer meaningful value to users.
- MCP Servers for Windows: This will include Windows system functionalities like File System, Windowing and Windows Subsystem for Linux as MCP Servers for agents to interact with. Developers will be able to wrap desired features and capabilities in their apps as MCP servers and make them available via MCP Registry for Windows.
Security is our top priority as we expand MCP capabilities on Windows. To learn more about the security approach, visit this blog.
- App Actions on Windows a new capability for developers, which will also be available as built-in MCP servers enabling apps to provide their functionality to agents. The new capability for app developers will help increase discoverability for their features. Developers will be able to build and test actions using:
- App Actions APIs: These APIs will enable developers to author actions for their desired features. Developers will be able to consume actions developed by other relevant apps to offer complementary functionality, thereby increasing the engagement time in their apps. These APIs are available now in Windows SDK 10.0.26100.4188 or greater.
- App Actions Testing Playground: This tool will provide developers with a dedicated environment to test the functionality and user experience of their actions. Developers can now download the testing tool via the Microsoft Store.
Additional resources:
7.1.3. Productivity improvements to Windows Developer tools
Improvements to popular Windows Developer tools including Terminal, WinGet, WSL and Microsoft PowerToys will enable developers to increase their productivity. These features will allow developers to focus on code and will include:
WinGet Configuration: Users will be able to effortlessly set up and replicate development environments using a single, reliable WinGet Configuration command. Developers will be able to capture the current state of their device, including apps, packages and tools (available in a configured WinGet source) into a WinGet Configuration file. WinGet Configuration will be updated to support Microsoft DSC V3. If installed apps and packages are DSC V3 enabled, the app settings will also be included in the generated configuration file. This will be generally available in summer 2025.
Windows Subsystem for Linux (WSL) is now open source: WLS open source enables developers to access the source code of WSL and make enhancements as per their needs by contributing pull requests to the WSL repository. It facilitates collaboration among WSL users, enabling them to engage in issue resolution and learn together as a community. This is generally available.
Advanced Windows Settings: Developers and power users often face challenges in customizing Windows to meet their unique needs due to hidden or obscure settings. Advanced Windows Settings will allow developers to effortlessly control and personalize their Windows experience. They will be able to access and configure powerful, advanced settings with just a few clicks, all from a central place within the Windows Settings app. These will include powerful settings like enabling File Explorer with GitHub version control. This is currently available in the Windows Insider Program.
Command Palette in PowerToys: Command Palette, the next evolution of PowerToys Run, enables developers to reduce context switching efforts by providing an easy way to access all their frequently used commands, apps and workflows from a single place. It is customizable, fully extensible and highly performant, empowering developers to manage interactions with their favorite tools effectively. This is now generally available.
Edit, the new command-line text editor on Windows: Windows will include a command-line text editor called Edit, installed by default in Windows. Developers will be able to access this editor by running “edit” in the command line. This will enable developers to edit files directly in the command line, staying in their current flows and minimizing unnecessary context switching. This is currently open source and will be available in the Windows Insider Program in summer 2025.
Additional resources:
- Blog: Advancing Windows for AI development: New platform capabilities and tools introduced at Build 2025
- Download: Visual assets
- Breakout: Boost your development productivity with Windows latest tools and tips
- Demo: Easily setup dev environments with WinGet and Microsoft DSC V3
- Demo: Extending your application with the new PowerToys Command Palette
7.1.4. VBS Enclave SDK and tooling makes building enclaves easier
The Virtualization-based security (VBS) Enclave SDK is a repository and set of NuGet packages containing APIs and tooling that will make building VBS enclaves easier to develop.
The API projection layer will allow developers to define an interface between the host app and the enclave, while the tooling will do all the hard work to validate parameters and handle memory management, allowing developers to focus on the enclave logic.
The SDK will make it easy for developers to handle enclave creation, manage thread pools and report telemetry. This package is in preview.
Additional resources:
7.1.5. Windows security posture changes to help protect user privacy
Two important changes to Windows security posture will help keep users in control of their security and privacy.
Administrator protection is a new platform security feature that will prevent unintended elevation by ensuring that users validate with Windows Hello before allowing actions that require administrator privileges.
Access to sensitive resources such as camera, microphone and location (C/M/L) will soon require explicit user consent. The journey begins with Windows changing the desktop access switch for these resources from default ON to OFF, ensuring users have more control over which apps can access this data.
Desktop app developers are encouraged to update their apps to include package identity. Existing apps can register package identity by adding an identity package to their installer. For more information, see Grant package identity by packaging with external location.
Both features, now in preview, may impact apps that rely on previous platform behavior. App developers will need to validate their apps work as expected when Administrator protection is enabled and the default desktop toggle for C/M/L privacy settings is turned OFF.
Additional resources:
7.1.6. New improvements in Windows App SDK
The Windows App SDK now incorporates advanced Windows AI capabilities in Copilot+ PCs, enabling developers to easily integrate intelligent features into their apps. These enhancements include local AI functionalities, such as responding to incoming prompts, recognizing text within images, describing image contents, removing objects from pictures and more.
In addition to the new APIs, the June experimental release of the Windows App SDK will introduce a significant change: the Windows App SDK NuGet package will be converted to a NuGet metapackage. This will allow developers to select specific components for their apps, enabling them to include only the APIs and functionalities that are necessary for their apps. This is in preview.
Additionally, React Native for Windows now supports the React Native New Architecture by default. With version v0.80, React Native for Windows now adds support for modern React features and, on Windows, uses the modern Windows App SDK and WinUI3. This is generally available.
Additional resources:
7.1.7. Microsoft Store on Windows helps empower developers
Microsoft Store on Windows is designed to put developers in control with tools and features to help reach new customers. New features, in preview, include:
Easy publishing with zero onboarding fees: Individual developers can sign up and publish to Microsoft Store for free starting in June 2025. This will make Microsoft Store on Windows the first global digital store to waive the fee for publishing apps. Publishing to Microsoft Store will be easy with streamlined submission experiences like easier privacy policy hosting, improved help and support, more actionable certification reports and a policy change to allow a noninteractive progress bar for Win32 app installation.
Elevated distribution reach through App campaigns: Developers will be able to run App campaigns to promote their Windows app in the Microsoft Store and other Microsoft products. This will be available on the Microsoft Advertising Platform, currently in open-beta. App campaigns are an effective way to reach new audiences, drive incremental downloads of an app and easily track user actions in the app post-download.
Additionally, Microsoft Store is introducing a new set of capabilities for developers that will include:
- An expanded set of Health Report insights in Partner Center will give users enhanced visibility into app quality and performance. New health metrics like crash rate, hang rate and affected device counts will help users prioritize failures based onuser impact. Proactive notifications for unusual spikes in failures and the ability to compare quality metrics across different app versions, architectures and devices will help prioritize failure sources. The enhanced acquisition report will include an install success rate metric, offering clear insights into app conversion.
- Support for updating Win32 apps directly in Microsoft Store, even if the updates are provided by their publishers. Previously, once MSI/EXE apps were installed, users could not get the newer version via Microsoft Store. Now, when a new version is available, users will be able to update to the latest version from downloads or the product’s detail page.
- Microsoft Store will display when apps were last updated, addressing one of the top requests from the developer community. Developers will be able to signal their commitment to app quality, enhance user confidence and drive business growth on Microsoft Store.
Additional resources:
7.1.8. Post-quantum cryptography algorithms now in preview
As the digital landscape continues to evolve, the emergence of quantum computing presents both significant opportunities and challenges. One of the most pressing concerns is the potential for quantum computers to compromise traditional cryptographic algorithms, which are fundamental to the security of online communications, financial transactions and data storage. In response to this imminent threat, post-quantum cryptography (PQC) has become a crucial area of preparation for Microsoft, its customers and its partners.
PQC capabilities will be integrated into Windows Insiders, Build 27852 and higher and SymCrypt-OpenSSL version 1.9.0 and higher. This development will allow customers to begin their exploration and experimentation with PQC within their environments.
Module-lattice based key encapsulation mechanism (ML-KEM) and digital signature algorithm (ML-DSA) are some of the first quantum-safe cryptography algorithms approved by the National Institute of Standards and Technology. In December, ML-KEM and ML-DSA were added to Microsoft’s cryptographic library, SymCrypt. Now, ML-KEM and ML-DSA are available to Windows Insiders via updates to the Cryptography API: Next-Generation (CNG) libraries and Certificate and Cryptographic messaging functions. For Linux customers, the SymCrypt-OpenSSL (SCOSSL) provider provides an interface to use the SymCrypt algorithm implementations through the OpenSSL API surface.
Customers will be able to begin experimenting with ML-KEM in scenarios where public key encapsulation or key exchange is desired, to prepare for “harvest now, decrypt later” threat. The addition of ML-DSA in Cryptography API: CNG will enable customers to begin experimenting with PQC algorithms for scenarios that require verification of identity, integrity or authenticity using digital signatures.
The integration of PQC into Windows Insiders and SCOSSL marks an important first step in enabling customers to explore PQC within their environments. Quantum computing has significant potential to solve some of humanity’s greatest challenges and by proactively addressing the security concerns with current cryptographic standards, Microsoft is paving the way for a digital future that both realizes the benefits of quantum and mitigates the security risks. As quantum computing continues to advance, the adoption of PQC will be crucial in safeguarding data, communications and digital infrastructure.
Additional resources: