Microsoft Build 2025 Keynote: Everything Revealed, in 14 Minutes
Estimated read time: 1:20
Learn to use AI like a Pro
Get the latest AI workflows to boost your productivity and business performance, delivered weekly by expert consultants. Enjoy step-by-step guides, weekly Q&A sessions, and full access to our AI workflow archive.
Summary
The Microsoft Build 2025 Keynote highlighted major advancements in AI, development tools, and enterprise solutions. Satya Nadella presented a comprehensive overview of new features and tools aimed at enabling developers to build scalable, agent-driven web applications. Visual Studio and VS Code received updates enhancing cross-platform development. GitHub Copilot has been open-sourced, now integrating AI coding capabilities directly into VS Code. Excitingly, autonomous agents were introduced to aid in site reliability engineering and a host of new agent features were unveiled. Microsoft also announced innovative data solutions integrating Cosmos DB and Azure Databricks, enabling seamless AI application development. The introduction of Windows AI Foundry and Foundry Local facilitates AI app development at the edge, while new enterprise-grade agents cater to industry-specific needs. Microsoft Discovery aims to advance scientific research by integrating AI into R&D processes, significantly impacting various industries.
Highlights
Open-source GitHub Copilot brings AI coding power to VS Code 🤖
New autonomous agents for streamlining site reliability processes 🚀
Enhanced data handling with Cosmos DB and Azure Databricks for AI 📊
Windows AI Foundry allows developing AI apps at the edge 🖥️
Microsoft Discovery integrates AI into scientific research 🔬
Key Takeaways
Microsoft unveils open-source GitHub Copilot in VS Code with AI capabilities 🤖
Introduction of autonomous agents for site reliability engineering 🚀
Seamless integration of Cosmos DB and Azure Databricks for AI applications 📊
Windows AI Foundry enables local AI app development 🖥️
Microsoft Discovery to revolutionize scientific research with AI 🔬
Overview
Microsoft Build 2025 has brought a plethora of exciting updates and tools to developers and enterprises alike. The announcement of an open-source GitHub Copilot marks a major shift in AI-powered coding, offering developers incredible new capabilities directly within VS Code. Enhancements to Visual Studio and VS Code will bolster cross-platform development with improved debugging and live previews.
GitHub's new autonomous agents promise to streamline site reliability engineering by automating issue detection, reporting, and even bug fixing through integrated AI capabilities. These advances are set to transform developers' workflows, enabling a more efficient and intuitive approach to problem-solving in complex environments.
Data solutions took center stage as Microsoft unveiled integration strategies for Cosmos DB and Azure Databricks, paving the way for seamless development of AI applications. The introduction of Windows AI Foundry and Foundry Local empowers developers to create AI-driven apps on the edge, broadening the scope of innovation across devices. Meanwhile, Microsoft Discovery aims to accelerate R&D across industries by embedding AI into the scientific process, potentially revolutionizing the scope and speed of research endeavors.
Chapters
00:00 - 00:30: Introduction & Vision for Open Agentic Web The chapter introduces the concept of building an open agentic web as of the year 2025. It emphasizes the transition from a few applications with vertically integrated stacks to a broader platform that supports an open and scalable agentic web. The focus is on expanding opportunities within this framework.
00:30 - 01:30: Visual Studio & VS Code Updates At the Build conference, new updates were announced for Visual Studio and VS Code. Visual Studio, the most powerful IDE for .NET and C++, now includes .NET 10 support, a live preview at design time, improvements to Git tooling, and a new debugger for cross-platform apps. Additionally, VS Code has received some recent updates.
01:30 - 02:30: GitHub & AI Integration in VS Code The 100th release has been shipped, featuring enhanced multi-window support and improved staging view directly in the editor. GitHub remains a pivotal platform for developers, with GitHub Enterprise gaining significant traction within the enterprise sector. There is a strong focus on supporting developers in building a variety of applications with an emphasis on trust and security.
02:30 - 03:30: Autonomous AI Agents & Site Reliability Engineering The chapter discusses the increased importance of compliance, auditability, and data residency in today's technological landscape. It highlights the evolution of GitHub Copilot within VS Code, marking the central role AI now plays in coding. The chapter also announces the open sourcing of C-pilot in VS Code, which excites the developers. Moreover, it mentions the integration of AI-powered capabilities directly into the core of VS Code, emphasizing the significance of this advancement.
03:30 - 05:30: Microsoft Teams Enhancements & AI Capabilities The chapter discusses enhancements in Microsoft Teams and its AI capabilities. It highlights how Copilot's functionalities are integrated into app modernization processes, including upgrading frameworks such as Java and .NET, and migrating on-premises applications to the cloud. Additionally, the introduction of an autonomous agent for site reliability is mentioned.
05:30 - 08:30: Enterprise-Grade AI Agents & Customized Models The chapter discusses the role of AI agents in automating and enhancing the incident management process. It highlights the capabilities of the 'S sur agent' which starts by triaging, identifying root causes, and mitigating issues automatically. Furthermore, it creates a comprehensive incident management report logged as a GitHub issue, containing all necessary repair items. These repair items can be assigned to GitHub Copilot, a coding agent integrated into GitHub. GitHub Copilot is depicted as evolving from merely being a supportive coding assistant to acting as a collaborative peer in programming tasks.
08:30 - 11:30: Introduction of Grock & Foundry Services The chapter introduces Grock & Foundry Services, highlighting a major update since the launch of teams. The update integrates several features such as chat, search, notebooks, and agents. Additionally, it introduces a new autonomous feature that assigns issues to Copilot for bug fixes, implementing new features, and maintaining code. The narrator expresses excitement about the availability of this update to all users.
11:30 - 14:30: Foundry Local & Windows AI Foundry This chapter discusses the concept of an intuitive scaffolding for AI, emphasizing the idea of a UI for AI. It highlights the importance of grounding chat features in both web and work data, making it a game changer. The chapter also covers the capability of search functions to work across a wide range of applications like Confluence, Google Drive, Jira, and Service Now, beyond just M365 data. Additionally, it mentions the ability to create heterogeneous collections of data using notebooks.
14:30 - 17:30: Integration of Cosmos DB & AI in Data Services The chapter discusses the integration of Cosmos DB with AI in data services, highlighting its versatile capabilities. It emphasizes the ability to store diverse data types such as chats, pages, documents, emails, audio reviews, and podcasts in a collection. It also notes the functionality to create new content from existing data, such as turning a PowerPoint presentation into an explainer video or generating images. Special mention is made of unique agents, like the researcher, which synthesizes information effectively and is noted as a significant advancement.
17:30 - 21:30: Scientific Research & Microsoft Discovery The chapter discusses advancements in scientific research and Microsoft's innovations, particularly focusing on the integration of AI in data analysis. It explains how deep chain of thought reasoning can be applied across diverse topics and projects, enabling analysts to derive insights from multiple data sources, such as Excel files.
21:30 - 25:00: Conclusion & Microsoft’s Systems Approach The chapter discusses Microsoft's advancement in creating multiplayer agents within a chat or meeting environment. The process has become increasingly accessible, notably with the company’s AI library supporting MCP, allowing users to enable A2A with just a single line of code. Moreover, developers can enhance these agents by integrating episodic or semantic memory, leveraging Azure search and a new retrieval system. The chapter alludes to more detailed discussions on these innovations later, emphasizing new possibilities for developers, including the ease of publishing developments.
Microsoft Build 2025 Keynote: Everything Revealed, in 14 Minutes Transcription
00:00 - 00:30 [Applause] Good morning and welcome to build. And here we are uh in 2025 building out this open aentic web at scale. And we're going from these few apps with you know vertically integrated stacks to more of a platform that enables this open scalable agentic web. More importantly, you know, it's all about expanding that opportunity for
00:30 - 01:00 developers across every layer. Uh we have a bunch of new updates we're rolling out at build starting with Visual Studio. It is the most powerful IDE for .NET and C++. Uh make and we're making it even better, right? NET 10 support, a live preview at design time, improvements to Git tooling, a new debugger for crossplatform apps, and much much more. And when it comes to VS Code, just a couple of weeks ago, we
01:00 - 01:30 shipped we shipped our 100th release in the open. It included improved multi-wind support and made it easier to view stage directly from within the editor. And GitHub continues to be the home for developers. GitHub enterprise has tremendous momentum in in the enterprise. And we're doubling down for developers building any applications. Trust, security,
01:30 - 02:00 compliance, auditability, data residency are even more critical today. As GitHub copilot has evolved inside VS Code, AI has become so central to how we code. And that's why we're open sourcing C-pilot in VS Code. We're really excited about this. Starting today, we will integrate these AI powered capabilities directly into the core of VS Code, bringing them into
02:00 - 02:30 the same open-source repo that powers the most world's most loved uh dev tool. In fact, we're building app modernization right into agent mode, right? So, Copilot now is capable of upgrading frameworks like a Java 8 to 20, Java 21 or .NET 6 to 9 and migrate any on premise app to the cloud. And the next thing we're introducing is an autonomous agent for site reliability
02:30 - 03:00 engineering or S sur. The S sur agent starts automatically triaging root causing mitigating the issue and then it logs the incident management report as a GitHub issue with all the repair items. Uh and from there you can even assign the repair items uh to GitHub copilot. a full coding agent built right into GitHub, taking C-pilot from being a pair programmer to a peer programmer. You can
03:00 - 03:30 assign issues to Copilot, bug fixes, new features, code maintenance, and it'll complete these tasks auto autonomously. And today, I'm super excited that it's now available to all of you. I don't think since teams launched we've had an update of uh this uh level and it really brings together chat search notebooks create and agents all into this one
03:30 - 04:00 scaffolding that's intuitive right I always say this is the UI for AI and chat for example is grounded both on web data as well as your work data and that's the game changer especially with pages uh search works across all of your applications, right? Whether it's Confluence or Google Drive or Jira or Service Now, not just M365 data. Uh with notebooks, I can now create these heterogeneous collections of data,
04:00 - 04:30 right? In fact, I can have chats and pages and any documents, emails all in that collection. Um and then in fact, I can get all these audio reviews or podcasts out of it. Um, you know, I can use create, you know, to turn a PowerPoint into a new explainer video or generate an image. Um, and when it comes to agents, we have a couple of special agents like researcher, right? It's been perhaps the biggest game changer for me because it's synthesizing across both
04:30 - 05:00 the web and enterprise sources, right? Applying deep chain of thought reasoning to any topic or any project. Uh, analyst goes from raw data across multiple source files. I can just upload a bunch of Excel files. It will g get the insights. It'll do forecast. It'll do all the visualizations. All of the agents you build can now show up in Teams and in Copilot. And you can ask questions, assign action items, or kick off a workflow by just at mentioning an
05:00 - 05:30 agent in a chat or meeting. And with the team's AI library, building multiplayer agents is easier than ever. it now supports MCP and with just one line of code you can even have it create enable A2A uh and you can add you know things like episodic or uh semantic memory by using Azure search and a new retrieval system which I'll talk about later and as a developer you can now publish and
05:30 - 06:00 this is the biggest thing right now you can build an agent you can publish your agent to the agent store and have them discovered and distributed across both copilot and teams providing you access to the hundreds of millions of users and unlocking that opportunity. Today, we're introducing a new class of enterprisegrade agents you can build using models fine-tuned on your company's data workflows and style. We call it co-pilot
06:00 - 06:30 tuning. You know, Copilot can now learn your company's unique tone uh and language, and soon it'll even go further understanding all of the company's specific expertise and knowledge. All you need to do is seed the training environment with a small set of references and kick off a training run. The customized model inherits the permissions of all the source control. uh and once integrated into the agent, it can deploy to authorized users. Our
06:30 - 07:00 new model router will automatically choose the best OpenAI model for the job. No more sort of those, you know, manual model selections. Uh an approach today though goes from having apps that you built or agents you build only bind to one model to truly becoming multimodel. Uh that's why today we're thrilled to announce Grock from XAI is coming to Azure. When you have multiple models,
07:00 - 07:30 what you need is a new capability in how you use these models. And now you can provision throughput once on Foundry and you can across you can use all that provision throughput across multiple models including Grock, right? That's just a gamecher in terms of how you think about uh models and model provisioning and the foundry agent service lets you build declarative agents in fact just with few lines of code just on the port in the portal uh
07:30 - 08:00 for complex workflows it supports multi-agent orchestration uh and I'm excited to share that now the agent service is generally available and we're making it straightforward for example for you to connect Foundry to your container app or functions uh and deploy deploy any open-source model into AKS uh whether it's in the cloud or in hybrid mode with ARC and you can now take a model uh fine-tune it in or post- trainin it uh in Foundry and then drop
08:00 - 08:30 it right into Copilot Studio so that you can now use that post-train model to automate a workflow or build an agent. This healthcare agent orchestrator that Stanford used is now available to everyone in Foundry. It's pretty awesome. You know, we now have new observability features coming to Foundry to help you monitor and manage AI in production. Uh you can track the impact, quality, safety as well as cost all in one place. Uh with Entra ID, agents now
08:30 - 09:00 get their own identity, permissions, policies, access controls. uh the agents you build in Foundry and Copilot Studio show up automatically in an agent directory in Entra. Uh we're also partnering with Service Now and Workday to bring automated provisioning and and management to their agents via Entra. And when it comes to data governance, Purview now integrates with Foundry, right? So when you write an agent,
09:00 - 09:30 automatically because of Purview, you can ensure end-to-end data protection. Another massive safety consideration. Uh and on the security side, uh defender now integrates with Foundry. So that means uh your agents are also protected just like an endpoint would be from threats like wallet abuse or credential theft by with defender. Now we want to bring the power of this app server and app building capability to the edge and
09:30 - 10:00 clients as well with Foundry local uh which we're announcing today. You know it includes a fast higherformance runtime models agents as a service uh and a CLI for local app development. And yes it's fully supported on Windows and the Mac. We're excited to announce the Windows AI Foundry. You know, Windows AI Foundry is what uh we used in fact
10:00 - 10:30 ourselves internally to build features SDK and now we're extending this platform to support the full dev life cycle right not just on co-pilot PCs but across CPUs, GPUs, NPUs, all and in the cloud, right? So you can build your application and have it run across all of that silicon. And Foundry local is built into Windows AI Foundry. So you can tap into this rich catalog of these pre-optimized open-source models that
10:30 - 11:00 you can run locally on your device. We're announcing native support for MCP in Windows. Windows will now include several built-in MCP servers like file systems, settings, app actions, as well as windowing. Uh, and we're adding native MCP registry that lets MCP compatible uh, client discover the secure MCP servers that have been vetted by us for security performance all while
11:00 - 11:30 keeping you in control. We first announced Bash on Ubuntu on Windows nearly 10 years ago. Uh it subsequently became what we obviously call, you know, today WSL. Today we're making WSL fully open source. And so we're announcing today, and you all should go check out the code in the GitHub repo, uh NL web. It is a way for anyone who has a website or an API already to very easily make their
11:30 - 12:00 website or their API uh an Agentic application. We're in integrating Cosmos DB directly into Foundry. So that means any agent can store uh and retrieve things like conversational history. Um and soon they'll be able to do uh also use Cosmos for all their rag application uh needs. uh and we're taking it further uh with Azure data bricks uh connecting your data in genie spaces or in AI
12:00 - 12:30 functions to foundry. The other very cool capability is now inside of a Postgress SQL query, you can have LLM directly, you know, LLM responses directly integrated. Uh we're bringing Cosmos DB to fabric too, right? Because AI apps need more than just structured data. Uh they need semi-structured data whether it's text, images, audio. And with Cosmos and Fabric and your data instantly available alongside SQL uh you
12:30 - 13:00 can now unify your entire data estate and make it ready uh for AI. And uh there's a lot more. In fact, we are even building our digital twin uh builder right into fabric. Uh now you can you know very easily take digital twins with no code, low code. Uh as you can see here, you can map the data from your physical assets and systems super fast. Uh we're also announcing you know shortcut transformations in one link. You can think of this as AI uh driven
13:00 - 13:30 ETL. You can apply all these pre-built AI powered transformations you know audio to text or sentiment analysis when it's data is coming in summarization uh all powered by foundry straight into fabric. So in fact the largest GB200based supercomputer is going to be Azure. And so we're very very excited about scaling this um and making it available to all of you as developers. We're bringing together the entire stack
13:30 - 14:00 I talked about today um and saying look let's apply it to science and the scientific workflow the scientific process uh that's our ambition with Microsoft discovery which we are announcing today. It understands the nuance knowledge in the scientific domain right from public domain as well as your own data if you're a biioarma company discoveries built on foundry bringing advanced agents uh highly
14:00 - 14:30 specialized in R&D not just for reasoning but for conducting research itself. It's great to see how companies across every industry are already using uh discovery to accelerate their R&D and I can't wait to see this in the hands of more R&D labs all over uh and what they can do. So that was you know a quick comprehensive whatever you want to call it walk uh through the full stack and how we're creating new opportunity for you across the agentic web. uh we're
14:30 - 15:00 talking we're taking really a a systems approach a platform approach which you can expect from Microsoft across every layer uh of the stack. back.