Microsoft is a Leader in the 2025 Gartner® Magic Quadrant™ for Cloud-Native Application Platforms 

We’re proud to announce that Microsoft has been named a Leader in the 2025 Gartner® Magic Quadrant™ for Cloud-Native Application Platforms for a second year in a row, and the furthest to the right in Completeness of Vision. We believe this recognition reflects our continued product innovation, seamless developer experience, and AI leadership. Enabling customers to innovate faster with cloud-native apps and AI agents.

As AI reshapes the application landscape, Microsoft continues to lead with a comprehensive application platform that spans web apps, APIs, event-driven applications, serverless functions, containers, and AI agents backed by global scale and deep enterprise expertise. We’re committed to helping developers and customers innovate with AI while delivering scalable, cost-efficient operations for organizations of all sizes.

Read more about Magic Quadrant for Cloud-Native Application Platforms

A developer-first experience, built for productivity 

We continue to invest deeply in improving the developer experience across our application platform—offering choice, simplicity, and integration at every layer. Whether customers are building with containers, functions, APIs, or web frameworks, Azure provides a seamless and productive environment to accelerate from idea to impact. Azure offers: 

Azure App Service: An enterprise-grade platform-as-a-service (PaaS) for web apps with support for language runtimes across .NET, Java, Node.js, Python, PHP, Windows, and Linux containers, custom domain management, and deep integration with GitHub and DevOps pipelines.

Azure Container Apps: A serverless container service ideal for microservices and event-driven workloads, now enhanced with scaling improvements, serverless GPU support, and Azure AI Foundry integration. 

Azure Functions: A powerful serverless compute platform for event-driven architectures. We recently added enhanced performance with the Flex Consumption plan, .NET 8 remote MCP server support, and improved support for long-running durable workflows. 

Azure Static Web Apps: A managed service for building full stack web applications with frameworks like React, Vue, Angular, and Blazor with fully managed global distribution of static content. 

Agentic DevOps: All of these are services are integrated with GitHub Copilot and Visual Studio. GitHub Copilot has transformed the way we code, which is why 20 million developers rely on it to build faster. 50 million developers are actively using Visual Studio and Visual Studio Code each month. With Agentic DevOps using GitHub Copilot and Azure SRE Agent, developers can seamlessly go from code to cloud scale faster, accelerating every stage of their application lifecycle. 

Powering the next wave of AI-native apps 

AI is fundamentally changing how software is built and experienced. From personalized digital experiences to autonomous agents, applications increasingly rely on embedded intelligence, real-time inference, and orchestration of complex workflows.

Microsoft is leading the way in AI innovation with Azure AI Foundry—unifying agents, models, and tools with built-in enterprise-readiness capabilities such as tracing, monitoring, and evaluations. Azure AI Foundry offers a rich catalog of AI models such as OpenAI’s GPT-5, GPT-4o, Meta’s Llama, Microsoft’s Phi-4, and more. AI Foundry integrates with our application platform services like Azure Container Apps and Azure Functions to enable developers to build agentic applications using composable, event-driven, and containerized building blocks. 

In the past year, we introduced key innovations across our application platform to make Azure the home for AI-native applications: 

Azure Container Apps Serverless GPUs let customers run AI model inference on demand without managing infrastructure. Serverless GPUs help teams scale generative AI and workloads with ease and cost efficiency. 

Dynamic sessions in Azure Container Apps offer secure, on-demand, sandboxed compute environments, ideal for large language model (LLM) generated code, AI agents, and ephemeral workloads. 

Azure Container Apps integration with Azure AI Foundry lets customers deploy models from the Foundry model catalog directly to Azure Container Apps, simplifying the model deployment experience. 

Sidecars in Azure App Service simplify AI-native app deployment by integrating small language model (SLM) model hosting, routing, and scaling in existing web apps. 

By combining apps, data, and AI in a single platform, Azure enables organizations to build and scale the next generation of intelligent, adaptive applications. 

Built to scale securely and cost effectively 

Applications need to do more than run, they must scale, perform reliably, and remain cost-effective. Azure is designed to meet the demands of enterprises and startups alike with built-in capabilities that reduce operational overhead and improve efficiency. Recent enhancements include: 

Azure App Service Premium v4 plan (public preview) brings a next-generation, fully managed PaaS experience on both Windows and Linux with superior performance, scalability, and cost efficiency powered by modern Azure hardware. It’s built to reduce total cost of ownership, with early testing showing up to 24% savings for Windows web apps compared to the previous Premium v3 tier. 

Azure Functions Flex Consumption offers concurrency‑based scaling, instantaneous scale‑from‑zero up to 1,000 instances with no cold starts via the “Always Ready” feature, and seamless virtual network integration. 

App Service plans with 2+ instances automatically support Availability Zones without extra set up, while still maintaining a 99.99% SLA. Support for regions with two zones, enhanced zone visibility, and mutable zone redundancy settings further simplify deployment and reduce costs. 

Customers are innovating with Azure’s application platform 

We’re honored by this recognition and thankful for the trust our customers and partners place in us. Their innovation and feedback continue to drive our roadmap and our mission. Here are just a few examples of that innovation in practice. 

Coca‑Cola leveraged Azure AI Foundry and Azure application services to build a custom, real‑time conversational “Santa” AI model that interacted with over one million people across 43 markets in 26 languages—launching the immersive “Create Real Magic” holiday campaign in just 60 days. The initiative showcased how the company’s multi‑phase migration to an AI‑ready Azure platform enabled rapid innovation through scalable services like Azure AI Speech, Functions, and Container Apps. 

Medigold Health migrated its applications to Azure App Service and adopted Azure OpenAI Service, along with Azure Cosmos DB and Azure SQL Database, to automate clinicians’ report generation and significantly cut down administrative effort. This transformation led to a 58% increase in clinician retention and greatly improved job satisfaction and workflow efficiency. 

The NFL enhanced its Combine App with a real‑time conversational AI assistant, powered by Azure OpenAI Service, Azure Container Apps, and Azure Cosmos DB, enabling scouts and coaches to ask natural‑language questions and receive fast, accurate player insights while drills are still underway. This innovation eliminated hours of manual data crunching, turning it into instantaneous “grab‑and‑go” insights, transforming talent evaluation during the Combine event. 

Audi AG used Azure AI Foundry, Azure App Service, and Azure Cosmos DB to rapidly deploy secure, scalable AI assistants that addressed urgent HR needs. The project delivers fast, modular access to information, earning leadership and employee confidence. Audi is moving from experimentation to operational AI, ready for deeper system integration and next-gen capabilities like multi-agent orchestration. 

As AI transforms the application landscape, Microsoft is committed to helping customers build what’s next with speed, intelligence, and resilience. 

Learn more about Azure

Explore Azure App Service, Azure Container Apps, and Azure Functions. 

Get started building with AI App Templates. Edit and deploy to Azure using Visual Studio Code or GitHub Codespaces. 

Get expert guidance from skilled Microsoft partners through the new Azure Accelerate program. 

Gartner, Magic Quadrant for Cloud-Native Application Platforms, By Tigran Egiazarov, Mukul Saha, Prasanna Lakshmi Narasimha, 4 August 2025. 

*Gartner is a registered trademark and service mark and Magic Quadrant is a registered trademark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission. All rights reserved. 

This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request here. 

Gartner does not endorse any vendor, product, or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose. 
The post Microsoft is a Leader in the 2025 Gartner® Magic Quadrant™ for Cloud-Native Application Platforms  appeared first on Microsoft Azure Blog.
Quelle: Azure

Microsoft’s open source journey: From 20,000 lines of Linux code to AI at global scale

Microsoft’s engagement with the open source community has transformed the company from a one-time skeptic to now being one of the world’s leading open source contributors. In fact, over the past three years, Microsoft Azure has been the largest public cloud contributor (and the second largest overall contributor) to the Cloud Native Computing Foundation (CNCF). So, how did we get here? Let’s look at some milestones in our journey and explore how open-source technologies are at the heart of the platforms powering many of Microsoft’s biggest products, like Microsoft 365, and massive-scale AI workloads, including OpenAI’s ChatGPT. Along the way, we have also introduced and contributed to several open-source projects inspired by our own experiences, contributing back to the community and accelerating innovation across the ecosystem.  

Innovate faster with open source on Azure

Embracing open source: Key milestones in Microsoft’s journey

2009—A new leaf: 20,000 lines to Linux. In 2009, Microsoft contributed more than 20,000 lines of code to the Linux kernel, initially Hyper‑V drivers, under General Public License, version 2 (GPLv2). It wasn’t our first open source contribution, but it was a visible moment that signaled a change in how we build and collaborate. In 2011, Microsoft was in the top 5 companies contributing to Linux. Today, 66% of customer cores in Azure run Linux.

2015—Visual Studio code: An open source hit. In 2015, Microsoft released Visual Studio Code (VS Code), a lightweight, open-source, cross-platform code editor. Today, Visual Studio and VS Code together have more than 50 million monthly active developers, with VS Code itself widely regarded as the most popular development environment. We believe AI experiences can thrive by leveraging the open-source community, just as VS Code has successfully done over the past decade. With AI becoming an integral part of the modern coding experience, we’ve released the GitHub Copilot Chat extension as open source on GitHub.

2018—GitHub and the “all-in” commitment. In 2018, Microsoft acquired GitHub, the world’s largest developer community platform, which was already home to 28 million developers and 85 million code repositories. This acquisition underscored Microsoft’s transformation. As CEO Satya Nadella said in the announcement, “Microsoft is all-in on open source… When it comes to our commitment to open source, judge us by the actions we have taken in the recent past, our actions today, and in the future.” In the 2024 Octoverse, GitHub reported 518 million public or open-source projects, over 1 billion contributions in 2024, about 70,000 new public or open-source generative AI projects, and about a 59% year-over-year surge in contributions to generative AI projects. 

Open source at enterprise scale: Powering the world’s most demanding workloads 

Open-source technologies, like Kubenetes and PostgreSQL, have become foundational pillars of modern cloud-native infrastructure—Kubernetes is the second largest open-source project after Linux and now powers millions of containerized workloads globally, while PostgreSQL is one of the most widely adopted relational databases. Azure Kubernetes Service (AKS) and Azure’s managed Postgres take the best of these open-source innovations and elevate them into robust, enterprise-ready managed services. By abstracting away the operational complexity of provisioning, scaling, and securing these platforms, AKS and managed PostgreSQL lets organizations focus on building and innovating. The combination of open source flexibility with cloud-scale reliability allows services like Microsoft 365 and OpenAI’s ChatGPT to operate at massive scale while staying highly performant.

COSMIC: Microsoft’s geo-scale, managed container platform powers Microsoft 365’s transition to containers on AKS. It runs millions of cores and is one of the largest AKS deployments in the world. COSMIC bakes in security, compliance, and resilience while embedding architectural and operational best practices into our internal services. The result: drastically reduced engineering effort, faster time-to-market, improved cost management, even while scaling to millions of monthly users around the world. COSMIC uses Azure and open-source technologies to operate at planet-wide scale: Kubernetes event-driven autoscaling (KEDA) for autoscaling, Prometheus, and Grafana for real-time telemetry and dashboards to name a few.

OpenAI’s ChatGPT: ChatGPT is built on Azure using AKS for container orchestration, Azure Blob Storage for user and AI-generated content, and Azure Cosmos DB for globally distributed data. The scale is staggering: ChatGPT has grown to almost 700 million weekly active users, making it the fastest-growing consumer app in history.1 And yet, OpenAI operates this service with a surprisingly small engineering team. As Microsoft’s Cloud and AI Group Executive Vice President Scott Guthrie highlighted at Microsoft Build in May, ChatGPT “needs to scale … across more than 10 million compute cores around the world,” …with approximately 12 engineers to manage all that infrastructure. How? By relying on managed platforms like AKS that combine enterprise capabilities with the best of open source innovation to do the heavy lifting of provisioning, scaling, and healing Kubernetes clusters across the globe. 

Consider what happens when you chat with ChatGPT: Your prompt and conversation state are stored in an open-source database (Azure Database for PostgreSQL) so the AI can remember context. The model runs in containers across thousands of AKS nodes. Azure Cosmos DB then replicates data in milliseconds to the datacenter closest to the user, ensuring low latency. All of this is powered by open-source technologies under the hood and delivered as cloud services on Azure. The result: ChatGPT can handle “unprecedented” load—over one billion queries per day, without a hitch, and without needing a giant operations team. 

Deploy containers on Azure Kubernetes Service

What Azure teams are building in the open

At Microsoft, our commitment to building in the open runs deep, driven by engineers across Azure who actively shape the future of open-source infrastructure. Our teams don’t just use open-source technologies, they help build and evolve them.  

Our open-source philosophy is straightforward: we contribute upstream first and then integrate those innovations into our downstream products. To support this, we play a pivotal role in upstream open-source projects, collaborating across the industry with partners, customers, and even competitors. Examples of projects we have built or contributed to include:  

Dapr (Distributed Application Runtime): A CNCF-graduated project launched by Microsoft in 2019, Dapr simplifies cloud-agnostic app development with modular building blocks for service invocation, state, messaging, and secrets.

Radius: A CNCF Sandbox project that lets developers define application services and dependencies, while operators map them to resources across Azure, AWS, or private clouds—treating the app, not the cluster, as the unit of intent.

Copacetic: A CNCF Sandbox tool that patches container images without full rebuilds, speeding up security fixes—originally built to secure Microsoft’s cloud images.

Dalec: A declarative tool for building secure OS packages and containers, generating software bill of materials (SBOMs) and provenance attestations to produce minimal, reproducible base images.

SBOM Tool: A command line interface (CLI) for generating SPDX-compliant SBOMs from source or builds—open-sourced by Microsoft to boost transparency and compliance.

Drasi: A CNCF Sandbox project released in 2024, Drasi reacts to real-time data changes using a Cypher-like query language for change-driven workflows. 

Semantic Kernel and AutoGen: Open-source frameworks for building collaborative AI apps—Semantic Kernel orchestrates large language models (LLMs) and memory, while AutoGen enables multi-agent workflows.

Phi-4 Mini: A compact 3.8 billion-parameter AI model released in 2025, optimized for reasoning and mathematics on edge devices; available on Hugging Face.

Kubernetes AI Toolchain Operator (KAITO): A CNCF Sandbox Kubernetes operator that automates AI workload deployment—supporting LLMs, fine-tuning, and retrieval-augmented generation (RAG) across cloud and edge with AKS integration. 

KubeFleet: A CNCF Sandbox project for managing applications across multiple Kubernetes clusters. It offers smart scheduling, progressive deployments, and cloud-agnostic orchestration. 

This is just a small sampling of some of the open-source projects that Microsoft is involved in—each one sharing, in code, the lessons we’ve learned from running systems at a global scale and inviting the community to build alongside us.  

Open Source + Azure = Empowering the next generation of innovation

Microsoft’s journey with open source has come a long way from that 20,000-line Linux patch in 2009. Today, open-source technologies are at the heart of many Azure solutions. And conversely, Microsoft’s contributions are helping drive many open-source projects forward—whether it’s commits to Kubernetes; new tools like KAITO, Dapr, and Radius; or research advancements like Semantic Kernel and Phi-4. Our engineers understand that the success of end user solutions like Microsoft 365 and ChatGPT rely on scalable, resilient platforms like AKS—which in turn are built on and sustained by strong, vibrant open source communities. 

Join us at Open Source Summit Europe 2025

As we continue to contribute to the open source community, we’re excited to be part of Open Source Summit Europe 2025, taking place August 25–27. You’ll find us at booth D3 with live demos, in-booth sessions covering a wide range of topics, and plenty of opportunities to connect with our Open Source team. Be sure to catch our conference sessions as well, where Microsoft experts will share insights, updates, and stories from our work across the open source ecosystem. 

1 TechRepublic, ChatGPT’s On Track For 700M Weekly Users Milestone: OpenAI Goes Mainstream, August 5, 2025.

The post Microsoft’s open source journey: From 20,000 lines of Linux code to AI at global scale appeared first on Microsoft Azure Blog.
Quelle: Azure