From Teams to PowerPoint: 10 ways Azure AI enhances the Microsoft Apps we use everyday

Azure AI is driving innovation and improving experiences for employees, users, and customers in a variety of ways, from increasing workday productivity to promoting inclusion and accessibility. The success of Azure AI—featuring Azure Cognitive Services, Azure Machine Learning, and Azure OpenAI Service—is built on a foundation of Microsoft Research, a wide range of Azure products that have been tested at scale within Microsoft apps, and Azure customers who use these services for the benefit of their end users. As 2023 begins, we are excited to highlight 10 use cases where Azure AI is utilized within Microsoft and beyond.

#1: Speech transcription and captioning in Teams

Speech transcription and captioning in Microsoft Teams is powered by Azure Cognitive Services for Speech. The Speech-to-Text service transcribes over 54 million hours of meetings in real time and captions 6.8 million characters each month. Microsoft achieved human parity in conversational speech recognition when it reached an error rate of 5.9 percent. The word error rate of professional speech transcribers is 5.9 percent. Twitter and Swedish TV are two customers using Azure AI to caption speech for accessibility.

#2: Content and design production in PowerPoint

Content and design production in PowerPoint, which has been used by nearly two billion Designer slides since its launch in 2015, is powered by Azure Cognitive Services and Azure Machine Learning, and their MLOps capabilities. These tools build models faster and at scale, replacing local development, and recommend background images and videos. The 2022 launch of DALL-E 2 in the Azure OpenAI Service improves the capabilities of this feature. Up to 4.1 million slides are created with Designer each day, and the Designer team regularly retrains models and experiments with model optimization to provide optimized content recommendations. Companies like Polaris encouraged employees to use Designer to share highlights from the year during the pandemic, when they shifted to virtual company events.

#3: Biometric detection and identity verification in Windows Hello

Biometric detection and identity verification in Windows Hello, which has been deployed to over 100,000 Microsoft employees, is powered by Azure Cognitive Services for Vision. Outside of Microsoft, Uber uses the Cognitive Services Face API, which powers Windows Hello, to ensure that drivers using the app match the account on file for fraud prevention and to increase peace of mind for both drivers and riders. This extra verification step is fast, works on all smartphones, and can scale to over one million driver-partners. Home healthcare and hospice care provider Amedisys also implemented Windows Hello for Business to improve its security strategy and patient care support.

#4: Personalized recommendations in Xbox

Personalized recommendations in Xbox powered by Azure Machine Learning and Personalizer have been rolled out to global audiences. In Microsoft Docs, referral traffic from content recommendations generate twice as many page views compared to other sources of traffic. The NBA uses Personalizer to deliver personalized content to fans at an individual level, with nearly 10 billion interactions across various delivery channels each day.

#5: Content reading and writing experiences in Edge and Microsoft365

Content reading and writing experiences in Edge and Microsoft 365 are powered by Azure Cognitive Services for Speech and Applied AI Services. Immersive Reader, which is used by over 15 million users on applications such as Teams, Word, OneNote, and Outlook, allows users to read text with an optimized layout and helps them to read 10 percent faster with 50 percent fewer reading errors. USA Today offers premium subscribers the option to "Hear This Article" using the Cognitive Services Speech-to-Text, which powers the Read Aloud feature in Immersive Reader.

#6: Image captioning and alt text in LinkedIn

Image captioning and content accessibility on LinkedIn are powered by Azure Machine Learning, Cognitive Services for Vision, and Azure OpenAI Service. Image classification and tagging, including the automatic generation of captions and alt text for images on LinkedIn, saves users time and improves accessibility for the community. In the manufacturing industry, AI-driven image classification can also be used to detect quality issues on assembly lines, reducing manual effort and improving the efficiency and quality of the manufacturing process. Microsoft achieved human parity in image captioning in 2020.

#7: Contextual Assistance in Outlook

Contextual assistance in Outlook, powered by Azure Machine Learning, has increased customer engagement. Meeting Insights, which is shown in over 40 percent of opened meetings on supported Outlook clients, has customers reporting that two out of three suggestions are useful. The learnings from these Meeting Insights models are being used to integrate other intelligent content recommendation features, such as "Suggested Attachments" and "Suggested Reply with File," into Outlook.

#8: Conversational Intelligence in Viva Sales and Dynamics

Viva Sales and Dynamics use conversational intelligence powered by Azure Machine Learning and Cognitive Services for Speech & Language. Viva Sales helps sellers by providing real-time business context from their CRM system, facilitating knowledge sharing and collaboration, and offering AI-powered insights to improve customer engagements. Lufthansa Cargo uses Dynamics 365 Sales, Customer Service, and Customer Voice to centralize customer information and related shipments in one location.

#9: Assisted Programming and Content Generation in GitHub and Power Platform

Assisted programming in GitHub and Power Platform is powered by Azure Machine Learning and Azure OpenAI Service. GitHub Copilot has developers accepting almost 25 percent of all completions shown by the tool. Azure Speech also powers the Voice to Code experience for GitHub Copilot. The Power Platform team also realized that the ability to translate conversational language into code could make the code more accessible. CarMax used Azure OpenAI Service to help summarize 100,000 customer reviews into short descriptions, saving the company over two years of manual work.

#10: Editorial Assistance in Microsoft 365

Editorial assistance in Microsoft 365 powered by Azure Machine Learning, is available through Microsoft Editor, which gives writers intelligent tools to improve their writing in documents, emails, and posts across the web. The tools are available in Word, Outlook, and the Edge Browser, and are expanded to Outlook.com, Outlook for the web, and browser extensions for Microsoft Edge and Google Chrome in over 20 languages.

Looking ahead into 2023 with Azure AI

The ten scenarios above represent just a slice of how Azure AI helped Microsoft users and customers do more with less in 2022. We look forward to seeing how organizations utilize AI to transform business, employee, customer, and end-user experiences around the world in 2023.

Learn more

Discover more use cases, solutions, research, and customer inspiration with Azure AI.
Quelle: Azure

Azure Native Qumulo Scalable File Service provides seamless, secure data storage

The amount of data created and consumed is expected to grow exponentially over the next decade. This proliferation has radically changed enterprise data storage requirements. Today, enterprises are in search of storage solutions with unlimited capacity and performance scaling capabilities alongside their core requirements of data security and availability.

Azure cloud services have been the backbone for enterprises in their digital transformation journey. With Azure cloud services, enterprises can reduce storage costs and have access to high-availability storage products with disaster recovery capabilities. We are happy to announce that the storage suite of Azure products has grown with the addition of Azure Native Qumulo Scalable File Service. You can subscribe to this service via the Azure marketplace.

Qumulo is the industry leader in Distributed File System and Object Storage. Qumulo provides a highly scalable, highly performant, and simple-to-use cloud-native filesystem that supports a wide variety of data workloads via standard protocols (NFS, SMB, FTP, and S3). Azure Native Qumulo Scalable File Service provides seamless integration between Qumulo and Azure and enables Qumulo’s distributed file system as a native service on Azure.

“Our commitment to delivering simplicity at scale starts with the initial deployment and onboarding experience of our customers. The availability of the Azure Native Qumulo Scalable File Service directly from the Azure portal creates a more streamlined procurement process for customers looking to store and manage unstructured data in the cloud at massive scale." —Kiran Bhageshpur, CTO of Qumulo

Native integration means that customers can deploy Qumulo in less than 15 minutes. Customers no longer need to set up virtual network peering to connect to Qumulo. In fact, with this integration, virtual network peering charges have been eliminated, as Qumulo is connected via private access to the customer’s chosen virtual network. The data is stored and processed by the service in the same region as the virtual network and customers can easily leverage their own network policies with Qumulo’s enterprise-scale filesystem.

Azure Native Qumulo Scalable File Service provides the following key capabilities:

Seamless onboarding—Easily onboard and use Qumulo as a natively integrated service on Azure.
Unified billing—Get a single bill for all resources you consume on Azure, including the Qumulo service.
Private access to virtual network—The service is directly connected to your virtual network and provides access to a fully managed Qumulo Scalable File service from the virtual network of your choice.

“Azure Native ISV Services enables enterprises to provision and use select ISV solutions on Azure with a seamless, first-class experience. With the public preview launch of Azure Native Qumulo Scalable File Service, we expand Azure’s storage offerings with Qumulo’s exabyte-scale data service. With this integration, enterprises can easily create and manage a multi-petabyte-scale file system in a single namespace on Azure. We are really excited about partnering with Qumulo and integrating their data services into Azure" —Balan Subramanian, Partner Director of Product Management, Azure Developer Experiences

Get started with Azure Native Qumulo Scalable File Service:

1. Prerequisite—Create a delegated subnet in a virtual network of your choosing and add network security group details as per your enterprise policy.

 

2. Subscribe to Azure Native Qumulo Scalable File Service on the Azure Marketplace.

3. Seamlessly create a Qumulo resource by entering the required details about the file in the Basics tab.

4. In the Networking tab, provide the details of the virtual network and the previously created delegated subnet. Click on Review + create to create the filesystem.

Learn more

Marketplace offer: Azure Native Qumulo Scalable File Service.
Azure Native Qumulo Scalable File Service overview.
Sign up for Azure today.

Quelle: Azure

What's new in Azure Data & AI: Empowering retailers to streamline operations and accelerate time to value

The new year brings opportunity for thoughtful reflection about the past year, both personally and professionally. 2022 was a year of firsts for me—first time having clam chowder at Pike Place Market as a local, first time going shopping for heels with my daughter, and first time delivering an Azure keynote at Inspire as a Microsoft employee when pre-COVID-19, I was a Partner listening in the audience. And here is another first; the start of a new blog series where I plan to share more about noteworthy and inspiring data and AI innovations we are releasing across Microsoft. Given the National Retail Federation’s Big Show this week, I’ll also highlight how these innovations impact retail.

Let’s explore what’s new for Azure Data & AI this month:

Microsoft underscores resilient retail at NRF

A bellwether for economic and societal trends, the retail industry continues to be on the front line of adaptive innovation. And rather than trying to predict the future, retailers are working to achieve greater business agility necessary to thrive in it. This means that, like Majid Al Futtaim Retail, they automate tedious processes so employees can focus on higher-value tasks. Like Grupo Bimbo, they unify disparate data points in real time so that employees can access a central source of truth when and where they need it. And, like CCC Group, they stay laser-focused on delivering differentiated customer experiences to build loyal fans. Across each of these organizations, data is seen as an accelerant for growth, powering more personalized customer experiences, cost-efficient supply chains, and proactive responses to market trends.

Business agility requires people, processes, and technologies to work in harmony, and to align on how massive amounts of data are managed, analyzed, and actioned to respond to market demands. Increasingly, these efforts focus on driving sustainable growth that limits carbon emissions, for example by using machine learning to more accurately forecast demand to reduce excess inventory and waste.

We know cloud technologies like databases, containers, and AI can enable more accurate decisions, but it can be challenging to ensure these technologies speak to each other in the right way at the right time on a global scale. This is where Azure and the Microsoft Intelligent Data Platform—the description we use to refer to all of the Data & AI Azure products and services we offer shine. With managed databases like Cosmos DB, analytics services like Azure Synapse, and leading AI offerings, retailers can take advantage of the “by design” integration the Microsoft Intelligent Data Platform offers, which means they are able to invest more time in creating value rather than integrating and managing their data estate.

The Microsoft Intelligent Data Platform came to life through new immersive demo experiences this year at NRF and I’m going to briefly highlight what we showcased at the conference. Earlier this week at NRF, Microsoft’s Alysa Taylor and Shelley Bransten spoke on the topic of Resilient Retail and shared examples of organizations digitizing their businesses to do more with less. You can read more about their talks on our Industry blog.

Microsoft also met with customers at our NRF booth to discuss strategies for making sense of all their data. For example, these two demos highlight how tight integrations between data, analytics, and AI can help make resilient retail a reality.

Wide World Importers (WWI), a global supermarket chain, wants to maximize the value of their data estate. By using Azure Synapse pipelines with Cosmos DB, they get real time insights which are automatically shared with the right decision makers through tools like Azure Data Explorer and Power BI. They’re able to track their supply chain data in real time and use predictive AI to reduce costs. They’re also able to govern their data estate from a single application one pane of glass using Microsoft Purview.

Next, Wide World Importers taps into the power of Azure AI to build a more connected customer experience. They use Azure Form Recognizer to detect and redeem promotional offers. Azure Cognitive Search helps customers find product information more quickly by recognizing their search intent and helps WWI deliver more personalized recommendations. Pre-built AI capabilities, such as speech recognition and computer vision, also differentiate the shopping experience and provide a more accessible flow.

The general availability of Azure OpenAI Service 

As Eric Boyd mentioned in his blog, we announced the general availability of Azure OpenAI Service as part of our ongoing partnership with OpenAI. Azure OpenAI Service provides a commercialization platform for businesses to leverage advanced AI models like GPT-3.5, Codex, and DALL*E to create innovative applications. Customers of all sizes across industries are using Azure OpenAI Service to do more with less, improve experiences for end users, and streamline operational efficiencies. 

Microsoft Responsible AI Dashboard now available

The digitization of retail enables retailers to meet customer expectations with increasing precision, from providing personalized recommendations online to restocking inventory based on computer vision in physical stores. Shoppers expect the technology behind their retail experience to apply data and AI responsibly. In December, we began rolling out our new Responsible AI dashboard in Azure Machine Learning, which includes capabilities like fairness assessment, interpretability, error analysis, and causal inferencing. And today, we are excited to announce the general availability of the Microsoft Responsible AI dashboard. Retailers can leverage the Responsible AI dashboard to optimize the shopper’s experience as well as build trust and positive perception for their brands.

See how you can learn more about Responsible AI.

Full text search capabilities come to Azure Cosmos DB for Apache Cassandra

Azure Cosmos DB has seen tremendous momentum within the retail industry, given its ability to automatically and instantly scale when traffic is unpredictable without sacrificing performance or cost efficiency. Microsoft runs both the Windows store and Xbox Live on Azure Cosmos DB for this very purpose. This month, we’re announcing several performance enhancements to Azure Cosmos DB to make it easier and faster for developers to query data stores in Azure Cosmos DB. These include the ability to do full text searches in Azure Cosmos DB for Apache Cassandra through a native integration with Azure Cognitive Search, and support for GraphQL and REST through Data API builder.

For more technical detail about these and other updates to Azure Cosmos DB, visit our developer blog.

JSON support for Azure Cache for Redis Enterprise generally available

Support for JSON documents in Azure Cache for Redis Enterprise tiers, delivered via the RedisJSON module, has been made generally available as of November 2022. This turns Azure Cache for Redis Enterprise into a high-performance NoSQL document store and drives efficiency for developers to modernize their applications. The new RedisJSON module update is well suited to retail customers looking to store, search, and index product catalogs and shopper data via a single atomic operation.

To learn more on Azure Cache for Redis Enterprise and the RedisJSON module please check out the blog.

Microsoft named a Leader in the 2022 Gartner Magic Quadrant for Insight Engines

In December, Microsoft was named a leader in the 2022 Gartner Magic Quadrant for Insight Engines, which evaluates the capabilities of various vendors in the market for providing enterprise-scale search for app development. Organizations benefit, no matter the industry, but Cognitive Search is an exceptionally powerful tool for retailers, helping them to quickly find and analyze data related to customer behavior, sales, and inventory. It can also be used to personalize the shopping experience for individual customers based on their past interactions and preferences.

Learn more and download the report.

Customers innovating with Azure Data & AI

I’d like to close my inaugural “what’s new” blog post with my favorite way of making everything I’ve covered above actionable—by sharing examples of our customers succeeding. I share them as a way of helping spark the understanding—maybe even a little imagination—so that others can better envision how this amazing new Data & AI technology can be used in their own organizations.

I hope you enjoyed reading this month’s edition of what’s new in Azure Data & AI. We look forward to sharing more insights and inspiration in the months ahead. If you have a question or idea you’d like to hear perspective on, please share in the comments section.

Grupo Bimbo transforms the data analysis of commercial areas with Microsoft Solutions

Mexico—The global baking company carries out hundreds of thousands of transactions globally and needed a way for collaborators to quickly access and act on insights to improve sales. By adopting Power BI and Azure Synapse, Grupo Bimbo was able to unify internal and external data, increase business agility, and democratize insights for increased productivity across the organization. Read the full Case Study.

How online marketplace CDON used AI to become a market leader

Sweden—CDON began as a small online retailer of CDs, DVDs, and games and quickly became the Nordic’s largest online marketplace. When CDON realized legacy technologies were holding it back, the company looked to Azure to accelerate its innovation and scale. Now their developers use Azure Cognitive Search, Azure DevOps, and other services to better understand the user experience on their website and increase personalization for better performance. Read the full CDON Case Study.

How AEON Group increased profits and maximized inventory with data and AI

Japan—Since opening its first store in December 2005, Maibasuketto, a member of the AEON Group's supermarket business, has expanded its footprint at a pace of 100 stores every year. To ensure its growth was sustainable, the company embarked on an initiative to optimize store ordering and operations using Azure Cognitive Services, maximizing supply chain and inventory based on selling patterns throughout the day. Read the full AEON Group Case Study.

How Fashable reimagines fashion design with Azure Machine Learning and PyTorch

Portugal—Using Azure Machine Learning, Fashable created an AI application that generates original clothing designs to quickly get a pulse on consumer preferences. This helps fashion companies understand customer demand, get to market faster, and reduce clothing waste by only producing what they know will sell. Read the full Fashable Case Study.

WTW accelerates delivery time with Azure Migration and Analytics

United Kingdom—Operating in more than 140 countries with over 40,000 employees, Willis Towers Watson (WTW) has decades of experience working with the world’s largest loyalty programs. To efficiently govern the exponential growth in data for these programs and apply innovations in advanced analytics and AI, WTW moved its workload to Azure. Now their employees have more time to focus on uncovering insights for clients. Read the full WTW Case Study.

Quelle: Azure

General availability of Azure OpenAI Service expands access to large, advanced AI models with added enterprise benefits

Large language models are quickly becoming an essential platform for people to innovate, apply AI to solve big problems, and imagine what’s possible. Today, we are excited to announce the general availability of Azure OpenAI Service as part of Microsoft’s continued commitment to democratizing AI, and ongoing partnership with OpenAI.

With Azure OpenAI Service now generally available, more businesses can apply for access to the most advanced AI models in the world—including GPT-3.5, Codex, and DALL•E 2—backed by the trusted enterprise-grade capabilities and AI-optimized infrastructure of Microsoft Azure, to create cutting-edge applications. Customers will also be able to access ChatGPT—a fine-tuned version of GPT-3.5 that has been trained and runs inference on Azure AI infrastructure—through Azure OpenAI Service soon.

Empowering customers to achieve more

We debuted Azure OpenAI Service in November 2021 to enable customers to tap into the power of large-scale generative AI models with the enterprise promises customers have come to expect from our Azure cloud and computing infrastructure—security, reliability, compliance, data privacy, and built-in Responsible AI capabilities.

Since then, one of the most exciting things we’ve seen is the breadth of use cases Azure OpenAI Service has enabled our customers—from generating content that helps better match shoppers with the right purchases to summarizing customer service tickets, freeing up time for employees to focus on more critical tasks.

Customers of all sizes across industries are using Azure OpenAI Service to do more with less, improve experiences for end-users, and streamline operational efficiencies internally. From startups like Moveworks to multinational corporations like KPMG, organizations small and large are applying the capabilities of Azure OpenAI Service to advanced use cases such as customer support, customization, and gaining insights from data using search, data extraction, and classification.

“At Moveworks, we see Azure OpenAI Service as an important component of our machine learning architecture. It enables us to solve several novel use cases, such as identifying gaps in our customer’s internal knowledge bases and automatically drafting new knowledge articles based on those gaps. This saves IT and HR teams a significant amount of time and improves employee self-service. Azure OpenAI Service will also radically enhance our existing enterprise search capabilities and supercharge our analytics and data visualization offerings. Given that so much of the modern enterprise relies on language to get work done, the possibilities are endless—and we look forward to continued collaboration and partnership with Azure OpenAI Service."—Vaibhav Nivargi, Chief Technology Officer and Founder at Moveworks.

“Al Jazeera Digital is constantly exploring new ways to use technology to support our journalism and better serve our audience. Azure OpenAI Service has the potential to enhance our content production in several ways, including summarization and translation, selection of topics, AI tagging, content extraction, and style guide rule application. We are excited to see this service go to general availability so it can help us further contextualize our reporting by conveying the opinion and the other opinion.”—Jason McCartney, Vice President of Engineering at Al Jazeera.

“KPMG is using Azure OpenAI Service to help companies realize significant efficiencies in their Tax ESG (Environmental, Social, and Governance) initiatives. Companies are moving to make their total tax contributions publicly available. With much of these tax payments buried in IT systems outside of finance, massive data volumes, and incomplete data attributes, Azure OpenAI Service finds the data relationships to predict tax payments and tax type—making it much easier to validate accuracy and categorize payments by country and tax type.”—Brett Weaver, Partner, Tax ESG Leader at KPMG.

Azure—the best place to build AI workloads

The general availability of Azure OpenAI Service is not only an important milestone for our customers but also for Azure.

Azure OpenAI Service provides businesses and developers with high-performance AI models at production scale with industry-leading uptime. This is the same production service that Microsoft uses to power its own products, including GitHub Copilot, an AI pair programmer that helps developers write better code, Power BI, which leverages GPT-3-powered natural language to automatically generate formulae and expressions, and the recently-announced Microsoft Designer, which helps creators build stunning content with natural language prompts.

All of this innovation shares a common thread: Azure’s purpose-built, AI-optimized infrastructure.

Azure is also the core computing power behind OpenAI API’s family of models for research advancement and developer production.

Azure is currently the only global public cloud that offers AI supercomputers with massive scale-up and scale-out capabilities. With a unique architecture design that combines leading GPU and networking solutions, Azure delivers best-in-class performance and scale for the most compute-intensive AI training and inference workloads. It’s the reason the world’s leading AI companies—including OpenAI, Meta, Hugging Face, and others—continue to choose Azure to advance their AI innovation. Azure currently ranks in the top 15 of the TOP500 supercomputers worldwide and is the highest-ranked global cloud services provider today. Azure continues to be the cloud and compute power that propels large-scale AI advancements across the globe.

Source: TOP500 The List: TOP500 November 2022, Green500 November 2022.

A responsible approach to AI

As an industry leader, we recognize that any innovation in AI must be done responsibly. This becomes even more important with powerful, new technologies like generative models. We have taken an iterative approach to large models, working closely with our partner OpenAI and our customers to carefully assess use cases, learn, and address potential risks. Additionally, we’ve implemented our own guardrails for Azure OpenAI Service that align with our Responsible AI principles. As part of our Limited Access Framework, developers are required to apply for access, describing their intended use case or application before they are given access to the service. Content filters uniquely designed to catch abusive, hateful, and offensive content constantly monitor the input provided to the service as well as the generated content. In the event of a confirmed policy violation, we may ask the developer to take immediate action to prevent further abuse.

We are confident in the quality of the AI models we are using and offering customers today, and we strongly believe they will empower businesses and people to innovate in entirely new and exciting ways.

The pace of innovation in the AI community is moving at lightning speed. We’re tremendously excited to be at the forefront of these advancements with our customers, and look forward to helping more people benefit from them in 2023 and beyond.

Getting started with Azure OpenAI Service

Learn more about Azure OpenAI Service and more about all the latest enhancements.
Apply for access to Azure OpenAI Service. Once approved for use, customers can log in to the Azure portal to create an Azure OpenAI Service resource and then get started either in our Studio website or via code:

How to create an Azure OpenAI Service resource
Quickstart: how to get started generating text

Read the blog: AI and the need for purpose-built cloud infrastructure.
Watch a video with tips on how to get started with Azure OpenAI Service:

Quelle: Azure

Azure Confidential Computing on 4th Gen Intel Xeon Scalable Processors with Intel TDX

Microsoft continues to be the cloud leader in confidential computing, and the Azure team is excited to continue our leadership by partnering with Intel to offer confidential computing on 4th Gen Intel Xeon Scalable processors with Intel Trusted Domain Extensions (Intel TDX) later this year, enabling organizations in highly regulated industries to lift and shift their workloads that handle sensitive data to scale in the cloud. Intel TDX meets the Confidential Computing Consortium (CCC) standard for hardware-enforced memory protection not controlled by the cloud provider, all while delivering minimal performance impact with no code changes. 

Azure and Intel enable innovative use cases

Across industries, Microsoft Azure customers use confidential computing with Intel processors to achieve higher levels of data privacy and mitigate risks associated with unauthorized access to sensitive data or intellectual property. They are leveraging innovative solutions such as data clean rooms to accelerate the development of new healthcare therapies, and privacy-preserving digital asset management solutions for the financial industry. These scenarios and more are in production today, leveraging 3rd Gen Intel Xeon Scalable processors with Intel Software Guard Extensions (Intel SGX), a foundational technology of the Azure confidential computing portfolio. In fact, Azure was the first major cloud provider to offer confidential computing in the cloud with virtual machines (VMs) enabled with Intel SGX application isolation. As founding members of the CCC, Microsoft and Intel work with numerous other member organizations to define and accelerate adoption of confidential computing. This effort includes contributions to several open source projects. The Azure team looks forward to extending this collaboration by bringing to market Intel TDX–based services in Azure.

Intel TDX extends Azure's existing confidential computing offerings

Today, Azure’s DCsv3 VMs offer application isolation using Intel SGX, delivering the smallest trust boundary of any confidential computing technology today. The addition of Intel TDX expands our portfolio to offer isolation at the VM, container or application levels to meet the diversity of customer needs. Azure is the only major cloud provider committed to offering both VM-level and application-level confidential computing offerings. Both are supported by Intel’s hardware root of trust and address the attestation requirements that meet the confidential computing industry standard. Both Intel TDX and Intel SGX technologies provide capabilities that help remove the cloud operator’s access to data, including removing the hypervisor from the trust boundary. 

Removing trust in the hypervisor

While Azure has engineered our hypervisor to be very secure, we are seeing a growing number of customers seeking further protections to meet data sovereignty and regulatory compliance. These customers require increased isolation and protection of their workloads to reduce the risk of unauthorized data access. As such, Microsoft leverages hardware control over hypervisors to protect customer data. With Intel-based confidential computing solutions on Azure, altering the hypervisor does not allow Azure operators to read or alter customer data in memory.

Establishing trust via attestation

Attestation is a critical concept of confidential computing. It allows customers to verify the third-party hardware root of trust and software stack prior to allowing any code to access and process data. With Intel TDX, the attestation is done against the entire VM or container, each with a unique hardware key to keep memory protected. With Intel TDX, we will offer attestation support with Microsoft Azure Attestation as standard and will also partner closely with Intel on their upcoming trust service, code-named "Project Amber," to meet the security requirements of customers.

Confidential computing takes off

Many Azure confidential computing customers can attest to the value they receive from our existing Intel confidential computing offerings.

Novartis Biome uses BeeKeeperAI’s EscrowAI confidential clean room solution on Azure confidential computing for the training and validation of algorithms to predict instances of a rare childhood condition using real patient data from health records, while maintaining privacy and compliance.

“Rare diseases are often challenging to diagnose and if left untreated, they can significantly diminish a patient’s quality of life. With BeeKeeperAI, our scientists were able to securely access a large gold standard dataset that enabled us to improve the predictive capabilities of our algorithm, bringing us much closer to identifying patients early in the disease course and to improving their outcomes.” —Robin Roberts, Co-founder and Chief Operating Officer, Novartis Biome

Fireblocks provides enterprise-grade secure infrastructure for moving, storing, and issuing digital assets. They use Intel confidential computing technology on Azure to hold one of the keys to its wallets.

"Some of the biggest cryptocurrency businesses, financial institutions, and enterprises in the world trust Fireblocks software and APIs to provide digital custody solutions, manage treasury operations, access DeFi, mint and burn tokens, and manage their digital asset operations. We leverage Azure to hold one of the keys to our wallets due to Azure Confidential Computing … " —Michael Shaulov, CEO and Co-founder, Fireblocks

Carbon Asset Solutions soil-based carbon credit collection and tracking system uses immutable ledger technology provided by Azure confidential ledger.

"Carbon Asset Solutions is a world-first precision measurement, recording, and verification platform focused on atmospheric carbon removal through soil carbon sequestration. With Azure, we deliver higher integrity Carbon Credits than any other method." —Sara Saeidi, Chief Operating Officer, Carbon Asset Solutions

Azure’s vision for the confidential cloud

We see a future where confidential computing is standard and pervasive both in the cloud and at the edge within all Azure service offerings. Customers will be able to more confidently use the cloud for their most sensitive data workloads while verifying the environment and staying in full control of data access. We look forward to the launch of 4th Gen Intel Xeon Scalable processors and offering Intel TDX–enabled instances with VM-level data protection and performance improvements later this year, continuing our partnership with Intel to help transition Azure to the confidential cloud.

Learn more

Sign up for early access to Intel TDX confidential VMs coming later this year.

Get started today deploying VMs and AKS nodes with Intel SGX application enclaves.

Current Azure confidential computing–based services featuring Intel technology:

Foundational infrastructure as a service (IaaS) elements utilizing Intel SGX such as Virtual Machines with Application Enclaves and Intel SGX based confidential computing nodes on Azure Kubernetes Service.
Azure first-party confidential computing software as a service (SaaS) such as Microsoft Azure Attestation, Azure confidential ledger, Azure Managed Confidential Consortium Framework (preview), and Azure Key Vault Managed HSM.
Various third-party confidential computing SaaS, many of which are captured in this webinar series.

Open source tools for developing Intel-based confidential computing apps on Azure:

The Open Enclave (OE) Software Development Kit (SDK)
The EGo SDK
The Intel SGX SDK
The Confidential Consortium Framework (CCF)
Gramine
Occlum
MarbleRun
SCONE

Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries.
Quelle: Azure

Microsoft named a Leader in 2022 Gartner® Magic Quadrant™ for Insight Engines

How your organization can benefit, no matter the industry.

As the amount of data being generated continues to grow at an exponential rate, it's becoming increasingly important for organizations to have a rich set of tools that can help them make sense of it all. That's where insight engines come in. These powerful solutions apply relevancy methods to data of all types, from structured to highly unstructured, allowing users to describe, discover, organize, and analyze it to deliver information proactively or interactively at the right time, in the right context.

Microsoft has recently been named a Leader in the 2022 Gartner Magic Quadrant for Insight Engines, a report that evaluates the capabilities of various vendors in the market.

Microsoft offers two integrated solutions in this space: Microsoft Search, which is available with Microsoft 365, and Azure Cognitive Search, which is available as a platform as-a-service (PaaS) with Microsoft Azure. These solutions are designed to help professionals and developers build impactful AI-powered search solutions that can solve complex problems and enhance the customer experience by enabling information discovery across the spectrum from unstructured to structured data. Whether you need a turnkey solution to reason over enterprise data or the flexibility to tailor search to specific scenarios, Microsoft has you covered.

Azure Cognitive Search can be used in a variety of industries to improve efficiency and decision-making. Some specific examples of how it can be used include:

Manufacturing: Cognitive Search can be used to help manufacturers quickly find information about production processes, equipment, and materials. It can be applied to structured data scenarios such as part catalogs as well as unstructured content such as equipment manuals, safety procedures, and imagery. 
Energy: Cognitive Search can be used to quickly find information related to exploration, drilling, and production. Geo-location search combined with traditional search input enables discovery experiences to get the most of past and present geological site studies, and extensibility allows incorporating energy industry-specific information.
Retail: Cognitive Search can be used to develop a powerful product catalog search experience for retail web sites and apps. Customizable ranking options, scale capability to handle peak traffic with low latency, and the ability for near-real time updates for critical data such as inventory make it a great fit for the scenario. 
Financial services: Cognitive Search can be used by financial institutions to quickly find data related to investments, market trends, and regulatory compliance. Its sophisticated semantic ranking and question-answering capabilities can enable users to answer business questions faster and more confidently.
Healthcare: Cognitive Search can be used by healthcare organizations to improve patient care, streamline operations, and make better informed decisions by quickly finding and accessing relevant information within electronic medical record systems, providing real-time access to clinical guidelines and evidence-based best practices.

Nearly every user knows what to do when they see a search box. All SaaS applications targeting audiences from consumer to enterprise can greatly benefit from a great search experience over their own data. Azure Cognitive Search can deliver an out-of-the-box solution, inclusive of various multi-tenancy strategies, support for over 50 languages, and a global presence to ensure your solution is delivered in the right location for your customers.

If you're a technical decision maker in one of these industries, or any other industry, and you're interested in learning more about how Microsoft's cognitive search solutions can help you unlock the full potential of your data, you can visit the Azure Cognitive Search website and the Microsoft Search website.

You can also download a complimentary copy of the Gartner Magic Quadrant for Insight Engines to see how Microsoft is recognized in the space.

 

 

Gartner, Magic Quadrant for Insight Engines, Stephen Emmott, Anthony Mullen, David Pidsley, Tim Nelms, 12 December 2022

Gartner is a registered trademark and service mark, and Magic Quadrant is a registered trademark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission. All rights reserved.

This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from Gartner Reprint.

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
Quelle: Azure

Microsoft named a Leader in The Forrester Wave™: Public Cloud Development and Infrastructure Platforms, 2022

Forrester recently published its report, The Forrester Wave™: Public Cloud Development and Infrastructure Platforms, Global, Q4 2022, placing Microsoft in the “Leaders” category. It’s an honor to be named as one of only two leaders in Forrester’s definitive report on the public cloud development and infrastructure platform market.

The Forrester report recognized Microsoft for its long-term focus on Kubernetes, hybrid, and multicloud capabilities and noted that it is seeking to lead in hybrid and multicloud environments with platform management tools and capabilities. Reference customers praised Microsoft’s service improvements and partnerships. With Microsoft Azure, customers have a trusted cloud partner and the most advanced, highly integrated enterprise IT infrastructure to help them navigate ever-changing environments and achieve business success today, while they build for the future.

Helping developers build any app for any platform

We recognize that developers are the driving engine of innovation. When they are empowered to set up a complete engineering system in seconds, contribute and collaborate with anyone on any device, use the right tool for the job, and integrate with the rest of the organization’s digital estate, organizations can bring innovation to market faster with greater confidence. Azure makes all that possible. For example, with Microsoft Visual Studio, developers can deploy iOS, Android, Windows, Web or embedded apps to wherever they’d like–Azure, hybrid, on-premises and multi-cloud environments. Further, Azure fully supports some of the most popular open source technologies from Linux, to open-sourced databases, to Grafana, allowing organizations to leverage existing investments when running on Azure.

In August, we introduced Microsoft Dev Box, a managed service for developers to create on-demand, high-performance, secure, ready-to-code, project-specific workstations in the cloud, so they can work and innovate anywhere. And we’ve continued to bring new Kubernetes capabilities across Azure, which I’ll cover a little later in this article.

As the range of application development tools continues to grow, we’re seeing a surge in low-code technologies to spur innovation and lower the barrier to entry. Microsoft makes it easy with PowerApps, which provides prebuilt templates, drag-and-drop simplicity, quick deployment and AI-powered assistance, helping anyone create apps using natural language, while enabling the same DevOps practices for low-code tools that customers expect when building trusted enterprise solutions.

In this new world, organizations are harnessing the cloud to create a culture where everyone feels empowered to innovate, while lowering the barrier to creating new types of apps that can take businesses to new heights.

The Microsoft Intelligent Data Platform

We’re entering the age of the “intelligent app,” where every app is AI-enabled and adapts to each organization’s modern data capabilities. However; fragmented digital estates make it difficult for organizations to harness their data to add layers of intelligence to their apps.

At our Build event in May, we announced the Microsoft Intelligent Data Platform that fully integrates databases, analytics, and governance for a unified data estate. With this integration, organizations can power applications at any scale, get actionable insights from all their data, and properly govern data where it resides. To accelerate time to value, customers can use pre-built, customizable, and production-ready AI models as the building blocks for intelligent solutions with Azure Cognitive Services and Azure Applied AI Services.

As AI becomes more mainstream across organizations, it’s essential that employees have the tools to leverage this technology responsibly. We apply Microsoft's Responsible AI Standards to our product development, and have made it a priority to help customers understand, protect, and control their AI solutions with tools and resources like the Responsible AI Dashboard, bot development guidelines, and built-in tools to help explain model behavior, test for fairness and more.

By unifying and integrating data to create more intelligent apps, customers are opening the door to new innovations never thought possible.

From cloud to edge: Innovate securely, anywhere

More and more organizations are embracing hybrid and multicloud as part of their migration and modernization journeys, and they want to continue this flexible approach in a secure, compliant, reliable, and integrated way. Forrester credits Microsoft with "seeking to lead in hybrid and multicloud environments with platform management tools and capabilities, including the Azure Arc management platform."

Azure Arc operates as a bridge extending across the Azure platform by allowing applications and services the flexibility to run across on-premises, edge, and multicloud environments. One of the key challenges organizations face is securing and managing their distributed environments consistently while building innovative applications using cloud-native technologies.

Recently, we announced new deployment options for Azure Kubernetes Services enabled by Azure Arc so customers can run containerized apps, in addition to many first-party Azure application, data, and machine learning services, anywhere regardless of their location.

They can also take advantage of Azure’s comprehensive security, governance, and management capabilities for their Windows, Linux, SQL Server, and Kubernetes deployments in their datacenters, at the edge, or multicloud.

Azure is the only cloud platform built by a security vendor and ensuring that our customers' data is safe and secure is at the forefront of everything we do. For example, our Defender for Cloud security service spans across all clouds—even AWS and Google Cloud for a seamless, consistent, and secure cloud journey where it leads.

Our deep commitment to our customers is baked into every aspect of our vision and roadmap—to be the trusted partner with the most advanced, yet flexible cloud technologies that enable anyone in any organization to innovate anywhere. It’s an honor to be recognized for that commitment and a great way to usher in the New Year.

Learn more

Read The Forrester Wave™: Public Cloud Development And Infrastructure Platforms, Global, Q4 2022.
Learn how Toyota employees used low-code to create more than 400 apps to meet business needs.
Learn how Sutherland, a professional services company, built a data-driven culture with Microsoft Azure.
Learn how the National Basketball Association delivers compelling experiences for its fans through intelligent applications. 
Learn how Royal Bank of Canada, the largest bank in Canada, is using Azure Arc–enabled data services to take advantage of always up-to-date cloud-native data services to modernize its large data estate.
Read how a study found a 228 percent ROI when modernizing apps on Azure's platform as a service.

Quelle: Azure

Microsoft Innovation in RAN Analytics and Control

Currently, Microsoft is working on RAN Analytics and Control technologies for virtualized RAN running on Microsoft Edge platforms. Our goal is to empower any virtualized RAN solution provider and operators to realize the full potential of disaggregated and programmable networks. We aim to develop platform technologies that virtualized RAN vendors can leverage to gain analytics insights in their RAN software operations, and to use these insights for operational automations, machine learning, and AI-driven optimizations.

Microsoft has recently made important progress in RAN analytics and control technology. Microsoft Azure for Operators is introducing flexible, dynamically loaded service models to both the RAN software stack and cloud/edge platforms hosting the RAN, to accelerate the pace of innovation in Open RAN.

The goal of Open RAN is to accelerate innovation in the RAN space through the disaggregation of functions and exposure of internal interfaces for interoperability, controllability, and programmability. The current standardization effort of O-RAN by O-RAN Alliance, specifies the RAN Intelligent Controller (RIC) architecture that exposes a set of telemetry and control interfaces with predefined service models (known as the E2 interface). Open RAN vendors are expected to implement all E2 service models specified in the standard. Near-real-time RAN controls are made possible with xApp applications accessing these service models.

Microsoft’s innovation extends this standard-yet-static interface. It introduces the capability of getting detailed internal states and real-time telemetric data out of the live RAN software in a dynamic fashion for new RAN control applications. With this technology, together with detailed platform telemetry, operators can achieve better network monitoring and performance optimization for their 5G networks, and enable new AI, analytics, and automation capabilities that were not possible before.

This year, Microsoft, together with contributions from Intel and Capgemini, has developed an analytics and control approach that was recognized with the Light Reading Editor’s Choice award under the category of Outstanding Use case: Service provider AI. This innovation calls for dynamic services models for Open RAN.

Dynamic service models for real-time RAN control

There are many RAN control use cases that require dynamic service models beyond those specified in O-RAN today, such as access to IQ samples, RLC and MAC queue sizes, and packet retransmission information. These high-volume real-time data need to be aggregated and compressed before being delivered to the xApp. Also, detailed data from different RAN modules across different layers like L1, L2, and L3 may need to be collected and correlated in real-time before any useful insight can be derived and shared with xApp. Further, a virtualized RAN offers so many more possibilities, that any static interface or service model may be ineffective in meeting the more advanced real-time control needs.

One such example occurs with interference detection. Today, operators typically need to do a drive test to detect external interference in a macro cell. But now, Open RAN has the potential to replace the expensive truck roll with a software program that detects interference signals at the RAN’s L1 layer. However, this will require a new data service model with direct access to raw IQ samples at the physical layer. Another example exists in dynamic power saving. If a RAN power controller can see the number of packets queued at various places in the live RAN system, then it can estimate the pending process loads and optimize the CPU frequency at a very high pace, in order to reduce the RAN server power consumption. Our study has shown that we can reduce the RAN power consumption by 30 percent through this method—even during busy periods. To support this in Open RAN, we will need a new service model that exposes packet queuing information.

These new use cases are envisioned for the time after the current E2 interface has been standardized. To achieve them, though, we need new RAN platform technologies to quickly extend this interface to support these and future advanced RAN control applications.

The Microsoft RAN analytics and control framework

The Microsoft RAN analytics and control framework extends the current RIC service models in O-RAN architecture to be both flexible and dynamic. In the process, the framework allows RAN solution providers and operators to define their own service models for dynamic RAN monitoring and control. Here, the underlying technology is a runtime system that can dynamically load and execute third-party code in a trusted and safe manner.

This system enables operators and trusted third-party developers to write their own telemetry, control, and inference pieces of code (called “codelets”) that can be deployed at runtime at various points in the RAN software stack, without disrupting the RAN operations. The codelets are executed inline in the live RAN system and on its critical paths, allowing them to get direct access to all important internal raw RAN data structures, to collect statistics, and to make real-time inference and control decisions.

To ensure security and safety, the codelets checked with static verified with verification tools before they can be loaded, and they will be automatically pre-empted if running longer than the predefined execution budgets. The dynamic code extension system is the same as the Extended Berkeley Packet Filter (eBPF), which is a proven technology that has been entrusted to run custom codes in Linux kernels on millions of mission-critical servers around the globe. The inline execution is also extremely fast, typically incurring less than one percent of overhead on the existing RAN operations.

The following image illustrates the overall framework and the dynamic service model denoted by the star circle with the letter D.

The benefit of the dynamic extension framework with low-latency control is that it can open the opportunity for third-party real-time control algorithms. Traditionally, due to the tight timing constraint, a real-time control algorithm must be tightly implemented and integrated inside the RAN system. The Microsoft RAN analytics framework allows RAN software to delegate certain real-time control to RIC, potentially leading to a future marketplace where real-time control algorithms, machine learning, and AI models for optimizations may be possible.

Microsoft, Intel, and Capgemini have jointly prototyped this technology in Intel’s FlexRAN™ reference software and Capgemini’s 5G RAN. We have also identified standard instrumentation points aligned with the standard 3GPP RAN architecture to achieve higher visibility into the RAN’s internal state. We have further developed 17 dynamic service models, and enabled many new and exciting applications that were previously not thought possible.

Examples of new applications of RAN analytics

With this new Analytics and Control Framework, applications of dynamic power savings and interference detection described earlier can now be realized.

RAN-agnostic dynamic power saving

5G RAN energy consumption is a major OPEX item for any mobile operator. As a result, it is paramount for a RAN platform provider to find any opportunity to save power when running the RAN software. One such opportunity can be found by stepping down the RAN server CPU frequency when the RAN processing load is not at full capacity. This is indeed promising because internet traffic is intrinsically “bursty”; even during peak hours, the network is rarely operated at full capacity.

However, any dynamic RAN power controller must also have accurate load prediction and fast reaction in millisecond timescale. Otherwise, if one part of RAN is in hibernation, then any instant traffic burst will cause serious performance issues, or even crashes. The Microsoft RAN analytics framework with dynamic service models and low-latency control-loop makes it possible to write a novel CPU frequency prediction algorithm based on the number of active users, and changes in different queue sizes. We have implemented this algorithm on top of Capgemini 5G RAN and Intel FlexRAN™ reference software, and we achieved up to 30 percent energy savings—even during busy periods.

Interference detection

External wireless interference has long been a source of performance issues in cellular networks. Detecting external wireless interference is difficult and often requires a truck roll with specialized equipment and experts to detect it. With dynamic service models, we can turn an O-RAN 5G base station into a software-defined radio that can detect and characterize external wireless interference without affecting the radio performance. We have developed a dynamic service model that averages the received IQ samples across frequency chunks and times inside an L1 of the FlexRAN™ reference software stack. The service model in turn reports the averages to an application that runs an AI and machine learning model for anomaly detection, in order to detect when the noise floor increases.

Virtualized and software-based RAN solution offer immense potential of programmable networks that can leverage AI, machine learning, and analytics to improve network efficiency. Dynamic service models for O-RAN interfaces further enhances the pace of innovation with added flexibility and security.

Learn more

Learn more about Microsoft Azure for Operators from our website.
Microsoft Research Technical Report.
Microsoft’s Innovation in RAN Analytics is The Editor’s Choice for “the Outstanding Use Case: Service Provider AI” in the 2022 Leading Lights Award. Leading Lights 2022: The Winners | Light Reading.
Finalist in the Fierce Innovation Award–Telecom Edition 2022: Finalists | Fierce Telecom Awards.

Quelle: Azure

Microsoft Cost Management 2022 year in review

In some ways, 2022 is what we expected out of 2021. Perhaps it’s better late than never, but the world is beginning to get back to normal—albeit a new normal where hybrid work is a default rather than an exception for many of us. In this new world, demands on our time have increased exponentially, making it more critical than ever to focus on maximizing value, return on investment, and cloud efficiencies. And that’s exactly what you saw as we doubled down on savings opportunities in 2022.

Streamlined management behind a single pane of glass

The last few years have been focused on building a new commerce platform that brings all Microsoft offers together under a single billing relationship. Microsoft Customer Agreement really puts you in control with consolidated invoicing that you can customize and split to meet your needs. 2022 continued down this path by expanding support to an even broader audience.

Perhaps the most notable change was the major shift you saw as Azure Cost Management expanded coverage to Microsoft 365, Dynamics 365, and more in January. At the same time, Cost Management was made available directly within the Microsoft 365 admin center, giving you more freedom to view costs where you are. Then, in May, Azure Cost Management was rebranded to Microsoft Cost Management to align with our vision of a single cost management experience across all your commercial products and services.

While Cost Management expanded to a new portal, Enterprise Agreement billing account management took the next step towards consolidating into the Azure portal in March.

Going back to Microsoft Customer Agreement, in May, organizations with billing accounts spread across multiple tenants can now link and centrally manage these accounts from a single tenant. You also saw new licensing benefits that make bringing workloads and licenses to partners’ clouds easier for Cloud Solution Provider partners in October.

A few of the things we didn’t announce, but some of you may have noticed include invoice details improvements in the portal, a streamlined support experience for refunds, transitioning education accounts to Microsoft Customer Agreement (which comes with Cost Management support), and a faster usage pipeline to get cost details to you faster than ever before.

What's next?

Looking at 2022, expect to see more organizations transitioning to Microsoft Customer Agreement, new Microsoft 365 offers covered by Cost Management, more Cost Management capabilities in the Microsoft 365 admin center, and further consolidation of Enterprise Agreement billing account management capabilities in the Azure portal for partners. You'll also see continued rollout of the faster usage pipeline to more accounts, which will be available for Microsoft Customer Agreement accounts first, then Enterprise Agreement accounts later in the year. We're very excited to get these into your hands and hear what you'd like to see next.

Rich cost reporting and analytics

I’ve talked about how the Cost analysis preview is the future of analytics and insights in Cost Management. 2022 introduced many improvements that show you where we’re headed with Cost analysis. It started with multitasking in January, showing how you’ll be able to investigate multiple perspectives of your data simultaneously, speeding up investigation times. Then you saw the next evolution of smart insights with the anomaly detection preview in February and cost savings insights in September. Going back to April, you saw summarized totals that give you an at-a-glance view of your total and average cost as they compare to your budget and building on that, you saw a callout for the change since the previous period in November. You also saw improvements around how resources are grouped in the Resources view, making it easier than ever to quickly review your costs by grouping child resources like Microsoft Azure SQL databases in May and enabling you to group resources your own way with the cm-resource-parent tag in October. And if that wasn’t enough, you may have noticed performance and reliability improvements in the underlying APIs as well as a few more preview features available in Cost Management Labs today that are coming soon.

For those interested in automation and integration with our APIs, you might be interested in downloading your Azure prices as a ZIP file in April or the new Cost Details API in July. I would especially encourage those using the old Usage Details or consumption APIs to take a hard look at the Cost Details API. Those APIs will be deprecated, so it’s important to get switched over to the new, more scalable API.

And for those on the go, you can now view your cost in the Azure mobile app as of June. You can see your current and forecasted cost for the month and check any budgets you have setup on your subscriptions and resource groups.

What's next?

2023 will see the general availability of the Cost analysis preview. That’ll start with a series of navigation updates that help you pick which cost view you want to start on followed by improvements to help you visualize and drill into costs. Classic cost analysis will remain available as we bring rich filtering and customization to the Cost analysis preview. Our goal is to bring each of those capabilities in better than they were before, including more built-in cost views that help you do more than you can today.

When it comes to automation and integration, expect to see continued evolution of scheduled exports ranging from support for storage accounts behind a firewall and overwriting files for current month exports instead of generating new files every day to more data sets becoming available for exports, like price sheets, and new guidance and templates to help you better manage data at scale on top of your exports.

On top of all this, you’ll also continue to see latency, performance, reliability, and usability improvements throughout the year. Our ultimate goal is to bring the time it takes cost data to make it to you in either APIs or portal experiences down to one to two hours.

Flexible cost control that puts the power in your hands

When it comes to cost governance and driving accountability throughout a large organization, tags are critical. And with that, I want to start by calling out the tag inheritance preview from November. This is a very powerful tool that allows you to apply your subscription and resource group tags down to the cost data of your resources. You’ll see the applied tags both in Cost analysis in the portal as well as any data you pull via APIs or exports. Note that Cost Management tag inheritance works differently than Azure Policy: Tags are not applied to the resources themselves—tags are only available in cost data—and inherited tags are applied to resources that don’t include tags in their cost data today.

As for alerts, you saw anomaly alerts configurable in the Azure portal in May followed by the Scheduled Actions API to configure anomaly and scheduled alerts programmatically in June. Also in June, you saw budgets now support action groups common alert schema. Then in September, you saw the addition of budgets in the Azure mobile app, which I mentioned earlier. We haven’t fully rolled it out yet, but some of you are also getting faster budget alerts, which we mentioned were in progress.

What's next?

Insights and alerts are a big area of discussion for us lately as we plan out the next set of changes to expand anomaly detection, identify new insights you might find useful, and add more options for cost alerts, including the full rollout of faster budget alerts, which come with a goal of alerting you within two hours of going over your budget. First up on this list is reservation utilization alerts, followed by resource group anomaly detection, but stay tuned as we continue to flesh out these plans.

New ways to save and do more with less

As you heard in many of the keynotes and sessions at Microsoft Ignite this year, cost optimization is a major focus for us at Microsoft. I’m going to start with the biggest announcement in this space, which was the release of Azure savings plans in October. Savings plans are a commitment-based discount that help you reduce your costs by committing to consistent usage over one or three years. The one-liner is that they’re like a more flexible version of reservations, except based on cost instead of usage quantity. Of course, there’s a lot more to that story, so I encourage you to learn more. But while I’m on the subject of commitment-based discounts, I should also mention the availability of reservations for Azure Cache for Redis in August, Azure Backup Storage in September, VM software reservations in November, and a new on-demand capacity reservation type for virtual machines and Azure Site Recovery in March and AKS in April.

You saw Azure Advisor improvements like filtering cost recommendations by tag in July, cost savings insights in Cost analysis in September, Advisor Score general availability in October, and new cost recommendations for virtual machine scale sets in November.

Throughout the year, you saw several blog posts focused on helping you learn how to drive efficiency across the different services you use:

Rightsize to maximize your cloud investment in January.
Save big by using your on-premises licenses in January.
Unlock cloud savings on the fly with autoscale in April.
How to choose the right Azure services for your applications—It’s not A or B in July.
What is desktop as a service (DaaS) and how can it help your organization? in July.
Migrate and modernize with Azure to power innovation across the entire digital estate in July.
5 steps to prepare developers for cloud modernization in August.
SQL Server discovery and assessment with Azure Migrate in September.
Drive efficiency through automation and AI in October.

And lastly, here's a summary of the services you saw new cost-saving opportunities for in 2022:

General

Azure reduced prices in US West 3 in August and expanded to Qatar in September and Sweden in November.
Microsoft Teams Premium in October.

AI and machine learning

Machine Learning added auto-shutdown for idle compute instances in September.

Analytics

Stream Analytics increased the size of jobs and clusters and added autoscaling for jobs in May and expanded to 10 new regions in March; China East 3, China North 3, US DoD East, and US DoD Texas in July; and Qatar in October.
Azure Databricks added Serverless SQL support in August and expanded to Sweden Central and West Central US in June and West US 3 in August.

Compute

Virtual machines reduced prices for DCsv2/DCsv3 in January and then expansion to Switzerland and West US in April and Australia East, Japan East, South Central US, and Southeast Asia in May; added the ability to auto-delete associated resources and added disk bursting in February; introduced the Ebsv5 SKU in April and expanded to 13 additional regions in May, introduced the DCsv3/DCdsv3 SKU in May, NC A100 v4 in June, NVads A10 v5 in July, and HX and HBv4 in November; and upgraded HBv3 VMs in March.
RedHat Enterprise Linux (RHEL) and SUSE Linux Enterprise Server (SLES) improved their Azure Hybrid Benefit support in February.
Virtual machine scale sets added support for both standard and Spot VMs in the same scale set in October.
Azure Batch added support for spot VMs in March.
Azure VMWare Solution expanded to Sweden Central in August and introduced the AV36P and AV52 SKUs in November.

Containers

Azure Kubernetes Service (AKS) added scale-down mode in April; support for Azure Dedicated Host, node pool user start/stop, and the DCav5/ECav5 SKU in August; and Azure Hybrid Benefit in October.

Databases

Azure SQL Database Hyperscale released the general availability of named replicas in June.
SQL virtual machines released the general availability of best practices assessment.
Azure Database for MySQL Flexible Server expanded to US Gov Virginia and China East 2 and China North 2 in March and added support for B-series VMs in April.
Azure Database for PostgreSQL Flexible Server added support for more high availability regions and expanded to US Gov Virginia and US Gov Arizona in March and China North 3 and China East 3 in November.
Cosmos DB lowered the autoscale RU/s requirement in April; features for scalable, cost-effective application development in May; and increased the serverless container storage limit, improved the Try Azure for free experience, and added a 16MB limit per document in API for MongoDB in June.

Developer tools

App Configuration added geo replication support in August.

Hybrid and multicloud

Azure Arc added support for SQL Managed Instance Business Critical in May and expanded support for South Africa North and China East 2 and China North 2 in August.
Azure Stack HCI added support for Windows Server guest licensing offer in April and Azure Hybrid Benefit in October.

Management

Application Insights expanded to China North 3 and China East 3 in August.
Log Analytics expanded to China North 3 and China East 3 in August.
Azure Monitor configure high-volume verbose logs tables as basic logs and reduce the cost.
Azure Backup added support for zone-redundant storage in October.

Networking

Azure Load Balancer introduced Gateway Load Balancer in July.
Azure Firewall introduced a Basic tier in October.
Virtual network IP services made IPv6 offers free in July.
Network Watcher expanded support for hybrid networks in October.

Security

Azure Key Vault increased service limits in January.

Storage

Azure Storage added the ability to create an additional 5000 storage accounts per subscription in June and Premium SSD v2 disk storage in October.
Azure Archive Storage expanded to Switzerland North in April.
Azure NetApp Files expanded to Australia Central 2 in February and added backup support in Southeast Asia and UK South in September.

Web

Azure SignalR introduced a Premium tier in March.

What's next?

As usual, you'll see more of the same types of cost optimization opportunities throughout 2022. We'll also continue to partner with service teams to help them deliver cost recommendations and find new ways to help you save more on your existing workloads.

Making it easier to learn and use Cost Management and Billing

We're constantly on the lookout for ways to make Cost Management easier to learn and use. From ratings and reviews in the portal to user research, like the numerous surveys and research interviews we shared in 2022, and many, many conversations with you all—your feedback is critical.

With 21 previews throughout the year, it's hard to pick favorites, but I’d love to ask each of you to check out the latest changes in the Cost analysis preview, given those are planned to be rolled out soon.

There were also many videos and documentation updates from us, our partners, and the community. It's truly amazing to see how important cost visibility, accountability, and optimization are for everyone from early learners to the largest organizations. We covered 16 videos and 45 of the main documentation updates, but that barely scratches the surface of all the great learning content out there.

What's next?

As always, you can expect to see more of the same in 2023: Continued dedication to ease of use through early access to previews and experimentation and further improvements to documentation to facilitate your learning. We’re still finalizing a few aspects of what’s coming in this area, but I’ll leave you with one takeaway: We are absolutely dedicated to evolving and sharing proven FinOps practices—from native capabilities within our platform and tools to guidance that helps you make the right decisions on how to best manage your costs to broader alignment across the industry, making it easier than ever to align people and implement proven practices, regardless of the tools you use.

Looking forward to another year

With all the things that happened in 2022, we couldn't list everything here. Check out and subscribe to the Microsoft Cost Management monthly updates for the latest news.

We look forward to hearing your feedback as new and updated capabilities become available. And if you're interested in the latest features, before they're available to everyone, check out Cost Management Labs and don’t hesitate to reach out with any feedback. Cost Management Labs gives you a direct line to the Microsoft Cost Management engineering team and is the best way to influence and make an immediate impact on features being actively developed and tuned for you.

Follow Microsoft Cost Management on Twitter and subscribe to the YouTube channel for updates, tips, and tricks! And, as always, share your ideas and vote up others in the Cost Management feedback forum.

Best wishes from the Microsoft Cost Management team.
Quelle: Azure

Microsoft named a Leader in the 2022 Gartner® Magic Quadrant™ for Global Industrial IoT Platforms

As industrial Internet of Things (IoT) matures and transforms critical business functions, Microsoft continues to innovate and invest in this area and engage with a large ecosystem of solution partners.

The Microsoft Azure IoT platform can help industries improve their operations to become more efficient, agile, and sustainable. Industrial-focused IoT technologies from Microsoft are especially important for enterprises looking to do more with less in the current macro-economic environment.

We’re focused on enabling industrial IoT solution providers and customer operations to be successful, and those efforts haven’t gone unnoticed. For the third year in a row, Gartner has positioned Microsoft as a leader in the 2022 Gartner Magic Quadrant for Global Industrial IoT Platforms.

Leading the way in industrial IoT

In naming Microsoft as an IoT leader, Gartner cites our customer success, our technical strengths in security and manageability, and our focus on supporting end-to-end solutions, as well as our support for our ecosystem of partners. With a long history stretching to the early days of IoT technology, Microsoft remains committed to providing the evolving infrastructure that companies need to excel with IoT deployments. Our IoT technology is embedded in everything from inspecting vehicle build quality, to monitoring and shifting energy across utility grids, to recycling surplus manufacturing materials.

To continue leading on IoT technology and applications, we’re especially focused on supporting key areas that customers are prioritizing:

IT/OT convergence: Digital transformation in the industrial sector is increasingly bringing together IT—systems to collect and analyze data to manage processes—with operational technology (OT), which is dedicated to detecting and controlling physical processes. This IT/OT convergence allows for more direct and real-time monitoring, analysis, and control of industrial processes and enables increased automation, faster responses to challenges, and shifts in manufacturing or production rates. With Microsoft Azure IoT, operators can move toward this convergence with system openness and visibility, as well as security.

Cloud and on-site innovations: Microsoft Azure IoT provides a holistic approach to maintaining and securing digital infrastructure, handling and controlling data, and managing apps and databases no matter where they are. From high-performance computing in the cloud to edge and on-premises equipment, Azure can govern and help to secure servers, Kubernetes clusters, and apps with tools such as Azure Arc, Azure private multi-access edge compute (MEC), and Azure Stack HCI. They can provide consistent, fast, and seamless performance for scalable IoT systems spread across multi-cloud and on-premises locations.

Using digital twins and the metaverse: As IT/OT convergence continues, the ability to use digital replicas and virtual environments to evaluate or control machinery or simulate factory floor environments becomes more useful. Azure Digital Twins uses IoT spatial intelligence to create accurate models of processes or physical things. This lets operators test the effects of new production line configurations, new processes, and modifications to machinery without disrupting physical world operations. Immersive metaverse technology allows employees to virtually perform maintenance procedures or train for operating new industrial equipment before touching any physical assets. This is safer for employees and creates fewer production disruptions.

Security advancements: With IoT workloads moving between multiple clouds and on-premises environments with more flexibility than ever before, security is also more important than ever. This is especially critical with industrial IoT, where compromising critical systems could have serious consequences. The zero-trust security measures built into Azure are extensive and robust to secure cloud-based applications and data. With Azure Arc, customers can extend the familiar security controls of Azure to companies’ on-premises and edge infrastructures. Additionally, many of our solutions partners offer additional security as part of their offerings.

Collaborating with partners: The strength of Microsoft Azure IoT offerings lies in our partner ecosystem, which includes hundreds of systems integrators, independent software vendors, and device partners. These partners offer solution choices for a wide range of industries, including many industrial-related solutions. We work to give their solutions visibility to our customers to help match solutions with needs to address emerging challenges.

Shaping the future of IoT

We’re pleased that Gartner once again has recognized us as a Leader in Global IoT platforms with their Magic Quadrant recognition. As IoT applications continue to grow, more organizations are connecting their physical operations to the digital world to gain insights, optimize performance, and work toward sustainability. We continue to invest and evolve our broad Azure IoT platform, including our industrial IoT technologies, to help customers to accelerate their time to value on IoT investments and enable even more capabilities.

With our partners, we take our role as a Leader seriously in IoT platforms and products, and we plan to keep providing the products and support our customers need to move ahead with digital transformations.

Learn more

We invite you to learn more about the Azure IoT platform and products and explore the Azure Marketplace. You can also read the full complimentary Gartner Report.

Gartner, Magic Quadrant for Global Industrial IoT Platforms, Al Velosa, Eric Goodness, et. al, December 12, 2022.

Gartner is a registered trademark and service mark and Magic Quadrant is a registered trademark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission. All rights reserved.

This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from Magic Quadrant for Global Industrial IoT Platforms.

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
Quelle: Azure