Microsoft Azure 1st Hyperscale Cloud Computing Platform to Enable UK Law Enforcement Community

We’re excited and proud to announce that Microsoft Azure is the first hyper-scale cloud computing platform to be able to service UK law enforcement IT customers. This announcement comes in the wake of the United Kingdom’s National Police Information Risk Management Team (NPIRMT) completing a comprehensive physical security review of a Microsoft UK Data Centre. This review is a necessary step to provide assurance to UK law enforcement agencies that their information management systems would be hosted in Police Approved Secure Facilities (PASF).  

As stated by the College of Policing’s Authorized Professional Practice (APP), “Policing is an information-led activity, and information assurance is fundamental to how the police service manages many of the challenges faced in policing today.” Azure is proud to be recognized in this way as we contribute to the information assurance tapestry needed to enable the UK law enforcement community.

The actual NPRIMT PASF assessment is available to policing customers from the Home Office for individual Police Services to review as part of their own approach to risk assessment in utilizing cloud services.

*= It is important to note that the NPIRMT do not offer any warranty of physical security of the Microsoft data center.

Quelle: Azure

Announcing larger, higher scale storage accounts

One of the fastest areas of growth in cloud computing is around data storage. With a variety of workloads such as IoT telemetry, logging, media, genomics and archival driving cloud data growth, the need for scalable capacity, bandwidth, and transactions for storing and analyzing data for business insights, is more important than ever.

Up to 10x increase to Blob storage account scalability

We are excited to announce improvements in the capacity and scalability of standard Azure storage accounts, which greatly improves your experience building cloud-scale applications using Azure Storage. Effective immediately, Azure Blob storage accounts can support the following larger limits:

 

Resource

New Limit

Max capacity for Blob storage accounts

5PB (10x increase)

Max TPS/IOPS for Blob storage accounts

50K (2.5x increase)

Max ingress for Blob storage accounts

50Gbps (2.5-10x increase)

Max egress for Blob storage accounts

50Gbps (2.5-5x increase)

These new limits apply to both new and existing Blob storage accounts. You can continue to leverage all of your favorite features in these storage accounts at-scale, without any changes required. The new limits are available for all commercial clouds. We plan to raise limits similarly for Government and National clouds in the near future. Stay tuned for more information!

Please contact Azure Support to get your Blob storage accounts raised to these new limits for storage capacity, TPS/IOPS, ingress or egress. We are also actively working to increase these numbers even further, so contact us even if your workload requires even higher capacity.

Frequently asked questions

1. What if I need higher ingress or egress, or capacity limits for standard General Purpose accounts?

If you have an immediate need for more capacity, or ingress or egress for General Purpose storage accounts, you can make a request with Azure Support to raise limits and the request will be evaluated and responded to.

2. Do the new limits apply only to Blob Storage?

The new limits specified on Blob storage accounts apply only to Blob Storage. The increase to General Purpose storage accounts does, however, apply to all services at the account level. Other service-specific limits such as File share sizes and Queue TPS/IOPS will continue to apply.

3. Will I be able to apply the new limits for Disk traffic?

We recommend that you use Managed Disks since that already handles scalability of disks for your VM.

4. What are the differences between Azure Blob Storage and Azure Data Lake Store?

For a comparison of Azure Blob Storage and Azure Data Lake Storage, please visit the Comparing Azure Data Lake Store and Azure Blob Storage article.

For further details on scalability targets, please visit our Azure Storage scalability documentation. We are excited to help you build and scale out your applications. Please let us know how we can further assist.
Quelle: Azure

Servicing Azure Stack using the Update resource provider

In today’s world, security is paramount. Microsoft is committed to ensuring your Azure Stack environment stays both secure and functional – as it delivers consistency to build and deploy applications using the same approach, APIs, DevOps tools, and portal, as you use for Azure.

Azure Stack operators must be able to safely and reliably update their Azure Stack infrastructure, while at the same time, provide highly-available, mission-critical services to their customers. Updates can range in scope from software to firmware, across core components of the system. The update process must be easy and predictable, allowing customers to focus on other aspects of their business.

Challenges

How do Azure Stack operators determine where to download updates, how to apply them, what order to apply them in, who to call if there is problem, ensure minimal disruptions or determine maintenance window durations? Enter the Update resource provider, an integral part of Azure Stack.

What we built

Azure Stack has a built-in, dependency-aware orchestration engine that allows Azure Stack operators to import, run and monitor updates for Azure Stack. No additional tooling, internet connectivity or integration is required. Operators simply download the updates for Azure Stack, then import and run them using the Update tile in the Administrator portal during a pre-defined maintenance window. The fully-native Update resource provider will ensure updates are applied across all physical hosts, Service Fabric applications and runtimes, as well as all infrastructure roles.

Using the Update tile is easy, and managing updates from the administrator portal is a simple process. Operators navigate to the Updates tile to:

View important information, such as the current cloud version
Install available updates
Review update history for previously installed updates
View the cloud’s current OEM package version

As you can see below, an Azure Stack operator has downloaded and imported an update into Azure Stack that has been processed by the Update resource provider and is ready to be installed.

As updates are installed, an operator can easily view high-level status as the update process iterates through various subsystems in Azure Stack. Example subsystems include physical hosts, Service Fabric, infrastructure virtual machines, and services that provide both the administrator and user portals. High-level logging can be easily viewed during the update process using the “Download full logs” button from the Update run details blade.

Throughout the update process, the Update resource provider will report back to the operator additional details about the update, such as the number of steps that have succeeded, as well as the number in progress.

Once completed, the Update resource provider provides a “Succeeded” confirmation to the operator informing them that the update process has been completed and how long it took. From there, operators can view information about all updates, available updates or installed updates using the filter as seen below. Should the update fail to apply, the Update will report as “Needs attention” and will, in most cases, require a support ticket to be initiated. Use the “Download full logs” as indicated above to get a high-level status of where the update could have failed. In most cases, using the Azure Stack log collection will help facilitate diagnostics and troubleshooting.

Looking ahead

Customers can expect updates for Azure Stack to release at least monthly – and while Microsoft strongly encourages installing the updates as soon as possible, we understand there may be circumstances where updates are unable to be installed. In this case, customers can defer updates for up to three months to stay within our support boundaries. Please note, the updates for Azure Stack are non-cumulative and must be installed sequentially, so plan on extended maintenance windows under these circumstances. 

Firmware updates, provided by the OEM, will be updated outside the Update resource provider. Similar to software updates provided by Microsoft, maintenance windows will be strongly recommended. Contact your Azure Stack OEM for more information on firmware updates for Microsoft Azure Stack.

More information

For more information about managing updates in Azure Stack, see the Manage updates in Azure Stack section of our online documentation. Also if you are heading to Microsoft Ignite, drop by the Updating and Servicing Microsoft Azure Stack session (THR3007R2). For more information on our support policy, visit the Azure Stack product lifecycle policy page.
Quelle: Azure

Extending Microsoft Azure IP Advantage to China

This blog post was authored by Erich Andersen, Corporate Vice President and Chief IP Counsel, Microsoft Intellectual Property. 

Cloud-fueled digital transformation enables companies around the world to create new products and services, and engage with their customers at an unprecedented pace and scale. As they become digital businesses, companies need to address legal challenges which come with participating in the digital economy. Microsoft has developed strategies and assets to manage the intellectual property infringement risks that come with digital transformation. As our customers and partners become digital businesses, we are using our IP expertise and patent portfolio to help our customers protect their innovations in the cloud and focus on developing their business to succeed in their transformation.

Today, we are announcing that Microsoft Azure IP Advantage will be available in China beginning October 1, 2017, ensuring that Azure customers in China can enjoy the same great IP protection benefits as customers in the rest of the world. 

We have had a tremendous response to the program since we launched it last February. Customers recognize that uncapped indemnification coverage, including for open source software that powers Azure experiences, access to 10,000 Microsoft patents, and the springing license right are valuable benefits that help them manage IP risk.

Many customers tell us that the patent pick benefit alone serves as a significant deterrent against patent assertions and that the breadth of our indemnification pledge is unmatched by competitors. ISVs building on Azure are excited by the ability to access 10,000 Microsoft patents to complement their own patent portfolio. TechInsights confirms that, “Microsoft Azure IP Advantage outranks competitors Oracle, Google, Amazon and VMware’s portfolios.” None of Microsoft’s Azure competitors offer a similar package of offerings. The fact that these tools are available for free to eligible Azure customers makes it all the more compelling.

Extending these benefits to China aligns well with Microsoft’s approach to delivering cloud services on a truly global scale. Azure has 42 regions around the world and that number is growing. In China, Microsoft has partnered with 21Vianet to deliver Microsoft Azure services to our customers since March 2014. No other cloud service provider can match the Azure global data center footprint, and many of them are just getting started in China while Microsoft has already been in market for several years. Beyond the public cloud, customers can leverage Azure Stack to use Azure services in their private data centers or in markets where Azure public cloud is not available yet, all through a consistent set of services and APIs.

The benefit of Azure IP Advantage is obvious. A recent study by IPlytics has shown that patent assertion entities have increased their stockpile of cloud computing patents by 130% since 2011. Worse, cloud-related patent litigation in the US has grown by 700% since 2012. We can see these trends taking hold in China as well where patent litigation has increased 158% between 2011 and 2016. Patent filings in China have surpassed the US since 2015.

We’re pleased to be supported in our Azure IP Advantage launch in China by valued customers. MoBike, the world largest bicycle sharing company headquartered in Beijing, is using the Azure platform to rapidly and expand its business outside of China into to Manchester in the UK and other cities worldwide. Azure IP Advantage protections follow MoBike in its international expansion.

Azure IP Advantage is already available outside of China. With this announcement, customers can rely on Azure IP Advantage protections anywhere they deploy their SaaS applications.
Quelle: Azure

More and more fun with Terraform on Azure

Just one month ago, we announced our increased investment in Terraform. It is amazing to see the progress we have already made together with HashiCorp and the Terraform community. In the last month alone, we added support for Azure Container Instances and Azure Event Grid to the Terraform provider. Today at HashiConf, I announced native Terraform support built-in to the Azure Cloud Shell. I also announced 8 verified Azure Modules as part of the Terraform Module Registry launch. Now is a great time for you to try Terraform on Azure.

Terraform Module Registry

HashiCorp just announced their Terraform Module Registry, allowing Users to generate Terraform modules that represent infrastructure topologies that can then be expressed on the cloud platform of their choice. I am pleased to announce that there are 8 Azure Modules available in the Terraform Registry at launch including Load Balancer, Virtual Network, Virtual Machine Scale Sets, Virtual Machines, Azure SQL, Consul and Vault.

As I demonstrated on stage, with these modules, you should be able to deploy complex topologies in an easy way on Azure.

Terraform in the Azure Cloud Shell

We want to make it incredibly easy for you to get started with Terraform on Azure. Today, I also announced that Terraform is available to every Azure user directly in the Azure Portal via Azure Cloud Shell.

The Azure Cloud Shell is a browser-based command-line experience that enables bash commands directly in the portal. This shell can run on any machine and any browser. It even runs on your phone, enabling provisioning using Terraform from anywhere your phone can go.

With the shell, any Azure user can start using Terraform in the portal. You have nothing to install. You have nothing to configure. We even authenticate Terraform to your subscription for you!

I am really excited about the progress we’re making with HashiCorp and the Terraform community. Go ahead, try out Terraform in Azure and the new Terraform Module Registry. Tell us what you think! I hope you find these improvements helpful to deploy your services and solutions.

See ya around,

Corey
Quelle: Azure

Simplifying OPC UA security for everyone

At the IoT Expo in Taipei, we were excited to announce our contribution of an open-source, cross-platform OPC UA Global Discovery Server (GDS) to the OPC Foundation. As we have done with our UA-.Net Standard cross-platform reference stack contribution, we will check it in to the OPC Foundations’ GitHub in the next couple of weeks. While an OPC UA GDS also manages OPC UA server configuration and handles centralized discovery, the greatest value of a GDS deployment is its certificate management capability and is described here.

The most important aspect of the digital factory and other connected industrial infrastructure is security. A defense-in-depth security approach is needed on premises and the air gap traditionally used to protect the Operational Technology infrastructure (i.e. the factory floor) from the Information Technology infrastructure (i.e. the back office and public Internet), which was proven insufficient over 7 years ago. For example, Stuxnet managed to “jump” the air gap by infecting laptops of engineers working in the factory who hand-carried the virus on premises. Defense-in-depth means that each machine on the factory floor handles its own security and doesn’t rely on a perimeter security concept alone.

Until now, there are no open-source GDS reference implementations available to the public. Due to this limitation, it is not surprising that the majority of factory operators turn off security (i.e. authentication and encryption) on their machines altogether or rely on a complicated and time-intensive manual exchange of self-signed OPC UA certificates (one per machine/server and one per connecting client). To make this process easier to manage, operators also use insecure locations to store certificates, such as file shares and USB keys. Furthermore, self-signed certificates not only have the disadvantage of being management-intensive, they also rely on the factory operator to make trust decisions based on hard-to-understand information located in the certificate and which additionally can be easily spoofed, as a self-signed certificate cannot be independently validated. Self-signed certificates are therefore not recommended for establishing trust and should not be used. On the other hand, Certificate Authority (CA)-signed certificates as provided by a GDS can be validated (via the certificate “chain” leading back to the root CA) and manual exchange of certificates is eliminated as all certificates signed by a certain CA are trusted by an application trusting the CA. A GDS can also handle the automatic installation of a CA-signed certificate on the machine.

Now, we realize that not everyone will be able to download, compile and run a GDS reference implementation from GitHub. We have therefore decided that we will additionally offer an Azure IoT Edge-based GDS, integrated with our upcoming Azure IoT Hub Device Provisioning Service. This fully cloud-managed GDS will also be available open-source on GitHub and as a Docker container on Docker Hub and will be the first truly global GDS, containing data from a customer’s worldwide industrial OPC UA-enabled machine deployments.

As you can see, we continue to invest in making the factory of the future more secure by simplifying and supporting the leading open industrial interoperability standard OPC UA.
Quelle: Azure

Ask us anything about the new Azure Log Analytics language

Join our first AMA session on Thursday, September 21, 2017 from 9:00 AM-10:00 AM Pacific Time. Add the event to your calendar!

Last month, we announced a new query language for Azure Log Analytics, offering advanced search and analytics capabilities, a straight-forward syntax, and a variety of new features. These features include joins, search-time calculated fields, rich date-time and string manipulation, machine-learning operators, and much more. We’ve also held a webinar, which reviewed the short upgrade process, and the new experiences we offer based on the new language.

We’ve since seen a lot of users upgrade their workspaces and get familiar with the language, through different channels:

The Advanced Analytics playground: A free analytics environment that already includes demo data, and is open to anyone who wants to play around with the new language and portal. It also offers some basic examples to get started.
The new language site: Everything you need to know about the new language – language references, cheat-sheets for users that are already familiar with SQL or the Log Analytics legacy language, videos, tutorials and guides for writing queries and using the Analytic portal, and lots of examples!
An open git-hub repo for shared examples: You are invited to share your own examples with us! We will publish them on the language site as well.
Finally – where real the action happens – our brand new community space: This is the place to post and answer questions, check out our announcement, and stay in touch.

Today, we’re excited to invite you to a live AMA (Ask Microsoft Anything) session we'll hold on Thursday, September 21, 2017 from 9:00 AM-10:00 AM Pacific Time – an hour of live Q&A with the product team! This is your opportunity to ask us anything about the new language features, and our opportunity to hear what you want.

To join the AMA session, first join Microsoft Tech Community and familiarize yourself with the Azure Log Analytics space. The AMA session will take place in the Azure AMA space.

See you there!
Quelle: Azure

EDNS Client Subnet support in Azure Traffic Manager

Over the past few months, we announced the support for Geographic Traffic Routing, Fast failover, and TCP probing using Azure Traffic Manager. It is our constant endeavor to add new capabilities that add value to our customers. Today, we are excited to announce the support for EDNS Client Subnet (ECS) in Azure Traffic Manager.

When customers choose to use Performance or Geographic routing methods with Azure Traffic Manager, the routing decision made depends on the origin of the Domain Name System (DNS) request. Azure Traffic Manager determines the request origin region by inspecting the source IP address of the query, which in most cases will be the IP address of the local DNS resolver that does the recursive DNS lookup on behalf of the end user.

While this is a good proxy for the location of the end user, there are many cases where a user can be using a resolver outside of their geographical location. This results in our query response not being optimized.

With the support for ECS, Azure Traffic Manager will use this information, if it is passed by the DNS resolver proxying the query, to make routing decisions. This will result in increased accuracy when Performance routing method is used and increased correctness of geographic location identification if Geographic routing method is used.

Specifically, this feature provides support for RFC 7871 – Client Subnet in DNS Queries that provides an Extension Mechanism for DNS (EDNS0) which can pass on the client subnet address to resolvers.

There is no customer action needed to enable this feature and it is available in all the Azure clouds. All your end user queries with ECS information are already benefitting from this new capability from Azure Traffic Manager!
Quelle: Azure

Microsoft Azure announces new capabilities and partnerships at IBC 2017

Over the past months, we have aggressively added innovative new capabilities in Azure to enable content owners and partners to prepare, store, protect, distribute and monetize media in the cloud. We are thrilled to see media & cable organizations choosing Microsoft Azure for their digital transformation needs. Whether helping organizations move their focus away from infrastructure to content, distributing content with digital era velocity, or providing reliable and relevant content, Microsoft Azure is the trusted and global-scale cloud for the media industry’s needs.

Azure Media Services

In response to enthusiastic customer feedback, we have added the following new capabilities to Azure Media Services.

Democratizing AI for Media Industry: Video Indexer a powerful new capability which brings AI to life for media industry was introduced a few months back at BUILD 2017, as a unique integrated bundling of Microsoft's cloud-based artificial intelligence and cognitive capabilities applied specifically for video content. Its brings capabilities like face detection, sentiment extraction, object dectection, audio transcripts and more. Enhancements to this service include support for Egyptian Arabic language for speech analysis, playback at multiple speeds, and many more.
Azure Media Redactor is a powerful cloud video processing service leveraging word-class artificial intelligence, which is capable of automatically detecting and blurring faces in your videos, for use in cases such as public safety and news media. This is a new addition to the broad set of already existing capabilities of Azure Media Analytics.
HEVC Encoding support: The improved encoding efficiency of HEVC helps video service operators deliver higher resolution and higher quality video to their viewers, while saving on storage and CDN costs. Apple’s iOS11 release will support this codec, which will ensure a large addressable market for operators who adopt this codec. HEVC encoding is now generally available in Azure Media Services via the Premium Encoder. Further, HEVC encoding is priced the same as H.264 – we believe this to be critical to the success of our customers.

Azure CDN

The following new capabilities to optimize performance for media streaming and dynamic content were added since NAB for Azure CDN.

Media streaming optimized delivery With Azure CDN: We have added the ability to easily optimize the delivery of streaming media. With a single click, media streaming optimizations are automatically applied. This results in fast and efficient delivery of media assets to users and optimized requests to origins.

Dynamic content acceleration: Azure CDN now supports Dynamic Site Acceleration (DSA) to optimize the delivery of dynamically served content. When DSA is enabled the performance of web pages with dynamic content is significantly improved as a result of route/network optimizations, resource prefetching and smart image compression.

One Click Integration with Storage and Web Apps: Pursuant to our goals of simplifying & making CDN capability transparent to our users we have enabled “One Click” enablement for assets in Azure storage as well a simple on click enablement of CDN for Websites & Web apps built on Azure.

Azure Storage

Azure Storage has made substantial improvements in 2017 to address the needs of Media & Entertainment customers. This includes:

Azure Archive Blob Storage – Azure Archive Blob storage is designed to provide Media organizations with a low cost means of delivering durable, highly available, secure cloud storage for rarely accessed data with flexible latency requirements. This is in addition to the existing Hot & Cool tiers which provide instantaneous high throughput object storage.
Blob-Level Tiering – To simplify data lifecycle management, we now allow customers to tier their data at the blob level.  Customers can easily change the access tier of a blob among the Hot, Cool, or Archive tiers as usage patterns change, without having to move data between accounts. Blobs in all three access tiers can co-exist within the same account.
Larger Object Sizes – Azure Blob Storage now supports up to 5TB objects to address the needs of 4K and 8K media.
Higher performance – Azure Blob Storage now supports significantly higher throughput for object reads, up to gigabytes per second for a single object to support post production, transcoding and other workloads. Write throughput has also improved to support large media ingestion including Media Archival.
Encryption at rest (including customer managed keys)– Azure Blob Storage has support for Encryption at rest to meet the security needs of our media customers. Azure is the only public cloud compliant with the MPAA certification. This summer, we launched a preview for customer managed keys, which allows customers to further enhance security by encrypting the data with their own keys.

Growing Partner Ecosystem

At IBC this year, we are excited to announce that our partner ecosystem is growing at a rapid pace, enabling our customers with solutions that span the length and breadth of digital workflows.

Earlier this year, we announced that Avid has selected Microsoft Azure as their preferred partner to power their business in the cloud. Visitors to the Microsoft booth at IBC can see a demonstration showcasing an integrated Microsoft Azure and Avid offering.

In late July, it was announced that Ooyala’s Flex platform, which runs on Microsoft Azure, will now be utilized by Zone TV for its end-to-end workflow needs. Also, other notable customers who are driving value from Ooyala's solution are National Rugby League (NRL), SiriusXM and Zoomin.TV (Netherlands). We are pleased to announce that visitors to Hall 15 at IBC can also see a joint demo between Ooyala’s Flex and Azure Media Services.

We are also pleased to announce our new partners – Amagi and axle Video. Amagi, which provides cloud-managed broadcast services and targeted advertising solutions, announced that their CLOUDPORT product, a cloud playout platform, can now be deployed on Microsoft Azure. axle Video, which provides Media Management solution, announced a deep integration with Microsoft Video Indexer to enhance their axle ai product.

We have also expanded the partnership with LiveArena as it launches LiveArena Broadcast Room, an end-to-end service allowing production and broadcast of Live and on-demand TV to any device, built entirely on Microsoft Azure.

Additionally, Levels Beyond, makers of Reach Engine, has announced the release of Reach Engine Media Services, which is built on Microsoft Azure and integrates with Microsoft Azure Storage. XenData, provider of high capacity data storage solutions, announced the launch of its managed hybrid cloud service by integrating with Microsoft Azure Storage.

Stay tuned for more Azure blog posts that will dive deeper into these announcements.

We’re continually innovating and forming new partnerships when it comes to enabling media workflows on the cloud, and invite you to take advantage by building on Microsoft Azure.

Come see us at IBC 2017

Learn more about Azure Media Services and Azure Storage, and visit us at our IBC Booth in Hall 15 to meet the Azure Media Services and Azure Storage team and see these services in action. We would also like to invite you to the Microsoft Hall to speak with our partners – Amagi, GameOn Technology, GrayMeta, IPV Limited, LiveArena, Make.TV, Oceaneering, Oooyala, Ownzones, StreamingBuzz, Veritone Media, and x.news information technology.

If you are not attending the conference but would like to learn more about our services for the media industry, follow the Azure Blog to say up to date on new announcements.
Quelle: Azure

Keep credentials out of code: Introducing Azure AD Managed Service Identity

A common challenge in cloud development is managing the credentials used to authenticate to cloud services. Today, I am happy to announce the Azure Active Directory Managed Service Identity (MSI) preview. MSI gives your code an automatically managed identity for authenticating to Azure services, so that you can keep credentials out of your code. What is Managed Service Identity and how do I use it? Your code needs credentials to authenticate to cloud services, but you want to limit the visibility of those credentials as much as possible. Ideally, they never appear on a developer’s workstation or get checked-in to source control. Azure Key Vault can store credentials securely so they aren’t in your code, but to retrieve them you need to authenticate to Azure Key Vault. To authenticate to Key Vault, you need a credential! A classic bootstrap problem. Through the magic of Azure and Azure AD, MSI provides a “bootstrap identity” that makes it much simpler to get things started. Here’s how it works! When you enable MSI for an Azure service such as Virtual Machines, App Service, or Functions, Azure creates a Service Principal for the instance of the service in Azure AD, and injects the credentials (client ID and certificate) for the Service Principal into the instance of the service. Next, Your code calls a local MSI endpoint to get an access token MSI uses the locally injected credentials to get an access token from Azure AD Your code uses this access token to authenticate to an Azure service And that’s it! The access token can be used directly with a service that supports Azure AD authentication, such as Azure Resource Manager. If you need to authenticate to a service that doesn’t natively support Azure AD, you can use the token to authenticate to Key Vault and retrieve credentials from there. Azure and Azure AD take care of rolling the Service Principal’s credentials. Your code and your developers will never see or manage them. Try it Today we are announcing previews of Managed Service Identity for: Azure Virtual Machines (Windows) Azure Virtual Machines (Linux) Azure App Service Azure Functions Click the links to try a tutorial! Managed Service Identity is a feature of Azure AD Free, which comes with every Azure subscription. There is no additional charge for using Managed Service Identity. We would love to hear from you! You can ask how-to questions on Stack Overflow using the tag “azure-msi”, or post feature feedback or suggestions to the Azure AD developer feedback forum.
Quelle: Azure