Announcing new AMD EPYC™-based Azure Virtual Machines

Microsoft is committed to giving our customers industry-leading performance for all their workloads. After being the first global cloud provider to announce the deployment of AMD EPYC™ based Azure Virtual Machines in 2017, we’ve been working together to continue bringing the latest innovation to enterprises.

Today, we are announcing our second-generation HB-series Azure Virtual Machines, HBv2, which features the latest AMD EPYC 7002 processor. Customers will be able to increase HPC performance and scalability to run materially larger workloads on Azure. We’ll also be bringing the AMD 7002 processors and Radeon Instinct GPUs to our family of cloud-based virtual desktops. Finally, our new Dav3 and Eav3-series Azure Virtual Machines, in preview today, provide more customer choice to meet a broad range of requirements for general purpose workloads using the new AMD EPYC™ 7452 processor.

Our growing Azure HPC offerings

Customers are choosing our Azure HPC offerings (HB-series) incorporating first generation AMD EPYC Naples for their performance and scalability. We’ve seen a 33 percent memory bandwidth advantage with EPYC, and that’s a key factor for many of our customers’ HPC workloads. For example, fluid dynamics is one workload in which this advantage is valuable. Azure has an increasing number of customers for whom this is a core part of their R&D and even production activities. On ANSYS Fluent, a widely used fluid dynamics application, we have measured EPYC-powered HB instances delivering a 54x performance improvement by scaling across nearly 6,000 processor cores. And this is 24 percent faster than a leading bare-metal solution with an identical InfiniBand network. Additionally, earlier this year, Azure became the first cloud to scale a tightly coupled HPC application to 10,000 cores. This is 10x higher than what had been previously possible on any other cloud provider. Azure customers will be among the first to take advantage of this capability to tackle the toughest challenges and innovate with purpose.

New HPC, general purpose, and memory optimized Azure Virtual Machines

Azure is continuing to increase its HPC capabilities, thanks in part to our collaboration with AMD. In preliminary benchmarking, HBv2 VMs featuring 120 CPUs from the second generation EPYC processor are demonstrating performance gains of over 100 percent on HPC workloads like fluid dynamics and automotive crash test analysis. HBv2 scalability limits are also increasing with the cloud’s first deployment of 200 Gigabit InfiniBand, thanks to the second generation EPYC processor’s PCIe 4.0 capability. HBv2 virtual machines (VMs) will support up to 36,000 cores for MPI workloads in a single virtual machine scale set, and up to 80,000 cores for our largest customers.

We’ll also be bringing AMD EPYC 7002 processor to our family of cloud-based remote desktops, pairing with the Radeon MI25 GPU for customers running Windows-based environments. The new series offers unprecedented GPU resourcing flexibility, giving customers more choice than ever before to size virtual machines all the way from 1/8th of a single GPU up to a whole GPU.

Finally, we are also announcing new Azure Virtual Machines as part of the Dv3 and Ev3-series—optimized for general purpose and memory intensive workloads. These new VM sizes feature AMD’s EPYC™ 7452 processor. The new general purpose Da_v3 and Das_v3 Azure Virtual Machines provide up to 64 vCPUs, 256 GiBs of RAM, and 1,600 GiBs of SSD-based temporary storage. Additionally, the new memory optimized Ea_v3 and Eas_v3 Azure Virtual Machines provide up to 64 vCPUs, 432 GiBs of RAM, and 1,600 GiBs of SSD-based temporary storage. Both VM series support Premium SSD disk storage. The new VMs are currently in preview in the East US Azure region and with availability coming soon to other regions.

Da_v3 and Das_v3 virtual machines can be used for a broad range of general-purpose applications. Example use cases include most enterprise-grade applications, relational databases, in-memory caching, and analytics. Applications that demand faster CPUs, better local disk performance or higher memories can also benefit from these new VMs. Additionally, the Ea_v3 and Eas_v3 VM series are optimized for other large in-memory business critical workloads.

Taking advantage of these new offerings

Request access to the latest HPC and Remote Desktop virtual machines.
Request access to the new general purpose and memory intensive Azure Virtual Machines. 

Quelle: Azure

Better security with enhanced access control experience in Azure Files

We are making it easier for customers to “lift and shift” applications to the cloud while maintaining the same security model used on-premises with the general availability of Azure Active Directory Domain Services (Azure AD DS) authentication for Azure Files. By integrating Azure AD DS, you can mount your Azure file share over SMB using Azure Active Directory (Azure AD) credentials from Azure AD DS domain joined Windows virtual machines (VMs) with NTFS access control lists (ACLs) enforced.

Azure AD DS authentication for Azure Files allows users to specify granular permissions on shares, files, and folders. It unblocks common use cases like single writer and multi-reader scenario for your line of business applications. As the file permission assignment and enforcement experience matches that of NTFS, lifting and shifting your application into Azure is as easy as moving it to a new SMB file server. This also makes Azure Files an ideal shared storage solution for cloud-based services. For example, Windows Virtual Desktop recommends using Azure Files to host different user profiles and leverage Azure AD DS authentication for access control.

Since Azure Files strictly enforces NTFS discretionary access control lists (DACLs), you can use familiar tools like Robocopy to move data into an Azure file share persisting all of your important security control. Azure Files access control lists are also captured in Azure file share snapshots for backup and disaster recovery scenarios. This ensures that file access control lists are preserved on data recovery using services like Azure Backup that leverages file snapshots.

Follow the step-by-step guidance to get started today. To better understand the benefits and capabilities, you can refer to our overview Azure Azure AD DS authentication for Azure Files.

What’s new in general availability?

Based on your feedback, there are several new features to share since the preview:

Seamless integration with Windows File Explorer on permission assignments: When we demoed this feature at Microsoft Ignite 2018, we showed changing and view permissions with a Windows command line tool called icacls. There were clearly some challenges, since icacls is not easily discoverable or consistent with common user behavior. Starting with general availability, you can view or modify the permissions on a file or folder with Windows File Explorer, just like any regular file shares.

New built-in role-based access controls to simplify share level access management: To simplify share-level access management, we have introduced three new built-in role-based access controls—Storage File Data SMB Share Elevated Contributor, Contributor, and Reader. Instead of creating custom roles, you can use the built-in roles for granting share-level permissions for SMB access to Azure Files.

What is next for Azure Files access control experience?

Supporting authentication with Azure Active Directory Domain Services is most useful for application lift and shift scenarios, but Azure Files can help with moving all on-premises file shares, regardless of whether they are providing storage for an application or for end users. Our team is working to extend authentication support to Windows Server Active Directory hosted on-premises or in the cloud.

If you are interested to hear future updates on Azure Files Active Directory Authentication, sign up today. For general feedback on Azure Files, email us at AzureFiles@microsoft.com.
Quelle: Azure

Disaster recovery of Azure disk encryption (V2) enabled virtual machines

Choosing Azure for your applications and services allows you take advantage of a wide array of security tools and capabilities. These tools and capabilities help make it possible to create secure solutions on Azure. Among these capabilities is Azure disk encryption, designed to help protect and safeguard your data to meet your organizational security and compliance commitments. It uses the industry standard BitLocker Drive Encryption for Windows and DM-Crypt for Linux to provide volume encryption for OS and data disks. The solution is integrated with Azure Key Vault to help you control and manage disk encryption keys and secrets, and ensures that all data on virtual machine (VM) disks are encrypted both in-transit and at rest while in Azure Storage.

Beyond securing your applications, it is important to have a disaster recovery plan in place to keep your mission critical applications up and running when planned and unplanned outages occur. Azure Site Recovery helps orchestrate replication, failover, and recovery of applications running on Azure Virtual Machines so that they are available from a secondary region if you have any outages in the primary region.

Azure Site Recovery now supports disaster recovery of Azure disk encryption (V2) enabled virtual machines without Azure Active Directory application. While enabling replication of your VM for disaster recovery, all the required disk encryption keys and secrets are copied from the source region to the target region in the user context. If the user managing disaster recovery does not have the appropriate permissions, the user can hand over the ready-to-use script to the security administrator to copy the keys and secrets and proceed with configuration.

This feature currently supports only Windows VMs using managed disks. The support for Linux VMs using managed disks will be available in the coming weeks. This feature is available in all Azure regions where Azure Site Recovery is available. Configure disaster recovery for Azure disk encryption enabled virtual machines using Azure Site Recovery today and become both secure and protected from outages.
Quelle: Azure

High Availability Add-On updates for Red Hat Enterprise Linux on Azure

High availability is crucial to mission-critical production environments. The Red Hat Enterprise Linux High Availability Add-On provides reliability and availability to critical production services that use it. Today, we’re sharing performance improvements and image updates around the High Availability Add-On for Red Hat Enterprise Linux (RHEL) on Azure.

Pacemaker

Pacemaker is a robust and powerful open-source resource manager used in highly available compute clusters. It is a key part of the High Availability Add-On for RHEL.

Pacemaker has been updated with performance improvements in the Azure Fencing Agent to significantly decrease Azure failover time, which greatly reduces customer downtime. This update is available to all RHEL 7.4+ users using either the Pay-As-You-Go images or Bring-Your-Own-Subscription images from the Azure Marketplace.

New pay-as-you-go RHEL images with the High Availability Add-On

We now have RHEL Pay-As-You-Go (PAYG) images with the High Availability Add-On available in the Azure Marketplace. These RHEL images have additional access to the High Availability Add-On repositories. Pricing details for these images are available in the pricing calculator.

The following RHEL HA PAYG images are now available in the Marketplace for all Azure regions, including US Government Cloud:

RHEL 7.4 with HA
RHEL 7.5 with HA
RHEL 7.6 with HA

New pay-as-you-go RHEL for SAP images with the High Availability Add-On

We also have RHEL images that include both SAP packages and the High Availability Add-On available in the Marketplace. These images come with access to SAP repositories as well as 4 years of support per standard Red Hat policies. Pricing details for these images are available in the pricing calculator.

The following RHEL for SAP with HA and Update Services images are available in the Marketplace for all Azure regions, including US Government Cloud:

RHEL 7.4 for SAP with HA and Update Services
RHEL 7.5 for SAP with HA and Update Services
RHEL 7.6 for SAP with HA and Update Services

Refer to the Certified and Supported SAP HANA Hardware Directory to see the list of SAP-certified Azure VM sizes.

You can also get a full listing of RHEL images on Azure, including the RHEL with HA and RHEL for SAP with HA images with the following Azure CLI command:

az vm image list –publisher redhat –all

Support

All the RHEL with HA and RHEL for SAP with HA images on Azure are fully supported by the Red Hat and Microsoft integrated support team.

See the support site here and the Red Hat support site here.

Full details on the Red Hat Enterprise Linux support lifecycle are available here.

Next steps

Visit the Red Hat on Azure site to learn more about Red Hat workloads on Azure.
View pricing information at the pricing calculator.
Get started with the RHEL HA PAYG images and the RHEL for SAP with HA PAYG images.
Learn to create a Pacemaker cluster for SAP using RHEL by following our instructions here.
Deploy SAP on RHEL with our Quickstart Guide.

Quelle: Azure

When to use Azure Service Health versus the status page

If you’re experiencing problems with your applications, a great place to start investigating solutions is through your Azure Service Health dashboard. In this blog post, we’ll explore the differences between the Azure status page and Azure Service Health. We’ll also show you how to get started with Service Health alerts so you can stay better informed about service issues and take action to improve your workloads’ availability.

How and when to use the Azure status page

The Azure status page works best for tracking major outages, especially if you’re unable to log into the Azure portal or access Azure Service Health. Many Azure users visit the status page regularly. It predates Azure Service Health and has a friendly format that shows the status of all Azure services and regions at a glance.

The Azure status page, however, doesn’t show all information about the health of your Azure services and regions. The status page isn’t personalized, so you need to know exactly which services and regions you’re using and locate them in the grid. The status page also doesn’t include information about non-outage events that could affect your availability. For example, planned maintenance events and health advisories (think service retirements and misconfigurations). Finally, the status page doesn’t have a means of notifying you automatically in the event of an outage or a planned maintenance window that might affect you.

For all of these use cases, we created Azure Service Health.

How and when to use Azure Service Health

At the top of the Azure status page, you’ll find a button directing you to your personalized dashboard. One common misunderstanding is that this button allows you to personalize the status page grid of services and regions. Instead, the button takes you into the Azure portal to Azure Service Health, the best option for viewing Azure events that may impact the availability of your resources.

In Service Health, you’ll find information about everything from minor outages that affect you to planned maintenance events and other health advisories. The dashboard is personalized, so it knows which services and regions you’re using and can even help you troubleshoot by offering a list of potentially impacted resources for any given event.

Service Health’s most useful feature is Service Health alerts. With Service Health alerts, you’ll proactively receive notifications via your preferred channel—email, SMS, push notification, or even webhook into your internal ticketing system like ServiceNow or PagerDuty—if there’s an issue with your services and regions. You don’t have to keep checking Service Health or the status page for updates and can instead focus on other important work.

Set up your Service Health alerts today

Feel free to keep using the status page for quick updates on major outages. However, we highly encourage you make it a habit to visit Service Health to stay informed of all potential impacts to your availability and take advantage of rich features like automated alerting.

Set up your Azure Service Health alerts today in the Azure portal. For more in-depth guidance, visit the Azure Service Health documentation. Let us know if you have a suggestion by submitting an idea here.
Quelle: Azure

Announcing Azure Databricks unit pre-purchase plan and new regional availability

Azure Databricks is a fast, easy, and collaborative Apache Spark based analytics platform that simplifies the process of building big data and artificial intelligence (AI) solutions. Azure Databricks provides data engineers and data scientists an interactive workplace where they can use the languages and frameworks of their choice. Natively integrated with services like Azure Machine Learning and Azure SQL Data Warehouse, Azure Databricks enables customers to build an end-to-end modern data warehouse, real-time analytics, and machine learning solutions.

Save up to 37 percent on your Azure Databricks workloads

Azure Databricks Unit pre-purchase plan is now generally available—expanding our commitment to make Azure the most cost-effective cloud for running your analytics workloads.

Today, with the Azure Databricks Unit pre-purchase plan, you can start unlocking the benefits of Azure Databricks at significantly reduced costs when you pre-pay for Databricks compute for a one or three-year term. With this new pricing option, you can achieve savings of up to 37 percent compared to pay-as-you-go pricing. You can learn more about the discount tiers on our pricing page. All Azure Databricks SKUs—Premium and Standard SKUs for Data Engineering Light, Data Engineering, and Data Analytics—are eligible for DBU pre-purchase.

Compared with other Azure services with reserved capacity pricing, which have a per hour capacity purchase, this plan allows you to pre-purchase DBUs that can be used at any time. You also have the flexibility to consume units across all workload types and tiers.

Azure Databricks is offered as a first party Azure service. You can pre-purchase Databricks compute either from your Azure prepayment or existing payment instruments.

Azure Databricks is now available in South Africa and South Korea

Azure Databricks is now generally available in additional regions—South Africa and South Korea. These additional locations bring the product worldwide availability count to 26 regions backed by a 99.95 percent SLA.

Driven by the motto of innovation and accessibility, we aim to ensure that we build a cloud infrastructure to serve the needs of customers globally. Stay updated with the region availability for Azure Databricks.

Organizations also benefit from Azure Databricks' native integration with other services like Azure Blob storage, Azure Data Factory, Azure SQL Data Warehouse, Azure Machine Learning, and Azure Cosmos DB. This enables new analytics solutions that support modern data warehousing, advanced analytics, and real-time analytics scenarios.

Get started today

Getting started with DBU pre-purchase is easy, and is done via the Azure portal. For details on how to get started, see our documentation. For more information on discount tiers, please visit the pricing page.
Quelle: Azure

Azure and Informatica team up to remove barriers for cloud analytics migration

Today, we are announcing the most comprehensive and compelling migration offer available in the industry to help customers simplify their cloud analytics journey.

This collaboration between Microsoft and Informatica provides customers an accelerated path for their digital transformation. As customers modernize their analytics systems, it enables them to truly begin integrating emerging technologies, such as AI and machine learning, into their business. Without migrating analytics workloads to the cloud, it becomes difficult for customers to maximize the potential their data holds.

For customers that have been tuning analytics appliances for years, such as Teradata and Netezza, it can seem overwhelming to start the journey towards the cloud. Customers have invested valuable time, skills, and personnel to achieve optimal performance from their analytics systems, which contain the most sensitive and valuable data for their business. We understand that the idea of migrating these systems to the cloud can seem risky and daunting. This is why we are partnering with Informatica to help customers begin their cloud analytics journey today with an industry-leading offer.

Free evaluation

With this offering, customers can now work with Azure and Informatica to easily understand their current data estate, determine what data is connected to their current data warehouse, and replicate tables without moving any data in order to conduct a robust proof of value.

This enables customers to get an end-to-end view of their data, execute a proof of value without disrupting their existing systems, and quickly see the possibilities of moving to Azure.

Free code conversion

A critical aspect of migrating on-premises appliances to the cloud is converting existing schemas to take advantage of cloud innovation. This conversion can quickly become expensive even in proof of values.

With this joint offering from Azure and Informatica, customers receive free code conversion for both the proof of value phase and when fully migrating to the cloud, as well as a SQL Data Warehouse subscription for the duration of the proof of value (up to 30 days).

Hands-on approach

Both Azure and Informatica are dedicating the personnel and resources to have analytics experts on-site helping customers as they begin migrating to Azure.

Customers that qualify for this offering will have full support from Azure SQL Data Warehouse experts. They will help with the initial assessment, executing the proof of value, and provide best practice guidance during migration.

Everything you need to start your cloud analytics journey

Get started today

Analytics in Azure is up to 14 times faster and costs 94 percent less than other cloud providers, and is the leader in both the TPC-H and TPC-DS industry benchmarks. Now with this joint offer, customers can easily get started on their cloud analytics journey.

Register for the Azure and Informatica webinar to learn more about this offer.
Sign up for a free Informatica: Cloud Data Warehouse Modernization on Azure workshop.

Quelle: Azure

We’re making Azure Archive Storage better with new lower pricing

As part of our commitment to provide the most cost-effective storage offering, we’re excited to share that we have dropped Azure Archive Storage prices by up to 50 percent in some regions. The new pricing is effective immediately.

In 2017 we launched Azure Archive Storage to provide cloud storage for rarely accessed data with flexible latency requirements at an industry leading price point. Since then we’ve seen both small and large customers from all industries utilize Archive Storage to significantly reduce their storage bill, improve data durability, and meet legal compliance. Forrester Consulting interviewed four of these customers and conducted a commissioned Total Economic Impact™ (TEI) study to evaluate the value customers achieved by moving both on-premises and existing data in the cloud to Archive Storage. Below are some of the highlights from that study.

112 percent return-on-investment (ROI). Forrester’s interviews with four existing customers and subsequent financial analysis found that a composite organization based on these interviewed organizations projects expected benefits of $296,941 over projected three years versus costs of $140,376, adding up to a net present value (NPV) of $156,565 and an ROI of 112 percent.
Reduced or eliminated more than $173 thousand in operational and hardware expenses over a three-year period. Organizations were able to reduce spending in their on-premises storage environments by transitioning data to the cloud. Moving to the cloud enabled users to eliminate their tape and hard disk backups, while also reducing overall operating expenditures.
Reduced monthly cloud storage costs by 95 percent. Organizations identified infrequently accessed data stored in active cloud storage tiers and transitioned them to the Archive tier, reducing their monthly per gigabyte (GB) storage costs by 95 percent. The Archive tier allowed organizations to augment their existing cloud storage savings. Over a three-year period, this saves an estimated $123,692.

How are customers using Archive Storage?

Toshiba America Business Solutions (TABS) sells digital signage and multifunction printers (MFPs), along with a complete set of maintenance and management services to help customers optimize their digital and paper communications. TABS created two Internet of Things (IoT) analytics solutions, e-BRIDGE™ CloudConnect and CloudConnect Data Services that are based on Microsoft Azure platform-as-a-service (PaaS) offerings, including Azure SQL Data Warehouse. Using e-BRIDGE, TABS remotely gathers device health data from thousands of installed devices and preemptively dispatches service technicians with the correct parts to perform repairs. With CloudConnect Data Services, TABS analyzes device health and repair history data to continuously improve product design and component choices. These solutions have helped the company improve device uptime and reduce service costs.

The daily configuration updates from printer devices were being stored in hot Blob Storage for four years even though they were rarely accessed. With Archive Blob Storage, Toshiba now moves these files to Archive Storage after 30 days once the probability of them being accessed goes down significantly. At this point, they also don’t need the files immediately available and can wait hours to get the files back. Archive Storage has allowed Toshiba to reduce their storage costs for this data by almost 90 percent.

Oceaneering uses remotely operated vehicles (ROVs) to capture video of operations and inspection work. The increase in overall video quality over the last few years has predicated the use of more efficient storage capabilities provided by the Azure platform. The satellite links provided onboard the vessels provide limited bandwidth to stream the video, so the traditional transport of media such as Data Box sometimes requires manual transport. The large amount of data per inspection, 2 TB a day in some instances, is maintained on Azure Storage. For the larger library of historical video, Azure Archive Storage is used to provide the most cost-effective solution for our customers who access the video via the Oceaneering Media Vault (OMV). Oceaneering has experienced 60 percent savings utilizing Azure Archive capabilities.

Regional availability

Archive Storage is currently available in a total of 29 regions worldwide, and we’re continuing to expand that list. Over the past year we have added support for Archive Storage in Australia East, Australia Southeast, East Asia, Southeast Asia, UK West, UK South, Japan East, Japan West, Canada Central, Canada East, US Gov Virginia, US Gov Texas, US Gov Arizona, China East 2, and China North 2.

Additional information

Azure Archive Storage provides an extremely cost-effective alternative to on-premises storage for cold data as highlighted in the Forrester TEI study. Customers can significantly reduce operational and hardware expenses to realize an ROI of up to 112 percent over three years by moving their data to the Archive tier.

Archive exists alongside the Hot and Cool access tiers. All archive operations are consistent with the other tiers so you can seamlessly move your data among tiers programmatically or using lifecycle management policies. Archive is supported by a broad and diverse set of storage partners.

For more information on Archive Storage features and capabilities, please visit our product page. For more information on Archive Storage pricing, please visit the Azure Block Blob Pricing page. If you have any further questions or feedback, please reach out to us at archivefeedback@microsoft.com.
Quelle: Azure

Improved developer experience for Azure Blockchain development kit

As digital transformation expands beyond the walls of one company and into processes shared across organizations, businesses are looking to blockchain as a way to share workflow data and logic.

This spring we introduced Azure Blockchain Service, a fully-managed blockchain service that simplifies the formation, management, and governance of consortium blockchain networks. With a few simple clicks, users can create and deploy a permissioned blockchain network and manage consortium membership using an intuitive interface in the Azure portal.

To help developers building applications on the service, we also introduced our Azure Blockchain development kit for Ethereum. Delivered via Visual Studio Code, the dev kit runs on all major operating systems, and brings together the best of Microsoft and open source blockchain tooling, including deep integration with leading OSS tools from Truffle. These integrations enable developers to create, compile, test, and manage smart contract code before deploying it to a managed network in Azure.

We’re constantly looking and listening to feedback for areas where we can lean in and help developers go further, faster. This week for TruffleCon, we’re releasing some exciting new features that make it easier than ever to build blockchain applications:

Interactive debugger: Debugging of Ethereum smart contracts, has been so far, a challenging effort. While there are some great command line tools (e.g., Truffle Debugger), these tools aren’t integrated into integrated development environments (IDE) like Visual Studio Code. Native integration of the Truffle Debugger into Visual Studio Code brings all the standard debugging features developers have come to rely on (e.g, breakpoints, step in/over/out, call stacks, watch windows, and Intellisense pop ups) that let developers quickly identify, debug, and resolve issues.
Auto-generated prototype UI: The dev kit now generates a UI that is rendered and activated inside of Visual Studio Code. This allows developers to interact with their deployed contracts, directly in the IDE environment without having to build other UI or custom software simply to test out basic functionality of their contracts. Having a simple, graphical user interface (GUI) driven interface that allows developers to interact and test out basic functionality of their contracts inside the IDE, without writing code, is a huge improvement in productivity.

With the addition of these new debugger capabilities, we are bringing all the major components of software development, including build, debug, test, and deploy, for Smart Contracts into the popular Visual Studio Code developer environment.

If you’re in Redmond, Washington this weekend, August 2-4, 2019, come by TruffleCon to meet the team or head to the Visual Studio Marketplace to try these new features today!
Quelle: Azure

New Azure Blueprint simplifies compliance with NIST SP 800-53

To help our customers manage their compliance obligations when hosting their environments in Microsoft Azure, we are publishing a series of blueprint samples built in to Azure. Our most recent release is the NIST SP 800-53 R4 blueprint that maps a core set of Azure Policy definitions to specific NIST SP 800-53 R4 controls. For US governmental entities and others with compliance requirements based on NIST SP 800-53, this blueprint helps customers proactively manage and monitor compliance of their Azure environments. 

The free Azure Blueprints service helps enable cloud architects and information technology groups to define a repeatable set of Azure resources that implements and adheres to an organization’s standards, patterns, and requirements. Blueprints may help speed the creation of governed subscriptions, supporting the design of environments that comply with organizational standards and best practices and scale to support production implementations for large-scale migrations.

Azure leads the industry with more than 90 compliance offerings that meet a broad set of international and industry-specific compliance standards. This puts Microsoft in a unique position to help ease our customers’ burden to meet their compliance obligations. In fact, many of our customers, particularly those in regulated industries, have expressed strong interest in being able to leverage our internal compliance practices for their environments with a service that maps compliance settings automatically. The Azure Blueprints service is our natural response to that interest.  Customers are ultimately responsible for meeting the compliance requirements applicable to their environments and must determine for themselves whether particular information helps meet their compliance needs.

The US National Institute of Standards and Technology (NIST) publishes a catalog of security and privacy controls, Special Publication (SP) 800-53, for all federal information systems in the United States (except those related to national security). It provides a process for selecting controls to protect organizations against cyberattacks, natural disasters, structural failures, and other threats.

The NIST SP 800-53 R4 blueprint provides governance guardrails using Azure Policy to help customers assess specific NIST SP 800-53 R4 controls. It also enables customers to deploy a core set of policies for any Azure-deployed architecture that must implement these controls.

NIST SP 800-53 R4 control mappings provide details on policies included within this blueprint and how these policies address various NIST SP 800-53 R4 controls. When assigned to an architecture, resources are evaluated by Azure Policy for non-compliance with assigned policies. These control mappings include:

Account management. Helps with the review of accounts of that may not comply with an organization’s account management requirements.
Separation of duties. Helps in maintaining an appropriate number of Azure subscription owners.
Least privilege. Audits accounts that should be prioritized for review.
Remote access. Helps with monitoring and control of remote access.
Audit review, analysis, and reporting. Helps ensure that events are logged and enforces deployment of the Log Analytics agent on Azure virtual machines.
Least functionality. Helps monitor virtual machines where an application white list is recommended but has not yet been configured.
Identification and authentication. Helps restrict and control privileged access.
Vulnerability scanning. Helps with the management of information system vulnerabilities.
Denial of service protection. Audits if the Azure DDoS Protection standard tier is enabled.
Boundary protection. Helps with the management and control of the system boundary.
Transmission confidentiality and integrity. Helps protect the confidentiality and integrity of transmitted information.
Flaw remediation. Helps with the management of information system flaws.
Malicious code protection. Helps the management of endpoint protection, including malicious code protection.
Information system monitoring. Helps with monitoring a system by auditing and enforcing logging across Azure resources.

At Microsoft, we will continue this commitment to helping our customers leverage Azure in a secure and compliant manner. Over the next few months we plan to release more new built-in blueprints for HITRUST, FedRAMP, NIST SP 800-171, the Center for Internet Security (CIS) Benchmark, and other standards.

If you would like to participate in any early previews please sign up. In addition, learn more about the Azure NIST SP 800-53 R4 blueprint.
Quelle: Azure