Microsoft publishes compliance guidelines for the German Cloud

Microsoft is the first to bring the sovereign Cloud to Germany. Built on a data trustee model, the Microsoft Cloud Germany enables customers in the European Union (EU) and European Free Trade Association (EFTA) to store and manage customer data in compliance with applicable German laws and regulations, as well as key international standards.

As a first step after our successful launch, we want to provide our customers the below workbooks. We listened to our customers’ high demand for regional regulations and compliance requirements, and IT Grundschutz is one of the most important methodologies published by the German Federal Office for Information Security (Bundesamt für Sicherheit in der Informationstechnik).

New workbooks for Microsoft Azure Germany

• Microsoft Cloud Germany – Compliance in the cloud for organizations in EU/EFTA is a new document published by Microsoft. It outlines the data trustee model that delivers the power and flexibility of Microsoft cloud services in an environment that provides both technical and legal protections for German customer data. With the data trustee model, all data that belongs to German, EU, and EFTA region customers is stored exclusively in datacenters on German soil, and a third party – the Data Trustee – alone controls access to Customer Data other than access initiated by Customer or Customer’s end users. Please visit this link to download the document.

• IT Grundschutz Compliance Workbook – Microsoft Azure Germany is a new workbook that was developed by Hisolutions AG, one of the most renowned consulting and auditing companies in Germany.  It supports our clients to achieve their IT Grundschutz certification with solutions and workloads deployed on Microsoft Azure Germany. It´s based on the most recent version of IT Grundschutz, covering the relevant sections for cloud usage. Please visit this link to download the workbook.

For access to any of the documents mentioned above or any other compliance certifications achieved by Microsoft Azure, visit our Service Trust Portal or Microsoft Trust Center.
Quelle: Azure

New security, performance and ISV solutions build on Azure HDInsight’s leadership to make Hadoop enterprise-ready for the cloud

This week in New York, thousands of people are at Strata Hadoop+World to explore the technology and business of big data and advanced analytics. As part of our participation in the conference, we are pleased to announce new capabilities in Azure HDInsight, Microsoft’s managed Hadoop and Spark cloud services, that build on our leadership to make Hadoop enterprise-ready in the cloud and easy for your users with the most security capabilities of any cloud Hadoop solution, big data query speeds that approach data warehousing performance, and new notebook experiences for data scientists all on the latest Hortonworks Data Platform 2.5 and Spark 2.0 platform.

The highest levels of security in a managed Cloud Hadoop solution

To support the adoption of Hadoop in the cloud, Microsoft understands that enterprises need peace of mind that the solution will help protect sensitive corporate data and intellectual property. With the new security features of Azure HDInsight, we provide you with the highest levels of security for authentication, authorization, auditing and encryption available in the cloud for Hadoop.

Authentication and identity management in a few clicks

Azure HDInsight is the first big data service to seamlessly integrate Azure Active Directory and Azure Active Directory Domain Services for enterprise-grade authentication and identity management. This is accomplished with a few clicks, making it easy to secure your Hadoop clusters. This also makes it easy to leverage your existing on-premises Active Directory deployment, which currently supports 1.3 billion daily authentications across 600 million user accounts. You can build sophisticated access control policies around users or security groups supported by features such as multifactor authentication.

Authorization with central security policy administration and auditing

Azure HDInsight is the first managed cloud Hadoop service to include Apache Ranger, which provides a central policy and management portal where administrators can author and maintain fine-grained access control policies over Hadoop data access, components and services. In addition, you can now analyze detailed audit records in the familiar Apache Ranger user interface.

Encryption for data protection

Data processed by Azure HDInsight is stored in Azure Data Lake Store or Azure Storage that both provide server-side encryption as an option to secure data at rest. The encryption works transparently with HDInsight with no extra configuration needed. For Azure Data Lake Store, enterprises can rely on service-managed encryption keys or manage their own keys in Azure Key Vault. Azure Key Vault protects your keys using hardware security models and gives you the ability to revoke access to the keys at any time.

These advanced security capabilities will be available as a public preview in October.

HDInsight now at data warehousing speeds with the latest Hive using LLAP

Microsoft has been involved from the beginning in making Hive run faster with our contributions to Project Stinger and Tez that sped up Hive query performance 100x. We are now pleased to be the first Cloud Hadoop solution to onboard LLAP (Long Lived and Process) from the Stinger.Next initiatives, which promises sub-second querying on big data, which is 25x faster than existing Hive.

LLAP keeps data compressed running in-memory, while retaining the ability to scale elastically within a Hadoop cluster. It also brings many enhancements to the Hive execution engine like Smarter Map Joins, Better MapJoin vectorization, a fully vectorized pipeline, and a smarter cost-based optimizer. In addition to these LLAP enhancements, the latest version of Hive also has faster type conversions, dynamic partitioning optimizations and vectorization support for text files. Collectively, these enhancements have brought a speed improvement of up to 25x when comparing LLAP to Hive on Tez, opening up new scenarios to do interactive BI and reporting on top of big data.

In addition, Microsoft has partnered with Simba to deliver an ODBC driver for Azure HDInsight that can be used with world-class BI tools like Power BI, Tableau and QlikView. Together, this allows business analysts to gain insights over big data using their tool of choice. 

Figure 1: Hortonworks TPC-DS benchmark on 15 queries using the hive-testbench repository: here

Microsoft continues commitment to Spark with a fully managed, SLA-backed Spark 2.0 offering

Spark 2.0 is a major release that overhauls the core query engine with “Project Tungsten,” which upgrades Spark with capabilities of a modern compiler to perform cache-efficient vectorized computations. This has enabled up to 10x faster performance with Spark 2.0 on an already-fast platform. In addition to faster performance, Spark 2.0 also has broader support of the SQL syntax, an improved streaming engine that makes it easier to build real-time solutions, improvements to the Machine Learning pipelines, and more algorithms supported in SparkR. Finally, in response to customer demand, Microsoft and Hortonworks included 100+ fixes for Spark 2.0, improving its stability for production deployments.

With the latest release of Apache HBase for HDInsight, we are also introducing a Spark-HBase connector, letting you use the performance and power of Spark SQL to query HBase. This lets you perform advanced analytics on top of all the data available in your NoSQL database.

Both the latest Hortonworks Data Platform 2.5 and Spark 2.0 are available in Azure HDInsight later today. Hive with LLAP is a new cluster type available as a public preview.

New data science experiences with Zeppelin notebook support

Our goal with big data is to make it accessible for everybody. With Spark for HDInsight, we have designed productivity experiences for the different audiences that use Spark, including the data engineer working on ETL jobs with IntelliJ support, the data scientists performing experimentation with R Server and Jupyter notebook support, and the business analysts creating dashboards with Power BI, Tableau, SAP Lumira and Qlik support.

As part of HDInsight’s support for Hortonworks Data Platform 2.5, we now provide out-of-the-box support for Zeppelin notebooks available later today to give data scientists even more options to create narratives that combine code, statistical equations and visualizations that tell a story about the data.

The easiest way to spin up third-party ISV applications with HDInsight

In the broader ecosystem for Hadoop, there is a thriving market of independent software vendors (ISVs) that provides value-added solutions which help organizations do data preparation, and provide visualizations, advanced security or streaming solutions. In the past these applications would sit outside the cluster, which required spinning up separate virtual machine; also, the connectivity to the Hadoop cluster was limited. Azure HDInsight introduced a way for ISVs such as Datameer to run their applications directly on the HDInsight clusters, letting customers spin up Hadoop and Spark clusters pre-integrated and pre-tuned with the ISV application out-of-the-box. 

"Azure HDInsight Application Platform is the most robust and stable framework we&;ve seen to quickly configure and test Datameer deployments in the cloud,” says Stefan Groschupf, Datameer CEO. “We had all the flexibility to iteratively test different deployment options for our solution as well as marketing collateral within the same portal. It is by far the easiest and fastest way to take your cloud-based solution to market. As a partner, HDInsight application platform has allowed us to connect with customers easily and reduce the time for customers to try Datameer on HDInsight."

Today, we are excited to announce that new partners Cask and StreamSets join the Azure HDInsight ISV program. Cask provides a self-service, extendable open source framework to visually develop, run, automate and operate data pipelines. StreamSets Dataflow Performance Manager provides a single pane of glass for management of big data flows, so enterprises can map and measure all their data in motion.

This week the big data world is focused on Strata + Hadoop World, a great event for the industry and community. It’s exciting to consider the new ideas and innovations happening around the world every day with data. Here at Microsoft, we’re thrilled to be part of it and to fuel that innovation with data solutions that give customers simple but powerful capabilities, using their choice of tools and platforms in the cloud.
Quelle: Azure

Azure Media OCR Simplified Output

Not sure what Azure Media OCR is? Check out the introductory blog post on Azure Media OCR.

Thanks to all of the customers and partners who have been part of the Azure Media OCR public and private previews. We have continued to iterate on valuable first-hand feedback over the past 5 months, and today are tackling one of the most common painpoints that we have heard: the complexity of the output format. 

It turned out, for most customers, we were simply providing too much detail in our default output format. This led to a lot of frustration.

Most customers utilize Azure Media OCR to index videos by the text displayed within them at various times.  In conjunction with Azure Media Indexer, this creates an excellent alternative to manual video tagging for augmenting the discoverability of your video content in a search engine. 

By providing positional data for every single word (in addition to the phrases and regions), we were needlessly inflating the output with little to no additional value. 

Today we are releasing our new output format, a simpler schema that will cover most end-user scenarios with less headache.  In case you were one of the advanced users who found value in the additional data from our previous output format, you can simply use the “AdvancedOutput” flag.

Advanced Output

The advanced output format of a JSON object is made of time fragments, each of which contained separate events comprised of regions, lines, and words, all tagged with positional and language data. 

The new “simple” output format simply contains time fragments which contain text.

Here’s an example of one fragment in the simple output format:

New output

"fragments": [
{
"start": 0,
"duration": 1435434,
"interval": 1435434,
"events": [
[
{
"language": "English",
"text": "Notes File WOF MYM Edit Format View oo Window Help index-html – MvWebSite Q Search Visual Studio Code for Mac Developers Today June 1, 2016, 1:07 PM Visual Studio Code for Mac Developers 211. Google Chrome extensions Sergii Baidachnyi Principal Technical Evangelist Microsoft Canada sbaydach@microsoft.com @sbaidachni"
}
]
]
},

 

The following is a heavily-truncated equivalent “advanced output” from the same fragment.

Note: the actual advanced output that corresponds to the above fragment is over 600 lines of JSON!

Old output

"fragments": [
{
"start": 0,
"duration": 120000,
"interval": 120000,
"events": [
[
{
"region": {
"language": "English",
"orientation": "Up",
"lines": [
{
"text": "Notes File",
"left": 74,
"top": 7,
"width": 109,
"height": 15,
"word": [
{
"text": "Notes",
"left": 74,
"top": 8,
"width": 54,
"height": 14,
"confidence": 974
},
{
"text": "File",
"left": 154,
"top": 7,
"width": 29,
"height": 15,
"confidence": 848
}
]
},
{
"text": "WOF",
"left": 155,
"top": 117,
"width": 33,
"height": 12,
"word": [
{
"text": "WOF",
"left": 155,
"top": 117,
"width": 33,
"height": 12,
"confidence": 397
}
]
},
{
"text": "MYM",
"left": 156,
"top": 206,
"width": 32,
"height": 12,
"word": [
{
"text": "MYM",
"left": 156,
"top": 206,
"width": 32,
"height": 12,
"confidence": 309
}
]
}
]
}
},
{
"region": {

As you can see, unless you need all of the detail, a lot of the advanced output features may be redundant for your scenario.

How do I use this?

Minimal preset for Old Output

{
&;Version&039;:&039;1.0&039;,
&039;Options&039;: {
&039;AdvancedOutput&039;:&039;true&039;
}
}

Minimal preset for New Output

{
&039;Version&039;:&039;1.0&039;
}

Want to learn more about the input configuration? Check out our previous blog post introducing the configuration.

Love the new output? Hate it? Share your feedback with us!

If you want to learn more about this product, and the scenarios that it enables, read the introductory blog post on Azure Media OCR.

To learn more about Azure Media Analytics, check out the introductory blog post.

If you have any questions about any of the Media Analytics products, send an email to amsanalytics@microsoft.com.
Quelle: Azure

Resource health exposes historical health

Today we are pleased to announce the public preview of the Azure Resource health history blade, a new feature that exposes up to 14 days of historical health information.

Up until now, Resource health has helped customers reduce the time spent on troubleshooting ongoing problems, in particular reducing the time spent determining if the problem is caused by an event inside the Azure platform or by a problem in the application. This new feature makes it easier to investigate problems occurred during the last 14 days.

Getting the current and historical health of a resource

The easiest way to open the Resource health blade is to navigate to the resource blade and click on Resource Health.  This blade will show the current health of the resource, as well as recommended troubleshooting steps that are customized based on the current health status. It is important to highlight that since entering Public Preview, we have made a number of improvements to help with troubleshooting, including tighter integration with the troubleshooting experience in the portal.

To access the historical health data, click on the View History link located under the current health state.

The history blade shows any changes in the health of the resource during the last 14 days, including the staring time, the end time and a summary of the text customers would have seen if they had visited the Resource health blade during this time. In the screenshot above you can see that the virtual machine was available until September 19th at 4:19PM when due to a disk failure, it became unavailable until 4:39PM when it was recovered.

New ways to access resource health

As mentioned above, the easiest way is to click on Resource health in the resource blade. Keep in mind that Resource health will only be displayed for resource types available in Resource health. 

Another way is by browsing to the Resource Health List Blade which displays the health of all resources in all your subscriptions. Open this blade by clicking on the Resource health tile located in the Help + Support blade. Once in the Resource Health List blade you can filter by subscription or by resource type.

Resource heath is a key data point when troubleshooting problems, so during the past few weeks we incorporated the Resource health signal in the Troubleshooting and the Case submission blades.

You can access the Troubleshooting blade by clicking on Diagnose and solve problems in the resource blade. Once the troubleshooting blade opens, Resource health will be displayed at the top. Clicking on More Details will take you to the Resource health blade.

In the case submission workflow, you will see the health once you have selected a resource.

Moving forward

Exposing the historical health of a resource is a big step forward on our journey to provide you with the data and the tools you need to troubleshoot problems. During the upcoming months, stay tune for additional improvements to Resource health and for more services to become available through it.
Quelle: Azure

Improving Developer / IT Collaboration with Application Insights Connector for OMS Log Analytics

Now IT Pros can see the health of applications monitored by Visual Studio Application Insights. Application Insights can now send app data to Microsoft Operations Management Suite (OMS), enabling the app developer and their IT Pro counterpart to each monitor the health of critical applications in the tool they are most experienced with.  Views of the same application telemetry you see in Application Insights will be available in OMS, facilitating more effective collaboration between developers and IT Pros, and thus reducing the time to detect and resolve both application and platform issues.

Information about your application’s health and usage is surfaced in OMS by the Public Preview of the Application Insights Connector for OMS Log Analytics. Telemetry streams from web tests, page views, server requests, exceptions, and custom events are included.

With this facility, OMS users can correlate any infrastructure issues with the impact on applications running in that environment. IT staff can contribute more fully to monitoring the whole stack from the infrastructure and configuration management provided by other OMS solutions, to the application layer data provided by Application Insights.

Drill into different applications

The solution provides a summary of the Application Insights resources that have been integrated with OMS:

Click any of the rows from the summary to show  data from a particular application:

From here, the visualization can be changed to the OMS list or table view:

Then fine tune a query to a time range or other property of interest:

One click to diagnosis in Application Insights

The solution makes it easy to dive deeply into problems, leveraging Application Insights powerful developer-focused diagnostics and analysis tools to get to the bottom of application issues.

It’s a one-click pivot from the OMS solution to this application’s telemetry overview in the Azure portal:

Get started

Want to get started? Check out this blog post by Cigdem Kontaci from the OMS team with the details of how an OMS workspace administrator can configure this solution and dramatically enhance the collaboration between your organization’s developers and IT pros.

To use this solution your app must be in either the Standard or Premium pricing tier of Application Insights.

As you get this integration enabled in your organization, feel free to leave feedback about this integration at UserVoice.

Quelle: Azure

Microsoft Cloud is first CSP behind the Privacy Shield

Microsoft was proud to become the first global cloud service provider to appear on the Department of Commerce’s list of Privacy Shield certified entities as of August 12th 2016. The European Commission adopted The EU-US Privacy Shield Framework on July 12th 2016, replacing the International Safe Harbor Privacy Principles as the mechanism for allowing companies in the EU and the US to transfer personal data across the Atlantic in a manner compliant with the EU data protection requirements. As stated on PrivacyShield.gov,

“The EU-U.S. Privacy Shield Framework was designed by the U.S. Department of Commerce and European Commission to provide companies on both sides of the Atlantic with a mechanism to comply with EU data protection requirements when transferring personal data from the European Union to the United States in support of transatlantic commerce.”

Adherence to this framework underscores the importance and priority we at Microsoft put on privacy, compliance, security, and protection of customer data around the globe.  A link to Microsoft’s statement of compliance can be found here.  The Microsoft Cloud offers an array of integrated tools which can enhance an IT professional’s productivity, supports a broad spectrum of operating systems, is highly scalable, and can integrate with existing customer IT environments.  These highly competitive attributes attract a globally diverse customer population whose compliance needs and regulations we are ready and able to support.  Check out the Microsoft Trust Center to learn more about our expansive compliance capabilities including our commitment and compliance with the Privacy Shield Framework.
Quelle: Azure

Announcing Azure Command-Line Interface (Azure CLI) 2.0 Preview

With the continued growth of Azure, we’ve seen a lot of customers using our command-line tools, particularly the Windows PowerShell tools and our Azure XPlat command-line interface (CLI).  We’ve received a lot of feedback on the great productivity provided by command-line tools, but have also heard, especially from customers working with Linux, about our XPlat CLI and its poor integration with popular Linux command-line tools as well as difficulties with installing and maintaining the Node environment (on which it was based).

Based on this feedback – along with the growth in the Azure Resource Manager-based configuration model – we improved the CLI experience and now provide a great experience for Azure. Starting today, we’re making this new CLI available. We’re calling it the Azure Command-Line Interface (Azure CLI) 2.0 Preview, now available as a beta on GitHub. Please try it out and give us your feedback!

Now, if you’re interested in how we approached this project and what it means for you, read on!

What Makes a Great, Modern CLI?

As we set out to develop our next generation of command-line tools, we quickly settled on some guiding principles:

It must be natural and easy to install: Regardless of your platform, our CLI should be installed from where you expect it, be it from “brew install azure-cli” on a MacBook, or from “apt-get install azure-cli” for BASH on Windows (coming soon).

It must be consistent with POSIX tools: Success with command-line tools is the result of the ease and predictability that comes with the implementation of well-understood standards.

It must be part of the open source ecosystem: The value of open source comes from the community and the amazing features and integrations they develop, from DevOps (Chef, Ansible) solutions to query languages (JMESPath).

It must be evergreen and current with Azure: In an age of continuous delivery, it&;s not enough to simply deploy a service. We must have up-to-date tools that let our customers immediately take advantage of that service. 

As we applied these principles, we realized that the scope of improvements went beyond a few breaking changes, and when combined with the feedback we’ve received about our XPlat CLI, it made sense to start from the ground up. This choice allowed us to focus exclusively on our ARM management and address another common point of feedback: the ASM/ARM “config mode” switch of our XPlat CLI.

Introducing the Azure CLI 2.0 Preview

While we are building out support for core Azure services at this time, we would like to introduce you to the next generation of our command-line tool: Azure CLI 2.0 Preview.

Get Started without delay with a quick and easy install, regardless of platform

Your tools should always be easy to access and install, whether you work in operations or development. Soon, Azure CLI 2.0 Preview will be available on all popular platform package services.

Love using command-line tools such as GREP, AWK, JQ?  So do we!

Command-line tools are the most productive when they work together well. The Azure CLI 2.0 Preview provides clean and pipe-able outputs for interacting with popular command-line tools, such as grep, cut, and jq.

Feel like an Azure Ninja with consistent patterns and help at your fingertips

Getting started in the cloud can feel overwhelming, given all the tools and options available, but the Azure CLI 2.0 Preview can help you on your journey, guiding you with examples and educational content for common commands.  We&039;ve completely redesigned our help system with improved in-tool help.

In future releases, we will expand our documentation to include detailed man-pages and online documentation in popular repositories.

The less you type, the more productive you are

We offer &039;tab completion&039; for commands and parameter names. This makes it easy to find the right command or parameter without interrupting your flow. For parameters that include known choices, as well as resource groups and resource names, you can use tab completion to look-up appropriate values.

Moving to the Azure CLI 2.0 Preview

What does this mean to existing users of the XPlat CLI? We&039;re glad you asked! Here are a few key answers to some questions we&039;ve anticipated:

You don&039;t need to change anything: The XPlat CLI will continue to work and scripts will continue to function. We are continuing to support and add new features to the CLI.

You can install and use both CLIs side-by-side: Credentials and some defaults, such as default subscriptions, are shared between CLIs. This allows you to try out the CLI 2.0 Preview while leaving your existing Azure XPlat CLI installation untouched. 

No, ASM/Classic mode is not supported in the Azure CLI 2.0 Preview: We&039;ve designed around ARM primitives, such as resource groups and templates. ASM/Classic mode will continue to be supported by the XPlat CLI.

Yes, we&039;ll help you along the way: While we can&039;t convert scripts for you, we&039;ve created an online conversion guide, including a conversion table that maps commands between the CLIs.

Please note: credential sharing with the Azure XPlat CLI requires version 0.10.5 or later.

Interested in trying us out?

We&039;re on GitHub, but we also publish on Docker: get the latest release by running "$ docker run -it azuresdk/azure-cli-python".

If you have any feedback, please type "az feedback" into the CLI and let us know!

Attending the Microsoft Ignite conference (September 26-30, 2016, Atlanta, GA)? Come visit us at the Azure Tools booth for a demo or attend our session titled:  Build cloud-ready apps that rock with open and flexible tools for Azure.

Frequently Asked Questions

What does this mean to existing users of the XPlat CLI?

The XPlat CLI will continue to work and scripts will continue to function. Both of them support a different top level command (‘azure’ vs ‘az’), and you can use them together for specific scenarios. Credentials and some defaults (such as default subscription) are shared between CLIs allowing you to try out Azure CLI 2.0 Preview while leaving your existing CLI installation untouched. We are continuing to support and add new features the XPlat CLI.

I have scripts that call the “azure” command – will those work with the new tool?

Existing scripts built against the Azure XPlat CLI ("azure" command) will not work with the Azure CLI 2.0 Preview. While most commands have similar naming conventions, the structure of the input and output have changed. For most customers, this means changing scripts to &039;workarounds&039; required by the Azure XPlat CLI, or relying on the co-existence of both tools.

Are you going to discontinue the Azure XPlat CLI? When will you take Azure CLI 2.0 out of preview?

The current XPlat CLI will continue to be available and supported, as it is needed for all ASM/Classic based services. The new Azure CLI 2.0 will stay in preview for now as we collect early user feedback to drive improvements up until the final release (date TBD).

Is .NET Core and PowerShell support changing on this release?

Support for .NET Core and PowerShell is not changing with this release. They will continue to be available and fully supported. We feel that PowerShell and POSIX-based CLIs serve different sets of users and provides the best choice for automation/scripting scenarios from the command-line. Both of these options are available on multiple platforms.  Both are open source now and we are investing in both of them.
Quelle: Azure

ISO 22301 highlights Azure's unmatched business continuity & disaster recovery preparedness

Many of you have asked us about how we plan and prepare in Azure so that you can learn from our best practices as well as have the peace of mind that your applications, and data, are safe and available in Azure. Today we are pleased to announce that Microsoft Azure has achieved ISO 22301 certification. Microsoft is the first hyper-scale cloud provider to achieve this important certification that ensures your Azure applications are backed by the highest standard for business continuity and disaster preparedness.

For years, we’ve heard from organizations about the importance of disaster preparedness and continuous improvement in their operations to ensure their IT systems can survive, and be restored, in the aftermath of major incidents (such as natural disasters, power outages, or cyber-attacks). As of today, we are the only major cloud provider to prove our commitment of being fully prepared for all eventualities through this internationally recognized standard for business continuity, ISO 22301. Our ISO 22301 certification is applicable across both our Azure public and Azure Government clouds. 

What does Azure achieving ISO 22301 provide? It gives you the assurance that you can trust Microsoft Azure with your mission critical applications by providing an extensive independent 3rd party audit of all aspects of Azure’s business continuity. This includes the following:

how backups are validated
how recovery is tested
the competency/training of critical staff
the level of resources available
buy-in by senior management
how risks are assessed/mitigated
adherence to legal/regularly requirements
the process for response to incidents
the process for learning from incidents

Being prepared for whatever happens is not easy, but it’s something that we in Azure take seriously. We test at all levels of our infrastructure to ensure that every day we are working to improve the cloud experience for you. We do tests as small as fault injections at the individual service layer, all the way up to entire region fail-over tests. We’ve been doing these tests for years and this work has helped us continuously improve the Azure infrastructure and services you use every day. ISO 22301 reviews and validates that we are selecting the right tests of our cloud services, that we’ve created programs to continuously run those tests, and that we implement improvements based on those test results.

Achieving the ISO 22301 certification demonstrates the seriousness of our commitment to providing you the highest quality of service, and our achievement of this rigorous third party attestation is part of our promise to provide you the most robust infrastructure possible for deploying your applications in the cloud. To learn more about Microsoft Azure’s ISO 22301 certification and download a copy of the certification, please visit https://aka.ms/iso22301cert.
Quelle: Azure

Availability of H-series VMs in Microsoft Azure

We are excited to announce availability of the new H-series virtual machines in Azure.   With the availability of these new VM sizes we continue our mission to deliver great performance for HPC applications in Azure.  H-series VM sizes is an excellent fit for any compute-intensive workload.  They are designed to deliver cutting edge performance for complex engineering and scientific workloads like computational fluid dynamics, crash simulations, seismic exploration, and weather forecasting simulations. 

The new H-series sizes are initially available in the South Central US Azure region and will be rolled out across other regions in the near future. 

The H-series VMs will be available in six different sizes, all based on Intel E5-2667 V3 3.2 GHz (with turbo up to 3.5 GHz) processor technology, utilizing DDR4 memory and SSD-based local storage.  The new H-series VMs furthermore features a dedicated RDMA backend network enabled by FDR InfiniBand network, capable of delivering ultra-low latency.  RDMA networking is dedicated for MPI (Message Passing Interface) traffic when running tightly coupled applications.

*m: High Memory, r: RDMA network

We see a large number of enterprise customers embracing Microsoft Azure for their enterprise HPC workloads. Enterprise customers bursting their HPC jobs to Azure for additional compute power helping to solve complex design of experiments (DOE), optimizations, and other critical projects.  One of our premier partners, Altair Engineering Inc., with its suite of enterprise computing products is an excellent and current example of integration between customers&; on-premises environment and H-series VMs in Azure.

“We are excited about the introduction of new non hyper-threaded compute and network optimized H-series VMs in Azure, we worked closely with the Microsoft Team to test our solutions for performance and scaling on the H Series VMs. Based on the testing we are confident not only to deliver high performance to our customers but also provide deep integration with the Azure environment to enable HPC cloud environments”  Sam Mahalingam, CTO Altair Engineering Inc

H-series VMs can deliver great performance running a variety of applications, helping businesses around the world reducing their product development cycle and bring products faster to market.

"We are pleased to see the launch of the new Azure high performance H series VMs with InfiniBand and Linux RDMA technology. This performance accelerates the pace of product design cycles using simulation and helps engineers discover better designs, faster." – Keith Foston, Product Manager, CD-adapco, a Siemens Business

H-series virtual machines provide on demand compute capacity for our customers that want to solve complex automotive crash simulation problems.  Through partners like d3View we can bring large scale computing capabilities to our customers when needed.

“We see a great need for the best-in-class compute power and capacity. With the introduction of Azure’s new H-Series with E5-2667 processor and RDMA InfiniBand network, running large-scale simulations across hundreds of cores will offer reduced turn-around time for simulation engineers and scientists. Multi-physics simulation software like LS-DYNA is designed to scale to thousands of processors and with the H-Series, we look forward to helping our customers in evaluating designs quickly” Suri Bala, CEO d3View

The low-latency RDMA network enabled by FDR InfiniBand in the H-series VMs, particularly using the H16r and H16mr sizes, make up an ideal combination for delivering the necessary scale and performance for very large CFD (Computational Fluid Dynamics) simulations.

“The RDMA technology in the new H series VMs is critical when running large scale-out jobs on the cloud.  With clock frequencies flattening out the last few years, RDMA technology enables jobs to scale out to large number of nodes. Our testing of CFD codes with the TotalCAE Portal enabled us to achieve reduced runtimes at large core counts that would not be possible without RDMA technology in Azure.”  Rod Mach, CEO TotalCAE

Non-RDMA-enabled H-series sizes can be deployed with various Linux distributions and Windows Server OS images available in the Azure Marketplace.  RDMA-enabled H-series VM sizes can be deployed using Windows Server 2008 R2, Windows Server 2012, Windows Server 2012 R2, and CentOS-based 7.1 HPC and SUSE Linux Enterprise Server 12 SP1 HPC images.  For more information and a quick guide on how to get started, see the Linux and Windows documentation.

With this milestone in our HPC journey in the cloud, we’d like to re-emphasize our excitement in bringing world-class High Performance Computing infrastructure capabilities through the Cloud to every engineer and scientist in the world. 
Quelle: Azure