Azure introduces new capabilities for live video analytics

In June 2020, we announced the preview of the Live Video Analytics platform—a groundbreaking new set of capabilities in Azure Media Services that allows you to build workflows that capture and process video with real-time analytics from the intelligent edge to intelligent cloud. We continue to see customers across industries enthusiastically using Live Video Analytics on IoT Edge in preview, to drive positive outcomes for their organizations. Last week at Microsoft Ignite, we announced new features, partner integrations, and reference apps that unlock additional scenarios that include social distancing, factory floor safety, security perimeter monitoring, and more. The new product capabilities that enable these scenarios include:

Spatial Analysis in Azure Computer Vision for Cognitive Services: Enhanced video analytics that factor in the spatial relationships between people and movement in the physical domain.
Intel OpenVINO Model Server integration: Build complex, highly performant live video analytics solutions powered by OpenVINO toolkit, with optimized pre-trained models running on Intel CPUs (Atom, Core, Xeon), FPGAs, and VPUs.
NVIDIA DeepStream integration: Support for hardware accelerated hybrid video analytics apps that combine the power of NVIDIA GPUs with Azure services.
Arm64 support: Develop and deploy live video analytics solutions on low power, low footprint Linux Arm64 devices.
Azure IoT Central Custom Vision Template: Build rich custom vision applications in as little as a few minutes to a few hours with no coding required.
High frame rate inferencing with Cognitive Services Custom Vision integration: Demonstrated in a manufacturing industry reference app that supports six useful out of the box scenarios for factory environments.

Making video AI easier to use

Given the wide array of available CPU architectures (x86-64, Arm, and more) and hardware acceleration options (Intel Movidius VPU, iGPU, FPGA, NVIDIA GPU), plus the dearth of data science professionals to build customized AI, putting together a traditional video analytics solution entails significant time, effort and complexity.

The announcements we’re making today further our mission of making video analytics more accessible and useful for everyone—with support for widely used chip architectures, including Intel, NVIDIA and Arm, integration with hardware optimized AI frameworks like NVIDIA DeepStream and Intel OpenVINO, closer integration with complementary technologies across Microsoft’s AI ecosystem—Computer Vision for Spatial Analysis and Cognitive Services Custom Vision, as well as an improved development experience via the Azure IoT Central Custom Vision template and a manufacturing floor reference application.

Live video analytics with Computer Vision for Spatial Analysis

The Spatial Analysis capability of Computer vision, a part of Azure Cognitive Service, can be used in conjunction with Live Video Analytics on IoT Edge to better understand the spatial relationships between people and movement in physical environments. We’ve added new operations that enable you to count people in a designated zone within the camera’s field of view, to track when a person crosses a designated line or area, or when people violate a distance rule.

The Live Video Analytics module will capture live video from real-time streaming protocol (RTSP) cameras and invoke the spatial analysis module for AI processing. These modules can be configured to enable video analysis and the recording of clips locally or to Azure Blob storage.

Deploying the Live Video Analytics and the Spatial Analysis modules on edge devices is made easier by Azure IoT Hub. Our recommended edge device is Azure Stack Edge with the NVIDIA T4 Tensor Core GPU. You can learn more about how to analyze live video with Computer Vision for Spatial Analysis in our documentation.

Live Video Analytics with Intel’s OpenVINO Model Server

You can pair the Live Video Analytics on IoT Edge module with the OpenVINO Model Server(OVMS) – AI Extension from Intel to build complex, highly performant live video analytics solutions. Open vehicle monitoring system (OVMS) is an inference server powered by the OpenVINO toolkit that’s highly optimized for computer vision workloads running on Intel. As an extension, HTTP support and samples have been added to OVMS to facilitate the easy exchange of video frames and inference results between the inference server and the Live Video Analytics module, empowering you to run any object detection, classification or segmentation models supported by OpenVINO toolkit.

You can customize the inference server module to use any optimized pre-trained models in the Open Model Zoo repository, and select from a wide variety of acceleration mechanisms supported by Intel hardware without having to change your application, including CPUs (Atom, Core, Xeon), field programmable gate arrays (FPGAs), and vision processing units (VPUs) that best suit your use case. In addition, you can select from a wide variety of use case-specific Intel-based solutions such as Developer Kits or Market Ready Solutions and incorporate easily pluggable Live Video Analytics platform for scale.

“We are delighted to unleash the power of AI at the edge by extending OpenVINO Model Server for Azure Live Video Analytics. This extension will simplify the process of developing complex video solutions through a modular analytics platform. Developers are empowered to quickly build their edge to cloud applications once and deploy to Intel’s broad range of compute and AI accelerator platforms through our rich ecosystems.”—Adam Burns, VP, Edge AI Developer Tools, Internet of Things Group, Intel




Live Video Analytics with NVIDIA’s DeepStream SDK

Live Video Analytics and NVIDIA DeepStream SDK can be used to build hardware-accelerated AI video analytics apps that combine the power of NVIDIA graphic processing units (GPUs) with Azure cloud services, such as Azure Media Services, Azure Storage, Azure IoT, and more. You can build sophisticated real-time apps that can scale across thousands of locations and can manage the video workflows on the edge devices at those locations via the cloud. You can explore some related samples on GitHub.

You can use Live Video Analytics to build video workflows that span the edge and cloud, and then combine DeepStream SDK to build pipelines to extract insights from video using the AI of your choice.

The diagram above illustrates how you can record video clips that are triggered by AI events to Azure Media Services in the cloud. The samples are a testament to robust design and openness of both platforms.

“The powerful combination of NVIDIA DeepStream SDK and Live Video Analytics powered by the NVIDIA computing stack helps accelerate the development and deployment of world-class video analytics. Our partnership with Microsoft will advance adoption of AI-enabled video analytics from edge to cloud across all industries and use cases.”—Deepu Talla, Vice President and General Manager of Edge Computing, NVIDIA



Live Video Analytics now runs on Arm

You can now run Live Video Analytics on IoT Edge on Linux Arm64v8 devices, enabling you to use low power-consumption, low-footprint devices such as the NVIDIA® Jetson™ series.

Develop Solutions Rapidly Using the IoT Central Video Analytics Template

The new IoT Central video analytics template simplifies the setup of an Azure IoT Edge device to act as a gateway between cameras and Azure cloud services. It integrates the Azure Live Video analytics video inferencing pipeline and OpenVINO Model Server—an AI Inference server by Intel, enabling customers to build a fully working end-to-end solution in a couple of hours with no code. It’s fully integrated with the Azure Media Services pipeline to capture, record, and play analyzed videos from the cloud.

The template installs IoT Edge modules such as an IoT Central Gateway, Live Video Analytics on IoT Edge, Intel OpenVINO Model server, and an ONVIF module on your edge devices. These modules help the IoT Central application configure and manage the devices, ingest live video streams from the cameras, and easily apply AI models such as vehicle or person detection. Simultaneously in the cloud, Azure Media Services and Azure Storage record and stream relevant portions of the live video feed. Refer to our IoT Show episode and related blog post for a full overview and guidance on how to get started.

Integration of Cognitive Services Custom Vision models in Live Video Analytics

Many organizations already have a large number of cameras deployed to capture video data but are not conducting any meaningful analysis on the streams. With the advent of Live Video Analytics, applying even basic image classification and object detection algorithms to live video feeds can help unlock truly useful insights and make businesses safer, more secure, more efficient, and ultimately more profitable. Potential scenarios include:

Detecting if employees in an industrial/manufacturing plant are wearing hard hats to ensure their safety and compliance with local regulations.
Counting products or detecting defective products on a conveyer belt.
Detecting the presence of unwanted objects (people, vehicles, and more) on-premises and notifying security.
Detecting low and out of stock products on retail store shelves or on factory parts shelves.

Developing AI models from scratch to perform tasks like these and deploying them at scale to work on live video streams on the edge entails a non-trivial amount of work. Doing it in a scalable and reliable way is even harder and more expensive. The integration of Live Video Analytics on IoT Edge with Cognitive Services Custom Vision makes it possible to implement working solutions for all of these scenarios in a matter of minutes to a few hours.

You begin by first building and training a computer vision model by uploading pre-labeled images to the Custom Vision service. This doesn’t require you to have any prior knowledge of data science, machine learning, or AI. Then, you can use Live Video Analytics to deploy the trained custom model as a container on the edge and analyze multiple camera streams in a cost-effective manner.

Live Video Analytics powered manufacturing floor reference app

We have partnered with the Azure Stack team to evolve the Factory.AI solution, a turn-key application that makes it easy to train and deploy vision models without the need for data science knowledge. The solution includes capabilities for object counting, employee safety, defect detection, machine misalignment, tool detection, and part confirmation. All these scenarios are powered by the integration of Live Video Analytics running on Azure Stack Edge devices.

In addition, the Factory.AI solution also allows customers to train and deploy their own custom ONNX models using Custom Vision SDK. Once a custom model is deployed on the edge, the reference app leverages gRPC from Live Video Analytics for high frame rate accurate inferencing. You can learn more about the manufacturing reference app at Microsoft Ignite or by visiting the Azure intelligent edge patterns page.

Get started today

In closing, we’d like to thank everyone who is already participating in the Live Video Analytics on IoT Edge preview. We appreciate your ongoing feedback to our engineering team as we work together to fuel your success with video analytics both in the cloud and on the edge. For those of you who are new to our technology, we’d encourage you to get started today with these helpful resources:

Watch the Live Video Analytics introduction video.
Find more information on the product details page.
Watch the Live Video Analytics demo.
Try the new Live Video Analytics features today with an Azure free trial account.
Register on the Media Services Tech Community and hear directly from the Engineering team on upcoming new features, to provide feedback and discuss roadmap requests.
Download Live Video Analytics on IoT Edge from the Azure Marketplace.
Get started quickly with our C# and Python code samples.
Review our product documentation.
Search the GitHub repo for Live Video Analytics open source projects.
Contact for questions.

Intel, the Intel logo, Atom, Core, Xeon, and OpenVINO are registered trademarks of Intel Corporation or its subsidiaries.

NVIDIA and the NVIDIA logo are registered trademarks or trademarks of NVIDIA Corporation in the U.S. and/or other countries. Other company and product names may be trademarks of the respective companies with which they are associated.
Quelle: Azure

Microsoft named a leader in Gartner’s Magic Quadrant for Enterprise Integration Platform as a Service

We are excited to share that Gartner has positioned Microsoft as a leader in the 2020 Enterprise Integration Platform as a Service (EiPaaS) Magic Quadrant, based on our ability to execute and completeness of vision. Microsoft has now remained an EiPaaS MQ leader for the past three years.

According to Gartner, “Enterprise iPaaS providers continue to broaden their go-to-market strategies to cover an increasing range of enterprise integration scenarios.” Our vision is to help customers enable integration in all areas of their operations, from traditional central IT to business-led activities. Azure Integration Services (AIS), comprising of API Management, Logic Apps, Service Bus, and Event Grid, helps customers connect applications and data seamlessly to the cloud for services such as machine learning, cognitive services, and analytics, enabling increased enterprise-wide visibility and agility.

Best-in-class integration capabilities and platform

As applications and data are becoming more connected than ever, integration has become a key part of building applications. Azure Integration Services provides a comprehensive and cohesive set of integration capabilities spanning applications, data, and AI, with over 370 connectors and UI automation with Robotic Process Automation (RPA), for customers to connect everything quickly and easily. We provide these capabilities through high productivity and low code serverless integration and automation across Azure, edge, on-premises, and multi-cloud.

Global availability and customer momentum

Microsoft Azure has a global presence with more than 60 regions and a large, active partner community around the globe. Azure Integration Services has over 40,000 customers across the globe, with more than 60 percent of Fortune 500 companies using AIS for their business integration needs. Learn how customers such as ABB, RXR Realty, Rockefeller Capital Management, and Flexigroup use Azure Integration Services, including Logic Apps, API Management and Service Bus to connect their business applications, data, and processes in a seamless and better way. We look forward to continuing to partner with customers on their integration journey together.

Next steps

Read the full Gartner report here. Visit our website to learn more about Azure Integration Services.
Catch up on the latest Logic Apps product announcements at Microsoft Ignite. 

This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request.

Gartner does not endorse any vendor, product, or service depicted in its research publications and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
Quelle: Azure

General availability of Azure Maps support for Azure Active Directory, Power Apps integration, and more

This blog post was co-authored by Chad Raynor, Principal Program Manager, Azure Maps.

New and recent updates for Microsoft Azure Maps include support for Azure Maps integration with Azure Active Directory (generally available), integration with the Microsoft Power Apps platform, Search and Routing services enhancements, new Weather services REST APIs (in preview), and expanded coverage for Mobility services.

Read on to learn more about the latest features and integrations for Azure Maps:

Azure Maps integration with Azure AD support now generally available

Azure Maps integration with Azure Active Directory (Azure AD) is now generally available allowing customers to rely on Azure Maps and Azure AD for enterprise level security. This update includes support for additional role-based control (RBAC) roles for fine grained access control. Azure Maps supports access to all principal types for Azure RBAC including; individual Azure AD users, groups, applications, Azure resources, and Azure Managed identities. Additionally, Azure Maps documentation now includes expanded implementation examples and estimates on implementation effort to assist choosing the right implementation method based on type of scenario.   

Geospatial features in Power Apps powered by Azure Maps now in preview

Microsoft Power Apps announced the preview of geospatial capabilities powered by Azure Maps. This includes both an interactive map component as well as an address suggestion component. Power Apps is a suite of apps, services, and connectors, along with a and data platform that provides a rapid app development environment to build custom apps for your business needs.

To get started with the new geospatial components, you don’t need to be a professional developer to take advantage of these features. You can add these components with the ease of drag-and-drop and low-code development. As an example, you can build an app for your field service workers to report any repair needs including the location of the work and detailed pictures.

The preview includes the following:

Interactive vector tile maps supporting multiple base map styles including satellite imagery.
Address search with dynamic address suggestions as you type and geocoding. The component suggests multiple potential address matches that the user can select and returns the address as structured data. This allows your app to extract information like city, street, municipality—and even latitude and longitude—in a format friendly to many locales and international address formats.

Figure 1. Azure Maps interactive vector tiles in Power Apps.

Search and Route services enhancements

Search business locations by brand name

Through Azure Maps, Search services customers have access to almost 100 million Point of Interest (POI) locations globally. To help our customers to restrict the POI search results to specific brands, we introduced brandSet parameter that is supported by all Search APIs covering POI search capabilities.

As an example, the users of your application want to search restaurants by typing ‘restaurant’ or selecting category ‘restaurant,’ and you want to show them all the restaurants that are under your brand umbrella. In your API request, you can pass us multiple brands or just one brand name, and results will be filtered based on those.

POI Category Tree API in preview

Azure Maps POI Search APIs support almost 600 categories and sub-categories. In an effort to make it easier for our customers to understand and leverage this information to optimize their queries, we have added POI Category Tree API to our Search services portfolio, and introduced categorySet parameter that is supported by all Search APIs covering POI search capabilities.  Azure Maps POI category API provides a full list of supported POI categories and subcategories, along with an ID for each category.

Customers can use POI category API along with Search API to query for POIs mapped to a specific category ID. With these capabilities customers can increase the accuracy of POI search results when customers use category IDs instead of description strings.

Request reachable range by travel distance

Azure Maps offers a full suite of vehicle routing capabilities, for example, to support fleet and logistics related scenarios. Azure Maps Get Route Range API, also known as Isochrone API, allows customers to determine the distance from a single point in which a user can reach based on time, fuel or energy. The API allows customers to also request reachable range based on travel distance returning the polygon boundary (isochrone). The returned isochrone, good for visualization, can also be used to filter for spatial queries, which opens a wide variety of applications for spatial filtering. 

For example, when there is a traffic incident at a given point, you need to know which of the electronic traffic signs should be updated to alert drivers of the incident. To figure out which signs to update, you want to know which signs drivers would hit within 5 kilometers of driving to the incident. You can use Route Range API to calculate the reachable range and after that you can use the returned polygon to check which signs resided in that polygon. See our code sample to test route range features in action.

Figure 2. Map visualizing reachable range based on travel distance.

Search Electric Vehicle (EV) charging stations by connector type

Azure Maps Route services support today’s multiple routing related customer scenarios covering routing support for private, electric, and commercial vehicles like trucks. To make it possible for our customers to build advanced features for their Electric Vehicle (EV) applications, our customer can now restrict the result to Electric Vehicle Station supporting specific connector types to find the most suitable EV charging stations. Consider that you have an application that allows your users to determine if there are suitable charging stations along their planned routes, or as business fleet owner you want to know if there are enough specific types of charging stations for your employees to charge their vehicles while they are delivering goods for your customers. You can now accomplish this through all Azure Maps Search service APIs with POI search capability, like Search Fuzzy and Search Nearby APIs.

Mobility services updates

Expanded coverage

We are announcing that we expanded our Mobility service coverage globally covering today almost 3,200 cities, to offer the best in-class public transit data. To highlight some improvements, we now support countries and regions such as Liechtenstein and metro areas like Hyderabad (India), Côte d’Azur (France), Dakar (Senegal), Rio Verde (Brazil) and San Francisco-San Jose (CA, US). You can find all the supported metro areas from our full Mobility services coverage page. One metro area can be country/region, city, or a group of multiple cities.

MetroID is no longer a required parameter for multiple Mobility APIs—To make it easier for our customers to request transit data, we have changed request parameter metroID optional for the following Mobility services APIs:

Get Nearby Transit
Get Transit Stop Info
Get Transit Route
Get Real-time Arrivals
Get Transit Line Info

As a result, Azure Maps customers don’t need to first request metroID parameter by calling Get Metro ID API to be able to call public transit route directions API.

Weather services updates

Severe Weather Alerts

Severe weather phenomenon can significantly impact our everyday life and business operations. For example, severe weather conditions such as tropical storms, high winds, or flooding can close roads and force logistics companies to reroute their fleet—causing delays in reaching destinations and breaking the cold chain of refrigerated food products. Azure Maps Severe Weather Alerts API returns the severe weather alerts that are available worldwide from both official Government Meteorological Agencies and leading global to regional weather alert providers.

The service can return details such as alert type, category, level, and detailed descriptions about the active severe alerts for the requested location, such as hurricanes, thunderstorms, lightning, heat waves or forest fires. As an example, logistics managers can visualize severe weather conditions on a map along with business locations and planned routes, and further coordinate with drivers and local workers.

Figure 3. Active Severe Weather Alerts on a map.

Azure Maps Indices API

There may be times when you want to know if weather conditions are optimal for a specific activity, such as outdoor construction, indoor activities, running or farming, including soil moisture information. Azure Maps Indices API returns index values that will guide users to plan future activities. For example, a health mobile application can notify users that today is good weather for running or for other outdoors activities like playing golf, and retail stores can optimize their digital marketing campaigns based on predicted index values. The service returns in daily indices values for current and the next 5, 10, and 15 days.

Request past and future radar and satellite tiles

In addition to real-time radar and satellite tiles, Azure Maps customers can now request past and future tiles to enhance data visualizations with map overlays by calling directly Get Map Tile v2 API or requesting tiles via Azure Maps web and mobile SDKs. Radar tiles are provided up to 1.5 hours in the past and up to 2 hours in the future and are available in 5-minute intervals. Infrared tiles are provided up to 3 hours in the past in 10-minute intervals.

Figure 4. Historical radar tiles visualized on a map.

We want to hear from you

We are always working to grow and improve the Azure Maps platform and want to hear from you. We’re here to help and want to make sure you get the most out of the Azure Maps platform.

Have a feature request? Add it or vote up the request on our feedback site.
Having an issue getting your code to work? Have a topic you would like us to cover on the Azure blog? Ask us on the Azure Maps forums.
Looking for code samples? There’s a plethora of them on our Azure Maps Code Sample site. Wrote a great one you want to share? Join us on GitHub.
To learn more, read the Azure Maps documentation.

Quelle: Azure

Scaling Microsoft Kaizala on Azure

This post was co-authored by Anubhav Mehendru, Group Engineering Manager, Kaizala.

Mobile-only workers depend on Microsoft Kaizala—a simple and secure work management and mobile messaging app—to get the work done. Since COVID-19 has forced many of us to work from home across the world, Kaizala usage has surged close to 3x from pre-COVID-19. While this is a good opportunity for the product to grow, it has increased pressure on the engineering team to ensure that the service scales along with the increased usage while maintaining the customer promised SLA of 99.99 percent.

Today, we’re sharing share some of the learnings about managing and scaling an enterprise grade secure productivity app and the backend service behind it.

Foundation of Kaizala

Kaizala is a productivity tool primarily targeted for mobile-only users and is based on Microservice architecture with Microsoft Azure as the core cloud platform. Our workload runs on Azure Cloud Services, with Azure SQL DB and Azure Blob Storage used for primary storage. We use Azure Cache for Redis to handle caching, and Azure Service Bus and Azure Notification Hub manages async processing of events. Azure Active Directory (Azure AD) is used for our user authentication. We use Azure Data Explorer and Azure Monitoring for data analytics. Azure Pipelines is used for automated safe deployments where we can deploy updates rapidly multiple times in a week with high confidence.

We follow a safe deployment process, ensuring minimal customer impact, and stage wise release of new feature and optimizations with full control on exposure and rollback ability.

In addition, we use a centralized configuration management system where all our config can be controlled, such as exposure of a new feature to a set of users/groups or tenants. We fine grained control on msg processing rate, receipt processing, user classification, priority processing, slow down non-core functionalities etc. This allows us to rapidly prototype new feature and optimization over a user set.

Key resiliency strategies

We employ the following key resilience strategies:

API rate limit

To protect our existing service from misuse, we need to control the incoming calls coming from multiple clients within a safe limit. We incorporated a rate limiter entirely based on in-memory caching that does the work with negligible latency impact on customer operations.

Optimized caching

To provide optimal user experience, we created a generic in-memory caching infra where multiple compute nodes are able the quickly sync back the state changes using Azure Redis PubSub. Using this a significant number of external API calls were avoided which effectively reduced our SQL load.

Prioritize critical operations

In case of overload of service due to heavy customer traffic, we prioritize the critical customer operations such as messaging over non-core operations such as receipts.

Isolation of core components

Core system components that support messaging are now totally isolated from other non-core parts so that any overload does not impact the core messaging operations. The isolation is done at every resource level such as separate compute nodes, separate service bus for event processing and totally separate storage for non-core operations.

Reduction in intra node communication

We made multiple enhancements in our message processing system where we significantly reduced scenarios of intra node communication that caused a heavy intra node dependency and slows down the entire message processing.

Controlled service rollout

We made several changes in our rollout process to ensure controlled rollout of new features and optimizations to minimize and negative customer impact. The deployments moved to early morning slots where the customer load is minimal to prevent any downtime.

Monitoring and telemetry

We setup specific monitoring dashboards to give a quick overview of service health that monitor important parameters, such as CPU consumption, thread count, garbage collection (GC) rate, rate of incoming messages, unprocessed messages, lock contention rate, and connected clients.

GC rate

We have finetuned the options to control the rate of Gen2 GC happening in a cloud service as per the needs of the web and worker instances to ensure minimal latency impact of GC during customer operations.

Node partitioning

Users need to be partitioned across multiple nodes to distribute the ownership responsibility using a consistent hashing mechanism. This master ownership helps in ensuring that only required user's information is stored in the in-memory cache on a particular node.

Active passive user

In large group messaging operations, there are always users who are actively using the app while a lot of users are not active. Our idea is to prioritize message delivery for active users so that the last bucket active user received the message fast.

Serialization optimization

Default JSON serialization is costly when the input output operations are very frequent and burn precious CPU cycles. ProtoBuf offers a fast binary serialization protocol that was leveraged to optimize the operations for large data structures.

Scaling compute

We re-evaluated our compute usage in our internal multiple test and scale environments and judiciously reduced the compute node SKU to optimize as per the needs and optimize cost of goods sold (COGS). While most of our traffic in an Azure region is during the day time, there is minimal load at the night where we leverage to do heavy tasks, such as redundant data cleanup, cache cleanups, GC, database re-indexing, and compliance jobs.

Scaling storage

With increasing scale, the load of receipts became huge on the backend service and was consuming a lot of storage. While critical operations required highly consistent data, the requirement is less for non-critical operations. We moved the receipt to highly available No-SQL storage, which costs a tenth of the SQL storage.

Queries for background operations were spread out lazily to reduce the overall peak load on SQL storage. Certain non-critical Operations were moved from being strongly consistent to eventual consistency model to flatten the peak storage load, thus creating more capacity for additional users.

Our future plans

As the COVID-19 situation continues to be grave, we are expecting an accelerated pace of Kaizala adoption from multiple customers. To keep up with the increase in messaging load and high customer usage, we are working on new enhancements and optimizations to ensure that we remain ahead of the curve including:

Developing alternative messaging flows where users actively using the app can directly pull group messages even if the backend system is overloaded. Message delivery is prioritized for active users over passive users.
Aggressively working on distributed in-memory caching of data entities to enable fast user response and alternative designs to keep cache in sync while minimizing stale data.
Moving to container-based deployment model from the current virtual machine (VM)-based model to bring more agility and reduce operational cost.
Exploring alternative storage mechanism which scale well with massive write operations for large consumer groups supporting batched data flush in a single connection.
Actively exploring ideas around active-active service configuration to minimize downtime due to data center outages and minimize Recovery Time Objective (RTO) and Recovery Point Objective (RPO).
Exploring ideas around moving some of the non-core functionalities to passive scale units to utilize the standby compute/storage resources there.
Evaluating the dynamic scaling abilities of Azure Cloud services where we can automatically reduce the number of compute nodes during nighttime hours where our user load is less than fifth of the peak load.

Quelle: Azure

Azure Cost Management + Billing updates – September 2020

Whether you're a new student, thriving startup, or the largest enterprise, you have financial constraints, and you need to know what you're spending, where, and how to plan for the future. Nobody wants a surprise when it comes to the bill, and this is where Azure Cost Management + Billing comes in.

We're always looking for ways to learn more about your challenges and how Azure Cost Management + Billing can help you better understand where you're accruing costs in the cloud, identify and prevent bad spending patterns, and optimize costs to empower you to do more with less. Here are a few of the latest improvements and updates based on your feedback:

Simplify financial reporting with cost allocation, now in preview.
Connector for AWS is now generally available.
Get pay-as-you-go rates for all Azure products and services.
What's new in Cost Management Labs.
Expanded availability of resource tags in cost reporting.
15 ways to optimize your Azure costs.
New ways to save money with Azure.
Upcoming changes to Azure usage data.
Documentation updates.

Let's dig into the details.


Simplify financial reporting with cost allocation, now in preview

Managing cloud costs can be challenging, especially if your organization needs to break down costs for an internal chargeback. You might have separate business units, or you might need to facilitate external billing for distinct customer solutions. This becomes even more difficult when you employ shared services to reduce costs, since there may not be a clear way to break those shared services down by business unit or customer. This is where the Azure Cost Management + Billing cost allocation preview for Enterprise Agreement (EA) and Microsoft Customer Agreement (MCA) accounts comes in.

Cost allocation is the process of breaking down and distributing shared costs. A couple examples of this are networking infrastructure, shared by multiple virtual machines, or shared databases or storage accounts, used by different teams or customers. If you're sharing services across business units or customers, take a look at how cost allocation can help you drive more accountability and streamline cost reporting within your organization.

Learn more about the cost allocation preview and let us know what you'd like to see next.


Connector for AWS is now generally available

In 2019, we announced the preview of AWS connectors for Azure Cost Management + Billing, which allows you to view and manage your Azure and AWS costs from a single pane of glass in the Azure portal. Support for AWS is now generally available. This new connector simplifies handling different cost models and numerous billing cycles so you can visualize and always stay up-to-date with your costs across clouds.

For more details and a walkthrough for how to get started, see the Connector for AWS general availability announcement.


Get pay-as-you-go rates for all Azure products and services

Have you ever needed access to pay-as-you-go (PAYG) prices for different Azure services, but didn't know where to start? Maybe you're looking to optimize costs with the cheapest SKU or location combination or generating a savings report to show how much you've saved with discounted rates. Perhaps you want to get notified about price changes or build your own pricing calculator. Whatever the need, the Retail Prices API gives you a simple, unauthenticated way to get PAYG rates for all Azure products and services. Give it a spin and let us know what you'd like to see next.


What's new in Cost Management Labs

With Cost Management Labs, you get a sneak peek at what's coming in Azure Cost Management and can engage directly with us to share feedback and help us better understand how you use the service, so we can deliver more tuned and optimized experiences. Here are a few features you can see in Cost Management Labs:

Dark theme support in dashboard tiles – Now available in the public portal
Pinned dashboard tiles now support the dark theme.
Pin the Cost Management overview for quick access to scopes – Now available in the public portal
Similar to pinning cost analysis, you can also pin the Cost Management overview to get quick access to the scopes you need. Each pinned tile will remember and pre-select the desired scope, helping you get to where you need to be quicker than ever.
Improved getting started
Expanded getting started experience covering more options across Cost Management + Billing.

Of course, that's not all. Every change in Azure Cost Management is available in Cost Management Labs a week before it's in the full Azure portal. We're eager to hear your thoughts and understand what you'd like to see next. What are you waiting for? Try Cost Management Labs today.


Expanded availability of resource tags in cost reporting

Tagging is the best way to organize and categorize your resources outside of the built-in management group, subscription, and resource group hierarchy. Add your own metadata and build custom reports using cost analysis. While most Azure resources support tags, some resource types do not. Here are the latest resource types which now support tags:

Data Factories
Databricks workspaces

Remember tags are a part of every usage record and are only available in Cost Management reporting after the tag is applied. Historical costs are not tagged. Learn more about resource tagging limitations and update your resources today for the best cost reporting.


15 ways to optimize your Azure costs

With a global health pandemic continuing to challenge us to find a new normal, cost optimization is at the forefront of many minds. We've talked a lot about the many different ways to optimize costs, but I'd like to offer a quick checklist for anyone getting started with cost optimization in Azure, as well as offer up a couple service-specific checklists for those using Azure SQL or Azure Backup.

First, let's start with basics:

Before you start architecting your solutions, review and factor proven practices from the Azure Well-Architected Framework into your solutions.
If you already have solutions deployed, your next step is to review and take action on cost-saving recommendations in Azure Advisor. 
Look for additional savings, up to 72 percent, for over 18 Azure and third-party services with Azure reservations. 
Take advantage of unused on-premises Windows and SQL Server licenses with Azure Hybrid Benefit.
Make sure you understand how the services you use are charged with the Azure Pricing Calculator and architect your solutions in ways that maximize investments while minimizing costs.

If you're using SQL in Azure, on-premises, or from another cloud provider, here are a few extra tips that might help you save even more on top of reservations, Azure Hybrid Benefit, and other Advisor recommendations:

Maintain business continuity in the cloud with free SQL Server licenses.
Shift capex to opex with SQL Server on Azure Virtual Machines.
Protect your data with free security updates.
Boost productivity with fully managed Azure SQL database services.
Pay only for the resources you use with per-second billing on the SQL Database serverless.

And lastly, if you're using a backup solution today, here are a few ways Azure Backup can help you reduce costs:

Clean up backups for deleted resources using the inactive resources report.
Consider using daily differential backups with weekly full backups to save compared to daily full backups.
Consider shorter retention durations to reduce costs while still meeting compliance requirements.
Backup only what you need with selective disk backup to include or exclude certain virtual machine data disks.
Opt for locally-redundant storage (LRS) for dev/test or other workloads that don't need geo-redundant storage (GRS) replication to cut storage costs in half.

If you're interested in more details on any of the above, here are a few resources you may want to check out:

How to optimize your Azure workload costs.
8 ways to optimize costs on Azure SQL.
5 ways to optimize your backup costs with Azure Backup.


New ways to save money with Azure

Lots of cost optimization improvements over the past month! Here are a few you might be interested in:

Look for new and updated cost-saving recommendations in Azure Advisor:

Right-sizing underutilized MariaDB, MySQL, and PostgreSQL database server resources.
Reservations for Cosmos DB, SQL PaaS, Blob storage, App Service Stamp Fee, MariaDB, MySQL, PostgreSQL, and Synapse analytics.
CPU, memory, and network utilization are now available in via API and Azure Resource Graph.

Save on Dsv3, Dsv4, Ddsv4, Esv3, Esv4, and Edsv4 virtual machines with Azure Dedicated Host.
Save up to 94 percent compared to other providers with Azure Synapse Analytics.


Upcoming changes to Azure usage data

Many organizations use the full Azure usage and charges to understand what's being used, identify what charges should be internally billed to which teams, and/or to look for opportunities to optimize costs with Azure reservations and Azure Hybrid Benefit, just to name a few. If you're doing any analysis or have setup integration based on product details in the usage data, please update your logic for the following services.

The following Azure Bastion meter IDs for Azure Gov are changing effective October 1:

Azure Bastion Basic:
Old: 0e3e1208-1bcd-41a4-a98b-a82d5928e448
New: 077ecfba-a126-5c32-8c15-506554457f05
Azure Bastion data transfer out:
Old: 43f56dc9-e2c7-47bc-935e-56e24f531111
New: 88d1ca2d-4cd2-50a9-b104-d83f41e8a2dc

Also, remember the key-based Enterprise Agreement (EA) billing APIs have been replaced by new Azure Resource Manager APIs. The key-based APIs will still work through the end of your enrollment, but will no longer be available when you renew and transition into Microsoft Customer Agreement. Please plan your migration to the latest version of the UsageDetails API to ease your transition to Microsoft Customer Agreement at your next renewal.


Documentation updates

Lots of documentation updates this month. Here are a few you might be interested in:

Subscription transfer for CSP partners.
Selecting a language for budget alerts.
Documented invoice status values shown in the Azure portal.
Added note about subscription quota changes after subscription transfers.
Billing admins can see reservation recommendations.
Updated the list of cost recommendations.
Set the default management group for new subscriptions.
New management group tutorials using the portal, Azure CLI, PowerShell, or API.

Want to keep an eye on all of the documentation updates? Check out the Cost Management + Billing doc change history in the azure-docs repository on GitHub. If you see something missing, select Edit at the top of the document and submit a quick pull request.


What's next?

These are just a few of the big updates from last month. Don't forget to check out the previous Azure Cost Management + Billing updates. We're always listening and making constant improvements based on your feedback, so please keep the feedback coming.

Follow @AzureCostMgmt on Twitter and subscribe to the YouTube channel for updates, tips, and tricks. And, as always, share your ideas and vote up others in the Cost Management feedback forum.

We know these are trying times for everyone. Best wishes from the Azure Cost Management + Billing team. Stay safe and stay healthy!
Quelle: Azure

The new Azure VMware Solution is now generally available

Last week, during Microsoft Ignite we announced the general availability of the new Azure VMware Solution. Designed, built, and supported by Microsoft, Cloud Verified by VMware, running VMware Cloud Foundation technologies, Azure VMware Solution enables customers to extend or migrate VMware workloads to the cloud seamlessly. Organizations can maintain existing VMware skills and operational processes, running VMware Cloud Foundation technologies, and leverage the benefits of Azure—all at the same time.

Since announcing the preview, we’ve seen tremendous interest from businesses for Azure VMware Solution. Driven in part by organizations adapting to and recovering from the global health crisis, organizations are increasingly adopting the cloud to ensure continuity, resiliency, and cost efficiency for their business.

As always, Microsoft is focused on delivering solutions that meet our customers where they are today. In a time where speed, simplicity and skill retention are critical, Azure VMware Solution provides organizations with a fast path to the cloud, so your business can continue to use the VMware platform you know, and modernize on-premises workloads at your pace.

Engineered to meet you where you are

The goal of Azure VMware Solution has always been to deliver the best, most secure, most functional cloud of choice for our customers. As a first-party Microsoft Azure service, built in partnership with VMware, we bring the best of both platforms together to deliver a high quality, integrated solution.

Azure VMware Solution has been engineered as a core Azure compute service to deliver you the speed, scale, and high availability of our global infrastructure. With a focus on simplicity the solution features a unified Azure portal experience, and seamless access to other Azure resources. Our Microsoft and VMware engineering teams also worked closely together to deliver new functionality to run familiar VMware Cloud Foundation technology, including vSphere, HCX, NSX-T, and vSAN. We also heard from our enterprise customers the importance of large scale bulk migration, so to further ease migration efforts, you will now be able to take advantage of HCX Enterprise edition (currently in preview) which includes Replication Assisted vMotion (RAV).

Finally, from our experience with enterprise migrations, we know the importance of planning and estimating your migration to cloud. We are pleased to share that Azure Migrate supports Azure VMware Solution, helping businesses discover all of your VMs running on-premises and create assessments based on sizing and cost analysis, so you can create your Azure VMware Solution private cloud to your needs.

Unmatched cost efficiency with Azure Hybrid Benefit

As a core Azure service, Azure VMware Solution also supports Azure Hybrid Benefits, allowing you to bring your existing Microsoft workloads, running on-premises to the cloud, in the most cost-effective way. You can now maximize the value of existing on-premises Windows Server and SQL Server license investments when migrating or extending to Azure. In addition, Azure VMware Solution customers are also eligible for three years of free Extended Security Updates on 2008 versions of Windows Server and SQL Server. These pricing benefits are only available on Azure and create simplicity and cost efficiency for your journey to cloud.

Seamless access to Azure services

Throughout development, delivering on a seamless connection to Azure services has been paramount. Azure VMware Solution is tightly integrated with the Azure global network backbone to ensure you can centralize all your cloud resources in Azure. Now, whether you are looking to migrate completely or extend your on-premises VMware-based applications, you gain access to Azure services that can enhance security, management, and unlock modernization across your entire environment. We know from our customer conversations that organizations need time to develop cloud competencies within the organization. Azure VMware Solution gets you to the cloud quickly, maintaining consistency in the VMware tools and operations that you have, and growing cloud skills over time.

To help ensure business continuity and improve security and management, Azure VMware Solution customers can incrementally attach Azure services to enhance the existing environment and processes, including:

Azure Backup combined with the Recovery Services vault for VM backup and recovery. Provides geo-redundant, longer-term, off-site storage for compliance purposes, and, at the same time, address short-term retention needs for restoring data. This is a cost-effective, scalable approach to backup.
Connect Azure Security Center and Azure Sentinel with Azure VMware Solution virtual machines (VMs) to quickly strengthen your security posture and protect against threats. As security threats continue to increase, this provides a streamlined way to apply advanced best practices to your environment.
Create efficiencies with enhanced management functionalities to support cloud and hybrid environments. Integration of Azure Monitor for vCenter logs provides visibility for VMs usage; Azure Update Manager for Lifecycle Management of Windows VMs; Azure Traffic Manager to balance application workloads running on Azure VMware Solution across multiple endpoints; Azure App gateway to manage traffic to webapps running on Azure VMware Solution; API Management to publish and protect APIs on Azure VMware Solution VMs for the developer community.
Optimization storage for VMs running on Azure VMware Solution with integration with Azure NetApp Files and Azure File Share. As you also modernize application storage strategies, you can now also integrate with Azure SQL services.

Expanding partner ecosystem

As you look to move workloads from your current datacenter, and extend existing VMware workloads from on-premises to the cloud, we recognize that confidence in supportability for partner technologies that you may use today is also important. Microsoft is working closely with key partners that are integral to your IT environment, including leading solutions for backup and disaster recovery, as well as other enterprise services that run on-premises today.

“Zerto and Azure have long been an ideal combination for businesses accelerating cloud adoption and looking to simplify data protection and disaster recovery (DR). Now, with Microsoft’s new generally available release of Azure VMware Solution, customers can use Zerto to replicate and protect VMs into the cloud and within the cloud, with the same seamless experience they have on premise. Users can replicate and recover into Azure VMware Solution in under two hours, providing the ability to implement a real-time, enterprise DR solution in less time than it takes to watch a movie. It’s fast, it’s easy to manage, and it’s a great platform for disaster recovery and data protection for hybrid cloud deployments.”

Gil Levonai, CMO and SVP of Product, Zerto

“Partnering with Microsoft Azure enables us to provide a robust and cost-effective solution for disaster recovery and business continuity. It's an ideal combination: the JetStream DR software brings continuous data protection to enterprise VMware environments, Azure Blob Storage provides a cost effective means of maintaining recovery assets, and the Azure VMware Solution provides a highly available, reliable VMware platform that can scale to meet customers' recovery and failover requirements.”

Tom Critser, Co-Founder and CEO, Jetstream

“Using Microsoft Azure has been a critical component of the success of Commvault’s solutions and we are happy to add support for Azure VMware Solution as part of a customer’s heterogeneous, enterprise-wide data environment. In leveraging Microsoft’s Azure infrastructure, our joint customers gain the benefits of ease, scalability, security, and cost reduction seamlessly combined with leading features of our products, making it a winning combination for customers.”

Randy De Meno, VP/CTO, Microsoft Products & Solutions, Commvault

“Veritas Technologies and the Microsoft Azure teams are mutually invested in our ongoing partnership to solve the most important customer needs in Azure VMware Solution. We worked very closely together in early testing and certification to deliver a level of protection and recovery that surpasses VMware admin’s expectations in Azure VMware Solution. Veritas data protection solutions ensure that no matter where VM data resides, it meets the enterprise data protection requirements of storage reduction, and automated intelligent policies. Veritas’s continued support for the next version of Azure VMware Solution further illustrates our strong partnership and aligned commitment to helping our joint customers in their ongoing adoption of hybrid cloud, including their VMware estate. VMware admins no longer have to choose between local or hosted when it comes to their workloads and can rest easy knowing their data is protected and recoverable from anywhere, to anywhere with Veritas NetBackup.”

Doug Mathews, VP Product Management, Enterprise Data Protection and Compliance, Veritas

“Veeam currently supports the backup of Azure-native virtual machines via Veeam Backup for Microsoft Azure. Now, Veeam support for Azure VMware Solutions makes it possible to easily backup and replicate vSphere workloads to and from Azure. Through Veeam Availability Suite, simple workload protection and portability is possible across on-premises VMware, Azure VMware Solution, and Azure-native VM workloads. As the leader in Cloud Data Management, Veeam strives to partner with our 375,000 customers and help them achieve their business objectives. Our day one support for Azure VMware Solution is not only a testament to our customer commitment, but also in our investment and strong, long-term relationships with VMware and Microsoft.“

Danny Allan, Chief Technology Officer and SVP of Product Strategy, Veeam

Learn more about Azure VMware Solution

Watch our Ignite sessions: To learn more about the solution and the general availability announcement, watch the Azure VMware Solution Ignite overview session or visit the Azure VMware Solution web page. You can also check out the following technical sessions from Ignite: AVS Technical Overview, AVS Business Continuity and Disaster Recovery, AVS Networking.
See Azure VMware Solution in Action: If you're interested in seeing the solution in action, watch the latest Microsoft Mechanics Azure VMware Solution demo here. 
Join us at VMworld 2020: Microsoft is excited to be a diamond sponsor at VMworld 2020 and we look forward to connecting with you in the new digital format. We have a great line-up of speakers and sessions across a number of solution areas where we partner together, including: Azure VMware Solution, Horizon Cloud and Windows Virtual Desktop, VMware Tanzu on Azure, Azure Spring Cloud.

Quelle: Azure

New Datadog integration with Azure offers a seamless configuration experience

This post was co-authored by Sreekanth Thirthala Venkata, Principal Program Manager, Visual Studio and .NET.

Microsoft Azure enables customers to migrate and modernize their applications to run in the cloud, in coordination with many partner solutions. One such partner is Datadog, which provides observability and security tools for users to understand the health and performance of their applications across hybrid and multi-cloud environments. But configuring the necessary integrations often requires navigating between the Azure portal and Datadog.

This adds complexity, takes time, and makes it difficult to troubleshoot if things aren’t working. To reduce the burden of managing across multiple portals, we worked with Datadog to create an integrated Datadog solution on the Azure cloud platform. Available via the Azure Marketplace, this solution provides a seamless experience for using the Datadog’s cloud monitoring solution in Azure.

“The Microsoft cloud is the first to enable a seamless configuration and management experience for customers using third-party solutions like Datadog. With Datadog, customers are empowered to use this experience to monitor their Azure workloads and enable an accelerated transition to the cloud.” —Corey Sanders, Corporate Vice President, Microsoft Solutions

With the new Azure integration with Datadog, organizations can now fully map their legacy and cloud-based systems, monitoring real-time data during every phase of the cloud transition, and ensure that migrated applications meet performance targets. This integration combines Azure’s global presence, flexibility, security, and compliance with Datadog's logging and monitoring capabilities to create the best experience for enterprises.

Through this unified experience, customers will be able to:

Provision a new Datadog organization and configure their Azure resources to send logs and metrics to Datadog—a fully managed setup with no infrastructure for customers to setup and operate.
Seamlessly send logs and metrics to Datadog. The log-forwarding process has been completely automated; rather than building out a log-forwarding pipeline with Diagnostic Settings, Event Hubs, and Functions, you can configure everything with just a few clicks.
Easily install the Datadog agent on VM hosts through a single-click.
Streamline single-sign on (SSO) to Datadog—a separate sign in from the Datadog portal is no longer required.
Get unified billing for the Datadog service through Azure subscription invoicing.

"Observability is a key capability for any successful cloud migration. Through our new partnership with Microsoft Azure, customers will now have access to the Datadog platform directly in the Azure console, enabling them to migrate, optimize and secure new and migrated workloads." —Amit Agarwal, Chief Product Officer, Datadog

Here’s a quick look at this integrated experience:

Acquire and setup the Datadog solution

Now let’s follow the step-by-step process to acquire and setup the Datadog solution:

Procuring the Datadog app: Azure customers can procure the Datadog app through the Azure Marketplace.

Provisioning in the Azure portal: After procuring in Azure Marketplace, customers can seamlessly provision Datadog as an integrated service on Azure via the Azure portal.

Configuring logs and metrics: Customers create a Datadog resource in Azure and configure which Azure resources send logs and metrics and to Datadog.

Installing Datadog agent: Customers can install the Datadog agent as an extension on virtual machines (VMs) and app services with a single click.

Access via SSO: Customers access Datadog from the Azure portal through a streamlined SSO experience and configure Datadog as a destination for logs and metrics from Azure services.

Next steps

Sign up for the preview of the new Datadog integration with Azure. The preview will be available on Azure Marketplace starting October 2020.
Read more about the Azure Monitor partner integration with Datadog.

Quelle: Azure

What’s new in Azure Backup

At Microsoft Ignite, we announced several new Azure Backup features that enhance the protection of your data in Azure. Azure Backup is Azure’s native backup solution that provides data protection capabilities for diverse and mission-critical workloads in a simple, secure, and cost-effective manner. The latest capabilities that we announced this Ignite let you expand your data protection to new workload types, enhance security, and improve the availability of your backup data. Furthermore, you can now increase operational efficiencies with built-in capabilities for managing backups at scale, along with the ability to automatically onboard Azure Backup to Windows Server Virtual Machines.

Protect Azure Database for PostgreSQL and retain the backups for 10 years

Azure Backup and Azure Database Services have come together to build an enterprise-class backup solution for Azure PostgreSQL (now in preview). Now you can meet your data protection and compliance needs with a customer-controlled backup policy that enables retention of backups for up to ten years. With this, you have granular control to manage the backup and restore operations at the individual database level. Likewise, you can restore across PostgreSQL versions or to blob storage with ease. Please visit Backup for Azure Database for PostgreSQL for more details and start using the service today.

Ensure database consistent snapshots for Oracle and MySQL databases running on Azure Linux virtual machines

Application-consistent backups ensure that the backed-up data is transactionally consistent, and that applications will boot up post-virtual machine (VM) restore. To ensure transactional consistency, applications need to be quiesced, and there should be no unfinished transactions when taking a VM snapshot. The Azure Backup service already provides a framework to achieve application consistency during backups of Azure Linux VMs. This framework gives you the flexibility to execute scripts that are orchestrated pre and post backup of the VM.

At Microsoft Ignite, we have an improved application consistency framework that provides packaged pre-post scripts to provide database consistent backups for a few select databases. Sign up for the preview that supports backing up Oracle and MySQL databases running on Azure Linux VMs.

Scale up and scale out your data protection along with your estate

As you scale up your workloads to use larger configurations or scale out across regions, you can also scale your data protection inline to the expansion of your business with new backup capabilities across various workload types.

Enhanced backup capabilities for Azure Virtual Machines

Azure Backup supports backing up all the disks (operating system and data) in a VM together using the VM backup solution. Using the new selective disks backup and restore functionality, you can back up a subset of the data disks in a VM (now generally available). This provides an efficient and cost-effective solution for your backup and restore needs. Each recovery point contains only the disks that are included in the backup operation. This further allows you to have a subset of disks restored from the given recovery point during the restore operation. Until now, Azure Backup has supported 16 managed disks per VM. Now, Azure Backup supports backup of up to 32 managed disks per VM (now generally available).

Simplified backup configuration experience for SQL in Azure Virtual Machines

Configuring backups for your SQL Server in Azure VMs is now even easier with inline backup configuration integrated into the VM blade of the Azure portal (now generally available). With just a few clicks, you can enable backup of your SQL Server to protect all the existing databases as well as the ones that get added in the future.

Backup SAP HANA in RHEL Azure Virtual Machines

Azure Backup is the native backup solution for Azure and is BackInt certified by SAP. Azure Backup has now added support for Red Hat Enterprise Linux (RHEL) (now generally available), one of the most widely used Linux operating systems running SAP HANA.

Seamless backup configuration for Azure Files

Azure Backup has simplified the experience of configuring backup for Azure file shares by giving you the ability to seamlessly enable backup from the file share management experience in the Azure portal. With this new capability (now generally available), you can perform the backup and restore related operations directly from the file share experience, as well as the vault experience.

Improve the durability of your backup data

Azure Storage provides a great balance of high performance, high availability, and high data resiliency with its varied redundancy options. Azure Backup allows you to extend these benefits to backup data as well, with options to store your backups in locally redundant storage (LRS) and geo-redundant storage (GRS). This Microsoft Ignite, we allow you to increase durability options for your backups with the added support of Zonally Redundant Storage for backup data (now in preview).

Ensuring data protection requires not just being able to back up your data, but also being able to control on to what, when, and where you want to restore your data. Azure Backup provides this control by allowing cross-region restore for Azure Virtual Machines. At Microsoft Ignite, we have extended this capability to restore across regions backed up SQL and SAP HANA databases in Azure Virtual Machines (now in preview).

Enhance security for your backups

Concerns about increasing security threats such as malware, ransomware, and intrusion have been top of mind for many of our customers. These security threats can be costly from both a monetary and reputational perspective. To guard against such attacks, Azure Backup provides security features to help protect backup data even after deletion. One such feature is soft delete. With soft delete, even if a malicious actor deletes a backup (or backup data is accidentally deleted), the backup data is retained for 14 additional days, allowing the recovery of that backup item with no data loss. The additional 14 days of retention for backup data in the soft deleted state does not incur any additional cost. Azure Backup now provides soft delete for SQL Server in Azure Virtual Machine and SAP HANA in Azure Virtual Machine (now generally available) workloads. This is in addition to the already supported soft-delete capability for Azure Virtual machines.

Increase efficiencies in backup operations with new built-in capabilities to manage your backups at scale with Backup Center

Azure Backup has enabled a new native management capability to manage your entire Backup estate from a central console. Backup Center provides customers with the capability to monitor, operate, govern, and optimize data protection at scale in a unified manner consistent with Azure’s native management experiences.

With Backup Center, you get an aggregated view of your inventory across subscriptions, locations, resource groups, vaults, and even tenants using Azure Lighthouse. Backup Center is also an action center from where you can trigger your backup related activities, such as configuring backup, restore, creation of policies or vaults, all from a single place. In addition, with seamless integration to Azure Policy, you can now govern your environment and track compliance from a backup perspective. In-built Azure Policies specific to backup also allow you to configure backups at scale.

Backup Center supports two workload types, which are Azure Virtual Machines and Azure Database for PostgreSQL servers (now in preview), along with Azure Files, SQL in Azure Virtual Machines, and SAP HANA in Azure Virtual Machines (now in limited preview).

Plan and optimize backup costs

Achieving cost efficiency with your cloud usage is more critical today than ever before. The Azure Backup team is committed to helping you optimize your backup costs. New capabilities in Azure Backup Reports (now in preview) help you get more visibility into your backup usage and also help you take action to right-size backup storage and achieve significant cost savings. An advanced pricing calculator (now generally available) also allows you to estimate your costs for budgeting and price comparisons.

Automatically onboard, configure, and monitor Azure Backup on your virtual machines

With Azure Automanage (now in preview), it is possible to automate frequent and time-consuming management tasks such as the onboarding and configuration of Azure Backup. Through Automanage, Backup will be automatically configured when using the Azure Best Practices Production configuration profile. The Backup policy can be customized through Automanage preferences to meet company or compliance requirements. Furthermore, Automanage will continuously monitor the Backup settings and will automatically remediate back to the desired state if drift is detected. By using Azure Automanage, it is possible to realize further operational savings while improving business continuity. 

Additional resources

Tell us how we can improve Azure Backup by contributing new ideas and voting up existing ones.
New to Azure Backup? Sign up for an Azure trial subscription.
Need help? Browse the Azure Backup documentation.
Write to for preview signups.
Get guidance and best practices for Azure Backup.
Utilize these resources for training.
Discover resources for learning.

Quelle: Azure

Microsoft partners with the telecommunications industry to roll out 5G and more

The increasing demand for always-on connectivity, immersive experiences, secure collaboration, and remote human relationships is pushing networks to their limits, while the market is driving down price. The network infrastructure must ensure operators are able to optimize costs and gain efficiencies, while enabling the development of personalized and differentiated services. To address the requirements of rolling out 5G, operators will face strong challenges, including high capital expenditure (CapEx) investments, an increased need for scale, automation, and secure management of the massive volume of data it will generate.

Today starts a new chapter in our close collaboration with the telecommunications industry to unlock the power of 5G and bring cloud and edge closer than ever. We're building a carrier-grade cloud and bringing more Microsoft technology to the operator’s edge. This, in combination with our developer ecosystem, will help operators to future proof their networks, drive down costs, and create new services and business models.

In Microsoft, operators get a trusted partner who will empower them to unlock the potential of 5G. Enabling them to offer a range of new services such as ultra-reliable low-latency connectivity, mixed reality communications services, network slicing, and highly scalable IoT applications to transform entire industries and communities.

By harnessing the power of Microsoft Azure, on their edge, or in the cloud, operators can transition to a more flexible and scalable model, drive down infrastructure cost, use AI and machine learning (ML) to automate operations and create service differentiation. Furthermore, a hybrid and hyper-scale infrastructure will provide operators with the agility they need to rapidly innovate and experiment with new 5G services on a programmable network.

More specifically, we will further support operators as they evolve their infrastructure and operations using technologies such as software-defined networking, network function virtualization, and service-based architectures. We are bringing to market a carrier-grade platform for edge and cloud to support the operator’s goals to future proof their infrastructure with disaggregated, and containerized network architectures. Recognizing that not everything will move to the public cloud, we will meet operators where they are—whether at the enterprise edge, the network edge, or in the cloud.

Our approach is built on the acquisitions of industry leaders in cloud-native network functions—Affirmed Networks and Metaswitch and on the development of Azure Edge Zones. By bringing together hundreds of engineers with deep experience in the telecommunications space, we are ensuring that our product development process is catering to the most relevant networking needs of the operators. We will leverage the strengths of Microsoft to extend and enhance the current capabilities of industry-leading products such as Affirmed’s 5G core and Metaswitch’s UC portfolio. These capabilities, combined with Microsoft’s broad developer ecosystem and deep business to business partnership programs, provide Microsoft with a unique ability to support the operators as they seek to monetize the capabilities of their networks.

Your customer, your service, powered by our technology

As we build out our partnerships with different operators, it is clear to us that there will be different approaches to technology adoption based on business needs. Some operators may choose to adopt the Azure platform and select a varied mix of virtualized or containerized network function providers. We also have operators that have requested complete end-to-end services as components for their offers. As a part of these discussions, many operators have identified points of control that are important to them, for example:

Control over where a slice, network API, or function is presented to the customer.
Definition of where and how traffic enters and exits their network.
Visibility and control over where key functions are executed for a given customer scenario.
Configuration and performance parameters of core network functions.

As we build out Azure for Operators, we recognize the importance of ensuring operators have the control and visibility they require to manage their unique industry requirements. To that end, here is how our assets come together to provide operators with the platform they need.


It starts with the ability to interconnect deeply with the operator’s network around the globe. We have one of the largest networks that connect with operators at more than 170 points of presence and over 20,000 peering connections around the globe, putting direct connectivity within 25 miles of 85 percent of the world’s GDP. More than 200 operators have already chosen to integrate with the Azure network through our ExpressRoute service, enabling enterprises and partners to link their corporate networks privately and securely to Azure services. We also provide additional routes to connect to the service through options as varied as satellite connectivity and TV White Space spectrum.

Edge platform

This reach helps us to supply operators with cloud computing options that meet the customer wherever those capabilities are needed: at the enterprise edge, the network edge, the network core, or in the cloud. The various form factors, optimized to support the location in which they are deployed, are supported by the Azure platform—providing virtual machine and container services with a common management framework, DevOps support, and security control.

Network functions

We believe in an open platform that leverages the strengths of our partners. Our solutions are a combination of virtualized and containerized services as composable functions, developed by us and by our Network Equipment Provider partners, to support operators’ services such as the Radio Access Network, Mobile Packet Core, Voice and Interconnect services, and other network functions.

Technology from Affirmed and Metaswitch Networks will provide services for Mobile Packet Core, Voice, and Interconnect services.

Cloud solutions and Azure IoT for operators

By exposing these services through the Azure platform, we can combine them with other Azure capabilities such as Azure Cognitive Services (used by more than 1 million developers processing more than 10 billion transaction per day), Azure Machine Learning, and Azure IoT, to bring the power of AI and automation to the delivery of network services. These capabilities, in concert with our partnerships with OSS and BSS providers, enables us to help operators streamline and simplify operations, create new services to monetize the network, and gain greater insights into customer behavior.

In IoT our primary focus is simplifying our solutions to accelerate what we can do together from the edge to the cloud. We’ve done so by creating a platform that provides simple and secure provisioning of applications and devices to Azure cloud solutions through Azure IoT Central, which is the fastest and easiest way to build IoT solutions at scale. IoT Central enables customers to provision an IoT app in seconds, customize it in hours, and go to production the same day. IoT Plug and Play dramatically simplifies all aspects of IoT device support and provides devices that “just work” with any solution and is the perfect complement to achieve speed and simplicity through IoT Central. Azure IoT Central also gives the Mobile Operator the opportunity to monetize more of the IoT solution and puts them in a position to be a re-seller of the IoT Central application platform through their own solutions. Learn more about using Azure IoT for operators here.

Cellular connectivity is increasingly important for IoT solutions and represents a vast and generational shift for mobile operators as the share of devices in market shifts towards the enterprise. We will continue our deep partnership with operators to enable fast and efficient app development and deployment, which is critical to success at the edge. This will help support scenarios such as asset tracking across industries, manufacturing and distribution of smart products, and responsive supply chains. It will also help support scenarios where things are geographically dispersed, such as smart city automation, utility monitoring, and precision agriculture.

Where we go next

Our early engagement with partners such as Telstra and Etisalat helped us shape this path. We joined the 5G Open Innovation Lab as the founding public cloud partner to accelerate enterprise startups and launch new innovations to foster new 5G use cases with even greater access to leading-edge networks. The Lab will create long-term, sustainable developer and commercial ecosystems that will accelerate the delivery of exciting new capabilities at the edge, including pervasive IoT intelligence and immersive mixed reality. And this is just the beginning. I invite you to learn more about our solutions and watch the series of videos we have curated for you.
Quelle: Azure

Accelerating genomics workflows and data analysis on Azure

Genomics is foundational to the development of targeted therapeutics and precision medicine. Advances in DNA sequencing technologies has driven a revolution in genomics-based research and is helping facilitate better understanding of human biology and disease conditions. This expanded knowledge is leading to the proliferation of personalized medicine strategies to prevent, diagnose, and treat diseases. The trend will continue to accelerate in the coming decade as use of genomics information becomes central to clinical decision support and healthcare delivery.

Sequencing genomes at the population level will be required to decipher the genomic fingerprint of a disease, predict interpersonal variability in progression and treatment response, and develop models for clinical decision support. The resulting explosion in genomics data and the computational power required for analysis (tens of exabytes and trillions of core hours in the next five years1) will require agility, easier management, data security, and access to scalable storage and compute capacity.

The demand for cloud-based solutions is evident. It is increasingly being recognized that community driven standards and open-source tools will be necessary in enabling data accessibility, tool interoperability, and reliability of results and models. Microsoft not only supports open-standards and open-source projects but has been actively contributing to these community driven efforts by making it easier to use these tools and software on Azure.

To that end, Microsoft Genomics has released several open source projects on GitHub, including Cromwell on Azure, Genomics Notebooks and Bioconductor support for Azure. We have also made available a growing list of genomics public datasets on the Azure Open Dataset platform.

Scale and automate genomic workflows on Azure with Cromwell

Cromwell is an open-source workflow management system geared toward scientific workflows, originally developed by the Broad Institute. With Cromwell on Azure, users can accelerate their genomic research with the hyperscale compute capabilities of Azure. Cromwell orchestrates the dynamic provisioning of computing resources via Azure Batch and integrates with customers’ Azure Blob storage account for easy data access.

Propelling novel next generation sequencing (NGS) based detection and characterization assay for COVID-19 with Biotia

Biotia is an emerging startup focused on building a platform leveraging next-generation DNA sequencing (NGS) and artificial intelligence (AI) for precision disease detection and diagnosis. They were looking for a cloud-based workflow solution to manage their NGS pipelines and Cromwell on Azure was able to meet their key requirements.

"At Biotia, we have achieved substantial parallelization, thorough version control, and novel COVID-19 detection results by using Cromwell on Azure to back our compute-intensive genomics workflows. We are pleased to include Cromwell on Azure in our bioinformatics software stack." —Joe Barrows, Director of Software Engineering at Biotia

Enable collaborative and repeatable data analysis using Genomics Notebooks powered by Jupyter Notebooks on Azure

Jupyter Notebooks provides users an environment for analyzing data using R or Python and enabling reusability of methods and reproducibility of results. Biomedical researchers and data scientists are increasingly using notebooks for their genomics data analysis needs and for building machine learning models based on multi-modal datasets (genomic, phenotypic, clinical, EMR, demographic, etc.).

Microsoft’s Genomics Notebooks open-source project provides a growing collection of pre-configured notebooks that users can easily launch and use in their Azure workspace. These pre-configured notebooks cover scenarios from genomics variant detection, filtration, annotation, to transformation of the genomic, phenotypic and clinical data into multi-modal data frames needed for data querying and building machine learning models.

Leveraging genomic data to assess the impact of environmental change with the Canadian Department of Fisheries and Oceans

The Canadian Department of Fisheries and Oceans (DFO) is responsible for preserving Canada’s aquatic natural resources. DFO researchers at the Bedford Institute of Oceanography in Dartmouth, Nova Scotia have been using genomics to understand the impact of climate change and human activity on the migration patterns, genetic diversity and population demography of fish such as Atlantic Salmon and Atlantic Cod, which can have major socio-economic implications for the communities that rely on these resources.

The research teams are starting to sequence fish genomes in the hundreds and were looking for Azure based solutions for scaling and streamlining their growing genomics and data analysis needs. The team successfully deployed, and scale tested Cromwell on Azure and is now looking to adopt it as a common genomics workflow platform across their various institutions.

“Leveraging Cromwell on Azure for running our genomics pipelines give us the ability to scale our analysis to thousands of genomes for any species of fish with automation. We can essentially eliminate three months’ time of manual work to generate all the variant calls we need and move directly into connecting that data with other data sources we have. The data science tools will help us easily build and train complex multi-modal data models to gain deeper insights into the impact resulting from interactions between genetic factors, climate information, and human impacts on these species, and predict how they might respond to environmental challenges in the future.” —Dr. Tony Kess, Researcher at the Bradbury Population Genomics Lab, a part of the Bedford Institute of Oceanography in Dartmouth, Nova Scotia

Easily access the vast collection of community-built bioinformatics tools with Bioconductor on Azure

Bioconductor is an open source, open development project that focuses on providing a repository of extensible statistical and graphical software packages, developed in R, for the analysis of high-throughput genomic and biomedical data. Microsoft is collaborating with the Bioconductor core team in bringing Azure support for this wide-ranging OSS software repository.

Bioinformaticians and data scientists can now easily use their preferred Bioconductor software packages on Azure by deploying the preconfigured Bioconductor Docker image hosted in the Microsoft Container Registry on Docker Hub. Additionally, users can also use Azure Virtual Machine (VM) templates to deploy Genomics Data Science VM preconfigured with popular tools for data exploration, analysis, machine learning, and deep learning model development.

Power data analysis and machine learning models with genomics datasets available through the Azure Open Data platform

The Genomics Data Lake on the Azure Open Dataset platform provides a growing compendium of curated and publicly available genomics datasets. These datasets have been generated by key international collaborative efforts with a focus on providing resources for the biomedical research community. Users across healthcare, pharma and life sciences can now use the Genomics Data Lake on Azure to access these datasets for free and easily integrate the data into their genomics analysis workflows.

Accelerate whole exome and genome processing using Microsoft Genomics turnkey service on Azure

Microsoft Genomics is a highly scalable Azure service to perform secondary analysis of the human genome using the Burrows-Wheeler Aligner (BWA) and the Genome Analysis Toolkit (GATK) open-source software. The service is ISO-certified, enables customers’ compliance with HIPAA, and is covered under the Microsoft Business Associate Agreement (BAA). Microsoft continues to optimize the performance of the service by leveraging the innovations in Azure’s high-performance compute infrastructure enabling customers to generate durable genetic variant data from whole genome sequence data (WGS) within hours. Compliance, performance, data durability and provenance make the service ideal for integration into genomics-based clinical decision support workflows.

Accelerating scientific discoveries to advance cures for childhood cancers through access to real-time clinical genome sequencing at St. Jude Children’s Research Hospital

Whole-genome sequencing offers the most comprehensive assessment of differences between patients’ normal and cancer genomes. Realtime access to the genomic information is not only important for clinical decision support, but it can also accelerate research and novel discoveries and cures. St. Jude Children’s Research Hospital has partnered with Microsoft and DNAnexus to build St. Jude Cloud—the world’s largest public repository of pediatric genomics data.

This first-of-its-kind initiative provides researchers from around the world access to high-quality whole-genome, whole-exome and transcriptome data from appropriately consented St. Jude patients who have undergone clinical genomic profiling. St. Jude Cloud uses Azure and the Microsoft Genomics service to quickly upload, analyze, and harmonize the genomics data, which is subsequently made available through the St. Jude Cloud data browser to researchers worldwide.

“Access to high-quality clinical genomic data, generated leveraging the Microsoft Genomics service and streamed to St. Jude Cloud, will help further research in precision medicine for childhood cancer and other diseases.”—Dr. Jinghui Zhang, Chair of Department of Computational Biology at the St. Jude Children’s Research Hospital

Learn more and get started

Microsoft Genomics and the open-source projects are fully supported by a team of Microsoft developers and scientists committed to driving the innovation needed to advance genomics and precision medicine. Learn more about Microsoft Genomics solutions and help contribute to the open-source projects by visiting our GitHub repositories.

Microsoft Genomics service on Azure.
Cromwell on Azure.
Genomics Notebooks.
Bioconductor Docker Image for Azure.
Genomics Data Science VM.

1 Big Data: Astronomical or Genomical?

Azure. Invent with purpose.
Quelle: Azure