Preparing for what’s next: Financial considerations for cloud migration

Co-authored by Jorge Magana, Director, Azure Finance (Financial Planning and Analysis).

In the kick off blog of this series, I shared our top recommendations to accelerate your cloud migration journey, one of which was around aligning key stakeholders across your organization. As you move through assessments and plan your migration, it is critical to get buy in from your CFO and other financial stakeholders—even more so in today’s challenging macro-climate.

IT and finance organizations need to be aligned around how to be agile to adjust to rapidly shifting demands while ensuring that their cost structure is lean enough to weather tough market conditions. With this dual focus, it is critical to understand not only the technical benefits of a cloud transition, but also the financial and economic opportunities associated with it. Today I'm sharing my own experience of partnering with finance along with the wisdom that customers have shared about their journey.

How can cloud migration affect CFO priorities?

Here are three key areas that IT organizations need to internalize and align on with their finance organization as they plan cloud migration:

What’s the holistic impact to the organization’s financial posture? 
What will the impact be on external and internal finance KPIs and processes?
What operational changes are required during and after migration to ensure that budget/ROI controls are met? 

How is the organization’s financial posture going to change?

Azure customers constantly unlock new, positive ROI projects previously not possible on-premises as they migrate workloads. By design, Azure is built to facilitate business agility, creating opportunities for true competitive advantage and substantial decrease in time to market. As a result, our customers recognize significant financial benefits driven in large part by cloud flexibility and elasticity and changes in businesses’ financial operating models that reduce asset purchases and upfront cash investments.

Cloud flexibility and elasticity

First, Azure customers can adjust their cost structure to improve their organization’s bottom line, which is table stakes in today’s environment. In recent earnings calls, CFOs of companies not leveraging the cloud mentioned their inability to reduce fixed expenses, which hurt profitability. As our customers migrate to Azure, they are shifting to a cost structure that is variable by design:

Figure 1: Cloud cost structure provides flexibility

 

Next, Azure customers can maximize resource efficiency. We have worked directly with large and small customers alike who were running on-premises workloads at very low resource utilization. These customers purchased assets for peak demand and lead-times, but most of the time those servers, and even some datacenters, were sitting idle and underused. By rightsizing and optimizing capacity when migrating to Azure, customers can realize economic benefits from cloud scale and elasticity. As an example, the built-in scalability in Azure has helped Maersk quickly scale up on demand eliminating the need to maintain idle resources during off-peak times.

“Scalability is one of the big benefits we get from Azure. In the past, it might have taken us months to procure and configure servers and get them into production. Now, we can scale up on demand in Azure in a matter of minutes." – Musaddique Alatoor, Head of Equipment Innovation, A.P. Moller – Maersk

Finally, shifting to a cloud model can reduce costs by enabling customers to consume resources only during peak usage periods, while reducing capacity when demand needs drop.

Changes in the financial operating model

Key financial benefits of Azure are driven by a fundamental shift in the IT operating model, which benefits the organization’s core financial statements in the following ways:

Balance sheet: Prior to migrating to Azure, many of our customers owned or operated their datacenters. These were expensive long-term assets that limited the cash and capital required to grow the business, support strategic initiatives, and respond to market conditions. Once on Azure, our customers avoid buying equipment, repurpose expensive real estate, and shift datacenter operations costs into developing cloud applications and other projects that drive business growth. This makes their balance sheet more agile, shifting fixed assets to cash. This is what drove Maersk to move their five regional datacenters to Azure to lower the company’s risks and position them for continued growth.
Cash flow statement: Azure customers save immediate cash by avoiding cyclical and sporadic IT asset purchases. With the “pay for what you use” model along with platform capabilities like policy and tagging that Azure enables, CFOs increase visibility, predictability and delay cash spend.
Income statement (profit and loss): Over time, Azure customers can improve profitability by reducing the cost to deliver equal or larger IT value by taking advantage of Azure’s flexibility, low management costs, its broad portfolio of services and pricing models. Learn how CYTI was able to take advantage of Azure’s flexibility to reduce infrastructure costs.

"We're now saving about 30 percent a year on infrastructure costs just by moving to Azure, with more flexibility, better servers, greater customization, and more freedom to do what we want." – Darren Gourley, Chief Technology Officer, CYTI

How will financial KPIs and processes change?

When migrating from on-premises to Azure, there are several financial benefits that subsequently impact KPIs and finance processes. The two most prominent are: 1) budget and financial reporting processes: expense shifts from capital expenditure (CAPEX) to operational expenditure (OPEX); 2) Impact on EBITDA (earnings before interest, taxes, depreciation, and amortization).

CAPEX to OPEX: During an Azure migration, spend that was previously allocated to CAPEX is now being redeployed to OPEX. This is optimal from a cashflow timing and a balance sheet flexibility perspective but requires CFOs to shift budgets to support the new model. Capstone Mining used this approach to significantly lower their capital costs by moving to Azure.
"We wanted to eliminate $3 million (USD) in capital costs over about three years, and to reduce our operating costs by approximately the same amount. At the same time, we wanted to improve our quality of service. With Azure, we're confident about meeting these goals." – Jim Slattery, Chief Financial Officer, Capstone Mining
EBITDA: EBITDA is a financial metric that companies use to measure profitability. This metric ignores real costs like server spend. When moving to the cloud, EBITDA is impacted because the metric can no longer ignore costs like server depreciation. When moving to the cloud, if your company tracks EBITDA, it will likely be impacted from a migration shift. As opposed to overly focusing on EBITDA, many customers choose to identify additional financial metrics that better measure business value improvements (such as cash flows, operating income, or cost of goods sold efficiency).

Managing financial KPI’s and processes is a critical component of a CFO’s job. By creating a channel of communication with your financial stakeholders and highlighting symbiotic relationships of some of the KPI and process impacts of a cloud migration, you can begin working with your finance team to proactively reset expectations around both capital/operating budgets and EBITDA targets in a cloud vs on-premises world.

Implementing the business case: Ongoing cost-optimization and management

Once the cloud migration project begins, here are a few tips and best financial practices for success:

Reducing on-premises asset acquisitions: There must be broad internal alignment and processes to evaluate and control how and when teams buy new on-premises assets. Every new purchase will add fixed costs that will prevent cloud savings for a longer period.
Initial resource clean-up, rightsizing, and optimization: When migrating to Azure, consider which workloads are no longer needed and can be turned off. For workloads still needed, consider what can be done to optimize those resources and operational hours, leveraging tools such as Azure Migrate.
Continuous cost optimization: Workloads aren’t static. Once in Azure, leverage our tools (including Azure Cost Management and Azure Advisor) and establish processes to monitor resources and patterns to continuously optimize cloud costs.
Resource tagging and spend categorization: Azure allows for simplified resource tagging and cost allocation compared with on-premises. This helps increase spend accountability, while evaluating workload ROI. Through resource tagging you are able to better align your spend to cost categories like the cost of goods sold (COGS) or research and development and allocate costs of workloads directly to underlying business units. Targeted cost allocation can directly help drive efficiencies and reductions.
Billing models: Azure billing models like reserved instances and spot pricing are fantastic opportunities to save money. As an example, Azure three-year Reserved Instances (RI) do not require upfront payment, have tremendous flexibility, and provide discounts up to 72 percent.
Azure Hybrid Benefit: With Azure you can take advantage of your existing Microsoft licenses with Software Assurance to avoid incremental licensing costs for migrating workloads and maximize previous investments.

Figure 2: Well-optimized cloud usage can free up excess capacity

Aligning cloud spend with underlying workload usage

A) Idle capacity: Azure allows customers to eliminate idle capacity intended to cover future growth across workloads. Actions like rightsizing or eliminating unnecessary workloads can help you reduce your idle capacity when moving to the cloud.

B) Variable workloads: Azure customers only pay for the hours they need when demand temporarily peaks above average levels on variable workloads. Taking advantage of tools and actions like VM scale sets and “snoozing” can help you only pay for the resources needed.

C) Predictable workloads: Azure customers can minimize costs of predictable workloads by taking advantage of Azure Reserved Instances and Spot prices.

What’s next?

As the cloud migration team in IT, ensure finance partners and key stakeholders are brought in from the beginning, and include them in appropriate decision-making and progress review forums. Reach out to your finance peers to better understand their expectations and how you can collaborate as you embark on your cloud migration project. Use the Cloud Adoption Framework for Azure for best practice guidance around aligning your organization to a common vision and approach.
Leverage cost-savings offers (including Azure Hybrid Benefit and Reserved Instances) and free tools (Azure TCO calculator, Azure pricing calculator, Azure Migrate) as you plan and prepare for cloud migration.
Use tools like Azure Cost Management and Azure Advisor once on Azure to drive continuous optimization; ensure financial stakeholders have appropriate access and visibility.
For expert assistance from Microsoft or our qualified partners, check out our Cloud Solution Assessment offerings or join the Azure Migration Program (AMP).

We hope this gives you a good understanding of the critical intersection between IT and finance in the context of your organization’s cloud migration journey. Engaging the migration leadership team within your organization to collaboratively create both the technical and correlating financial roadmap ensures alignment, facilitates migration success and long-term organizational success. In the coming weeks, we will continue this blog series with deeper dives on topics like assessments, Landing Zones, infrastructure, data, and application migration best practices.  

Share your feedback

Please share your experiences or thoughts in the comments section below—we appreciate your feedback.
Quelle: Azure

Now save up to 52 percent when migrating to Azure Databricks

More than ever before, companies are relying on their big data and artificial intelligence (AI) systems to find new ways to reduce costs and accelerate decision-making. However, customers using on-premises systems struggle to realize these benefits due to administrative complexity, inability to scale their fixed infrastructure cost-effectively, and lack of a shared collaborative environment for data engineers, data scientists and developers.

To make it easier for customers to modernize their on-premises Spark and big data workloads to the cloud, we’re announcing a new migration offer with Azure Databricks. The offer includes:

Up to a 52 percent discount over the pay-as-you-go pricing when using the Azure Databricks Unit pre-purchase plans. This means that customers can free themselves from the complexities and constraints of their on-premises solutions and realize the benefits of the fully managed Azure Databricks service at a significant discount.
Free migration assessment for qualified customers.

Azure Databricks is a fast, easy, and collaborative Apache Spark-based service that simplifies building big data and AI solutions. Since its debut two years ago, Azure Databricks has experienced significant adoption from customers, such as Shell, Cerner, Advocate Aurora Health, and Bosch, which are using it to run mission-critical big data and AI workloads.

We’ve also seen several customers accelerating their migration of on-premises systems to Azure Databricks for the following reasons:

Reduced costs and enhanced security: Moving to the fully managed Azure Databricks environment enables customers to reduce administrative costs while also helping increase overall security and compliance of their solutions. Autoscaling and auto-termination of jobs help reduce operational costs. In addition, native integration with Azure Data Lake Storage Gen 2, which supports the Hadoop Distributed File System (HDFS) format, helps reduce migration costs.
Increased agility: On-premises systems are limited to a fixed amount of compute and storage. With Azure Databricks, customers can quickly scale up or down compute resources as needed to accelerate jobs and increase productivity.
Enhanced collaboration: Azure Databricks empowers data engineers, data scientists and developers to collaborate in an interactive workspace using the languages and frameworks of their choice. Integration with Azure Machine Learning, Synapse Analytics, and Cosmos DB provides users easy access to new technologies, thereby accelerating overall time to value.

This new offer is designed to help customers who are still using on-premises big data systems but are looking to move to the cloud and take advantage of Azure Databricks capabilities.

Offer details

The Azure Databricks Unit pre-purchase plan already enables customers to save up to 37 percent over pay-as-you-go pricing when they pre-pay for one or three-year commitments. With the migration offer, we are adding an extra 25 percent discount for three-year pre-purchase plan larger than 150,000 DBCUs and a 15 percent discount for one-year pre-purchase plan larger than 100,000 DBCUs. The offer is valid until January 31, 2021. More information on the Azure Databricks Unit pre-purchase plan can be found on the pricing page.

All Azure Databricks SKUs—Premium and Standard SKUs for Data Engineering Light, Data Engineering, and Data Analytics—are eligible for this migration offer. The Azure Databricks pre-purchase units can be used at any time and can be consumed across all Databricks workload types and tiers.

Qualified customers will also receive a free migration evaluation. This includes an assessment of current tools, systems, and processes, and a two-day workshop to identify value drivers, prioritize use cases, and define the future state architecture.

Get started today

Learn more about migration to Azure Databricks and the offer by watching this webinar. For more information on discount tiers, please visit the Azure Databricks pricing page and contact your sales team to take advantage of this offer.
Quelle: Azure

Azure Monitor for SAP Solutions is now in preview

Some of the largest enterprises in the world are currently running their SAP solutions on Microsoft Azure. Since these SAP applications are mission critical, a delay or disruption of service for even a minute can have a significant financial and reputational impact on an organization.

To help our customers effectively monitor their SAP on Azure deployments, today we are announcing the preview of Azure Monitor for SAP Solutions. With this Azure-native monitoring solution, customers running their SAP landscapes on Azure now have access to simplified monitoring, efficient troubleshooting, and flexible customizations. Watch Introducing Azure Monitor for SAP Solutions on Azure Friday.

Before we announced a private preview of Azure Monitor for SAP Solutions in September 2019 we heard from customers that they relied on complex and unmanageable disparate tools and dashboards. Customers wanted to collect the required SAP telemetry in one location for an end-to-end view to easily recognize patterns and correlate data between various components within their SAP landscapes.

“Azure Monitor for SAP Solutions enables infrastructure teams to quickly identify the state of the enterprise critical SAP HANA DB without being an SAP HANA Expert. We had several occasions where functional teams pointed at infrastructure for system issue and with the use of the monitor we could quickly confirm or point at the real root cause for the issue. The tool speeds up the time it takes to identify who needs to be involved in solving whatever problem the customer faces…” —Thomas Kremer, Sr. Manager II Cloud and Service Delivery, Walgreens

Key features of Azure Monitor for SAP Solutions

Key features of Azure Monitor for SAP Solutions include:

Multi-instance/multi-provider: Customers can get telemetry data from multiple systems of the same source system type or from multiple systems of different source system types. For example, customers can deploy just one monitoring resource to monitor multiple SAP HANA instances and multiple Pacemaker clusters.
SAP HANA DB telemetry: Customers can collect and view HANA Backup and HSR telemetry, in addition to the infrastructure utilization data from various SAP HANA instances in one location with the Azure portal.
Microsoft SQL Server telemetry: Customers can get telemetry from Microsoft SQL Server, can visualize and correlate telemetry data—such as CPU and memory with top SQL statements—and can also get information about ‘Always On.’
High-availability (HA) cluster telemetry: Customers can get telemetry data from Pacemaker clusters and identify which clusters are healthy versus unhealthy and correlate this with the health of underlying node and resource health.

Benefits of Azure Monitor for SAP Solutions

Benefits of Azure Monitor for SAP Solutions include the ability to:

Easily collect and consolidate telemetry data from Azure infrastructure and databases in a central location, independent of the underlying infrastructure (Azure Virtual Machines, Azure Large Instances, or both). Customers can use this data to visually correlate telemetry between different components for faster troubleshooting.
Create Azure dashboards to see telemetry from both the SAP and non-SAP components running on Azure. This can be done with ‘pinning’ to combine telemetry from Azure Monitor for SAP Solutions (used to monitor SAP landscape components) with telemetry from Application Insights or Log analytics (used to monitor non-SAP components).
Edit the visualizations to create customized charts and graphs. Customers can run custom Kusto queries on the raw data collected by Azure Monitor for SAP Solutions to identify patterns, configure alerts to get proactive notifications, and configure custom data retention period to retain telemetry data for trend analysis.
Integrate with Azure Lighthouse. With this, partners can view telemetry across different tenants as per appropriate access policies. This enables partners to help their customers with monitoring and troubleshooting their SAP on Azure landscapes.

In addition, Azure Monitor for SAP Solutions is open source, so customers can see the inner workings of the product and offer feedback by visiting this GitHub repository.

Pricing and availability

Azure Monitor for SAP Solutions is available in West Europe, East US, East US 2, and West US 2.
There is no licensing fee for the product. Customers only pay for the underlying infrastructure which is deployed as part of the product.

Learn more

To learn more about the product and pricing, check out the Azure Monitor for SAP Solutions documentation. To get started, watch this QuickStart video and head to Azure Marketplace to create your first resource.
Quelle: Azure

Run high scale workloads on Blob storage with new 200 TB object sizes

Azure Blob storage is a massively scalable object storage solution that serves from small amounts to hundreds of petabytes of data per customer across a diverse set of data types, including logging, documents, media, genomics, seismic processing, and more. Read the Introduction to Azure Blob storage to learn more about how it can be used in a wide variety of scenarios.

Increasing file size support for Blob storage

Customers that have workloads on-premises today utilize files that are limited by the filesystem used with file size maximums up to exabytes in size. Most usage would not go up to the filesystem limit but do scale up to the tens of terabytes in size for specific workloads that make use of large files. We recently announced the preview of our new maximum blob size of 200 TB (specifically 209.7 TB), increasing our current limit of 5TB in size, which is a 40x increase! The increased size of over 200TB per object is much larger than other vendors that provide a 5TB max object size. This increase allows workloads that currently require multi-TB size files to be moved to Azure without additional work to break up these large objects.

This increase in object size limit will unblock workloads, including seismic analysis, backup files, media and entertainment (video rendering and processing), and others which include scenarios where multi-TB object size is used. As an example, a media company which is trying to move from a private datacenter to Azure can now do so with our ability to support files up to 200TB in size. Increasing our object size removes the need to carefully inventory existing file sizes as part of a plan to migrate a workload to Azure. Given many on-premises solutions can store files in the ten to hundreds of terabytes in size, removing this gap simplifies migration to Azure.

With large file size support, being able to break up an object into blocks to ease upload and download is critical. Every Azure Blob is made up of up to 50,000 blocks. This allows a multi-terabyte object to be broken down into manageable pieces for write. The previous maximum of 5 TB (4.75TiB) was based on a max block size of 100 MiB x 50,000 blocks. The preview increases the block size to 4,000 MiB and keeps 50,000 blocks per object for a maximum object size of 4,000 MiB x 50,000 = 190.7 TiB. Conceptually in your application (or within the utility or SDK), the large file is broken into blocks, each block is written to Azure Storage, and, after all, blocks have successfully been uploaded, the entire file (object) is committed.

As an example of the overall relationship within a storage account, the following diagram shows a storage account, Contososa, which contains one container with two blobs. The first is a large blob made up of 50,000 blocks. The second is a small blob made of a single block.

The 200 TB preview block blob size is supported in all regions, using tiers including Premium, Hot, Cool, and Archive. There is no additional charge for this preview capability. We do not support upload of very large objects using Azure Portal. The various methods to transfer data into Azure will be updated to make use of this new blob size. To get started today with your choice in language:

.Net.
Java.
JavaScript.
Python.
REST.

Next steps

We look forward to hearing your feedback via email or post in the Azure Storage technet forum.

Learn more about Azure Blob storage.
Quelle: Azure

Save up to 76 percent on Azure Synapse Analytics and gain breathtaking insights of your ERP data

To help customers save on data warehouse migration costs and accelerate time-to-insight on critical SAP data, we are announcing two new analytics offers from Azure Synapse Analytics.

Business disruptions, tactical pivots, and remote work have all emphasized the critical role analytics plays for every organization. Uncharted situations demand charted performance insights, so businesses can quickly determine what is and is not working. In recent months, the urgency for these business-guiding insights has only been heightened—leading to a need for real-time analytics solutions. And equally important is the need to discover and share these insights in the most cost-effective manner.

Azure Synapse has you covered. It is the undisputed leader in price-performance and when compared to other cloud providers is up to 14 times faster and costs 94 percent less. In fact, businesses using Azure Synapse today report an average ROI of 271 percent.

To help customers get started today, we are announcing the following new offers aimed at empowering businesses to act now wherever they are on their cloud analytics journey.

Save up to 76 percent when migrating to Azure Synapse

For customers that use an on-premises data warehouse, migrating to the cloud offers both significant cost savings and accelerated access to innovative features. Today, customers experience cost savings with our existing reserved capacity discount for cloud data warehousing with Azure Synapse. To boost these cost savings further, today we are announcing a new limited time offer that provides additional savings on top of the existing reserved capacity discount—enabling qualifying customers who currently use an on-premises data warehouse to save up to 76 percent when migrating to Azure Synapse.

To learn more about the terms and conditions and the qualification criteria of this offer, contact your Microsoft account representative. The migration offer is available until January 31, 2021.

Gain breathtaking insights of your ERP data with new offering from Azure, Power BI, and Qlik Data Integration

For companies worldwide, SAP data is at the core of their business applications—housing critical information on sales, manufacturing, and financial processes. However, due to the inherent complexity of SAP systems, many organizations struggle to integrate SAP data into modern analytics projects. To enable businesses to gain real-time insights from their SAP data, we are announcing a new joint offer with Qlik (formerly Attunity) that brings Azure Synapse, Power BI, and Qlik Data Integration together for end-to-end supply chain intelligence, finance analytics, and more.

With this new offer, customers can now work with Azure, Power BI, and Qlik Data Integration to easily understand how to enable real-time insights on SAP data through a robust proof of value. This joint proof-of-value offer provides customers a free solution architecture workshop, software subscriptions, and hands-on technical expertise from dedicated personnel and resources from both Microsoft and Qlik.

To learn more about this joint offer and how to apply, register for the upcoming webinar.

Get started today

Register for the webinar, Gain Real-Time SAP Data Insights with Azure Synapse Analytics, airing July 30, 2020 at 10:00 AM PT.
Try the new Azure Synapse features and  create an Azure Synapse workspace in minutes.
Learn more about the new joint offer, Unleash your SAP data with Microsoft and Qlik.

 

 

Migration offer details

The Azure Synapse Analytics reserved capacity plan for data warehousing (formerly SQL Data Warehouse) already enables customers to save up to 65 percent over pay-as-you-go pricing when they pre-pay for a three-year commitment. With this new migration offer, we are adding an extra 33 percent discount for three-year pre-commits that spend over $60,000/ year in year 2 and year 3 on SQL Data Warehouse Compute Optimized Gen2. Terms and conditions apply and can be discussed in full with your Microsoft account representative. More information on the Azure Synapse Analytics (formerly SQL Data Warehouse) reserved capacity plan can be found on the pricing page.

 

 

Quelle: Azure

Azure Partner Zone brings new resources and special events for Partners

On July 1, 2020, the Microsoft Azure team launched a new experience for Azure Partners on Azure Partner Zone. The site will feature the latest news, resources, and content to help partners stay abreast of emerging trends, learn about new products and features, and find the tools needed to build a thriving Azure practice.

In addition to the new web experience for Azure Partners, engineering and marketing teams across Azure will be hosting special events for partners, designed to help build, scale, and secure your Azure practice. The most recent event was with industry experts, Service Leadership Inc, aimed at helping our IT Service Partners prepare and manage a new global economic climate. The webinar illustrated likely upcoming macro-economic scenarios, likely Service Provider revenue and profit paths through recovery, and suggested actions for Service Providers to maximize revenue, profit, and safety. Access to the event recording is available on Partner Zone.

The next event is an exclusive pre-Inspire workshop for Microsoft Partners on Monday, July 20. This full-day, eight-session series is an interactive experience for partners designed to help you build and scale managed services on Azure.  Hear from leadership about the future of Azure and learn how you can leverage our cloud-native management solution to reduce operational costs, generate new revenue opportunities, and expand the value of your offerings. 

Azure Management Workshop details

The Azure Management Workshop: An interactive journey for building, scaling, and optimizing your Azure practice using our one native management solution for Azure.

Monday, July 20 from 9 AM to 5 PM Eastern Time.

Sessions include:

Unplugged with Azure Leadership, an Interview with Julia White, CVP Azure Marketing and Erin Chapple, CVP Azure Compute.
Partnering to Drive Customer Success Pre-, During, and Post-Migration to Azure.
Building Scalable Managed Services on Azure with Azure Lighthouse.
Enabling Governance for Managed Services with Azure Policy.
Enabling Hybrid Managed Service Capabilities with Azure Arc.
Deploying Azure Managed Services at Scale Using ARM Templates.
Building and Scaling a Managed Security Practice on Azure Using Azure Sentinel and Azure Lighthouse. 
Taking the Next Step: Becoming an Azure Expert MSP.

Register today for this one-time event on Azure Partner Zone.

The relaunch of Azure Partner Zone

Azure Partner Zone will be updated regularly to help partners migrate existing applications to Azure, innovate with new apps, and enable customers for success. Explore the Practices pages to discover Azure services aligned to support each of these business needs and alleviate challenges. To learn more about each one and drill further into available training documentation, head over to the Solutions pages. You’ll also find a section for building a COVID-19 strategy, showcasing the best tools and resources to help you and your customers navigate this unchartered climate.

For extended education, check out the Training Library, Partner Resources Catalog, and Partner News Center. Here you can dive into self-paced learning for almost any Azure topic, plus find white papers, datasheets, infographics, videos, latest industry blogs, and more. If you are short on time, try the ‘save’ feature to post things to your Partner Zone profile for viewing or revisiting later on. Visitors who log in to the site with their LinkedIn profile can find saved content under View my Toolkit from the profile dropdown.

Partner Zone also helps partners stay connected to the Azure team through the year-round Partner Newsletter and Azure Partner Community. Newsletter subscribers will get early access to the latest news, events, and opportunities for partner engagement while the Azure Partner Community offers a place for partner-to-partner connections and direct communications with the Azure team.

Build and expand your Azure practice today and visit Azure Partner Zone.
Quelle: Azure

Azure Maps Power BI visual now in preview

The Azure Maps visual for Power BI will be releasing as a preview this week. Power BI is a powerful analysis and visualization tool. Azure Maps is an important tool for gaining geospatial context and insights that can be used in decision making.

This initial release includes the following visualization layers:

Bubble layer
3D bar chart layer
Reference layer
Custom tile layer
Real-time traffic overlay

In addition to these visualization layers, this visual also leverages built-in Power BI features, such as tooltips, color themes, as wells as filter and slicer support.

Bubble layer—represent location data as scaled circles

Bubble layers are a great way to represent location data as scaled circles on the map. Customers can use a linear scaling method or customize the scaling logic using a logarithmic or Cubic-Bezier curve. Additionally, users can pass a value into the legend field and have the fill color of the circles dynamically set; and, outline the circles with a single color or enable the high contrast outline option to have a high contrast variant of the fill color assigned to the circle to help ensure the circles are clearly visible regardless of which style the map is set to. Allowing the user to easily visualize two metrics for each location on the map, scale, and category.

For example, the following image shows bicycle accident locations in North Carolina. The color indicates the speed limit of the road the accident occurred on and the size is based on the number of individuals involved in the accident.

3D bar chart layer—visualize location data as 3D bars or cylinders

3D bar charts are useful for taking data to the next dimension by allowing visualization of location data as 3D bars or cylinders on the map. Users can tilt and rotate the map by holding down the right mouse button and dragging or use one of the navigation controls to view your data from different perspectives.

Similar to the bubble layer, the bar chart later can easily visualize two metrics at the same time using color and relative height. The following map displays store locations with bar heights representing the revenue generated from each location, colored by sales region.

Reference layer—overlay additional data layers to add more context

Power BI currently allows a single data set to be connected to a visual. However, when working with maps, its often desirable to be able to overlay additional data layers to add more context to a report. With this feature, a GeoJSON file containing custom location data can be uploaded and overlaid on the map. Properties in the GeoJSON file can be used to customize the style of the shapes.

For example, the following map image adds a GeoJSON file of census tract boundaries colored by population below a layer of addresses colored by real estate value. This provides insights on how population density is related to property values.

Custom tile layer—superimpose images on top of Azure Maps base map tiles

Overlay a custom tile layer on the map to add an additional layer of context. Tile layers allow you to superimpose images on top of Azure Maps base map tiles. Overlay weather data from the Azure Maps weather services or bring your own tile service.

The following map displays a bubble layer of sales data of store selling sunglasses above a tile layer showing current weather radar from Azure Maps. In this case, we can easily see that less sales of sunglasses are occurring where it is rain.

Real-time traffic overlay—see how traffic congestion relates to your data

Users can overlay real-time traffic flow data to see how traffic congestion relates to their data. For example, the following map is showing the position of field technicians rendered as a bubble layer on the map colored by their experience level and scaled by the amount of remaining time on their current job. Real-time traffic is overlaid on the map and provides a quick visual reference of which technicians are most likely be delayed getting to their next job due to traffic congestion.

Get started with the Azure Maps visual for Power BI

To get started using the Azure Maps visual, first enable it in the Power BI desktop app. To do this, open the options panel though File > Options and settings. Go to the Preview features options and select the Azure Maps visual. Once this is done you will also be able to use this visual in the Power BI website.

This is just the beginning! We have lots of exciting new features planned. Have a feature request? Let us know or vote for an existing request on our feedback site.

Learn more about the Azure Maps Power BI visual.
Quelle: Azure

Azure Files support and new updates in advanced threat protection for Azure Storage

A year ago we announced the general availability of advanced threat protection for Azure Storage, to help our customers better protect their data in blob containers from the growing risk of cyberattacks. Since then, advanced threat protection for Azure Storage has been protecting millions of storage accounts and helping customers to detect common threats such as malware, access from suspicious sources (including TOR exit nodes), data exfiltration activities, and more.

Today we’re excited to announce the preview of extending advanced threat protection for Azure Storage to support Azure Files and Azure Data Lake Storage Gen2 API, helping our customers to protect their data stored in file shares and data stores designed for enterprise big data analytics.

Growing demand to secure file shares and data lakes

More and more organizations are moving their data to the cloud, seeking better security and data protection, data modernization, and optimized cost and performance of IT operations. It’s expected that over 80 percent of enterprise workloads will be in the cloud by the end of 2020.

This growing demand has also increased the popularity of Azure Files Storage, which delivers secure, Server Message Block (SMB) based, fully managed cloud file shares that can also be cached on-premises for performance and compatibility.

With Azure Files, organizations get the added benefit of a secure storage infrastructure that is massively scalable, and globally available. Even with all these capabilities, it’s still essential to bolster cybersecurity, especially with the growing complexity and sophistication of cyberattacks.

In addition, we’re seeing the growing demand for data stores optimized for big data analytics, and the need to serve and manage massive amounts of data. Azure Data Lake Storage Gen2 is a set of capabilities dedicated to big data analytics, built on Azure Blob storage while focusing on performance, management and security, it supports serving multiple petabytes of information while sustaining hundreds of gigabits.

What’s included in advanced threat protection for Azure Files and ADLS Gen2 API

Advanced threat protection for Azure Storage provides an additional layer of security intelligence that provides alerts when it detects unusual and potentially harmful attempts to access or exploit your storage accounts. This layer of protection allows you to address threats without being a security expert or managing security monitoring systems.

Security alerts are triggered when anomalies in activity occur. These security alerts are integrated with Azure Security Center and are also sent via email to subscription administrators, with details of suspicious activity and recommendations on how to investigate and remediate threats.
 

Besides the built-in security of Azure file shares and data lakes, customers of advanced threat protection for Azure Storage also benefit from:

World-class algorithms that learn, profile, and detect unusual or suspicious activity in your file shares.
Actionable alerts in a centralized view in Azure Security Center with optional email notifications.
Integration with Azure Sentinel for efficient threat investigation.
Azure-native support for Azure Files with one click enablement from the Azure portal and with no need to modify your application code.

Get started today

We encourage you to try out advanced threat protection for Azure Storage and start detecting potential threats on your Azure Files shares and Azure Blob containers. Advanced threat protection for Azure Storage needs to be enabled on the storage accounts containing the files shares and blob containers you want to protect.

We recommend enabling advanced threat protection for Azure storage on the subscription level by following the instructions here: Configure advanced threat protection for Azure Storage.

Learn more about the pricing of advanced threat protection for Storage price in Azure Security Center pricing page.

For more information on Azure Security Center, please visit Azure Security Center web page.
Quelle: Azure

Making buildings smarter with Azure IoT

Commercial real estate developers, building owners, facilities management companies, and tenants have a huge opportunity to address, and solve for, the unique business challenges faced by their industry, by applying the Internet of Things (IoT) to buildings. For example, by leveraging data from IoT sensors and building management systems, companies can gain insights that enable them to save energy, reduce operational expenses, increase occupant comfort, and optimize space.

However, the COVID-19 crisis has presented a new set of challenges for developers, owners, and management companies. New forecasts show the smart building market size growing between 7.3 percent and 11.6 percent annually to overall market revenues of between $65.2 billion and $82.7 billion USD in 2025.1

Smart buildings also help companies meet regulations for tracking and reducing greenhouse gas emissions.

Let’s look at how Bosch Building Technologies, Bentley Systems, Schneider Electric, and ICONICS use Azure IoT to deliver the benefits of smart buildings.

Decreasing energy requirements

The American Council for an Energy-Efficient Economy estimates that implementing smart building technology in an existing building can result in energy savings of 30–50 percent.2 For example, companies can combine data from occupancy sensors with data from HVAC and lighting systems to lower room temperatures and turn lights off in unoccupied rooms.

Bosch Building Technologies developed an in-house Energy Platform to analyze energy consumption and pursue ongoing energy efficiency. Based on Microsoft Azure, the Energy Platform monitors and analyzes energy consumption in real-time. Bosch customers use the Energy Platform to connect to IoT enabled devices and then link to existing meters, sensors, and machines. Customers can make informed decisions to improve energy and resource efficiency.

Bosch offers the solution to customers and uses it internally at more than 100 manufacturing plants worldwide. At one of their larger plants, Bosch saves up to €1.2 million (approximately $1.3 million USD) a year.

Bosch also created a Building Intelligence as a Service program to provide new IoT-based services for customers. Bosch adopted Azure Digital Twins as part of their Connected Building Services offering. By leveraging Azure Digital Twins, the company can query data from entire rooms or spaces, rather than from disparate sensors, to build complete digital models of the physical building environment.

By using Azure Digital Twins, Bosch gains more precise data for a wide range of building technology systems. With this level of precision, it’s easier for customers to fully understand data points, consumption results, context, and how they relate to the physical environment to quickly gain insights on energy usage to inform their business decisions.

Human factor design of new buildings can help decrease energy requirements.

Creating a connected workplace

At Microsoft’s Frasers Tower in Singapore, Bentley Systems and Schneider Electric implemented sensors and telemetry to create a connected workplace. They used a mix of 179 Bluetooth beacons in meeting rooms and 900 sensors for lighting, air quality, and temperature. The platform generates nearly 2,100 data points that are stored and analyzed in Azure. Using the data, Microsoft optimizes various aspects of the spaces, making them more comfortable for employees, while reducing energy consumption in a sustainable and economical manner.

Additionally, Bentley Systems built a digital twin of the Fraser Towers on its Bentley iTwin platform—using Azure Digital Twins, Azure IoT Hub, and Azure Time Series Insights. The iTwin platform uses both historical and real-time data from IoT sensors to create an exact digital replica of the physical building. The building management team uses the information to dynamically allocate space, increase utilization, reduce costs, improve competitiveness, and enhance collaboration and productivity.

Sensors generate data that is stored and analyzed to decrease energy use.

Monitoring occupancy and reducing costs

ICONICS smart building software has run on Microsoft Azure since 2015. The software is an integration hub for building management systems that control heating, ventilation, and lighting and collect and centralize each system’s sensor data. ICONICS relies on Azure Digital Twins to boost solution scalability and rapidly deliver innovative capabilities to customers, such as viewing space occupancy and spatial analytics.

Microsoft uses the ICONICS smart building software to collect sensor data in office buildings in the Puget Sound area of Washington State. The ICONICS solution aggregates the data over multiple buildings to give facility managers visibility into building health and applies big data analytics to provide insights that drive decisions in order to deliver energy savings. In fact, the Microsoft Energy Smart Buildings program, leveraging ICONICS software, has saved Microsoft 20 percent off its energy bills.

Next steps

Smart buildings provide insights that enable real estate developers, commercial building owners, facilities managers, and tenants to save energy, reduce operational expenses, increase occupant comfort, and meet regulatory and sustainability goals.

To learn more about best practices for planning smart building projects, download the white paper, Smart buildings: From design to reality, co-written by Microsoft and L&T Technology Services.

Also visit, Azure IoT to find the right IoT approach for your solutions.

 

1Impact of COVID-19 on the Global IoT in Smart Commercial Buildings Market to 2025 – ResearchAndMarkets.com.

2 Smart Buildings: Using Smart Technology to Save Energy in Existing Buildings.
Quelle: Azure

Azure AI: Build mission-critical AI apps with new Cognitive Services capabilities

As the world adjusts to new ways of working and staying connected, we remain committed to providing Azure AI solutions to help organizations invent with purpose.

Building on our vision to empower all developers to use AI to achieve more, today we’re excited to announce expanded capabilities within Azure Cognitive Services, including:.

Text Analytics for health preview.
Form Recognizer general availability.
Custom Commands general availability.
New Neural Text to Speech voices.

Companies in healthcare, insurance, sustainable farming, and other fields continue to choose Azure AI to build and deploy AI applications to transform their businesses. According to IDC1, by 2022, 75 percent of enterprises will deploy AI-based solutions to improve operational efficiencies and deliver enhanced customer experiences.

To meet this growing demand, today’s product updates expand on existing language, vision, and speech capabilities in Azure Cognitive Services to help developers build mission-critical AI apps that enable richer insights, save time and reduce costs, and improve customer engagement.

Get rich insights with powerful natural language processing

One of the ways organizations are adapting is scaling the ability to rapidly process data and generate new insights from data. COVID-19 has accelerated the urgency, particularly for the healthcare industry. With the overwhelming amount of healthcare data generated every year2, it is increasingly critical for providers to quickly unlock access to this information to find new solutions that improve patient outcomes.

We are excited to introduce Text Analytics for health, a new feature of Text Analytics that enables health care providers, researchers, and companies to extract rich insights and relationships from unstructured medical data. Trained on a diverse range of medical data—covering various formats of clinical notes, clinical trials protocols, and more—the health feature is capable of processing a broad range of data types and tasks, without the need for time-intensive, manual development of custom models to extract insights from the data.

In response to the COVID-19 pandemic, Microsoft partnered with the Allen Institute of AI and leading research groups to prepare the COVID-19 Open Research Dataset. Based on the resource of over 47,000 scholarly articles, we developed a COVID-19 search engine using Text Analytics for health and Cognitive Search, enabling researchers to generate new insights in support of the fight against the disease.

Additionally, we continue to make advancements in natural language processing (NLP) so developers can more quickly build apps that generate insights about sentiment in text. The opinion mining feature in Text Analytics assigns sentiment to specific features or topics so that users can better understand customer feedback from social media data, review sites, and more.

Save time and reduce costs by turning forms into usable data

A lot of the unstructured data is contained in forms that have tables, objects, and other elements. These types of documents typically take manual labeling by document type or intensive coding to extract insights.

We’re making Form Recognizer generally available to help developers extract information from millions of documents efficiently and accurately—no data science expertise needed.

Customers like Sogeti, part of the Capgemini Group, are using Form Recognizer to help their clients more quickly process large volumes of digital documents.

“Sogeti constantly looks for new ways to help clients in their digital transformation journey by providing cutting-edge solutions in AI and machine learning. Our Cognitive Document Processing (CDP) offer enables clients to process and classify unstructured documents and extract data with high accuracy resulting in reduced operating costs and processing time. CDP leverages the powerful cognitive and tagging capabilities of the Form Recognizer to extract effortlessly, keyless paired data and other relevant information from scanned/digital unstructured documents, further reducing the overall process time.” – Mark Oost – Chief Technology Officer at Sogeti, Artificial Intelligence and Machine Learning

Wilson Allen, a leading provider of consulting and analytics solutions, is using Form Recognizer to help law and other professional services firms process and evaluate documents (PDFs and images, including financial forms, loan applications, and more), and train custom models to accurately extract values from complex forms.

“The addition of Form Recognizer to our toolkit is helping us turn large amounts of unstructured data into valuable information, saving more than 400 hours of manual data entry and freeing up time for employees to work on more strategic tasks.” – Norm Mullock – VP of Strategy at Wilson Allen

Improve customer engagement with voice-enabled apps

People and organizations continue to look for ways to enrich customer experiences while balancing the transition to digital-led, touch-free operations2. Advancements in voice technology are empowering developers to create more seamless, natural, voice-enabled experiences for customers to interact with brands.

One of those advancements, Custom Commands, a capability of Speech in Cognitive Services, is now generally available. Custom Commands allows developers to create task-oriented voice applications more easily for command-and-control scenarios that have a well-defined set of variables, like voice-controlled smart home thermostats. It brings together Speech to Text for speech recognition, Language Understanding for capturing spoken entities, and voice response with Text to Speech, to accelerate the addition of voice capabilities to your apps with a low-code authoring experience.

In addition, Neural Text to Speech is expanding language support with 15 new natural-sounding voices based on state-of-the-art neural speech synthesis models: Salma in Arabic (Egypt), Zariyah in Arabic (Saudi Arabia), Alba in Catalan (Spain), Christel in Danish (Denmark), Neerja in English (India), Noora in Finnish (Finland), Swara in Hindi (India), Colette in Dutch (Netherland), Zofia in Polish (Poland), Fernanda in Portuguese (Portugal), Dariya in Russian (Russia), Hillevi in Swedish (Sweden), Achara in Thai (Thailand), HiuGaai in Chinese (Cantonese, Traditional) and HsiaoYu in Chinese (Taiwanese Mandarin).

Customers are already adding speech capabilities to their apps to improve customer engagement. With Cognitive Services and Bot Service, the BBC created an AI-enabled voice assistant, Beeb, that delivers a more engaging, tailored experience for its diverse audiences.

We are excited to introduce these new product innovations that empower all developers to build mission-critical AI apps. To learn more, check out our resources below.

Get started today

Learn more with the resources below and get started with Azure Cognitive Services and an Azure free account.

Text Analytics for health: Read the technical blog for more information. See it in action with the COVID-19 search engine demo. Enter medical terms such as “ibuprofen” in the search bar and try exploring graph relationships.
Form Recognizer: Read the technical blog for more information. See it in action with the Form Recognizer demo, showcasing the ability to extract information from different types of forms. Access the code samples.
Custom Commands: Read the technical blog for more information. See it in action with the inventory, hospitality, and automotive demos. Start by selecting your scenario and saying a command out loud per the prompt. Access the code samples.
Neural Text to Speech: Read the technical blog for more information. See it in action with the demo. Use the pre-populated text or add your own, and try finetuning audio output. Access the code samples.

1 Worldwide Artificial Intelligence Predictions (IDC FutureScape 2020).

2 Adapting customer experience in the time of coronavirus (McKinsey 2020).

Quelle: Azure