Town of Cary innovates flood prediction with IoT

This post was co-authored by Daniel Sumner, Worldwide Industry Director, Government—Smart Infrastructure at Microsoft.

According to Flood Safety, flooding is the most common type of natural disaster worldwide. It affects tens of millions of people around the world each year and causes, on average, more than $200 billion in damages. Many communities face flood-related challenges, and the Town of Cary in North Carolina, United States, is no different. Its flood-prone areas are affected by heavy rains, which are often exacerbated by the yearly Atlantic hurricane season. When the town sees excessive rainfall, its personnel often find themselves scrambling to address overflowing stormwater systems, but even a burst water main can create a spontaneous flood event.

Town of Cary parking lot during a flood event.

As a leader in innovative city solutions, the Town of Cary was already committed to using smart technology, data, and analytics to optimize city functions, drive economic growth, and improve the quality of life. Chief Information Officer, Nicole Raimundo, Smart City Strategist, Terry Yates, and Stormwater Operations Manager, Billy Lee, saw another opportunity: use technology to predict and manage flood events.

Envisioning a flood prediction solution

In October 2019, Cary’s leaders met with partners Microsoft and SAS, IoT division, to envision a new solution. The team started by assessing the current situation.

During storm events, Cary had no visibility into the river levels or how quickly the water was rising. Traditionally, the town relied on citizens to alert them of floods through phone calls, text messages, and other means. The town staff processed these requests manually dispatching public work personnel to erect barriers and close roads and first responders to emergencies.

The team came away with a vision for building a flood prediction system leveraging Azure IoT and SAS Analytics for IoT. Raimundo explained the need for the change.

“We felt strongly that the existing system wasn’t serving citizens in flood-prone areas well. We knew we needed a scalable solution to get us from reactive to proactive and ultimately predictive. The scalability of Azure IoT platform became a critical component of our IoT architecture. In addition, we required a robust set of analytical tools that could deliver insight from both real-time and historical data and SAS Analytics for IoT offered that.” —Nicole Raimundo, Chief Information Officer, Town of Cary

“There are thousands of cities that are similar to the Town of Cary that are looking to deploy solutions to solve urban issues such as flooding. Leveraging the Azure IoT platform and SAS Analytics for IoT these cities can move from being reactive to proactive and, ultimately, predictive in a cost-effective, scalable manner.” —Daniel Sumner, Worldwide Industry Director, Government—Smart Infrastructure at Microsoft

Defining project goals

Cary, Microsoft, and SAS agreed to several project goals outlined below.

Improve the situational awareness of town staff.
Automate stormwater personnel notifications and work order generation.
Alert citizens of flooding events.
Provide data to downstream regional and state entities.
Analyze captured data and predict future flood events.

A key requirement for the Town of Cary was that their new flood prediction system needed to integrate with existing business systems. These included using the SAS Visual Analytics dashboard integrated with ArcGIS for real-time visualization, Salesforce for alerts, automated notifications and work orders, and data sharing for regional partner response systems.

“The Azure IoT platform has been a critical piece of our technology ecosystem and accelerates our ability to scale.” —Terry Yates, Smart City Strategist, Town of Cary

Through a series of work sessions with the partners in February 2020, the team created a project plan and system architecture. Then the implementation work began.

Town of Cary working session with Microsoft and SAS resources.

Implementing the solution

The Town of Cary installed water level sensors at various points along the Walnut Creek stream basin and rain gauges at several Town of Cary owned facilities.

Water sensors were placed at strategic locations.

Below are highlights of how the solution was built.

Microsoft Azure IoT Hub enabled a highly secure and reliable communication to ingest stormwater levels over an Firstnet LTE wireless connection. The team used Azure IoT Hub to provision, authenticate, and manage the two-way communication to the sensors.
SAS Analytics for IoT combined streaming sensors or gauges and weather data for real-time scoring, dashboarding, and historical reporting.
SAS Visual Analytics provided interactive dashboard, reports, business intelligence, and analytics. The dashboard is integrated with ESRI ArcGIS for additional geographic analysis and data visualization.
Microsoft Azure Logic Apps seamlessly integrated with Salesforce and other third-party applications.
Microsoft Azure Synapse Analytics provides data warehousing for Big Data analytics.

Evaluating results

The solution’s initial phase has been running for several months with positive results.

Town staff can now visualize flooding events in real-time.
Stormwater personnel receive notifications and can generate work orders automatically.
A mechanism has been established to share data with regional partners.

“We’re still connecting some of the dots, but we’re already seeing real benefits in the automation of formerly manual processes. Previously, we might get a call from a citizen, which would cause us to dispatch public works or emergency services depending on the type of flooding. Now the data triggers alerts that automatically notify stormwater personnel, who can react and address the flooded areas. It’s much more efficient and could ultimately save lives.” —Nicole Raimundo, Chief Information Officer, Town of Cary

Lee explained how exciting it is to be able to visualize water flow and using the SAS Visual Analytics dashboard which is fully integrated with the ESRI ArcGIS.

“Now we can see a storm event in real time. We can pull up the dashboard and see how much rain we’re getting. We can see the stream levels rising and share this data with our regional partners. It’s amazing to see the data in real-time.” —Billy Lee, Stormwater Operations Manager, Town of Cary

Town of Cary storm water IoT dashboard.

Applying analytics

As the Atlantic region nears the peak of hurricane season, Cary’s leaders are looking forward to better predicting potential flood events. Leveraging SAS Analytics for IoT and SAS Event Stream Processing (ESP), the Town of Cary has enhanced their ability to acquire and manage new data from Azure IoT, generate and deploy predictive models, manage the lifecycle of those models over time, and achieve greater insight they can take action on.

“Using Microsoft Azure IoT with the capabilities to integrate the water sensor data, Accuweather data from Azure Maps, and SAS analytics we are able to create a digital twin of the watershed. This allows the Town of Cary to be proactive in addressing floodwater issues so action can be taken ahead of the storm or flooding event.” —Brad Klenz, Distinguished IoT Analytics Architect, SAS

In the case of the flood detection and management solution, the Town of Cary can better identify anomalies, such as rising water, through the integration of weather forecasting data, real-time sensor data measuring water and rain levels to deliver advanced warnings and future predictions of flooding events both within the Town of Cary and downstream to surrounding municipalities.

“Cary sits on top of several rain basins. We will now be able predict flooding and share this information with our regional neighbors. This data and predictability will have a huge economic impact, not just in the Town of Cary, but for many municipalities, including local businesses and citizens, downstream.” —Nicole Raimundo, Chief Information Officer, Town of Cary

Advice to other cities

The Town of Cary has implemented a series of smart city initiatives, and its flood prediction solution shows amazing promise. What advice would Raimundo and Yates provide to other cities looking to implement similar projects?

“It’s really about selecting the right partners that understands your platform strategy vision for building solutions on a future-proof scalable architecture and that offer a flexible and open set of tools.” —Nicole Raimundo, Chief Information Officer, Town of Cary

Yates encouraged his peers to get the buy-in of all stakeholders.

“Include all departments, all subject matter experts in the digital transformation process and especially people working out in the field. You’ll need everyone’s buy-in and participation to be successful.” —Terry Yates, Smart City Program Strategist, Town of Cary

Next steps

Learn more about Azure IoT, SAS Analytics for IoT, and Microsoft for smart cities.
Quelle: Azure

Six reasons customers trust Azure to run their SAP solutions

As global organizations across every industry adjust to the new normal, SAP solutions are playing an increasingly vital role in addressing immediate needs and paving a path to a resilient future. Now more than ever, companies are realizing the value of running their SAP solutions in the cloud. While some are using advanced analytics to process their SAP data to make real-time business decisions, others are integrating their SAP and non-SAP data to build stronger supply chains. Whether it’s meeting urgent customer needs, empowering employees to make quick decisions, or planning for the future, customers running SAP solutions in the cloud have been well prepared to face the new reality. Check out how Walgreens delivers superior customer service with SAP solutions on Microsoft Azure.

Many organizations running their SAP solutions on-premises have become increasingly aware of the need to be more agile and responsive to real-time business needs. According to an IDC survey, 54 percent of enterprises expect the future demand for cloud software will increase. As global organizations seek agility, cost savings, risk reduction, and immediate insights from their ERP solutions, here are some reasons many of the largest enterprises choose Microsoft Azure as their trusted partner when moving their SAP solutions to the cloud.

1. Running SAP solutions on Azure delivers immediate insights and increased agility

“Now that we have SAP in the cloud … we have a platform for digital innovation in the cloud … With Azure, we’ve lifted our entire IT landscape up to a higher level where we can drive experimentation with much less risk and much less cost.”—Sarah Haywood, Chief Technology Officer and Vice President of Technology at Carlsberg Group

Organizations running SAP solutions on Azure gain real-time and predictive insights that empower them to break into new ways of doing business. Azure offers the ability to tap into more than 100 cloud services, access SAP Cloud Platform, apply intelligent analytics, and also integrate with an organization’s existing productivity and collaboration tools such as Microsoft 365, Microsoft Teams, Microsoft Power Apps, and Microsoft Power BI.

With Azure, organizations can integrate their SAP and non-SAP data through an extensive portfolio of Azure data services and create real-time dashboard views of the current operations using SAP and Microsoft business intelligence tools. Using intelligent analytics deepens real-time and predictive insights to improve decision-making by responding dynamically as business conditions change, and how that change impacts your customers or products. Integration with Teams and Microsoft 365 improves team collaboration and enhances user experience and productivity. Using Microsoft Power Automate, Power Apps, and Power BI, organizations can create customized workflows, apps, and business insight reports without having to write any code.

 

2. An ever-evolving and growing set of Azure cloud services drives continuous innovation

“We are looking at drones, IoT, RFID sensors, artificial intelligence, chatbots, and every other futuristic technology you can think of to do mining better, and with Azure we have a broad foundation for exploring all that.”—Head of Enterprise IT Services, Rio Tinto

While Zuellig Pharma is building an app that uses Azure blockchain services and data from the SAP Business Suite on HANA to track and capture counterfeit products and illegal parallel imports in its region, Walgreens plans to use AI and machine learning to develop new customer offerings quickly and respond in real time to changes in the marketplace.

Customers such as Rio Tinto are using Azure’s secure and scalable IoT applications to pilot a solution to take real-time data from trucks, drills, smelters, and other equipment and analyze it to gain equipment health, preemptive maintenance, supply chain efficiency, and other operational intelligence. Additionally, with DevOps with GitHub and Azure Kubernetes Service, customers can build, manage, and deploy applications on a massive global network.

3. Running SAP solutions on Azure offers costs savings

“We chose to migrate to Azure for three main reasons: cost, strategy, and speed … We saw a big cost advantage with SAP HANA on Azure over the cloud we currently used”—David South, Director of Architecture at Coke One North America Services

A Forrester study showed customers achieved more than 100 percent ROI, a 50 percent reduction in data center costs, and a 100 percent reduction in SAP release delays by migrating their SAP systems to Azure. Moving to Azure not only eliminates capital expenditure and cost of underutilized hardware, but it also offers cost management tools such as on-demand scaling during peak usage periods, using cheaper storage, and optimizing disaster recovery environments.

By running SAP solutions on Azure, organizations replace expensive, manual, and error-prone processes with automated, flexible processes, and with a single ticket-to-solution experience, enterprises empower employees to focus on value-added activities by putting data in their hands.

4. Running SAP solutions on Azure offers immense flexibility and scalability

“Moving to Azure gives us the scalability we need … running SAP on Azure gives us the agility and flexibility we need to disrupt the healthcare industry in a way that improves our customers’ access to the products and services they need.”—Dan Regalado, Vice President of Global Technology Transformation and Strategic Partnerships, Walgreens

Customers across every industry run their largest production SAP landscapes on Azure because it is a proven cloud platform certified by SAP to run their most mission-critical SAP applications. Azure offers the industry’s most performant and scale-able cloud infrastructure—offering 192 GB to 12 TB SAP HANA certified VMs in more regions than any other public cloud provider along with support for both Linux and Windows OS. Azure offers on-demand scalability and agility that reduces the time to market —customers can spin up or spin down resources as needed. For instance, Daimler AG reduced operational costs by 50 percent and increased agility by spinning up resources on-demand in 30 minutes with SAP S/4HANA and Azure. 

Azure also offers access to more than 1,000 pre-built integrations, out-of-the-box business services, SAP HANA services, and apps built by SAP and our partners. Customers such as Tate and Lyle appreciate that with Azure, they get access to compute, network, and storage resources preconfigured for SAP HANA that they didn’t have to build, install, or manage.

5. SAP solutions on Azure offer best-in-class security, compliance, and business continuity

“If you go to the Microsoft Trust Center, you can see the tremendous investment Microsoft makes in security certifications and compliance. It would have been very costly for Kennametal to implement that level of security within our own environment. Instead, we get to inherit it from Microsoft.”—John Johnston, Senior Manager, Global Information Security and Compliance, Kennametal

Azure’s intelligent security services are backed by a $1 billion annual investment in enterprise-grade security and compliance offers and 3,500 cybersecurity professionals. Azure has the most compliance offerings of any public cloud. Azure offers the best-in-class security services such as Azure Sentinel for SIEM, Azure security center for threat monitoring, and Azure Active Directory for identity management. Additionally, customers can leverage built-in availability and recovery options such as Azure Backup and Azure Site Recovery to ensure business continuity and data protection. Microsoft teams work closely with partners to ensure that critical systems remain online during migration and offer a robust set of joint planning workshops, migration programs such as FastTrack, POCs, and training and certifications.

6. Organizations benefit from the trusted partnership between SAP and Microsoft

“We needed a provider that enjoys a close partnership with SAP, understands our needs, and can accelerate our migration and expand our capabilities. Azure answered every need.”—Joshua Sefchek, Manager of Cloud and Enterprise Services, Toyota Material Handling North America

After decades of working together to serve our customers, SAP and Microsoft deepened their relationship by signing the Embrace initiative. As part of Embrace, SAP will lead with Azure to move on-premise SAP ERP and SAP S/4HANA customers to the cloud through industry-specific best practices, reference architectures, and cloud-delivered services. Our engineering teams co-residing in Germany and Redmond, Washington work together to develop joint reference architectures, product integration roadmaps, and best practices; our industry teams are jointly developing industry-specific transformation roadmaps, and our support teams have developed collaborative support models.

SAP and Microsoft have been partners for more than 25 years and are also mutual customers. Microsoft is the only cloud provider that’s been running SAP for its own finance, HR, and supply chains for the last 20 years, including SAP S/4HANA. Likewise, SAP has chosen Azure to run a growing number of its own internal system landscapes, including those based on SAP S/4HANA. Microsoft IT and SAP IT generously share their learnings from running SAP solutions on Azure with our customers. Check out the latest MSIT webinar and SAP IT webinar for some best practices.

More than 95 percent of Fortune 500 companies run their business on Azure. Our experience and history give us a powerful understanding of the needs of enterprise customers. Together with SAP, customers have trusted us with their most critical workloads for decades because we understand what it takes to support our customers in their journey to the cloud.

We look forward to seeing you this month at the virtual SAPPHIRE and ASUG events. Learn more about SAP solutions on Azure and read today’s announcement about new offerings to help our SAP customers optimize costs and increase agility. 
Quelle: Azure

Optimize costs and increase agility with the latest SAP on Azure offerings

SAP SAPPHIRE NOW is an event we look forward to year after year, as it’s always a place to meet our customers and learn how we can continue to support their evolving needs. This year, those conversations will take a different format, but thanks to technology, we can still connect with our customers across the globe.

We’re hearing from enterprises that more than ever before they need a trusted cloud partner to support business continuity, agility, and real-time decision making for their mission-critical business processes. In addition, they need help to manage costs effectively.

To help our customers achieve these goals, and as part of the Embrace initiative with SAP, today we're announcing a series of new offerings and reference architectures. These offerings unlock the power of Azure for SAP workloads, including integration with SAP Cloud Platform and Microsoft products like Office 365 and Power Platform.

M-series virtual machines (VMs): Our latest updates to M-series virtual machine (VM) offerings will help increase agility through seamless scale-up and scale-down.
New use cases supported by reference architectures for integration scenarios: As part of our Embrace initiative with SAP, we are announcing five use cases supported by reference architectures that cover integration scenarios that will help our customers get immediate insights by integrating their SAP and non-SAP environments.
More Azure Large Instances options: We are also launching 18 new SKUs for Azure Large Instances including the largest Intel Optane bare metal instances available in the cloud, so customers can choose the optimum configuration for their workload profiles while optimizing costs.
New DevOps capabilities: We are making it even easier for customers to automate and integrate SAP workloads in Azure using ready-made building blocks to support a DevOps model.
NetWeaver-certified virtual machines: Our latest SAP NetWeaver-certified virtual machines deliver a lower price to performance ratio and help drive total cost of ownership (TCO) reduction.

M-series updates increase agility and cost effectiveness

One year ago at SAPPHIRE 2019, we introduced our Mv2 virtual machine series to support customers with databases from 6 to 12 TBs, the largest-memory SAP HANA-certified configuration available on virtual machines in the public cloud. Since then, adoption has grown rapidly with customers like Luxottica Group, Kennametal, Coats, and Accenture relying on Mv2 virtual machines for their production workloads.

Seamlessly scale up or down from 2 to 416 vCPUs and from 16GiB to 2TiB Memory

As the needs of your SAP workloads on Azure change, you can change to different virtual machine families or sizes without worrying about the underlying hardware and by simply resizing your virtual machines. With our latest investments in the Gen2 virtual machine support for Mv1 (aka M-series), you can start small with your SAP database with the Esv3-series and move to larger sizes on the Mv1 (aka M-series) as your workload needs grow, and seamlessly scale up to the Mv2-series as your workloads approach over 400 vCPUs and up to 12TiB memory. Learn more about Gen 2 virtual machines advantages.

Achieve flexibility and agility with our expanding regional footprint for the M-series

The Mv1 (aka M-series) virtual machines are available in 34 regions and Mv2 virtual machines are available in 12 regions, and we are now expanding our regional availability footprint even further. For Mv1 (aka M-series) virtual machines we have recently added United Arab Emirates (UAE) Central and will be adding US West, US Central, and North Central US in 2020. For Mv2 virtual machines, we will be expanding to Brazil South, Germany West Central, Japan East, UAE Central, North Central US, and West US in 2020. Please refer to virtual machine availability by region for the latest regional availability.

Reduce software licensing costs with our new Mv2 constrained core sizes

Starting in July 2020, customers can constrain the Mv2 virtual machine vCPU count while maintaining the same memory, storage, and I/O bandwidth of the unconstrained core Mv2 size.

New use cases supported by reference architectures drive immediate insights and agility by integrating SAP and non-SAP data

As part of the unique Embrace initiative between Microsoft and SAP, we are focused on accelerating time to business outcomes by helping customers integrate across Azure, SAP Cloud Platform, and Microsoft offerings. To support this, we have jointly released a series of five use cases supported by reference architectures and will continue to release unique integration patterns to create great customer experiences. The use cases announced today are focused on identity, workflow and service integration. Over the coming months, we will start to integrate further with Office 365 Graph, Microsoft Teams, and Power Platform:

1. How to consume Microsoft Azure services in SAP Cloud Platform

2. Establish identity and authentication workflow between SAP and Microsoft

3. Extend SAP S/4HANA with SAP and Microsoft services

4. Simplify business process integration across SAP and Microsoft through enterprise integration and extension

5. Intelligently optimize and automate processes with SAP and Microsoft services

New Azure Large Instances SKUs help achieve a lower total cost of ownership and the fastest recovery times in the market

With the launch of 18 new SKUs, we now offer 24 SKUs powered by the 2nd Generation Intel Xeon Platinum processors, supporting Intel® Optane™ persistent memory, making it the most comprehensive portfolio in the market. Learn more by reviewing our portfolio. Additionally, Azure Large Instances is the first to bring to market two unique capabilities:

NetApp SnapCenter, which enables customers to use a SnapCenter console to take and restore consistent live snapshots of databases as large as 96 TB within seconds.
A solution to combine the power of Bare Metal with the agility of virtual machines: With the ability to mount Azure Large Instance database volumes on virtual machines, Azure Large Instances customers can now dynamically spin up virtual machines to accomplish tasks such as refresh, consistency checks, and data distribution within minutes at a significantly lower cost than before.

Latest DevOps offerings simplify monitoring, backup, and deployment of SAP workloads

We have added new capabilities to help our customers more easily deploy, monitor, and back up SAP solutions on Azure:

Simplify deployment with SAP Automation with Terraform and Ansible

Since the initial announcement of SAP Automation for Azure (v1), we have evolved the vision and broadened the scope of automation for SAP on Azure. With leading industry automation tools, Terraform and Ansible, we are developing common building blocks, to simplify deployment of SAP landscapes on Azure as well as provide consistency in these deployments. Today, we are announcing that these building blocks will be available in a GitHub Open Source repository (sap-hana) as (v2) for SAP Automation by July 2020. The automation solution is based on the best practices specified by Microsoft and SAP as part of our reference architectures for SAP. You can learn more about the scenarios supported in our GitHub.

Seamlessly monitor SAP landscapes with Azure Monitor for SAP solutions

With the preview of Azure Monitor for SAP solutions, customers will be able to centrally collect and visually correlate telemetry data from Azure infrastructure and databases in one location for faster troubleshooting. Customers will be able to deploy Azure Monitor for SAP solutions resources with a few simple clicks from Azure portal and monitor the following components: SAP HANA on Azure virtual machines or Azure Large Instances, SQL Server on Azure virtual machines, and Pacemaker High-availability clusters on Azure virtual machines or Azure Large Instances. With the preview starting in July 2020, the product will be available in US East, US East 2, US West 2, and West EU regions with more regions to follow soon.

Ensure business continuity by instantly backing up your SAP HANA databases running on SUSE Linux Enterprise Server (SLES) and Red Hat Enterprise Linux (RHEL) platform

In addition to supporting SAP HANA workloads on SUSE Linux Enterprise Server(SLES), Azure Backup for SAP HANA workloads running on Red Hat Enterprise Linux (RHEL) is now in preview. Azure’s native backup solution for SAP HANA offers zero-infrastructure backup, one-click point-in-time restore capability, and long term retention and backup management capabilities. The preview will be available across all Azure regions except for Germany Northeast, Germany Central, France South, and US Gov IOWA. Please leverage the tutorial for backing up your SAP HANA databases for your RHEL based systems.

NetWeaver-certified virtual machine updates

Achieve a lower price to performance ratio with our new NetWeaver-certified virtual machines

We are excited to be the first hyperscaler to offer new SAP NetWeaver certified virtual machine families based on AMD EPYCTM 7452 processors. These new Dasv4 and Easv4 virtual machine families offer superior performance for the SAP application layer and SAP-supported databases (excluding HANA). The increased performance of these virtual machine families provides a lower price to performance ratio, driving down total cost of ownership. You can learn more about the global availability of these virtual machines by referring to virtual machine availability by region.

We are glad we could share these updates with you and we want to hear from you on how we could continue to build solutions to help you meet your evolving needs. We look forward to seeing you at virtual SAPPHIRE NOW, where we are sponsoring the Intelligent Enterprise track on Monday, June 15. To learn more, check out our latest blog on reasons customers trust Azure to run their SAP solutions or visit our website. Please share your feedback and join the conversation with other SAP experts on the Microsoft Tech Community.
Quelle: Azure

New general purpose and memory-optimized Azure Virtual Machines with Intel now available

Today we're announcing the availability of new general purpose and memory-optimized Azure Virtual Machines based on the 2nd generation Intel Xeon Platinum 8272CL (Cascade Lake). This custom processor runs at a base speed of 2.5GHz and can achieve all-core turbo frequency of 3.4GHz. It features Intel® Deep Learning Boost Technology, Intel® Advanced Vector Extensions 512 (Intel® AVX-512), Intel® Turbo Boost Technology 2.0, and Intel® Hyper-Threading Technology.

With this announcement, we are introducing two new Azure Virtual Machines families, one of which represents a brand-new product category in our portfolio:

The Azure Ddv4 and Ddsv4 and Edv4 and Edsv4 virtual machines, which include a local data temporary disk (now generally available)
The Azure Dv4 and Dsv4 and Ev4 and Esv4 virtual machines, a new category of virtual machines, which rely on remote disks and do not provide temporary local storage (now in preview).

The new virtual machine (VM) sizes deliver up to roughly 20 percent CPU performance improvement compared to their predecessors, the Dv3 and Ev3 VM families.

New Ddv4 and Ddsv4 and Edv4 and Edsv4 VMs are generally available

The new Ddv4 and Ddsv4 and Edv4 and Edsv4 VM sizes include fast, larger local solid state drive (SSD) storage and are designed for applications that benefit from low latency, high-speed local storage, such as applications that need fast reads and writes to temporary storage, or applications that need temporary storage for caches or temporary files. These new VM sizes offer 50 percent larger local storage, as well as better local disk IOPS for both Read and Write operations compared to the Dv3 and Dsv3 and Ev3 and Esv3 sizes with generation 2 (Gen 2) VMs. The new Ddv4 and Ddsv4 and Edv4 and Edsv4 VM sizes can be attached to standard HDD, standard SSD, premium SSD, or ultra SSD persistent disks.

The new Ddv4 and Ddsv4 VM sizes provide a good balance of memory-to-vCPU performance, with up to 64 vCPUs, 256 GiB of RAM, and include local SSD storage (up to 2,400 GiB). These VM families are ideal for development and testing, small to medium databases, and low-to-medium traffic web servers.

The new Edv4 and Edsv4 VM sizes feature a high memory-to-CPU ratio, with up to 64 vCPUs, 504 GiB of RAM, and also include local SSD storage (up to 2,400 GiB). These VM families are great for relational database servers and in-memory analytics.

New Dv4 and Dsv4 and Ev4 and Esv4 VMs now in preview

The Dv4 and Dsv4 and Ev4 and Esv4 VM sizes are new offerings that do not include the local temporary disk. These new VM families offer 20 percent CPU improvement over the Dv3 and Ev3 VM families. The new Dv4 and Ev4 VM sizes can be attached to standard HDD and standard SSD persistent disks. While the Dsv4 and Esv4 VM sizes can be attached to standard HDD, standard, SSD, premium SSD, or ultra SSD persistent disks. If you are currently using v3 VM sizes, switching to v4 sizes will provide you a better price-per-core performance option.

If you are currently using v3 VM sizes, switching to v4 sizes will provide you a better price-per-core performance option.

The new Dv4 and Dsv4 VM sizes provide a good balance of memory to vCPU performance, with up to 64 vCPUs and 256 GiB of RAM. These VM families are ideal for development and testing workloads, small-to-medium databases, and low-to-medium traffic web servers.

The new Ev4 and Esv4 VM sizes feature a high memory-to-CPU ratio, with up to 64 vCPUs and 504 GiB of RAM. These VM families are great for relational database servers and in-memory analytics.

Customers can request access to these new VMs (with no local temporary disk) currently in preview today by filling out this form. If you have any further questions or feedback, please reach out to us directly.

Working in collaboration with Intel while meeting our customer needs

“The launch of Azure D-v4 and E-v4-series virtual machines further extends the Microsoft IaaS portfolio to meet the diverse needs of our customers. Powered by 2nd Generation Intel® Xeon Scalable Processors, these virtual machines offer optimized application performance for web and data services, desktop virtualization and business applications moving to Azure.” —Jason Grebe, Intel CVP Cloud and Enterprise

With these new VM sizes, we are providing more customer value with better CPU performance.

“Silicon design workloads require high CPU performance, large number of cores, high memory-to-core ratios, and sufficient local storage. The newly introduced Edsv4 family meets all these requirements, making it an ideal choice for our use cases. Using the Edsv4 VMs, TSMC was able to successfully create a brand new Scale-Out/Scale-In silicon design strategy, helping designers achieve significant run-time speedup and cost optimization.” —Willy Chen, Deputy Director, Design & Technology Platform, TSMC

Frequently asked questions

Customers regularly ask what the differences are between the new VMs and the general purpose Dv3/Dsv3 or memory-optimized Ev3/Esv3 VM sizes that they’re currently using. The answer is that you’ll now have more options to choose from. The table below summarizes the key differences:

Customers also ask what happens if they still need a local temp disk for their VM. You can choose the Ddv4 and Ddsv4 or Edv4 and Edsv4 VM sizes for your application if a local disk is still required.

For more frequently asked questions related to these VM sizes, refer to Azure VM sizes with no local temp disk.

Region availability for Ddv4, Ddsv4, Edv4, and Edsv4 VM sizes

The new Ddv4, Ddsv4, Edv4, and Edsv4 VM families are available in Pay-As-You-Go, Reserved Instance, and Spot in the following Regions. Prices vary by Region.

Get started today

Learn more about Ddv4 and Ddsv4-series or Edv4 and Edsv4-series (with local temporary disk) now generally available.
You can learn more about the Dv4, Dsv4-series or Ev4, Esv4-series VMs (without local temporary disk) that are currently in preview.
You can also request access to the new VMs currently in preview by filling out this form. If you have any further questions or feedback, please reach out to us directly.

Quelle: Azure

Be prepared for what’s next: Accelerate cloud migration

We are in the midst of unprecedented times with far-reaching implications of the global health crisis to healthcare, public policy, and the economy. Organizations are fundamentally changing how they run their businesses, ensure the safety of their workforce, and keep their IT operations running. Most IT leaders that we have had the opportunity to speak with over the past couple of months are thinking hard on how to adapt to rapidly changing conditions. They are also trying to retain momentum on their well-intentioned strategic initiatives.

Across our customers in many industries and geographies, we continue to see the cloud deliver tangible benefits. Azure enables our customers to act faster, continue to innovate, and pivot their IT operations to what matters most. We understand the challenges our customers are facing. We also recognize that our customers are counting on us more than ever.

Common, consistent goals for businesses

Even though our customers’ challenges are often unique to the industries they serve, we hear many common, consistent goals.

Cloud-based productivity and remote collaboration is enabling workers, IT professionals, and developers to work from anywhere. As our customers enable an increase in remote work, there’s increased importance on scaling networking capacity while securely connecting employees to resources they need.
Azure is critical to our customers’ ability to rapidly scale their compute and storage infrastructure to meet their business needs. This is made possible because of how customers have transformed their IT operations with Azure. Driving operational efficiency with Azure can also enable businesses to scale on-demand and meet business needs.
IT budgets will be constrained over the coming year—optimization of existing cloud investments and improving cashflow via migration to Azure are top of mind. Our customers are exploring ways to run their businesses with reduced IT budgets. Owning and managing on-premises datacenters is expensive and makes customers vulnerable to business continuity risk. An Azure migration approach is resonating with these customers to transition spend to Opex, improving cash flow and reducing business risk.
The downtime is also becoming an opportunity to accelerate projects. CIOs are looking at this as an opportunity to deliver planned projects and find ways to innovate with Azure. They are counting on this innovation to help their business experience a steep recovery as we exit the current scenario.

In many of my discussions with customers, we still hear uncertainty about how to navigate the cloud migration journey. There is an urgency to act, but often a hesitation to start. There is, no doubt, a learning curve, but Microsoft has traversed it with many customers over the past few years. Businesses need best practices and prescriptive guidance on where to begin, how to best steer, and how to avoid the pitfalls. This blog is aimed to help you make progress on this pressing need. We’ll dive deeper into the steps of the cloud migration journey in upcoming posts.

To get you started on your accelerated journey to Azure, here are our top three recommendations. While these aren’t meant to be one-size-fits-all, these are based on learnings from hundreds of scale migration engagements that our team has helped our customers with.

1. Prioritize assessments

Perform a comprehensive discovery of your datacenters using our free tools such as Azure Migrate or Movere. Creating an inventory of your on-premises infrastructure, databases, and applications is the first step in generating right-sized and optimized cost projections for running your applications in Azure. Between your existing configuration management database (CMDB), Active Directory, management tools, and our discovery tools, you have everything you need to make crucial migration decisions.

The priority should be to cover the entire fleet and then arrive at key decisions related to candidate apps that you can migrate first and the appropriate migration approach for them. As you run your assessments, identify applications that could be quick wins—hardware refresh, software end-of-support, OS end-of-support, and capacity constrained resources are all great places to prioritize for the first project. Bias towards action and demonstrating urgency with triggers that need immediate attention can ensure that you are able to drive operational efficiencies and flip your capital expenses to operational.

Many Azure customers are doing this effectively. One example is GlaxoSmithKline (GSK). In partnership with Azure engineering and Microsoft FastTrack for Azure, and by leveraging the Azure Migration Program (AMP), GSK was able to quickly discover their VMware virtual machines and physical servers with Azure Migrate. By leveraging features such as application inventory and application dependency mapping, GSK was able to build a prioritized list of applications that they could migrate. They then used the discovery and assessment data and incorporated it with their CMDB to build PowerBI dashboards to track the progress of their strategic migration initiatives.

“Microsoft engineering and FastTrack’s ability to quickly aggregate and visualize our application hosting estate is the cornerstone to our migration planning activities. GSK is comprised of many different business units, and we are able to tailor migration priorities for each of these business units. In addition, we also now have clear visibility for each server, what they are dependent on, and can now also determine the appropriate server size in Azure to create our migration bundles and landing zones. With this excellent foundation of data, we are able to quickly move into the migration phase of our cloud journey with a high degree of confidence in our approach.”—Jim Funk, Director, Hosting Services, GlaxoSmithKline

2. Anticipate and mitigate complexities

You will run into complexities as you drive your migration strategy—some of these will be related to the foundational architecture of your cloud deployments, but a lot of it will be about how your organization is aligned for change. It is important that you prepare people, business processes, and IT environments for the change, based on a prioritized and agreed cloud adoption plan. Every migration we’ve been involved in has had its own unique requirements. We find that customers who are moving quickly are those who have established clarity in ownership and requirements across stakeholders from security, networking, IT, and application teams.

“The migration to the cloud was more about the mindset in the organization and that transformation we needed to do in IT to become the driver of change in the company instead of maintaining the old. A big part of the migration was to reinvent the digital for the company." —Mark Dajani, CIO, Carlsberg Group

On the technical front, anticipate complexities and plan for your platform foundation for identity, security, operations, compliance, and governance. With established baselines across these shared-architectural pillars, deploy purpose-built landing zones that leverage these centralized controls. Simply put, landing zones and the platform foundation capture everything that must be in place and ready to enable cloud adoption across the IT portfolio.

In addition to designing your baseline environment, you would also want to consider your approach to managing your applications as they migrate to Azure. Azure offers comprehensive management solutions for backup, disaster recovery, security, monitoring, governance, and cost management, which can help you achieve IT effectiveness as you migrate. Most customers run in a hybrid reality even when they intend to evacuate on-premises datacenters. Azure Arc is a terrific option for customers who want to simplify complex and distributed environments across on-premises and Azure, extending Azure management to any infrastructure.

3. Execute iteratively

Customers who have the most success in executing on their migration strategy are customers who follow an iterative, workload-based, wave-oriented approach to migration. These customers are using our free first-party migration tools to achieve the scale that works best for their business—from a few hundred to thousands of servers and databases. With Azure Migrate you have coverage for Windows Server and Linux, SQL Server and other databases, .NET and PHP-based web applications, and virtual desktop infrastructure (VDI). These capabilities give you options for migration to infrastructure as a service (IaaS) and platform as a service (PaaS) offerings like Azure App Service and Azure SQL.

The key to success and executing effectively is targeting specific workloads and then executing in phases. In addition, leveraging capabilities like dependency mapping and test migration ensures that your migration cutovers are predictable and have high success rates. We strongly recommend using a lift-optimize-shift approach and then innovating in the cloud, especially during these times.

One such customer who has leveraged the Azure Migrate toolset as part of their cloud transformation is Malaysian telecommunications operator, Celcom. Celcom leveraged Azure Migrate’s discovery and assessment features to securely catalog their applications, virtual machines (VMs), and other IT assets, and to determine the best way to host them in the cloud. With their foundational architecture and management strategy in place, Celcom executed in waves, transitioning their complex multi-vendor on-premises environment with multiple applications over to Azure. Read more about Celcom’s digital transformation with Azure.

Share your feedback

In the coming weeks and months, we will dive deeper into these topics. Please share your experiences or thoughts as this series comes together in the comments below—we appreciate your feedback. You can also visit Azure migration center to learn more and get started.
Quelle: Azure

New features and insights in Azure Monitor

Customers need full stack observability for their apps and infrastructure across Azure and hybrid environments to ensure their workloads are always up and running, for which they rely on Azure Monitor. Over the past few months, we have released many new capabilities targeting to improve native integration with Azure, enable easier onboarding at scale, support enterprise security and compliance needs, provide rich full stack distributed tracing, and much more. In this blog, we're sharing the newest enhancements from Azure Monitor announced at Microsoft Build, including:

Preview of Azure Monitor Application Insights logs being available directly on Log Analytics Workspaces.  
General availability of Azure Monitor for Storage and Azure Cosmos DB, with previews for Key Vault and Azure Cache for Redis.

Be sure to read through the blog post to learn about the full list of announcements at the end.

Application Insights on Log Analytics Workspaces

Logs from Application Insights could previously only be stored separately for each monitored application, and you had to resort to cross-workspace queries to correlate with logs in Log Analytics Workspaces. Continuing the integration of Application Insights with Log Analytics, we are announcing a preview of a major milestone today in our Application Performance Management (APM) story. You can now choose to send your Application Insights logs to a common Log Analytics Workspace, keeping application, infrastructure, and platform logs altogether.

 
This would let you apply common role-based access control across your resources and not have to worry about cross-application or workspace queries anymore (even the Application Insights logs schema is now integrated and made consistent with other data tables in Log Analytics).

 

You will now be able to export both your metrics and logs data to firewalled Azure Storage accounts or stream to Azure Event Hub via Diagnostic Settings. Note that exporting to Storage or Event Hub is a premium feature that would start getting billed once it reaches general availability. You can also start optimizing cost through reserved capacity pricing at workspace level with Log Analytics.

One of the biggest benefits with this upgrade is that you will now be able to easily drive enterprise readiness for your application logs, with all the new enhancements coming soon in Azure Monitor Logs (including Customer Managed Key Encryption, Network Isolation with Private Link support, Business Continuity Disaster Recovery with Globally Distributed Workspaces, High Availability with Availability Zones, Global Availability with Satellite Regions, and much more).

Out-of-the-box insights for Azure Resources

Customers have asked us to provide more out of the insights like the ones we provide for virtual machines, containers, and network. We are now happy to provide you out of the box insights on more Azure resources using the platform metrics that Azure Monitor already collects. These insights are built on workbooks which is a platform for creating and sharing rich interactive reports. You can access any of these insights out-of-the-box and customize it even further. These can be accessed either directly from an individual resource blade, or at scale from Azure Monitor blade in Azure Portal.

Azure Monitor for Storage is now generally available, offering comprehensive monitoring of your Azure Storage accounts covering insights across health and capacity, with the ability to focus on hotspots and help you diagnose latency, throttling, and availability issues.
  
Azure Monitor for Azure Cosmos DB is also now GA, where you can access insights on usage, failures, capacity, throughput, and operations for your Azure Cosmos DB resources across subscriptions.
  
We also announced previews of Azure Monitor for Key Vault and Azure Monitor for Azure Cache for Redis which will provide similar out-of-the-box insights for these resources, helping you use them optimally.

More enhancements

In addition to the two highlights above, here are other exciting announcements from Microsoft Build:

Data Encryption at Rest with Customer Managed Keys (CMK) in Azure Key Vault, providing complete control over log data access with key revocation. Available only when using dedicated clusters with capacity reservation of more than 1TB/day.
Out-of-the-box support for Distributed Tracing in Java Azure Functions, providing richer data pertaining to requests, dependencies, logs, and metrics.
Application Insights Codeless Attach for Node.JS Apps on Azure App Services (Linux) with automatic dependency collection.
Notifications with enhanced visibility on all Azure resource changes across subscriptions with application change analysis.

Next steps with Azure Monitor

To learn more about Azure Monitor and monitoring best practices, check out the documentation and recorded sessions from our recent virtual series. If you have any questions or suggestions, reach out to us through our Tech Community forum.
Quelle: Azure

General availability of Azure Files on-premises Active Directory Domain Services authentication

Today we're announcing the general availability of Azure Files support for authentication with on-premises Active Directory Domain Services (AD DS).

Since preview in February 2020, we’ve received great feedback and growing interest from our customers, especially because of increased work from home scenarios. With file shares migrated to the cloud, maintaining access using Active Directory credentials greatly simplifies the IT management experience and provide better mobility for remote work. Most importantly, you do not need to reconfigure your clients. As long as your on-premises servers or user laptops are domain-joined to AD DS, you can sync Active Directory to Azure AD, enable AD DS authentication on the storage account, and mount the file share directly. It makes the migration from on-premises to cloud extremely simple as the existing Windows ACLs can be seamlessly carried over to Azure Files and continue to be enforced for authorization. Along with private endpoint support of Azure Files, you can access data in Azure Files just like you would in an on-premises file server within the secure network boundary.

On-premises AD DS integration also simplifies the setup experience of using Azure Files as the user profile storage for Virtual Desktop scenarios. Leveraging Azure Files for Virtual Desktop Infrastructure (VDI) environments eliminates the need for self-hosting file servers. With AD DS integration, it extends the same authentication and authorization as traditional file servers to Azure. User profiles will be loaded from the file share to the desktop session supporting a single sign-on login experience. You can continue to use the existing AD DS setup and carry over Windows access control lists (ACLs) if needed. Beyond that, Azure Files as a cloud-native file service provides dynamic scaling to better accommodate the change of capacity and traffic patterns. For example, your VDI farm may have started with supporting 500 users, but with more people working remotely you need to scale up to 5000 users (10x increase). Azure Files premium tier allows you to scale up your capability along with performance on the fly to handle the increase in capacity. This will also reduce the management overhead to deploy additional file servers and manage the reconfigurations.

To help with your setup, we have collaborated with first and third-party VDI providers to provide detailed guidance. You can follow this step-by-step walkthrough to configure Windows Virtual Desktop FSLogix profile containers with Azure Files. Citrix has partnered with Microsoft to provide day-one support for Azure Files as a certified storage solution for both User Profile Management and User Personalization Layer technologies. Leveraging Azure Files provides a simple and cost-effective persistent storage solution for user data in your VDI environment. Detailed configuration information for integrating Azure Files with Citrix technologies is available in Citrix Tech Zone.

In addition, we want to share with you the recent updates on Azure Files:

Enhanced data protection with soft delete. To protect your Azure file shares from accidental deletion, we released the preview of soft delete for Azure file shares. Think of soft delete like a recycle bin for your file shares. When a file share is deleted, it transitions to a soft deleted state in the form of a soft deleted snapshot. You get to configure how long soft deleted data is recoverable for before it is permanently erased.
Better scaling with max file size increased to 4 TiB. We have increased the max size supported on a single file from 1TiB to 4 TiB. If you are using file share to store engineering files or virtual hard disks (VHDs), this would address your concerns on the size limitations. As you grow your data footprint, you can also scale up the share size at runtime. Larger file sizes are supported over Server Message Block (SMB) protocol and will be enabled for REST access in the upcoming weeks.
Private endpoint support for Azure File Sync. Starting with version 10.1 of the Azure File Sync agent, you can create private endpoints for your Storage Sync Services. Private endpoints enable you to securely connect to your Azure resources from on-premises using an ExpressRoute with private peering or a Site-to-Site VPN connection.

Getting started

You can deploy a file share and mount it for your data storage within 5 minutes. Here are some materials to help you get started:

What is Azure Files?
Quickstart: Create and manage Azure Files share.
Planning for an Azure Files deployment.
Planning for an Azure File Sync deployment.
Enable Active Directory Domain Services authentication on Azure Files.

You can share your feedback via Azure Storage forum or just send us an email at AzureFiles@microsoft.com.
Quelle: Azure

Introducing live video analytics from Azure Media Services—now in preview

Azure Media Services is pleased to announce the preview of a new platform capability called Live Video Analytics, or in short, LVA. LVA provides a platform for you to build hybrid applications with video analytics capabilities. The platform offers the capability of capturing, recording, and analyzing live video and publishing the results (which could be video and/or video analytics) to Azure Services in the cloud and/or the edge.

With this announcement, the LVA platform is now available as an Azure IoT Edge module via the Azure Marketplace. The module, referred to as, “Live Video Analytics on IoT Edge” is built to run on a Linux x86-64 edge device in your business location. This enables you to build IoT solutions with video analytics capabilities, without worrying about the complexity of designing, building, and operating a live video pipeline.

LVA is designed to be a “pluggable” platform, so you can integrate video analysis modules, whether they are custom edge modules built by you with open source machine learning models, custom models trained with your own data (using Azure Machine Learning or other equivalent services) or Microsoft Cognitive Services containers. You can combine LVA functionality with other Azure edge modules such as Stream Analytics on IoT Edge to analyze video analytics in real-time to drive business actions (e.g. generate an alert when a certain type of object is detected with a probability above a threshold).

You can also choose to integrate LVA with Azure services such as Event Hub (to route video analytics messages to appropriate destinations), Cognitive Services Anomaly Detector (to detect anomalies in time-series data), Azure Time Series Insights (to visualize video analytics data), and so on. This enables you to build powerful hybrid (i.e. edge + cloud) applications.

With LVA on IoT Edge, you can continue to use your CCTV cameras with your existing video management systems (VMS) and build video analytics apps independently. It can also be used in conjunction with existing computer vision SDKs (e.g. extract text from video frames) to build cutting edge, hardware-accelerated live video analytics enabled IoT solutions. The diagram below illustrates this process:

 

Use cases for LVA

With LVA, you can bring the AI of your choice and integrate it with LVA for different use cases. It can be first-party Microsoft AI models, open source or third-party models, etc.

Retail

Retailers can use LVA to analyze video from cameras in their parking lots to detect and match incoming cars to registered consumers to enable curb-side pickup of items ordered by the consumer via their online store. This enables consumers and employees to maintain a safe physical distance from each other, which is particularly important in the current pandemic environment.

In addition, retailers can use video analytics to understand how consumers view and interact with products and displays in their stores to make decisions about product placement. They can also use real-time video analytics to build interactive displays that respond to consumer behavior.

Transportation

When it comes to transportation and traffic, video analytics can be used to monitor parking spots, track usage to display automated, “No parking available” signs and re-route those trying to park. It can also be used for public transportation to monitor queues and crowds and identify capacity needs, enabling organizations to add capacity or open new entrances or exits. By feeding business data, pricing can be adjusted in real-time based on demand and capacity.

Manufacturing

Manufacturers can use LVA to monitor lines for quality assurances or ensure safety equipment is being used and procedures are being followed. For example, monitoring personnel to see that they are wearing helmets where required or even checking that face shields are lowered when needed.

Platform capabilities

The LVA on IoT Edge platform offers the following capabilities for you to develop video analytics functionality in IoT solutions.

Process video in your own environment

Live Video Analytics on IoT Edge can be deployed on your own appliance in your business environment. Depending on your business needs you can choose to process the video on your device and have only analytics data go to cloud services such as Power BI. This helps in avoiding cost related to moving video from the edge to the cloud and helps address any privacy or compliance concerns.

Analyze video with your own AI

Live Video Analytics on IoT Edge enables you to plug in your own AI and be in control of analyzing your video per your business needs. You have flexibility in using your own custom-built AI, open source AI, or AI built by companies specializing in your business domain.

Flexible live video workflows

You can define a variety of live video workflows using the concept of Media Graph. Media Graph lets you define where video should be captured from, how it should be processed, and where the results should be delivered. You accomplish this by connecting components, or nodes, in the desired manner. The diagram below provides a graphical representation of a Media Graph. You can learn more about it on the Media Graph concept page.

Integrate with other Azure services

LVA on IoT Edge can be combined with other Azure services on the edge and in the cloud to build powerful business applications with relative ease. As an example, you can use LVA on IoT Edge to capture video from cameras, sample frames at a frequency of your choice, use an open source AI model such as Yolo to detect objects, use Azure Stream Analytics on IoT Edge to count and/or filter objects detected by Yolo, and use Azure Time Series Insights to visualize the analytics data in the cloud, while using Azure Media Services to record the video and make it available for consumption by video players in browsers and mobile apps.

Next steps

To learn more, visit LVA on IoT Edge, watch this demo, and see the LVA on IoT Edge documentation.

Microsoft is committed to designing responsible AI and has published a set of Responsible AI principles. Please review the Transparency Note: Live Video Analytics (LVA) to learn more about designing responsible AI integrations.
Quelle: Azure

Azure Firewall forced tunneling and SQL FQDN filtering now generally available

Two new key features in Azure Firewall—forced tunneling and SQL FQDN  filtering—are now generally available. Additionally, we increased the limit for multiple public IP addresses from 100 to 250 for both Destination Network Address Translation (DNAT) and Source Network Address Translation (SNAT).

Azure Firewall is a cloud native Firewall as a Service (FWaaS) offering that allows you to centrally govern and log all your traffic flows using a DevOps approach. The service supports both application and network level filtering rules and is integrated with the Microsoft Threat Intelligence feed for filtering known malicious IP addresses and domains. Azure Firewall is highly available with built-in auto scaling.

Forced tunneling support now generally available

Forced tunneling lets you redirect all internet bound traffic from Azure Firewall to your on-premises firewall or to chain it to a nearby network virtual appliance (NVA) for additional inspection. You enable a firewall for forced tunneling when you create a new firewall. As of today, it is not possible to migrate an existing firewall deployment to a forced tunneling mode.

To support forced tunneling, service management traffic is separated from customer traffic. An additional dedicated subnet named AzureFirewallManagementSubnet is required with its own associated public IP address. The only route allowed on this subnet is a default route to the internet, and Border Gateway Protocol (BGP) route propagation must be disabled.

Within this configuration, the AzureFirewallSubnet can now include routes to any on-premises firewall or NVA to process traffic before it's passed to the internet. You can also publish these routes via BGP to AzureFirewallSubnet if BGP route propagation is enabled on this subnet.

Figure 1. Azure Firewall in forced tunneling mode.

Avoiding SNAT with forced tunneling

Azure Firewall provides automatic SNAT for all outbound traffic to public IP addresses. Azure Firewall doesn’t SNAT when the destination IP address is a private IP address range per IANA RFC 1918. This logic works perfectly when you egress directly to the internet. However, with forced tunneling enabled, internet-bound traffic ends up SNATed to one of the firewall private IP addresses in AzureFirewallSubnet, hiding the source from your on-premises firewall. You can configure Azure Firewall to not SNAT regardless of the destination IP address by adding “0.0.0.0/0” as your private IP address range. Note that with this configuration, Azure Firewall can never egress directly to the internet. For more information, see Azure Firewall SNAT private IP address ranges.

Figure 2. Azure Firewall doesn’t SNAT private IP prefixes configuration.

Routing to public PaaS and Office 365

While Azure Firewall forced tunneling allows you to direct all internet-bound traffic to your on-premises firewall or a nearby NVA, this is not always desirable. For example, it is likely preferable to egress to public Platform as a Service (PaaS) or Office 365 directly. It is possible to achieve this by adding User Defined Routes (UDR) to the AzureFirewallSubnet with next hop type “Internet” for specific destinations. As this definition is more specific than the default route, it will take precedence. See Azure IP Ranges and Service Tags. and Office 365 IP addresses for more information.

As an alternative approach for egressing directly to public PaaS, you can enable Virtual Network (VNet) service endpoints on the AzureFirewallSubnet. These endpoints extend your virtual network private address space and identity to the Azure PaaS services over a direct connection. When enabled, specific routes to the corresponding PaaS services are automatically created. Service endpoints allow you to secure your critical Azure service resources to your VNet only. Traffic from your VNet to the Azure service always remains on the Microsoft Azure backbone network.

It is important to note that with this configuration, you will not be able to add “0.0.0.0/0” as your private IP prefix as shown previously, but you can still add custom ranges that will not be SNATed.

Finally, it is also possible to use Azure Private Endpoint to connect privately and securely to public PaaS services powered by Azure Private Link. However, these connections will bypass your default route to Azure Firewall as described in this documentation. If you require all traffic to go via your firewall, you can mitigate by adding a UDR on all client subnets with the Private Endpoint IP address and a /32 suffix as the destination and Azure Firewall as the next hop. Note that for this configuration to work and for the returned traffic from your private endpoint to go via your firewall as well, you will have to always SNAT, by using 255.255.255.255/32 as your private IP address range.

Figure 3. A UDR to a Storage Private Endpoint pointing to the firewall as a next hop.

SQL FQDN filtering now generally available

You can now configure SQL FQDNs in Azure Firewall application rules. This allows you to limit access from your VNet to only the specified SQL Server instances. You can filter traffic from VNets to an Azure SQL Database, Azure SQL Data Warehouse, Azure SQL Managed Instance, or SQL IaaS instances deployed in your VNets.

SQL FQDN filtering is currently supported in proxy-mode only (port 1433). If you use non-default ports for SQL Infrastructure as a Service (IaaS) traffic, you can configure those ports in the firewall application rules.

If you use SQL in the default redirect mode, you can still filter access using the SQL service tag as part of network rules. Adding redirect mode support to application rules is on our roadmap.

Figure 4. SQL FQDN filtering in Azure Firewall application rules.

Multiple public IP addresses limit increase

You can now use up to 250 public IP addresses with your Azure Firewall for both DNAT and SNAT.

DNAT—You can translate multiple standard port instances to your backend servers. For example, if you have two public IP addresses, you can translate TCP port 3389 (RDP) for both IP addresses.
SNAT—Additional ports are available for outbound SNAT connections, reducing the potential for SNAT port exhaustion. Currently, Azure Firewall randomly selects the source public IP address to use for a connection. If you have any downstream filtering on your network, you need to allow all public IP addresses associated with your firewall. Consider using a public IP address prefix to simplify this configuration.

For more information see Deploy an Azure Firewall with multiple public IP addresses.

Next steps

For more information on everything we covered here, see the following:

 

Azure Firewall documentation
Azure Firewall forced tunneling
SQL FQDN filtering with Azure Firewall
Azure Firewall–multiple public IP addresses
What is Azure Firewall Manager preview
Use Azure Firewall for secure and cost-effective Windows Virtual Desktop protection

Quelle: Azure

Azure Files enhances data protection capabilities

Protecting your production data is critical for any business. That’s why Azure Files has a multi-layered approach to ensuring your data is highly available, backed up, and recoverable. Whether it’s a ransomware attack, a datacenter outage, or a file share that was accidentally deleted, we want to make sure you can get everything backed up and running again pronto. To give you a peace of mind with your data in Azure Files, we are enhancing features including our new soft delete feature, share snapshots, redundancy options, and access control to data and administrative functions.

Soft delete: a recycle bin for your Azure file shares

Soft delete protects your Azure file shares from accidental deletion. To this end, we are announcing the preview of soft delete for Azure file shares. Think of soft delete like a recycle bin for your file shares. When a file share is deleted, it transitions to a soft deleted state in the form of a soft deleted snapshot. You get to configure how long soft deleted data is recoverable for before it is permanently erased.

Soft-deleted shares can be listed, but to mount them or view their contents, you must undelete them. Upon undelete, the share will be recovered to its previous state, including all metadata as well as snapshots (Previous Versions).

We recommend turning on soft delete for most shares. If you have a workflow where share deletion is common and expected, you may decide to have a very short retention period or not have soft delete enabled at all. Soft delete is one part of a data protection strategy and can help prevent inadvertent data loss.

Soft delete is currently off by default for both new and existing storage accounts, but it will be enabled by default for new storage accounts in the portal later this year. In the API, it will be on by default beginning January 1, 2021. You can toggle the feature on and off at any time during the life of a storage account. The setting will apply to all file shares within the storage account. If you are using Azure Backup, soft delete will be automatically enabled for all protected instances. Soft delete does not protect against individual file deletions—for those, you should restore from your snapshot backups. To learn more about soft delete, read Prevent accidental deletion of Azure file shares.

Snapshot backups you can restore from

Snapshots are read-only, point-in-time copies of your Azure file share. They’re incremental, meaning they’re very efficient—a snapshot only contains as much data as has changed since the previous snapshot. You can have up to 200 snapshots per file share and retain them for up to 10 years. You can either manually take these snapshots in the Azure portal, via PowerShell, or command-line interface (CLI), or you can use Azure Backup, which recently announced that the snapshot management service for Azure Files is now generally available. Snapshots are stored within your file share, meaning that if you delete your file share, your snapshots will also be deleted. To protect your snapshot backups from accidental deletion, ensure soft delete is enabled for your share.

Azure Backup handles the scheduling and retention of snapshots, you define the backup policy you want when setting up your Recovery Services Vault, and then Backup does the rest. Its new grandfather-father-son (GFS) capabilities mean that you can take daily, weekly, monthly, and yearly snapshots, each with their own distinct retention period. Azure Backup also orchestrates the enablement of soft delete and takes a delete lock on a storage account as soon as any file share within it is configured for backup. Lastly, Azure Backup provides certain key monitoring and alerting capabilities that allow customers to have a consolidated view of their backup estate.

You can perform both item-level and share-level restores in the Azure portal using Azure Backup. All you need to do is choose the restore point (a particular snapshot), the particular file or directory if relevant, and then the location (original or alternate) you wish you restore to. The backup service handles copying the snapshot data over and shows your restore progress in the portal.

If you aren’t using Azure Backup, you can perform manual restores from snapshots. If you are using Windows and have mounted your Azure file share, you can use File Explorer to view and restore from snapshots using the “Previous Versions” API (meaning that users can perform item-level restores on their own). When restoring from a single file, it picks up any versions that were different in previous snapshots. When used on an entire share, it will show all snapshots that you can then browse and copy from.

You can also restore by copying data from your snapshots using your copy tool of choice. We recommend using AzCopy (requires the latest version, v10.4) or Robocopy (requires port 445 to be open). Alternatively, you can simply mount your snapshot and then do a simple copy and paste of the data back into your primary.

If you are using Azure File Sync, you can also utilize server-side Volume Shadow copy Service (VSS) snapshots with Previous Versions to allow users to perform self-service restores. Note that these are different from snapshots of your Azure file share and can be used alongside—but not as a replacement for—cloud-side backups.

Data replication and redundancy options

Azure Files offers different redundancy options to protect your data from planned and unplanned events ranging from transient hardware failures, network and power outages, to massive natural disasters. All Azure file shares can use locally-redundant (LRS) or zone-redundant storage (ZRS). Geo-redundant (GRS) and geo-zone-redundant storage (GZRS) is available for standard file shares under 5 TB and we are actively working on geo-redundant storage for standard file shares of up to 100 TiB.

You can achieve geographic redundancy for your premium file shares in the following ways. You can set up Azure File Sync to sync between your Azure file share (your cloud endpoint) and a mounted file share running on a virtual machine (VM) in another Azure region (your server endpoint). You must disable cloud tiering to ensure all data is present locally (note that your data on the server endpoint may be up to 24 hours outdated, as any changes made directly to the Azure file share are only picked up when the daily change detection process runs). It is also possible to create your own script to copy data to a storage account in secondary region using tools such as AzCopy (use version 10.4 or later to preserve access control lists (ACLs) and timestamps).

Access control options to secure your data

Another part of data protection is securing your data. You have a few different options for this. Azure Files has long supported access control via the storage account key, which is Windows Challenge/Response (NTLM)-based and can be rotated on a regular basis. Any user with storage account key access has superuser permissions. Azure Files also now supports identity-based authentication and access control over Server Message Block (SMB) using on-premises Active Directory (preview) or Azure Active Directory Domain Services (Azure AD DS). Identity-based authentication is Kerberos-based and allows you to enforce granular access control to your Azure file shares.

Once either Azure AD or on-premises Azure AD DS is configured, you can configure share-level access via built-in Role-based Access Control (RBAC) roles or configure custom access roles for Azure AD identities, and you can also configure directory and file-level permissions using standard Windows file permissions (also known as NTFS ACLs).

Multiple data protection strategies for Azure Files

Azure Files gives you many tools to protect your data. Soft delete for Azure file shares protects against accidental deletion, while share snapshots are point-in-time copies of your Azure file share that you can take manually or automatically via Azure Backup and then restore from. To ensure high availability, you have a variety of replication and redundancy options to choose from. In addition, you can ensure appropriate access to your Azure file share with identity-based access control.

Let us know what you think

We look forward to hearing your feedback on these features and suggestions for future improvements through email at azurefiles@microsoft.com. You can also upvote or add new suggestions for Azure Files via UserVoice.
Quelle: Azure