Data scientists assist medical researchers in the fight against COVID-19

Cutting-edge technological innovation will be a key component to overcoming the COVID-19 pandemic. Kaggle—the world’s largest community of data scientists, with nearly 5 million users—is currently hosting multiple data science challenges focused on helping the medical community to better understand COVID-19, with the hope that AI can help scientists in their quest to beat the pandemic.The Kaggle community has been working hard forecasting COVID-19 fatalities, summarizing the COVID-19 literature, and sharing their work under open-source Apache 2.0 licenses (on Kaggle.com). In this post, we’ll take a detailed look at a few of the challenges underway right now, and some interesting strategies our community is using to solve them. NLP vs. COVID-19 The volume of COVID-19 research is becoming unmanageable. In May there were about 357 scientific papers on COVID-19 published per day, up from 16 per day in February. In March, officials from the White House and global research organizations asked Kaggle to host a natural language processing (NLP) challenge with the goal of distilling knowledge from a large number of continuously released pre-print publications. Specifically, Kaggle’s community is trying to answer nine key questions that were drawn from both the National Academies of Sciences, Engineering, and Medicine’s Standing Committee on Emerging Infectious Diseases research topics and the World Health Organization’s R&D Blueprint for COVID-19. To answer these questions, we’re sharing a corpus of more than 139,000 scientific articles that have been stored in a machine-readable format. Already, there’s a lot of interesting work being done using transformer language models such as SciBERT, BioBERT, and other similar models, and we encourage you to check out the code (Python/R), which has all been open-sourced.Figure 1, for instance, illustrates the first two rows from an article summary table that describes recent findings concerning the impact of temperature and humidity on the transmission of COVID-19. Preliminary tables are generated by Kaggle notebooks that extract as much relevant information as possible, and then the results are double-checked for accuracy and missing values by a team of medical experts. The article summary tables contain text excerpts that were extracted directly from the original publications. Summary tables like these, which can be produced in an expedited fashion, make it much easier for researchers to keep up with the rapid rate of publication.Figure 1Figure 1: A representative article summary table from here. The articles are sorted chronologically and the table provides information about the study results, the study type, and the study design. Each row also shows the title of the study, complete with a link to the full-text PDF, and a reference to the journal that the article was published in.“My initial approach was to build a semantic similarity index over the data, enabling researchers to find topic/keyword matches. I learned that while search is important, researchers need more context to evaluate the study behind the paper,” explained David Mezzetti, a US-based contributor on Kaggle and founder of NeuML. “Much of my efforts have been focused on using NLP to extract study metadata (design, sample size/method, risk factor stats), allowing researchers to not only find relevant papers but also judge the credibility of its conclusions.” Time series forecasting vs. COVID-19 On March 23, Kaggle also started hosting a series of global transmission forecasting competitions, to explore new approaches to modeling that may be useful for epidemiologists. The goal is to predict the total number of infections and fatalities for various regions—with the idea being that these numbers should correlate well with the actual number of hospitalizations, ICU patients, and deaths—as well as the total number of scarce resources that will be needed to respond to the crisis.  Forecasting COVID-19 has been a very challenging task, but we hope that our community can generate approaches to forecasting that can be useful for medical researchers. So far, the results have been promising. As we can see in the plot below, the winning solution from the Kaggle competitions performed on par with the best epidemiological models in April in terms of RMSLE—Root Mean Square Log Error, a measure of the differences between the log of predicted values and actual values—for predicting fatalities in 51 U.S. states and territories over the following 29 days. (Models may have been optimized for varying objective functions, and so this is an approximate comparison.)Figure 2Figure 2: Measurements of error for four different COVID-19 forecasting models, taken from here. The y-axis is the root mean square log error (RMSLE) for predictions over the next 29 days; lower is better.“This competition series showed that it is still a challenging problem to solve and currently a combination of transforming data into a consumable format from various sources, understanding the difference in modelling short-term forecasts vs. long-term forecasts, and using simpler machine learning models with some adjustments seems to perform the best,” said Rohan Rao, a Kaggle competitor based in India. “I hope with more data availability and research of how the virus spreads in various countries, we should be able to add intelligent features to improve and optimize these forecasts and tune it for each geography.” Participants have had success using advanced ensembles of machine learning models such as XGBoost and LightGBM (ex1, ex2, ex3). Participants have also identified important sources of external data that can potentially help to make more accurate predictions (ex1), including population size, population density, age distribution, smoking rates, economic indicators, and nation-wide lockdown dates. By examining the relative contribution of different model features using techniques such as feature importances and SHAP Values (SHapley Additive exPlanations), participants have been able to shed light on the factors that are most predictive in forecasting COVID-19 infections and fatalities. There is a lot of interesting work being done using neural networks and gradient boosted machines, and we encourage you to check out the code (Python/R), which has all been open-sourced.Public data vs. COVID-19 Kaggle also hosted a dataset curation challenge with the goal of finding, curating, and sharing useful COVID-19-related datasets—especially those that can be useful for forecasting the virus’s spread. Winning submissions thus far include: County-level Dataset for Informing the United States’ Response to COVID-19: describes behaviors concerning demographics, healthcare, and social distancing interventions, that can potentially be used to predict the progress of the pandemic.COVID-19 Lockdown Dates by Country: can potentially inform models by indicating a point in time when the rate of growth should slow.COVID-19 Tests Conducted by Country: can potentially inform models of whether the increased number of infections is due to a spread of the disease or due to a spread of our testing capabilities. By considering these regional policies, dates of enforcement, and testing protocols you can make much better data-driven conclusions.  Along those same lines, dataset publishers can also quickly spin up self-service tasks or challenges on Kaggle. For example, the Roche Data Science Coalition (RDSC) recently published a collection of publicly available COVID-related datasets and formed a challengefocused on attempting to answer the most pressing questions forwarded to them from frontline responders in healthcare and public policy. Kaggle is a free platform that allows all users to upload datasets, host data analysis challenges, and publish notebooks—and we encourage data scientists and data publishers to come together to fight COVID-19.ConclusionData scientists across the globe are collaborating to help the medical community to defeat COVID-19, and we could use your help. You can keep up-to-date with our challenges at kaggle.com/covid19, and see the progress our community is making towards achieving the goals we’ve discussed here at kaggle.com/covid-19-contributions.
Quelle: Google Cloud Platform

Six reasons customers trust Azure to run their SAP solutions

As global organizations across every industry adjust to the new normal, SAP solutions are playing an increasingly vital role in addressing immediate needs and paving a path to a resilient future. Now more than ever, companies are realizing the value of running their SAP solutions in the cloud. While some are using advanced analytics to process their SAP data to make real-time business decisions, others are integrating their SAP and non-SAP data to build stronger supply chains. Whether it’s meeting urgent customer needs, empowering employees to make quick decisions, or planning for the future, customers running SAP solutions in the cloud have been well prepared to face the new reality. Check out how Walgreens delivers superior customer service with SAP solutions on Microsoft Azure.

Many organizations running their SAP solutions on-premises have become increasingly aware of the need to be more agile and responsive to real-time business needs. According to an IDC survey, 54 percent of enterprises expect the future demand for cloud software will increase. As global organizations seek agility, cost savings, risk reduction, and immediate insights from their ERP solutions, here are some reasons many of the largest enterprises choose Microsoft Azure as their trusted partner when moving their SAP solutions to the cloud.

1. Running SAP solutions on Azure delivers immediate insights and increased agility

“Now that we have SAP in the cloud … we have a platform for digital innovation in the cloud … With Azure, we’ve lifted our entire IT landscape up to a higher level where we can drive experimentation with much less risk and much less cost.”—Sarah Haywood, Chief Technology Officer and Vice President of Technology at Carlsberg Group

Organizations running SAP solutions on Azure gain real-time and predictive insights that empower them to break into new ways of doing business. Azure offers the ability to tap into more than 100 cloud services, access SAP Cloud Platform, apply intelligent analytics, and also integrate with an organization’s existing productivity and collaboration tools such as Microsoft 365, Microsoft Teams, Microsoft Power Apps, and Microsoft Power BI.

With Azure, organizations can integrate their SAP and non-SAP data through an extensive portfolio of Azure data services and create real-time dashboard views of the current operations using SAP and Microsoft business intelligence tools. Using intelligent analytics deepens real-time and predictive insights to improve decision-making by responding dynamically as business conditions change, and how that change impacts your customers or products. Integration with Teams and Microsoft 365 improves team collaboration and enhances user experience and productivity. Using Microsoft Power Automate, Power Apps, and Power BI, organizations can create customized workflows, apps, and business insight reports without having to write any code.

 

2. An ever-evolving and growing set of Azure cloud services drives continuous innovation

“We are looking at drones, IoT, RFID sensors, artificial intelligence, chatbots, and every other futuristic technology you can think of to do mining better, and with Azure we have a broad foundation for exploring all that.”—Head of Enterprise IT Services, Rio Tinto

While Zuellig Pharma is building an app that uses Azure blockchain services and data from the SAP Business Suite on HANA to track and capture counterfeit products and illegal parallel imports in its region, Walgreens plans to use AI and machine learning to develop new customer offerings quickly and respond in real time to changes in the marketplace.

Customers such as Rio Tinto are using Azure’s secure and scalable IoT applications to pilot a solution to take real-time data from trucks, drills, smelters, and other equipment and analyze it to gain equipment health, preemptive maintenance, supply chain efficiency, and other operational intelligence. Additionally, with DevOps with GitHub and Azure Kubernetes Service, customers can build, manage, and deploy applications on a massive global network.

3. Running SAP solutions on Azure offers costs savings

“We chose to migrate to Azure for three main reasons: cost, strategy, and speed … We saw a big cost advantage with SAP HANA on Azure over the cloud we currently used”—David South, Director of Architecture at Coke One North America Services

A Forrester study showed customers achieved more than 100 percent ROI, a 50 percent reduction in data center costs, and a 100 percent reduction in SAP release delays by migrating their SAP systems to Azure. Moving to Azure not only eliminates capital expenditure and cost of underutilized hardware, but it also offers cost management tools such as on-demand scaling during peak usage periods, using cheaper storage, and optimizing disaster recovery environments.

By running SAP solutions on Azure, organizations replace expensive, manual, and error-prone processes with automated, flexible processes, and with a single ticket-to-solution experience, enterprises empower employees to focus on value-added activities by putting data in their hands.

4. Running SAP solutions on Azure offers immense flexibility and scalability

“Moving to Azure gives us the scalability we need … running SAP on Azure gives us the agility and flexibility we need to disrupt the healthcare industry in a way that improves our customers’ access to the products and services they need.”—Dan Regalado, Vice President of Global Technology Transformation and Strategic Partnerships, Walgreens

Customers across every industry run their largest production SAP landscapes on Azure because it is a proven cloud platform certified by SAP to run their most mission-critical SAP applications. Azure offers the industry’s most performant and scale-able cloud infrastructure—offering 192 GB to 12 TB SAP HANA certified VMs in more regions than any other public cloud provider along with support for both Linux and Windows OS. Azure offers on-demand scalability and agility that reduces the time to market —customers can spin up or spin down resources as needed. For instance, Daimler AG reduced operational costs by 50 percent and increased agility by spinning up resources on-demand in 30 minutes with SAP S/4HANA and Azure. 

Azure also offers access to more than 1,000 pre-built integrations, out-of-the-box business services, SAP HANA services, and apps built by SAP and our partners. Customers such as Tate and Lyle appreciate that with Azure, they get access to compute, network, and storage resources preconfigured for SAP HANA that they didn’t have to build, install, or manage.

5. SAP solutions on Azure offer best-in-class security, compliance, and business continuity

“If you go to the Microsoft Trust Center, you can see the tremendous investment Microsoft makes in security certifications and compliance. It would have been very costly for Kennametal to implement that level of security within our own environment. Instead, we get to inherit it from Microsoft.”—John Johnston, Senior Manager, Global Information Security and Compliance, Kennametal

Azure’s intelligent security services are backed by a $1 billion annual investment in enterprise-grade security and compliance offers and 3,500 cybersecurity professionals. Azure has the most compliance offerings of any public cloud. Azure offers the best-in-class security services such as Azure Sentinel for SIEM, Azure security center for threat monitoring, and Azure Active Directory for identity management. Additionally, customers can leverage built-in availability and recovery options such as Azure Backup and Azure Site Recovery to ensure business continuity and data protection. Microsoft teams work closely with partners to ensure that critical systems remain online during migration and offer a robust set of joint planning workshops, migration programs such as FastTrack, POCs, and training and certifications.

6. Organizations benefit from the trusted partnership between SAP and Microsoft

“We needed a provider that enjoys a close partnership with SAP, understands our needs, and can accelerate our migration and expand our capabilities. Azure answered every need.”—Joshua Sefchek, Manager of Cloud and Enterprise Services, Toyota Material Handling North America

After decades of working together to serve our customers, SAP and Microsoft deepened their relationship by signing the Embrace initiative. As part of Embrace, SAP will lead with Azure to move on-premise SAP ERP and SAP S/4HANA customers to the cloud through industry-specific best practices, reference architectures, and cloud-delivered services. Our engineering teams co-residing in Germany and Redmond, Washington work together to develop joint reference architectures, product integration roadmaps, and best practices; our industry teams are jointly developing industry-specific transformation roadmaps, and our support teams have developed collaborative support models.

SAP and Microsoft have been partners for more than 25 years and are also mutual customers. Microsoft is the only cloud provider that’s been running SAP for its own finance, HR, and supply chains for the last 20 years, including SAP S/4HANA. Likewise, SAP has chosen Azure to run a growing number of its own internal system landscapes, including those based on SAP S/4HANA. Microsoft IT and SAP IT generously share their learnings from running SAP solutions on Azure with our customers. Check out the latest MSIT webinar and SAP IT webinar for some best practices.

More than 95 percent of Fortune 500 companies run their business on Azure. Our experience and history give us a powerful understanding of the needs of enterprise customers. Together with SAP, customers have trusted us with their most critical workloads for decades because we understand what it takes to support our customers in their journey to the cloud.

We look forward to seeing you this month at the virtual SAPPHIRE and ASUG events. Learn more about SAP solutions on Azure and read today’s announcement about new offerings to help our SAP customers optimize costs and increase agility. 
Quelle: Azure

Optimize costs and increase agility with the latest SAP on Azure offerings

SAP SAPPHIRE NOW is an event we look forward to year after year, as it’s always a place to meet our customers and learn how we can continue to support their evolving needs. This year, those conversations will take a different format, but thanks to technology, we can still connect with our customers across the globe.

We’re hearing from enterprises that more than ever before they need a trusted cloud partner to support business continuity, agility, and real-time decision making for their mission-critical business processes. In addition, they need help to manage costs effectively.

To help our customers achieve these goals, and as part of the Embrace initiative with SAP, today we're announcing a series of new offerings and reference architectures. These offerings unlock the power of Azure for SAP workloads, including integration with SAP Cloud Platform and Microsoft products like Office 365 and Power Platform.

M-series virtual machines (VMs): Our latest updates to M-series virtual machine (VM) offerings will help increase agility through seamless scale-up and scale-down.
New use cases supported by reference architectures for integration scenarios: As part of our Embrace initiative with SAP, we are announcing five use cases supported by reference architectures that cover integration scenarios that will help our customers get immediate insights by integrating their SAP and non-SAP environments.
More Azure Large Instances options: We are also launching 18 new SKUs for Azure Large Instances including the largest Intel Optane bare metal instances available in the cloud, so customers can choose the optimum configuration for their workload profiles while optimizing costs.
New DevOps capabilities: We are making it even easier for customers to automate and integrate SAP workloads in Azure using ready-made building blocks to support a DevOps model.
NetWeaver-certified virtual machines: Our latest SAP NetWeaver-certified virtual machines deliver a lower price to performance ratio and help drive total cost of ownership (TCO) reduction.

M-series updates increase agility and cost effectiveness

One year ago at SAPPHIRE 2019, we introduced our Mv2 virtual machine series to support customers with databases from 6 to 12 TBs, the largest-memory SAP HANA-certified configuration available on virtual machines in the public cloud. Since then, adoption has grown rapidly with customers like Luxottica Group, Kennametal, Coats, and Accenture relying on Mv2 virtual machines for their production workloads.

Seamlessly scale up or down from 2 to 416 vCPUs and from 16GiB to 2TiB Memory

As the needs of your SAP workloads on Azure change, you can change to different virtual machine families or sizes without worrying about the underlying hardware and by simply resizing your virtual machines. With our latest investments in the Gen2 virtual machine support for Mv1 (aka M-series), you can start small with your SAP database with the Esv3-series and move to larger sizes on the Mv1 (aka M-series) as your workload needs grow, and seamlessly scale up to the Mv2-series as your workloads approach over 400 vCPUs and up to 12TiB memory. Learn more about Gen 2 virtual machines advantages.

Achieve flexibility and agility with our expanding regional footprint for the M-series

The Mv1 (aka M-series) virtual machines are available in 34 regions and Mv2 virtual machines are available in 12 regions, and we are now expanding our regional availability footprint even further. For Mv1 (aka M-series) virtual machines we have recently added United Arab Emirates (UAE) Central and will be adding US West, US Central, and North Central US in 2020. For Mv2 virtual machines, we will be expanding to Brazil South, Germany West Central, Japan East, UAE Central, North Central US, and West US in 2020. Please refer to virtual machine availability by region for the latest regional availability.

Reduce software licensing costs with our new Mv2 constrained core sizes

Starting in July 2020, customers can constrain the Mv2 virtual machine vCPU count while maintaining the same memory, storage, and I/O bandwidth of the unconstrained core Mv2 size.

New use cases supported by reference architectures drive immediate insights and agility by integrating SAP and non-SAP data

As part of the unique Embrace initiative between Microsoft and SAP, we are focused on accelerating time to business outcomes by helping customers integrate across Azure, SAP Cloud Platform, and Microsoft offerings. To support this, we have jointly released a series of five use cases supported by reference architectures and will continue to release unique integration patterns to create great customer experiences. The use cases announced today are focused on identity, workflow and service integration. Over the coming months, we will start to integrate further with Office 365 Graph, Microsoft Teams, and Power Platform:

1. How to consume Microsoft Azure services in SAP Cloud Platform

2. Establish identity and authentication workflow between SAP and Microsoft

3. Extend SAP S/4HANA with SAP and Microsoft services

4. Simplify business process integration across SAP and Microsoft through enterprise integration and extension

5. Intelligently optimize and automate processes with SAP and Microsoft services

New Azure Large Instances SKUs help achieve a lower total cost of ownership and the fastest recovery times in the market

With the launch of 18 new SKUs, we now offer 24 SKUs powered by the 2nd Generation Intel Xeon Platinum processors, supporting Intel® Optane™ persistent memory, making it the most comprehensive portfolio in the market. Learn more by reviewing our portfolio. Additionally, Azure Large Instances is the first to bring to market two unique capabilities:

NetApp SnapCenter, which enables customers to use a SnapCenter console to take and restore consistent live snapshots of databases as large as 96 TB within seconds.
A solution to combine the power of Bare Metal with the agility of virtual machines: With the ability to mount Azure Large Instance database volumes on virtual machines, Azure Large Instances customers can now dynamically spin up virtual machines to accomplish tasks such as refresh, consistency checks, and data distribution within minutes at a significantly lower cost than before.

Latest DevOps offerings simplify monitoring, backup, and deployment of SAP workloads

We have added new capabilities to help our customers more easily deploy, monitor, and back up SAP solutions on Azure:

Simplify deployment with SAP Automation with Terraform and Ansible

Since the initial announcement of SAP Automation for Azure (v1), we have evolved the vision and broadened the scope of automation for SAP on Azure. With leading industry automation tools, Terraform and Ansible, we are developing common building blocks, to simplify deployment of SAP landscapes on Azure as well as provide consistency in these deployments. Today, we are announcing that these building blocks will be available in a GitHub Open Source repository (sap-hana) as (v2) for SAP Automation by July 2020. The automation solution is based on the best practices specified by Microsoft and SAP as part of our reference architectures for SAP. You can learn more about the scenarios supported in our GitHub.

Seamlessly monitor SAP landscapes with Azure Monitor for SAP solutions

With the preview of Azure Monitor for SAP solutions, customers will be able to centrally collect and visually correlate telemetry data from Azure infrastructure and databases in one location for faster troubleshooting. Customers will be able to deploy Azure Monitor for SAP solutions resources with a few simple clicks from Azure portal and monitor the following components: SAP HANA on Azure virtual machines or Azure Large Instances, SQL Server on Azure virtual machines, and Pacemaker High-availability clusters on Azure virtual machines or Azure Large Instances. With the preview starting in July 2020, the product will be available in US East, US East 2, US West 2, and West EU regions with more regions to follow soon.

Ensure business continuity by instantly backing up your SAP HANA databases running on SUSE Linux Enterprise Server (SLES) and Red Hat Enterprise Linux (RHEL) platform

In addition to supporting SAP HANA workloads on SUSE Linux Enterprise Server(SLES), Azure Backup for SAP HANA workloads running on Red Hat Enterprise Linux (RHEL) is now in preview. Azure’s native backup solution for SAP HANA offers zero-infrastructure backup, one-click point-in-time restore capability, and long term retention and backup management capabilities. The preview will be available across all Azure regions except for Germany Northeast, Germany Central, France South, and US Gov IOWA. Please leverage the tutorial for backing up your SAP HANA databases for your RHEL based systems.

NetWeaver-certified virtual machine updates

Achieve a lower price to performance ratio with our new NetWeaver-certified virtual machines

We are excited to be the first hyperscaler to offer new SAP NetWeaver certified virtual machine families based on AMD EPYCTM 7452 processors. These new Dasv4 and Easv4 virtual machine families offer superior performance for the SAP application layer and SAP-supported databases (excluding HANA). The increased performance of these virtual machine families provides a lower price to performance ratio, driving down total cost of ownership. You can learn more about the global availability of these virtual machines by referring to virtual machine availability by region.

We are glad we could share these updates with you and we want to hear from you on how we could continue to build solutions to help you meet your evolving needs. We look forward to seeing you at virtual SAPPHIRE NOW, where we are sponsoring the Intelligent Enterprise track on Monday, June 15. To learn more, check out our latest blog on reasons customers trust Azure to run their SAP solutions or visit our website. Please share your feedback and join the conversation with other SAP experts on the Microsoft Tech Community.
Quelle: Azure

New general purpose and memory-optimized Azure Virtual Machines with Intel now available

Today we're announcing the availability of new general purpose and memory-optimized Azure Virtual Machines based on the 2nd generation Intel Xeon Platinum 8272CL (Cascade Lake). This custom processor runs at a base speed of 2.5GHz and can achieve all-core turbo frequency of 3.4GHz. It features Intel® Deep Learning Boost Technology, Intel® Advanced Vector Extensions 512 (Intel® AVX-512), Intel® Turbo Boost Technology 2.0, and Intel® Hyper-Threading Technology.

With this announcement, we are introducing two new Azure Virtual Machines families, one of which represents a brand-new product category in our portfolio:

The Azure Ddv4 and Ddsv4 and Edv4 and Edsv4 virtual machines, which include a local data temporary disk (now generally available)
The Azure Dv4 and Dsv4 and Ev4 and Esv4 virtual machines, a new category of virtual machines, which rely on remote disks and do not provide temporary local storage (now in preview).

The new virtual machine (VM) sizes deliver up to roughly 20 percent CPU performance improvement compared to their predecessors, the Dv3 and Ev3 VM families.

New Ddv4 and Ddsv4 and Edv4 and Edsv4 VMs are generally available

The new Ddv4 and Ddsv4 and Edv4 and Edsv4 VM sizes include fast, larger local solid state drive (SSD) storage and are designed for applications that benefit from low latency, high-speed local storage, such as applications that need fast reads and writes to temporary storage, or applications that need temporary storage for caches or temporary files. These new VM sizes offer 50 percent larger local storage, as well as better local disk IOPS for both Read and Write operations compared to the Dv3 and Dsv3 and Ev3 and Esv3 sizes with generation 2 (Gen 2) VMs. The new Ddv4 and Ddsv4 and Edv4 and Edsv4 VM sizes can be attached to standard HDD, standard SSD, premium SSD, or ultra SSD persistent disks.

The new Ddv4 and Ddsv4 VM sizes provide a good balance of memory-to-vCPU performance, with up to 64 vCPUs, 256 GiB of RAM, and include local SSD storage (up to 2,400 GiB). These VM families are ideal for development and testing, small to medium databases, and low-to-medium traffic web servers.

The new Edv4 and Edsv4 VM sizes feature a high memory-to-CPU ratio, with up to 64 vCPUs, 504 GiB of RAM, and also include local SSD storage (up to 2,400 GiB). These VM families are great for relational database servers and in-memory analytics.

New Dv4 and Dsv4 and Ev4 and Esv4 VMs now in preview

The Dv4 and Dsv4 and Ev4 and Esv4 VM sizes are new offerings that do not include the local temporary disk. These new VM families offer 20 percent CPU improvement over the Dv3 and Ev3 VM families. The new Dv4 and Ev4 VM sizes can be attached to standard HDD and standard SSD persistent disks. While the Dsv4 and Esv4 VM sizes can be attached to standard HDD, standard, SSD, premium SSD, or ultra SSD persistent disks. If you are currently using v3 VM sizes, switching to v4 sizes will provide you a better price-per-core performance option.

If you are currently using v3 VM sizes, switching to v4 sizes will provide you a better price-per-core performance option.

The new Dv4 and Dsv4 VM sizes provide a good balance of memory to vCPU performance, with up to 64 vCPUs and 256 GiB of RAM. These VM families are ideal for development and testing workloads, small-to-medium databases, and low-to-medium traffic web servers.

The new Ev4 and Esv4 VM sizes feature a high memory-to-CPU ratio, with up to 64 vCPUs and 504 GiB of RAM. These VM families are great for relational database servers and in-memory analytics.

Customers can request access to these new VMs (with no local temporary disk) currently in preview today by filling out this form. If you have any further questions or feedback, please reach out to us directly.

Working in collaboration with Intel while meeting our customer needs

“The launch of Azure D-v4 and E-v4-series virtual machines further extends the Microsoft IaaS portfolio to meet the diverse needs of our customers. Powered by 2nd Generation Intel® Xeon Scalable Processors, these virtual machines offer optimized application performance for web and data services, desktop virtualization and business applications moving to Azure.” —Jason Grebe, Intel CVP Cloud and Enterprise

With these new VM sizes, we are providing more customer value with better CPU performance.

“Silicon design workloads require high CPU performance, large number of cores, high memory-to-core ratios, and sufficient local storage. The newly introduced Edsv4 family meets all these requirements, making it an ideal choice for our use cases. Using the Edsv4 VMs, TSMC was able to successfully create a brand new Scale-Out/Scale-In silicon design strategy, helping designers achieve significant run-time speedup and cost optimization.” —Willy Chen, Deputy Director, Design & Technology Platform, TSMC

Frequently asked questions

Customers regularly ask what the differences are between the new VMs and the general purpose Dv3/Dsv3 or memory-optimized Ev3/Esv3 VM sizes that they’re currently using. The answer is that you’ll now have more options to choose from. The table below summarizes the key differences:

Customers also ask what happens if they still need a local temp disk for their VM. You can choose the Ddv4 and Ddsv4 or Edv4 and Edsv4 VM sizes for your application if a local disk is still required.

For more frequently asked questions related to these VM sizes, refer to Azure VM sizes with no local temp disk.

Region availability for Ddv4, Ddsv4, Edv4, and Edsv4 VM sizes

The new Ddv4, Ddsv4, Edv4, and Edsv4 VM families are available in Pay-As-You-Go, Reserved Instance, and Spot in the following Regions. Prices vary by Region.

Get started today

Learn more about Ddv4 and Ddsv4-series or Edv4 and Edsv4-series (with local temporary disk) now generally available.
You can learn more about the Dv4, Dsv4-series or Ev4, Esv4-series VMs (without local temporary disk) that are currently in preview.
You can also request access to the new VMs currently in preview by filling out this form. If you have any further questions or feedback, please reach out to us directly.

Quelle: Azure

Be prepared for what’s next: Accelerate cloud migration

We are in the midst of unprecedented times with far-reaching implications of the global health crisis to healthcare, public policy, and the economy. Organizations are fundamentally changing how they run their businesses, ensure the safety of their workforce, and keep their IT operations running. Most IT leaders that we have had the opportunity to speak with over the past couple of months are thinking hard on how to adapt to rapidly changing conditions. They are also trying to retain momentum on their well-intentioned strategic initiatives.

Across our customers in many industries and geographies, we continue to see the cloud deliver tangible benefits. Azure enables our customers to act faster, continue to innovate, and pivot their IT operations to what matters most. We understand the challenges our customers are facing. We also recognize that our customers are counting on us more than ever.

Common, consistent goals for businesses

Even though our customers’ challenges are often unique to the industries they serve, we hear many common, consistent goals.

Cloud-based productivity and remote collaboration is enabling workers, IT professionals, and developers to work from anywhere. As our customers enable an increase in remote work, there’s increased importance on scaling networking capacity while securely connecting employees to resources they need.
Azure is critical to our customers’ ability to rapidly scale their compute and storage infrastructure to meet their business needs. This is made possible because of how customers have transformed their IT operations with Azure. Driving operational efficiency with Azure can also enable businesses to scale on-demand and meet business needs.
IT budgets will be constrained over the coming year—optimization of existing cloud investments and improving cashflow via migration to Azure are top of mind. Our customers are exploring ways to run their businesses with reduced IT budgets. Owning and managing on-premises datacenters is expensive and makes customers vulnerable to business continuity risk. An Azure migration approach is resonating with these customers to transition spend to Opex, improving cash flow and reducing business risk.
The downtime is also becoming an opportunity to accelerate projects. CIOs are looking at this as an opportunity to deliver planned projects and find ways to innovate with Azure. They are counting on this innovation to help their business experience a steep recovery as we exit the current scenario.

In many of my discussions with customers, we still hear uncertainty about how to navigate the cloud migration journey. There is an urgency to act, but often a hesitation to start. There is, no doubt, a learning curve, but Microsoft has traversed it with many customers over the past few years. Businesses need best practices and prescriptive guidance on where to begin, how to best steer, and how to avoid the pitfalls. This blog is aimed to help you make progress on this pressing need. We’ll dive deeper into the steps of the cloud migration journey in upcoming posts.

To get you started on your accelerated journey to Azure, here are our top three recommendations. While these aren’t meant to be one-size-fits-all, these are based on learnings from hundreds of scale migration engagements that our team has helped our customers with.

1. Prioritize assessments

Perform a comprehensive discovery of your datacenters using our free tools such as Azure Migrate or Movere. Creating an inventory of your on-premises infrastructure, databases, and applications is the first step in generating right-sized and optimized cost projections for running your applications in Azure. Between your existing configuration management database (CMDB), Active Directory, management tools, and our discovery tools, you have everything you need to make crucial migration decisions.

The priority should be to cover the entire fleet and then arrive at key decisions related to candidate apps that you can migrate first and the appropriate migration approach for them. As you run your assessments, identify applications that could be quick wins—hardware refresh, software end-of-support, OS end-of-support, and capacity constrained resources are all great places to prioritize for the first project. Bias towards action and demonstrating urgency with triggers that need immediate attention can ensure that you are able to drive operational efficiencies and flip your capital expenses to operational.

Many Azure customers are doing this effectively. One example is GlaxoSmithKline (GSK). In partnership with Azure engineering and Microsoft FastTrack for Azure, and by leveraging the Azure Migration Program (AMP), GSK was able to quickly discover their VMware virtual machines and physical servers with Azure Migrate. By leveraging features such as application inventory and application dependency mapping, GSK was able to build a prioritized list of applications that they could migrate. They then used the discovery and assessment data and incorporated it with their CMDB to build PowerBI dashboards to track the progress of their strategic migration initiatives.

“Microsoft engineering and FastTrack’s ability to quickly aggregate and visualize our application hosting estate is the cornerstone to our migration planning activities. GSK is comprised of many different business units, and we are able to tailor migration priorities for each of these business units. In addition, we also now have clear visibility for each server, what they are dependent on, and can now also determine the appropriate server size in Azure to create our migration bundles and landing zones. With this excellent foundation of data, we are able to quickly move into the migration phase of our cloud journey with a high degree of confidence in our approach.”—Jim Funk, Director, Hosting Services, GlaxoSmithKline

2. Anticipate and mitigate complexities

You will run into complexities as you drive your migration strategy—some of these will be related to the foundational architecture of your cloud deployments, but a lot of it will be about how your organization is aligned for change. It is important that you prepare people, business processes, and IT environments for the change, based on a prioritized and agreed cloud adoption plan. Every migration we’ve been involved in has had its own unique requirements. We find that customers who are moving quickly are those who have established clarity in ownership and requirements across stakeholders from security, networking, IT, and application teams.

“The migration to the cloud was more about the mindset in the organization and that transformation we needed to do in IT to become the driver of change in the company instead of maintaining the old. A big part of the migration was to reinvent the digital for the company." —Mark Dajani, CIO, Carlsberg Group

On the technical front, anticipate complexities and plan for your platform foundation for identity, security, operations, compliance, and governance. With established baselines across these shared-architectural pillars, deploy purpose-built landing zones that leverage these centralized controls. Simply put, landing zones and the platform foundation capture everything that must be in place and ready to enable cloud adoption across the IT portfolio.

In addition to designing your baseline environment, you would also want to consider your approach to managing your applications as they migrate to Azure. Azure offers comprehensive management solutions for backup, disaster recovery, security, monitoring, governance, and cost management, which can help you achieve IT effectiveness as you migrate. Most customers run in a hybrid reality even when they intend to evacuate on-premises datacenters. Azure Arc is a terrific option for customers who want to simplify complex and distributed environments across on-premises and Azure, extending Azure management to any infrastructure.

3. Execute iteratively

Customers who have the most success in executing on their migration strategy are customers who follow an iterative, workload-based, wave-oriented approach to migration. These customers are using our free first-party migration tools to achieve the scale that works best for their business—from a few hundred to thousands of servers and databases. With Azure Migrate you have coverage for Windows Server and Linux, SQL Server and other databases, .NET and PHP-based web applications, and virtual desktop infrastructure (VDI). These capabilities give you options for migration to infrastructure as a service (IaaS) and platform as a service (PaaS) offerings like Azure App Service and Azure SQL.

The key to success and executing effectively is targeting specific workloads and then executing in phases. In addition, leveraging capabilities like dependency mapping and test migration ensures that your migration cutovers are predictable and have high success rates. We strongly recommend using a lift-optimize-shift approach and then innovating in the cloud, especially during these times.

One such customer who has leveraged the Azure Migrate toolset as part of their cloud transformation is Malaysian telecommunications operator, Celcom. Celcom leveraged Azure Migrate’s discovery and assessment features to securely catalog their applications, virtual machines (VMs), and other IT assets, and to determine the best way to host them in the cloud. With their foundational architecture and management strategy in place, Celcom executed in waves, transitioning their complex multi-vendor on-premises environment with multiple applications over to Azure. Read more about Celcom’s digital transformation with Azure.

Share your feedback

In the coming weeks and months, we will dive deeper into these topics. Please share your experiences or thoughts as this series comes together in the comments below—we appreciate your feedback. You can also visit Azure migration center to learn more and get started.
Quelle: Azure