Help us shape new Azure migration capabilities: Sign up for early access!

Based on Azure Migrate and Azure Site Recovery usage trends, we know that many of you are well along on your Azure migration journey. We’re now working on the next wave of innovation to further enhance and simplify your migration experience. We have a great opportunity for you to influence and shape product direction through early access to new capabilities.

Delivering an integrated end-to-end migration experience that enables you to discover, assess, and migrate servers to Azure is the goal. To that end, we have several new capabilities in our roadmap, including a new user experience with partner tool integration, Hyper-V environment assessment, and server migration enhancements. You are welcome to migrate your workloads to Azure using these new features, and we will enable you by providing production support.

If you’d like to be part of this awesome opportunity, please fill and submit this form as soon as possible. We will review your submission and follow up with on-boarding steps, including detailed guidance on how to participate and provide feedback.

Your feedback is extremely valuable in helping us improve our product offerings. We look forward to sharing more about what we’ve been working on and look forward to your inputs!

Regards,

Azure Migrate Team
Quelle: Azure

Completers in Azure PowerShell

Since version 3.0, PowerShell has supported applying argument completers to cmdlet parameters. These argument completers allow you to tab through a set of values that are valid for the parameter. However, unlike ValidateSet which enforces that only the provided values are passed to the cmdlet, argument completers do not restrict the values that can be passed to the string parameter. Additionally, argument completers can be either a static or a dynamic set of strings. Using this feature, we have added argument completers to the Azure PowerShell modules which allow you to select valid parameter values without needing to make additional calls to Azure. These completers make the required calls to Azure to obtain the valid parameter values.

To best capture the functionality of the completers, I have modified the key binding for “Tab” in the examples below to display all the possible values at once. If you want to replicate this setup, simply run: “Set-PSReadLineKeyHandler -Key Tab -Function Complete.”

Location completer

The first completer that we created was the Location completer. Since each resource type has a distinct list of available Azure regions, we wanted to create an easy, quick way to select a valid region when creating a resource. Thus, for every parameter in our modules which accepts an Azure region (which in most cases is called Location), we added an argument completer that returns only the regions in which the resource type can be created. In the example below, you can see the result of pressing tab immediately after -Location for the New-AzResourceGroup cmdlet.

In addition to listing out all available regions that a Resource Group can be created in, the Location completer allows you to filter the results by a typing in the first few characters of the region you are looking for.

Resource Group Name completer

The second completer that we added to the PowerShell modules is the Resource Group Name completer. This completer was applied to all parameters which accept an existing resource group and returns all resource groups in the current subscription. Similar to the Location completer, you can filter the results by typing the first few characters of the resource group before pressing tab.

Resource Name completer

The third completer that we added to the PowerShell modules is the Resource Name completer. This completer returns the list of names of all resources that match the resource type required by the parameter. Additionally, this argument completer will filter by the resource group name if it is already provided to the cmdlet invocation. For example, in the screenshot below, when we tab after typing “Get-AzVM -Name test,” we see all four VMs in the current subscription that starts with “test.” Then, when we tab after typing “Get-AzVM -ResourceGroupName maddie1 -Name test,” we only see the two VMs that are contained in the “maddie1” resource group.

Not only does the Resource Name completer filter by the resource group name, but, for all subresources, it also filters by the parent resources, if they are provided to the cmdlet invocation. The results will be filtered by each of the parent resources provided. In the example below, you can see the results of tab completion over “maddiessqldatabase” for various combinations of parameters being provided.

At the moment, this completer has only been applied to the Compute, Network, KeyVault, and SQL modules. If you enjoy this feature and would like to see it applied to more modules, please let us know by sending us feedback using the Send-Feedback cmdlet.

Resource Id completer

The final completer that we added to the Az modules is a Resource Id completer. This completer returns all resource Ids in the current subscription, filtered for the resource type that the parameter requires. The Resource Id completer allows you to filter the results by a typing in a few characters, using '*<characters>*' wildcard pattern. This completer was applied to all parameters in our cmdlets that accept an Azure resource Id.

Try it out

To try out Azure PowerShell for yourself, install our Az module via the PowerShell Gallery. For more information about our new Az module, please check out our Az announcement blog. We look forward to getting your feedback, suggestions or issues via the built-in “Send-Feedback” cmdlet. Alternatively, you can always open an issue in our GitHub repository.
Quelle: Azure

Modernizing payment management for online merchants

E-commerce merchants all over the world are innovating every day to offer customers the best user experience. To keep customers coming back, the buying experience should leave only good impressions, from beginning to end. To achieve this, merchants want to examine every step—especially the payment checkout. So, payment processors need to complement and support the innovations of the merchants. And the final experience needs to be as intuitive and seamless as possible, so it does not break the checkout flow; it should support the brand experience and leave customers with a pleasing memory. Helping a merchant craft a seamless payment experience is the domain of Newgen.

Solution key features

Guru is Newgen's fully integrated portal that enables merchants to have a complete view of their payments, generate reports, capture/void transactions, and perform refunds. It is a fully managed SaaS solution which comes as a value addition with Newgen's Payment Gateway—a cutting edge payment technology for merchants. The solution competes in the market with these key features.

Intelligent transaction routing: Newgen’s engine automatically routes transactions taking into account the country, credit provider, volume and ratio (selecting the best destination based on the transaction amount), currency, and transaction fees. Using machine learning that is continually improving, the engine bases its decisions on platform health, performance, and fees—it will select the optimal route to maximize your gains. The service provides capabilities that would otherwise consume a merchant’s resources to reproduce.
Split payments: When friends and family want to split a charge, Newgen will make it easy for them to do so. The end user provides the email addresses for the participants, and sets the split ratio (for example, “evenly”). The engine generates the email inviting others to participate. The initiator can check the status of the payment progress.
Page builders: Your brand and UI should be distinct, and the payment process needs to be integrated into it. Newgen lets you build a custom UI with a drag-and-drop approach. You can craft the checkout experience to ensure your brand is present and reassuring to your customers.
Flash checkout: Requiring customers to fill in form data should be a one-time event. Newgen gives the customer the option of saving the data for reuse and instant checkout. The data is securely stored, and Newgen strictly adheres to PCI DSS level 1 for maximum security.

Azure services

Guru is a fully cloud-based solution hosted completely on Microsoft Azure. It benefits from Azure’s highly scalable and secure technologies, with the flexibility to develop non-trivial technology stacks. Guru specifically uses these Azure technologies:

Azure Virtual Machines
Azure SQL Database
Azure Files
Azure Site Recovery
Azure Functions

Recommended next steps

Explore Newgen’s various solutions to see what works for you. Or, go to the Azure Marketplace, and click Contact me.
Quelle: Azure

Azure.Source – Volume 68

Now available in preview

Read Replicas for Azure Database for PostgreSQL now in preview

Scale out read-heavy workloads on Azure Database for PostgreSQL with read replicas, which enable continuous, asynchronous replication of data from one Azure Database for PostgreSQL master server to up to five Azure Database for PostgreSQL read replica servers in the same region. Replica servers are read-only except for writes replicated from data changes on the master. Stopping replication to a replica server causes it to become a standalone server that accepts reads and writes. Replicas are new servers that can be managed in similar ways as normal standalone Azure Database for PostgreSQL servers. For each read replica, you are billed for the provisioned compute in vCores and provisioned storage in GB/month.

Scaling out the reporting workload for a BI Reporting workload with read replicas

Now generally available

Announcing the general availability of Lsv2-series Azure Virtual Machines

Power your high throughput and high IOPS workloads including big data applications, SQL and NoSQL databases, data warehousing, and large transactional databases with Lsv2-series Azure Virtual Machines (VMs), which run on the AMD EPYCTM 7551 processor with an all core boost of 2.55GHz. The Lsv2-series VMs offer various configurations from 8 to 80 vCPUs with simultaneous multi-threading. Each VM features 8 GiB of memory and one 1.92TB NVMe SSD M.2 device per 8 vCPUs, with up to 19.2TB (10 x 1.92TB) available on the 80vCPU L80s v2. We are launching the Lsv2-series in the following regions: East US, East US 2, West Europe, and SE Asia. We plan to make these new VMs available in more regions in the coming months including West US 2 in February and North Europe in April.

Announcing the general availability of Query Store for Azure SQL Data Warehouse

Get insight into performance of your Azure SQL Data Warehouse queries using Query Store. The SQL Server Query Store feature provides you with insight on query plan choice and performance. It simplifies performance troubleshooting by helping you quickly find performance differences caused by query plan changes. Query Store automatically captures a history of queries, plans, and runtime statistics, and retains these for your review. It separates data by time windows so you can see database usage patterns and understand when query plan changes happened on the server. Query Store is available in all regions for all generations of SQL Data Warehouse with no additional charges.

Also generally available

General availability: Global VNet Peering in Azure China cloud
General availability: Move MariaDB servers to new resource groups and subscriptions
Azure DNS: Getting ready for DNS Flag Day
Avere vFXT for Azure: New ARM Template Deployment now available
Azure Site Recovery: Azure VM disaster recovery updates
Schedules feature released for Azure Lab Services classroom labs
New release of the Microsoft Threat Modeling Tool

 

News and updates

Microsoft joins the SciKit-learn Consortium

SciKit-learn is a first class, open source machine-learning library in Python that is used by many companies and individuals around the world for scenarios ranging from fraud detection to process optimization. Microsoft joined the SciKit-learn consortium as a platinum sponsor and released tools to enable increased usage of SciKit-learn pipelines. Support is now available for using SciKit-learn in inference scenarios through the high performance, cross platform ONNX Runtime. The SKlearn-ONNX converter exports common SciKit-learn pipelines directly to the ONNX-ML standard format for use on Linux, Windows, or Mac. In addition, support is also available for SciKit-learn training in Azure Machine Learning to automatically generate the best SciKit-learn pipeline according to your training data and problem scenario.

Disaster Recovery support for Linux on VMware

Ensure business continuity by keeping your business apps and workloads running during outages. Azure Site Recovery supports replication of any workload running on a supported machine from a primary site to a secondary location, including all major Linux server versions on VMware. Site Recovery service is updated and improved on an ongoing basis. Over the last six months, we extended support for the latest OS version releases from multiple providers such as Red Hat Enterprise Linux (RHEL), CentOS, Ubuntu, Debian, SUSE, and Oracle.

VMWare to Azure replication architecture

Azure Site Recovery: Disaster Recovery as a Service (DRaaS) for Azure, by Azure

Orchestrate and manages disaster recovery for your Azure VMs with Azure Site Recovery. Azure provides native high availability and reliability for your mission-critical workloads running on IaaS virtual machines (VMs). Improve your protection and meet compliance requirements using the disaster recovery provided by Azure Site Recovery. Check out this post to get a recap of all the new capabilities from the last few months, including support zone-pinned Azure VMs and disaster recovery of Azure Disk Encryption-enabled VMs.

Adventure awaits: Azure Trivia is back!

If you loved playing #AzureTrivia on Twitter last year, it’s back! This year's #AzureTrivia celebrates your love of problem solving, growth, and adventure. Players will be taken on an exciting, mystical journey where you’ll not only pick up new skills, but test your technical prowess in order to unlock new lands and win some sweet, sweet swag. Check out @Azure on Twitter every Monday for a new Azure-related question to tackle. See this post for more details, including a handy FAQ and a link to the official rules.

Play #AzureTrivia – follow @Azure on Twitter

Additional updates

Extend alerts created in the Azure Government cloud's OMS portal to Azure
What's new in Azure Log Analytics – January 2019 
In Development: AKS Pod Identity, AKS cluster auto-upgrade, Node auto-repair support for AKS, AKS private cluster, Availability Zones (AZ) support for AKS, Multiple node pools for your AKS workloads, Authorized IP Ranges for Kubernetes API server, AKS pod security policy, AKS cluster autoscaling, AKS control plane audit Logs, and AKS Network Policy

Tech content

Development, source control, and CI/CD for Azure Stream Analytics jobs

Stream Analytics Visual Studio tools together with Azure Pipelines provide an integrated environment that helps you develop and source control your Azure Stream Analytics jobs and set up automated processes to build, test, and deploy these jobs to multiple environments. In this post, Jie Su covers the end-to-end development and CI/CD process using Stream Analytics Visual Studio tools, Stream Analytics CI/CD NuGet package, and Azure Pipelines.

Azure Security Center can detect emerging vulnerabilities in Linux

Recently a flaw was discovered in PolKit – a component that controls system-wide privileges in Unix OS. This vulnerability potentially allows unprivileged account to have root permission. In this blog post, learn about this vulnerability and how an attacker can easily abuse and weaponize it. In addition, learn how Azure Security Center can help you detect threats, and provide recommendations for mitigation steps.

An example security alert in Azure Security Center

QnA Maker simplifies knowledge base management for your Q&A bot

QnA Maker is an easy-to-use web-based service that makes it easy to power a question-answer application or chatbot from semi-structured content like FAQ documents and product manuals. With QnA Maker, developers can build, train, and publish question and answer bots in minutes. Active Learning in QnA Maker helps identify and recommend question variations for any question and allows you to add them to your knowledge base. Your knowledge base content won’t change unless you choose to add or edit the suggestions to the knowledge base.

Configuring Global Multi-region Reads and Writes for Mongo DB with Azure Cosmos DB

Jay Gordon shows you how quickly you can configure your data's consistency level and replication across the Azure Cloud.

 

Using PowerShell to Import and Export Azure Blueprints

In the second part of a series on Azure Blueprints, Sonia Cuff writes about a new command in the PowerShell Gallery to import and export Azure Blueprints and their artifacts (JSON and Infra as code). The post also describes use cases (multi-subscription, promote test to production, multi-tenancy/multi-org) and command syntax, and drives to official resources.

Build Pipelines for GitHub Projects

If you have a project on GitHub, chances are you will want to continually build it to ensure it's still compiling, that it's still working with all tests passing, and possibly create a release of the project that allows you and others to simply use the latest version, without having to manually compile/package it. For all that, you need a build pipeline. In this post, you'll learn how to use Azure Pipelines' GitHub integration for more complicated builds and projects, including installing from the GitHub marketplace, finding your repo, and starting to define your first build.

Azure shows

Episode 264 – OnMSFT.com migrating to Azure | The Azure Podcast

The team talks to Kip Kniskern, managing editor of OnMSFT.com about his impressions of Azure after he finished migrating OnMSFT.com to Azure.

HTML5 audio not supported

Azure is the new mainframe | Azure Friday

Steve Steuart from Astadia joins Scott Hanselman to show how a mainframe reference architecture based on Azure enables you to deploy your mainframe assets to a native Azure environment. You can the leverage all the available Azure services with your new mainframe in the cloud, including AI, Power BI, IoT, and Machine Learning.

A Closer Look at Intelligent Retail | AI Show

Get ideas about how to build engaging conversational applications using this fun retail example that leverages services from across Microsoft.

Five Things About JavaScript in DevOps | Five Things

What does Azure DevOps have to do with generators, Captain Kirk, and ponies? Where can you get therapy for your VB 6 scars? Why don't Angular, React, and Vue developers have a cool logo like docker? How can you automate your development pipeline using whatever tools you want with Azure DevOps? We turn to the triple amazing Donovan Brown to lay this all out.

What’s new with Windows IoT in 2019 | Internet of Things Show

Find out what Microsoft Windows IoT has planned for 2019! Get an early look into how Windows IoT can help you quickly build safe, smart devices that can be the foundation of your IoT solutions.

How to create a virtual machine in Azure | Azure Portal Series

Azure provides multiple virtual machine configuration options for you to set up your virtual machine exactly how you want for whatever you're trying to do. In this video of the Azure Portal “How To” Series, learn how to configure a basic virtual machine and connect to it in minutes in the Azure portal.

How to push a container image to a Docker Repo | Azure Tips and Tricks

In this edition of Azure Tips and Tricks, learn how to push a container image to a Docker Repo. Once you've signed up for a Docker account, you can easily come to the Docker Hub and push your containers to your registry.

Overview of Azure Database for MySQL in Azure Government | Azure Government

In this episode of the Azure Government video series, Steve Michelotti talks with Sachin Dubey, of the Azure Government Engineering team, about Azure Database for MySQL in Azure Government. MySQL is one of the most popular open source relational databases ever, but running it at an Enterprise scale has its challenges just like any other database. Now with a fully managed PaaS offering, Azure Database for MySQL provides the perfect unification of the underlying MySQL database, which developers know and love with the world-class security, scalability, and reliability of Azure. Sachin also discusses the benefits of Azure Database for MySQL and demonstrates how easy it is to get up and running with Azure Database for MySQL in Azure Government.

Reviewing Current Azure DevOps News, Tips, and Strategies – Episode 2 | The Azure DevOps Podcast

In this week's episode, Jeffrey Palermo reviews some of the current industry news and tips, including; an interesting announcement in the A.I. space about Cortana, ServiceNow Change Management in Azure Pipelines, Azure DevOps Agents on Azure Container Instances (ACI), .NET Core 3 and 4.8, and an article about Razor Components. He also gives his 10 tips for rapidly recovering when a deployment breaks badly.

HTML5 audio not supported

Events

The Things Network and Azure IoT connect LoRaWAN devices

Last week at The Things Conference in Amsterdam, Microsoft and The Things Network Foundation collaborated with 2,000 LoRaWAN developers, innovators, and integrators on connecting devices to Azure IoT Central using the open source project Azure IoT Central Device Bridge. If you are planning to or already developing LoRaWAN solutions, join the project today and contribute your code, comments, and suggestions.

IoT Central dashboard

Make healthcare more intelligent by putting IoT into action

Intelligent healthcare, including IoT technology, is a key way to address these critical challenges. Intelligent, connected solutions are lowering costs, increasing patient care and health, and improving clinician and patient satisfaction. If you’re planning to attend HIMSS, you’ll definitely want to register for IoT in Action in Orlando on February 11, 2019, where you’ll discover how IoT and intelligent solutions are transforming the healthcare industry. IoT in Action is a fantastic opportunity to share ideas and build connections in the rich Microsoft customer and partner ecosystem.

Customers, partners, and industries

Azure Marketplace new offers – Volume 30

The Azure Marketplace is the premier destination for all your software needs – certified and optimized to run on Azure. Find, try, purchase, and provision applications & services from hundreds of leading software providers. You can also connect with Gold and Silver Microsoft Cloud Competency partners to help your adoption of Azure. In the second half of December, we published 46 new offers.

Ansible solution now available in the Azure Marketplace

Get the Ansible solution template from the Azure Marketplace, which you can use to configure an Ansible instance following best practices with minimal Azure knowledge. With a handful of user inputs and a simple single-click deployment through the Azure portal, you can provision a fully configured Ansible instance in minutes, which can use Azure services anywhere across the globe. This solution template will install Ansible on a Linux VM based on CentOS 7.5 along with tools configured to work with Azure.

Transitioning big data workloads to the cloud: Best practices from Unravel Data

In this first of a five-part series, Shivnath Babu, CTO and Co-Founder at Unravel Data discusses key considerations in planning for migrations. Unravel Data is an AI-driven Application Performance Management (APM) solution for managing and optimizing big data workloads. Unravel Data provides a unified, full-stack view of apps, resources, data, and users, enabling users to baseline and manage app performance and reliability, control costs and SLAs proactively, and apply automation to minimize support overhead. Upcoming posts will outline the best practices for the migration, operation, and optimization phases of the cloud adoption lifecycle for big data.

A Cloud Guru – Azure This Week – 1 February 2019 | A Cloud Guru – Azure This Week

This time on Azure This Week: Lars talks about Azure Security Center making it easier being compliant with regulatory constraints, OpenAPI Specification v3 support in Azure API Management now in preview, and the team from A Cloud Guru will be at The Ignite Tour in Sydney.

Quelle: Azure

Advancing tactical edge scenarios with Dell EMC Tactical Microsoft Azure Stack and Azure Data Box family

Today, Microsoft is announcing new intelligent cloud and intelligent edge capabilities for U.S. government customers. These new capabilities will help government customers uniquely address “the tactical edge”—or, a dependence on information systems and connectivity in harsh scenarios or other situations where users have critical data availability, integrity, and transparency needs.

As U.S. government agencies support missions around the world, in remote locations, and beyond the reach of standard infrastructure, new technology is required for mission success. Microsoft offers a comprehensive portfolio designed to bring data analysis and insight to the tactical edge. Azure Stack and our Data Box family of products help government agencies with remote operations access the information they need to make decisions at the edge, along with access to the full range of cloud data analytics as connectivity allows.

Just last year we announced the integration of Azure Stack with Azure Government cloud, which unlocks a wide range of hybrid cloud use cases for government customers. By connecting the tactical edge to Azure Government, the mission-critical cloud for U.S. government customers and their partners, federal, state, and local government agencies can now operate with full regulatory compliance and the most up-to-date edge capabilities.

To that end, today, Microsoft, working with partners like Dell EMC, is sharing new capabilities that continue to deliver the power of the intelligent cloud and intelligent edge to government customers and their partners:

Dell EMC Tactical Microsoft Azure Stack in partnership with Tracewell Systems brings Azure-consistent cloud to operating environments where network connectivity is an issue or mobility and high portability are required, i.e. in remote and rugged circumstances. To learn more, please check out Dell EMC’s announcement.

Azure Data Box family for Azure Government –

Azure Data Box Edge, an on-premises appliance with AI-enabled edge compute capabilities, is now available in preview in Azure Government. Azure Data Box Gateway, a virtual storage appliance, will be available in Azure Government in March 2019.

​Azure Data Box will be available in Azure Government in March 2019. Azure Data Box Disk and Azure Data Box Heavy will be available in Azure Government in mid-2019. Together, they provide a spectrum of options to move data to Azure Government in secure and simple way.

From supporting military and humanitarian missions to the needs of the U.S. State Department or other U.S government organizations, these new systems offer unprecedented opportunities to expedite decision making and bring the power of cloud to areas far beyond the reach of a traditional datacenter.

Enabling the tactical edge

As we have shared, in field operations, speed is of the essence and insights empower decisions. In certain situations, a connection can be a security liability or might not even be available. With Azure Stack, agencies can bring core and advanced cloud services to the edge — right to where they’re needed, making it possible to process data in the field without worrying about latency or even Internet connectivity.

See how Azure Stack brings intelligence and cloud services to remote sites here.

Being able to gather, discern, and distribute mission data is essential for making critical decisions. Tools that help process and transfer data directly at the edge make this possible. For example, Data Box Edge, with its in-built hardware acceleration for ML inferencing and light footprint, is useful to further the intelligence of forward-operating units or similar mission needs with AI solutions designed for the tactical edge. Data transfer from the field, which is complex and slow, is made seamless with the Data Box family of products.

This unites the best of edge and cloud computing to unlock never-before-possible capabilities like synthetic mapping and tracking water or air quality. From submarines to aircraft to remote bases, Azure Stack and Azure Data Box allow for the harnessing of the power of cloud at the edge.

Accelerating IT modernization

By merging the best of commercial innovation and investment with a secure, compliant, truly hybrid cloud platform, agencies can now up-level their infrastructures at a scale, pace, and speed that meets their unique needs.

Last year, we announced Azure Stack availability for Azure Government customers. With Azure Stack for Azure Government, agencies can efficiently modernize their on-premises legacy applications that are not ready or a fit for the public cloud due to cyber defense concerns, regulations, or other requirements. These applications can be moved without making any change in code, DevOps tools, processes, or people skills. Solutions like blockchain on Azure Stack enhances transparency and ensure data privacy for various government use cases.

Data Box products help agencies to migrate large amounts of data, for example backup, archive or big data analytics, to Azure when they are limited by time, network availability, or costs.

Finally, I’d like to share an example of our customer solving a real world challenge using intelligent cloud and intelligent edge solution with Azure.

You can get started with Azure Stack by contacting our solution partners for more details on how to order integrated systems. Or, learn more about Azure Stack by downloading an Azure Stack Development Kit – a single-server deployment designed for trial and proof of concepts.

Get started with Azure Data Box Edge preview in Azure Gov by signing up. You can learn more about Data Box family of products at http://azure.com/DataBox.

Learn more about how the new paradigm of intelligent cloud and intelligent edge helps unlock new mission scenarios for government agencies by reading the blog post, “Enabling intelligent cloud and intelligent edge solutions for government.”
Quelle: Azure

Announcing the general availability of Lsv2-series Azure Virtual Machines

After wrapping up a successful preview with fantastic customer engagement, we are excited to officially announce the general availability of the Lsv2-series Azure Virtual Machines (VMs). Customers from all over the globe and across a broad range of industries participated in the Lsv2-series VMs preview during the second half of 2018.

General overview

The Lsv2-series features high throughput, low latency, and directly mapped local NVMe storage. The Lsv2 VMs run on the AMD EPYCTM 7551 processor with an all core boost of 2.55GHz. The Lsv2-series VMs offer various configurations from 8 to 80 vCPUs with simultaneous multi-threading. Each VM features 8 GiB of memory and one 1.92TB NVMe SSD M.2 device per 8 vCPUs, with up to 19.2TB (10 x 1.92TB) available on the 80vCPU L80s v2.

Target workloads

The Lsv2-series is well suited for your high throughput and high IOPS workloads including big data applications, SQL and NoSQL databases, data warehousing, and large transactional databases. Examples include Cassandra, MongoDB, Cloudera, and Redis. In general, applications that can benefit from large in-memory databases are a good fit for these VMs.

We have worked closely with AMD to maximize potential customer value:

“Designed from the ground up for the modern IT enterprise, the AMD EPYC™ 7551 processor featured in the Microsoft Azure Lsv2 VM instance has today’s highest core count for a server processor, exceptional memory capacity and bandwidth, coupled with phenomenal I/O density. Combine this with the strength of Azure, and it creates a perfect environment for workloads such as in memory databases and big data. Microsoft Azure was the first global cloud provider to deploy AMD EPYC processors and we’re excited to continue this partnership as we target even greater performance in 2019.”

– Daniel Bounds, Senior Director, Datacenter Solutions, AMD

Performance

With learnings from our preview, we have optimized our Lsv2-series VMs to drive maximum performance from the local NVMe disks. These performance levels are possible thanks to the optimization of Windows Server 2019 on Azure and Canonical’s latest Ubuntu 18.04 and 16.04 releases in the Azure Marketplace. Throughout 2019 we will continue to add Lsv2-series optimized Linux distributions in the Azure Marketplace. Please check the documentation, “Storage optimized virtual machine sizes” for future updates.

Available configurations and regional availability

The Lsv2-series VMs are available in the following sizes:

Size
vCPU’s
Memory (GiB)
NVMe Disk
NVMe Disk Throughput (Read IOPS/MBps)

L8s_v2
8
64
1 x 1.92 TB
340,000 / 2,000

L16s_v2
16
128
2 x 1.92 TB
680,000 / 4,500

L32s_v2
32
256
4 x 1.92 TB
1,400,000 / 9,000

L64s_v2
64
512
8 x  1.92 TB
2,700,000 / 18,000

L80s_v2
80
640
10 x 1.92 TB
3,400,000 / 22,000

Your performance levels will vary depending on your workload, configuration, and choice of operating system. For example, on the L80s_v2 with 19.2 TB of NVMe direct to the VM, we were able to reach up to 3.7M read IOPS on WS2019 and 3.6M read IOPS on Ubuntu 18.04/16.04. To learn more watch the video, “Inside Azure datacenter architecture with Mark Russinovich” from Ignite 2018.

We are launching the Lsv2-series in the following regions: East US, East US 2, West Europe, and SE Asia. We plan to make these new VMs available in more regions in the coming months including West US 2 in February and North Europe in April.

Next steps

You can get more information about the Lsv2-series by reading the documentation, “Storage optimized virtual machine sizes.” Lsv2-series VMs support pay-as-you-go, low priority, and 1 or 3 year reserved instance (RI) pricing for both Windows and Linux.

Quelle: Azure

Transitioning big data workloads to the cloud: Best practices from Unravel Data

Migrating on-premises Apache Hadoop® and Spark workloads to the cloud remains a key priority for many organizations. In my last post, I shared “Tips and tricks for migrating on-premises Hadoop infrastructure to Azure HDInsight.” In this series, one of HDInsight’s partners, Unravel Data, will share their learnings, best practices, and guidance based on their insights from helping migrate many on-premises Hadoop and Spark deployments to the cloud.

Unravel Data is an AI-driven Application Performance Management (APM) solution for managing and optimizing big data workloads. Unravel Data provides a unified, full-stack view of apps, resources, data, and users, enabling users to baseline and manage app performance and reliability, control costs and SLAs proactively, and apply automation to minimize support overhead. Ops and Dev teams use Unravel Data’s unified capability for on-premises workloads and to plan, migrate, and operate workloads on Azure. Unravel Data is available on the HDInsight Application Platform.
Today’s post, which kicks off the five-part series, comes from Shivnath Babu, CTO and Co-Founder at Unravel Data. This blog series will discuss key considerations in planning for migrations. Upcoming posts will outline the best practices for the migration, operation, and optimization phases of the cloud adoption lifecycle for big data.

Unravel Data’s perspective on migration planning

The cloud is helping to accelerate big data adoption across the enterprise. But while this provides the potential for much greater scalability, flexibility, optimization, and lower costs for big data, there are certain operational and visibility challenges that exist on-premises that don’t disappear once you’ve migrated workloads away from your data center.

Time and time again, we have experienced situations where migration is oversimplified and considerations such as application dependencies and system version mapping are not given due attention. This results in cost overruns through over-provisioning or production delays through provisioning gaps.

Businesses today are powered by modern data applications that rely on a multitude of platforms. These organizations desperately need a unified way to understand, plan, optimize, and automate the performance of their modern data apps and infrastructure. They need a solution that will allow them to quickly and intelligently resolve performance issues for any system through full-stack observability and AI-driven automation. Only then can these organizations keep up as the business landscape continues to evolve, and be certain that big data investments are delivering on their promises.

Current challenges in big data

Today, IT uses many disparate technologies and siloed approaches to manage the various aspects of their modern data apps and big data infrastructure.
Many existing monitoring solutions often do not provide end-to-end support for big data environments, lack full-stack compatibility, or require complex instrumentation. This includes configuration changes to applications and their components, which requires deep subject matter expertise. The murky soup of monitoring solutions that organizations currently rely on doesn’t deliver the application agility that is required by the business.
Consequently, this results in poor user experience, inefficiencies and mounting costs as organizations buy more and more tools to solve these problems and then have to spend additional resources managing and maintaining those tools.
Additionally, organizations see a high Mean Time to Identify (MTTI) and Mean Time to Resolve (MTTR) issues because it is hard to understand the dependencies and keep focused on root cause analysis. The lack of granularity and end to end visibility makes it impossible to remedy all of these problems, and businesses are stuck in a state of limbo.
It’s not an option to continue doing what was done in the past. Teams need a detailed appreciation of what they are doing today, what gaps they still have, and what steps they can take to improve business outcomes. It’s not uncommon to see 10x or more improvements in root cause analysis and remediation times for customers who are able to gain a deep understanding of the current state of their big data strategy and make a plan for where they need to be.

Starting your big data journey to the cloud

Without a unified APM platform, the challenges only intensify as enterprises move big data to the cloud. Cloud adoption is not a finite process with a clear start and end date — it’s an ongoing lifecycle with four broad phases (planning, migration, operation, and optimization). Below, we briefly discuss some of the key challenges and questions that arise for organizations below, which we will dive into in further detail in subsequent posts.

In the planning phase, key questions may include:

“Which apps are best suited for a move to the cloud?”
“What are the resource requirements?
“How much disk, compute, and memory am I using today?”
“What do I need over the next 3, 6, 9, and 12 months?”
“Which datasets should I migrate?”
“Should I use permanent, transient, autoscaling, or spot instances?”

During migration, which can be a long running process as workloads are iteratively moved, there is a need for continuous monitoring of performance and costs. Key questions may include:

“Is the migration successful?”
“How does the performance compare to on-premises?”
“Have I correctly assessed all the critical dependencies and service mapping?”

Once workloads are in production on the cloud, key considerations include:

“How do I continue to optimize for cost and for performance to guarantee SLAs?”
“How do I ensure Ops teams are as efficient and as automated as possible?”
“How do I empower application owners to leverage self-service to solve their own issues easily to improve agility?”

The challenges of managing disparate big data technologies both on-premise and in the cloud can be solved with a comprehensive approach to operational planning. In this blog series, we will dive deeper into each stage of the cloud adoption lifecycle and provide practical advice for every part of the journey. Upcoming posts will outline the best practices for the planning, migration, operation, and optimization phases of this lifecycle.

About HDInsight application platform

The HDInsight application platform provides a one-click deployment experience for discovering and installing popular applications from the big data ecosystem. The applications cater to a variety of scenarios such as data ingestion, data preparation, data management, cataloging, lineage, data processing, analytical solutions, business intelligence, visualization, security, governance, data replication, and many more. The applications are installed on edge nodes which are created within the same Azure Virtual Network boundary as the other cluster nodes so you can access these applications in a secure manner.

Additional resources

Learn more about Azure HDInsight
Migrate on-premises Apache Hadoop clusters to Azure HDInsight
Get up to speed with Azure HDInsight: The comprehensive guide
Open Source component guide on HDInsight
HDInsight release notes
Ask HDInsight questions on MSDN forums
Ask HDInsight questions on StackOverflow

Quelle: Azure

Announcing the general availability of Query Store for Azure SQL Data Warehouse

Since our preview announcement, hundreds of customers have been enabling Query Store to provide insight on query performance. We’re excited to share the general availability of Query Store worldwide for Azure SQL Data Warehouse.

Query Store automatically captures a history of queries, plans, and runtime statistics and retains them for your review when monitoring your data warehouse. Query Store separates data by time windows so you can see database usage patterns and understand when plan changes happen.

Top three reasons to use Query Store right now

1. Find the full text of any query: Using the sys.query_store_query and sys.query_store_query_text catalog views, you can see the full text of queries executed against your data warehouse over the last 7 days.

SELECT
q.query_id
, t.query_sql_text
FROM
sys.query_store_query q
JOIN sys.query_store_query_text t ON q.query_text_id = t.query_text_id;

2. Finding your top executing queries: Query Store tracks all query executions for your review. On a busy data warehouse, you may have thousands or millions of queries executed daily. Using the Query Store catalog views, you can get the top executing queries for further analysis:

SELECT TOP 10
q.query_id [query_id]
, t.query_sql_text [command]
, SUM(rs.count_executions) [execution_count]
FROM
sys.query_store_query q
JOIN sys.query_store_query_text t ON q.query_text_id = t.query_text_id
JOIN sys.query_store_plan p ON p.query_id = q.query_id
JOIN sys.query_store_runtime_stats rs ON rs.plan_id = p.plan_id
GROUP BY
q.query_id , t.query_sql_text ORDER BY 3 DESC;

3. Finding the execution times for a query: Query also gathers runtime query statistics to help you focus on queries with variance in execution. The variance could be for a variety of reasons such as loading a bunch of new data.

SELECT
q.query_id [query_id]
, t.query_sql_text [command]
, rs.avg_duration [avg_duration]
, rs.min_duration [min_duration]
, rs.max_duration [max_duration]
FROM
sys.query_store_query q
JOIN sys.query_store_query_text t ON q.query_text_id = t.query_text_id
JOIN sys.query_store_plan p ON p.query_id = q.query_id
JOIN sys.query_store_runtime_stats rs ON rs.plan_id = p.plan_id
WHERE
q.query_id = 10
AND rs.avg_duration > 0;

Get started now

Query Store is available in all regions for all generations of SQL Data Warehouse with no additional charges. You can enable Query Store by running the ALTER DATABASE <database name> SET QUERY_STORE = ON; command.

To get started, you can read the monitoring performance by using the Query Store overview topic. A complete list of supported operations can be found in the Query Store Catalog Views documentation.

Next steps

Azure SQL Data Warehouse continues to lead in the areas of security, compliance, privacy, and auditing. For more information, refer to the whitepaper, “Guide to enhancing privacy and addressing GDPR requirements with the Microsoft SQL platform,” on Microsoft Trust Center, or our documentation, “Secure a database in SQL Data Warehouse.”

For more information on Query Store in Azure SQL Data Warehouse, refer to the article, “Monitoring performance by using the Query Store,” and the Query Store DMVs, such as sys.query_store_query.
For feature requests, please vote on our UserVoice.
To get started today, create an Azure SQL Data Warehouse.
To stay up-to-date on the latest Azure SQL Data Warehouse news and features, follow us on Twitter @AzureSQLDW.

Quelle: Azure

The Things Network and Azure IoT connect LoRaWAN devices

This week, at The Things Conference in Amsterdam, Microsoft and The Things Network Foundation collaborate with 2,000 LoRaWAN developers, innovators, and integrators on connecting devices to Azure IoT Central using the open source project Azure IoT Central Device Bridge.

Internet of Things (IoT) applications are about harnessing sensors and device data to transform processes and businesses. They require pervasive connectivity to allow compute at the intelligent edge, connected devices and sensors, communicate and share learnings with the intelligent cloud. The heterogenous nature of IoT devices, networks, and infrastructures leads to the creation of different protocols and technologies for wirelessly connecting IoT devices, each addressing specific needs and requirements for battery consumption, range, security, frequency usage, and more.

LoRaWAN™ is one of these technologies – a specification developed by the LoRa Alliance as a low power, wide area networking protocol based on a star-of-stars topology in which gateways relay messages between end-devices and a central network server. Many companies have adopted LoRaWAN and often offer IoT connectivity services, simplifying connectivity for IoT devices. The Things Network, an active member of the LoRa Alliance, is a Foundation that aims at building a global open LoRaWAN network and supporting developers in building industrial grade LoRaWAN solutions. They foster an active global community of over 60,000 developers and offer a marketplace of LoRaWAN compatible solutions, devices, and services.

To bring the intelligence of Microsoft Azure to existing IoT cloud solutions, such as The Things Network, the Azure IoT team created the Azure IoT Central Device Bridge, an open source, ready to deploy project that leverages Azure services and makes it trivial to connect LoRaWAN devices to Microsoft’s SaaS offering for IoT. The combination of both solutions, integrated in a matter of minutes, allows you to ingest The Things Network devices and sensors’ data into Azure IoT Central to be displayed, analyzed, and trigger actions in business applications.

“The bridge with Azure IoT Central is a necessary step to enhance the experience and reduce complexity for LoRaWAN developers. We are thrilled to collaborate with the Azure IoT team on this open source project that facilitates the community to develop end to end IoT applications faster and with less effort.”

– Alexander Overtoom, Head of Business Development at The Things Industries

The Things Conference brings 2,000 LoRaWAN developers, innovators, and integrators together and is the ideal playground to work with LoRaWAN experts on this simple and robust integration. We are eager to make the Azure IoT Central Device Bridge project grow with the help of the community and look forward to connecting the millions of LoRaWAN things to Azure IoT. If you are planning to or already developing LoRaWAN solutions, join the project today and contribute your code, comments, and suggestions!
Quelle: Azure

Make healthcare more intelligent by putting IoT into action

The healthcare industry is poised for massive transformation. With an aging population, chronic diseases, and increased regulations, healthcare costs continue to balloon—along with patient and provider expectations for better outcomes.

Intelligent healthcare, including IoT technology, is a key way to address these critical challenges. Intelligent, connected solutions are lowering costs, increasing patient care and health, and improving clinician and patient satisfaction. The opportunity is immense, with the potential IoT healthcare market projected at $158 billion by 2022.1

Below I will share how IoT solutions can drive revenue and transform the healthcare industry. For even more information and to connect with other partners, I highly encourage you to attend the IoT in Action event in Orlando on February 11, 2019.

The intelligent health opportunity

Intelligent health solutions can solve any number of issues. Below, I’ve called out some key areas where we see partner IoT solutions making a difference.

Patient engagement: Intelligent cloud solutions that give patients more control of their health can help strengthen relationships with their providers, reduce re-admissions, and improve outcomes.
Secure communications: Solutions that empower care teams to efficiently coordinate and easily share patient insights help to deliver improved patient-centric care while supporting secure, compliant, and timely communications.
Operational efficiency: Solutions that reduce operational costs and improve outcomes by identifying patterns and trends, connecting data systems, and remotely monitoring critical systems improve healthcare delivery.
Precision care: Giving healthcare providers the ability to tailor treatments with genomics and remotely monitor patients enables more personalized care and better compliance.

A secure platform for data access and communications

With increasingly stringent privacy and security regulations, IoT solutions must ensure data confidentiality, integrity, and accessibility while meeting regulations such as HIPAA and GDPR. Azure Sphere is an intelligent, trusted, and secure platform, upon which connected solutions that engage patients and empower care teams can be built. It provides an Azure Security and Compliance Blueprint to provide a secure, end-to-end foundation for organizations to deal with sensitive, regulated data and helps to ensure a HIPAA- and HITRUST-ready environment.

A hospital in Missouri recently built their own IoT solution on Azure Sphere. They needed a more proactive way to remotely monitor babies born with congenital heart disease. They created a solution that collects vital signs such as heartrate, weight, and oxygen saturation. Healthcare providers can see the data within two minutes of its being collected and get alerts if there are issues, enabling them to provide proactive care in a secure way.

Intelligent health plays a key role in life sciences

The pharmaceutical industry faces its own pressures, such as reduced payments, shifting regulations, and increased innovation costs. It’s never been more critical to find ways to increase operational efficiencies and deliver better results at a lower cost.

Counterfeit drugs and product integrity are among the major issues plaguing the industry, resulting in not only lost income and productivity but also negative health impacts. But IoT-enabled solutions like Titan Secure from Wipro are helping to solve these challenges. Titan Secure ensures end-to-end integrity across the pharmaceutical supply chain by providing real-time shipment visibility, data streams, and alerts around the following key areas:

Temperature, humidity, vibration, and shock
Environmental excursion and tampering
Geo spatial location and documents in blockchain

Receive greater insights around the benefits of utilizing IoT when you register for the life sciences healthcare webinar on February 20, 2019.

Register for the Orlando IoT in Action event

If you’re planning to attend HIMSS, you’ll definitely want to register for IoT in Action in Orlando on February 11, 2019, where you’ll discover how IoT and intelligent solutions are transforming the healthcare industry. IoT in Action is a fantastic opportunity to share ideas and build connections in the rich Microsoft customer and partner ecosystem.

If you can’t attend the in-person event, take advantage of the IoT in Action webinar, Health efficiencies with IoT – Healthy outcomes for patients on February 20, 2019. Together, we’ll explore how IoT is helping the healthcare industry streamline and improve communications, medical supply transport, and regulation compliance.

 

1Markets and Markets, “IoT Healthcare Market by Component,” available at: https://www.marketsandmarkets.com/Market-Reports/iot-healthcare-market-160082804.html (accessed January 8, 2019)
Quelle: Azure