How HSBC built its PayMe for Business app on Microsoft Azure

Bank-grade security, super-fast transactions, and analytics 

If you live in Asia or have ever traveled there, you’ve probably witnessed the dramatic impact that mobile technology has had on all aspects of day to day life. In Hong Kong in particular, most consumers now use a smart phone daily, presenting new opportunities for organizations to deliver content and services directly to their mobile devices.

As one of the world’s largest international banks, HSBC is building new services on the cloud to enable them to organize their data more efficiently, analyze it to understand their customers better, and make more core customer journeys and features available on mobile first.

HSBC’s retail and business banking teams in Hong Kong have combined the convenience afforded by smart phones with cloud services to allow “cashless” transactions where people can use their smart phone to perform payments digitally. Today, over one and a half million people use HSBC’s PayMe app to exchange money with people in their personal network for free. And businesses are using HSBC’s new PayMe for Business app, built natively on Azure, to collect payments instantly, with 98 percent of all transactions completed in 500 milliseconds or less. Additionally, the businesses can leverage powerful built-in intelligence on the app to improve their sales and operations.

On today’s Microsoft Mechanics episode of “How We Built it,” Alessio Basso, Chief Architect of PayMe from HSBC, explains the approach they took and why.

Bank-grade security, faster time to delivery, dynamic scale and resiliency

The first decision Alessio and team made was to use fully managed services to allow them to go from ideation to a fully operational service in just a few months. Critical to their approach was adopting a microservices-based architecture with Azure Kubernetes Service and Azure Database for MySQL.

They designed each microservice to be independent, with their own instance of Azure managed services, including Azure Database for MySQL, Azure Event Hub, Azure Storage, Azure Key Vault for credentials and secrets management, and more. They architected for this level of isolation to strengthen security and overall application uptime, as shared dependencies are eliminated.

Each microservice can rapidly scale compute and database resources elastically and independently, based on demand. What’s more, Azure Database for MySQL, allows for the creation of read replicas to offload read-only and analytical queries without impacting payment transaction response times.

Also, from a security perspective, because each microservice runs within its own subnet inside of Azure Virtual Network, the team is able to isolate network and communications back and forth between Azure resources with service principals via Virtual Network service endpoints.

Fast and responsive analytics platform

At its core, HSBC’s PayMe is a social app that allows consumers to establish their personal networks, while facilitating the interactions and transactions with the people in their circle and business entities. In order to create more value for both businesses and consumers, Azure Cosmos DB is used for graph data modelled to store customer-merchant-transaction relationships.

Massive amounts of structured and unstructured data from Azure Database for MySQL, Event Hubs, and Storage are streamed and transformed. The team designed an internally developed data ingestion process, feeding an analytical model called S.L.I.M (simple, lightly, integrated model), optimized for analytics queries performance, as well as making data virtually available to the analytics platform, using Azure Databricks Delta’s unmanaged table capability.

Then machine learning within their analytics platform built on Azure Databricks allows for the quick determination of patterns and relationships, as well as for the detection of anomalous activity.

With Azure, organizations can immediately take advantage of new opportunities to deliver content and services directly to mobile devices, including a next-level digital payment platform.

To learn more about how HSBC architected their cashless digital transaction platform, please watch the full episode.
Learn more about achieving microservice independence with your own instance of a Azure managed service like Azure Database for MySQL.

Quelle: Azure

New ways to train custom language models – effortlessly!

Video Indexer (VI), the AI service for Azure Media Services enables the customization of language models by allowing customers to upload examples of sentences or words belonging to the vocabulary of their specific use case. Since speech recognition can sometimes be tricky, VI enables you to train and adapt the models for your specific domain. Harnessing this capability allows organizations to improve the accuracy of the Video Indexer generated transcriptions in their accounts.

Over the past few months, we have worked on a series of enhancements to make this customization process even more effective and easy to accomplish. Enhancements include automatically capturing any transcript edits done manually or via API as well as allowing customers to add closed caption files to further train their custom language models.

The idea behind these additions is to create a feedback loop where organizations begin with a base out-of-the-box language model and improve its accuracy gradually through manual edits and other resources over a period of time, resulting with a model that is fine-tuned to their needs with minimal effort.

Accounts’ custom language models and all the enhancements this blog shares are private and are not shared between accounts.

In the following sections I will drill down on the different ways that this can be done.

Improving your custom language model using transcript updates

Once a video is indexed in VI, customers can use the Video Indexer portal to introduce manual edits and fixes to the automatic transcription of the video. This can be done by clicking on the Edit button at the top right corner of the Timeline pane of a video to move to edit mode, and then simply update the text, as seen in the image below.

 

The changes are reflected in the transcript, captured in a text file From transcript edits, and automatically inserted to the language model used to index the video. If you were not already using a customer language model, the updates will be added to a new Account Adaptations language model created in the account.

You can manage the language models in your account and see the From transcript edits files by going to the Language tab in the content model customization page of the VI website.

Once one of the From transcript edits files is opened, you can review the old and new sentences created by the manual updates, and the differences between them as shown below.

All that is left is to do is click on Train to update the language model with the latest changes. From that point on, these changes will be reflected in all future videos indexed using that model. Of course, you do not have to use the portal to train the model, the same can be done via the Video Indexer train language model API. Using the API can open new possibilities such as allowing you to automate a recurring training process to leverage ongoing updates.

There is also an update video transcript API that allows customers to update the entire transcript of a video in their account by uploading a VTT file that includes the updates. As a part of the new enhancements, when a customer uses this API, Video Indexer also adds the transcript that the customers uploaded to the relevant custom model automatically in order to leverage the content as training material. For example, calling update video transcript for a video titled "Godfather" will result with a new transcript file named “Godfather” in the custom language model that was used to index that video.

Improving your custom language model using closed caption files

Another quick and effective way to train your custom language model is to leverage existing closed captions files as training material. This can be done manually, by uploading a new closed caption file to an existing model in the portal, as shown in the image below, or by using the create language model and update language model APIs to upload a VTT, SRT or TTML files (similarly to what was done until now with TXT files.)

 

Once uploaded, VI cleans up all the metadata in the file and strip it down to the text itself. You can see the before and after results in the following table.

 

Type
Before
After

VTT

NOTE Confidence: 0.891635
00:00:02.620 –> 00:00:05.080
but you don't like meetings before 10 AM.

but you don’t like meetings before 10 AM.

SRT

2
00:00:02,620 –> 00:00:05,080
but you don't like meetings before 10 AM.

but you don’t like meetings before 10 AM.

TTML

<!– Confidence: 0.891635 –>
<p begin="00:00:02.620" end="00:00:05.080">but you don't like meetings before 10 AM.</p>

but you don’t like meetings before 10 AM.

From that point on, all that is left to do is review the additions to the model and click Train or use the train language model API to update the model.

Next Steps

The new additions to the custom language models training flow make it easy for you and your organization to get more accurate transcription results easily and effortlessly. Now, it is up to you to add data to your custom language models, using any of the ways we have just discussed, to get more accurate results for your specific content next time you index your videos.

Have questions or feedback? We would love to hear from you! Use our UserVoice page to help us prioritize features, or email VISupport@Microsoft.com for any questions.
Quelle: Azure

Silo busting 2.0—Multi-protocol access for Azure Data Lake Storage

Cloud data lakes solve a foundational problem for big data analytics—providing secure, scalable storage for data that traditionally lives in separate data silos. Data lakes were designed from the start to break down data barriers and jump start big data analytics efforts. However, a final “silo busting” frontier remained, enabling multiple data access methods for all data—structured, semi-structured, and unstructured—that lives in the data lake.

Providing multiple data access points to shared data sets allow tools and data applications to interact with the data in their most natural way. Additionally, this allows your data lake to benefit from the tools and frameworks built for a wide variety of ecosystems. For example, you may ingest your data via an object storage API, process the data using the Hadoop Distributed File System (HDFS) API, and then ingest the transformed data using an object storage API into a data warehouse.

Single storage solution for every scenario

We are very excited to announce the preview of multi-protocol access for Azure Data Lake Storage! Azure Data Lake Storage is a unique cloud storage solution for analytics that offers multi-protocol access to the same data. Multi-protocol access to the same data, via Azure Blob storage API and Azure Data Lake Storage API, allows you to leverage existing object storage capabilities on Data Lake Storage accounts, which are hierarchical namespace-enabled storage accounts built on top of Blob storage. This gives you the flexibility to put all your different types of data in your cloud data lake knowing that you can make the best use of your data as your use case evolves.

Single storage solution

Expanded feature set, ecosystem, and applications

Existing blob features such as access tiers and lifecycle management policies are now unlocked for your Data Lake Storage accounts. This is paradigm-shifting because your blob data can now be used for analytics. Additionally, services such as Azure Stream Analytics, IoT Hub, Azure Event Hubs capture, Azure Data Box, Azure Search, and many others integrate seamlessly with Data Lake Storage. Important scenarios like on-premises migration to the cloud can now easily move PB-sized datasets to Data Lake Storage using Data Box.

Multi-protocol access for Data Lake Storage also enables the partner ecosystem to use their existing Blob storage connector with Data Lake Storage.  Here is what our ecosystem partners are saying:

“Multi-protocol access for Azure Data Lake Storage is a game changer for our customers. Informatica is committed to Azure Data Lake Storage native support, and Multi-protocol access will help customers accelerate their analytics and data lake modernization initiatives with a minimum of disruption.”

– Ronen Schwartz, Senior Vice President and General Manager of Data Integration, Big Data, and Cloud, Informatica

You will not need to update existing applications to gain access to your data stored in Data Lake Storage. Furthermore, you can leverage the power of both your analytics and object storage applications to use your data most effectively.

Multi-protocol access enables features and ecosystem

Multiple API endpoints—Same data, shared features

This capability is unprecedented for cloud analytics services because not only does this support multiple protocols, this supports multiple storage paradigms. We now bring you this powerful capability to your storage in the cloud. Existing tools and applications that use the Blob storage API gain these benefits without any modification. Directory and file-level access control lists (ACL) are consistently enforced regardless of whether an Azure Data Lake Storage API or Blob storage API is used to access the data.  

Multi-protocol access on Azure Data Lake Storage

Features and expanded ecosystem now available on Data Lake Storage

Multi-protocol access for Data Lake Storage brings together the best features of Data Lake Storage and Blob storage into one holistic package. It enables many Blob storage features and ecosystem support for your data lake storage.

Features
More information

Access tiers
Cool and Archive tiers are now available for Data Lake Storage. To learn more, see the documentation “Azure Blob storage: hot, cool, and archive access tiers.”

Lifecycle management policies
You can now set policies to a tier or delete data in Data Lake Storage. To learn more, see the documentation “Manage the Azure Blob storage lifecycle.”

Diagnostics logs
Logs for the Blob storage API and Azure Data Lake Storage API are now available in v1.0 and v2.0 formats. To learn more, see the documentation "Azure Storage analytics logging."

SDKs
Existing blob SDKs can now be used with Data Lake Storage. To learn more, see the below documentation:

Azure Blob storage client library for .NET
Azure Blob storage client library for Java
Azure Blob storage client library for Python

PowerShell
PowerShell for data plane operations is now available for Data Lake Storage. To learn more, see the Azure PowerShell quickstart.

CLI
Azure CLI for data plane operations is now available for Data Lake Storage. To learn more, see the Azure CLI quickstart.

Notifications via Azure Event Grid
You can now get Blob notifications through Event Grid. To learn more, see the documentation “Reacting to Blob storage events.” Azure Data Lake Storage Gen2 notifications are currently available.

 

Ecosystem partner
More information

Azure Stream Analytics
Azure Stream Analytics now writes to, as well as reads from, Data Lake Storage.

Azure Event Hubs capture
The capture feature within Azure Event Hubs now lets you pick Data Lake Storage as one of its destinations.

IoT Hub
IoT Hub message routing now allows routing to Azure Data Lake Storage Gen 2.

Azure Search
You can now index and apply machine learning models to your Data Lake Storage content using Azure Search.

Azure Data Box
You can now ingest huge amounts of data from on-premises to Data Lake Storage using Data Box.

Please stay tuned as we enable more Blob storage features using this amazing capability.

Next steps

All these new capabilities are available today in West US 2 and West Central US. Sign up for the preview today. For more information, please see our documentation on multi-protocol access for Azure Data Lake Storage.
Quelle: Azure

Making it easier to bring your Linux based web apps to Azure App Service

Application development has radically changed over the years. From having to host all the physical hardware hosting the app and its dependences on-premises, to moving to a model where the hardware is hosted by external companies yet still managed by the users on to hosting your apps on a fully managed platform where all hardware and software management is done by the hosting provider. And then finally over to a full serverless solution where no resources need to be set up to run applications.

The perception of complexity in running smaller solutions in the cloud are slowly being eradicated due to moving solutions to a managed platform, where even non-technical audiences can manage their application in the cloud.

A great example in the managed platform realm is Azure App Service. Azure App Service provides an easy way to bring source code or containers and deploy full web apps in minutes, with the ease of configuration settings at the hands of the app owner. Built in features such as secure sockets layer (SSL) certificates, custom domains, auto-scaling, setting up a continuous integration and deployment (CI/CD) pipeline, diagnostics, troubleshooting, and much more, provides a powerful platform for full cycle build and management of the applications. Azure App Service also abstracts all of the infrastructure and its management overhead away from the users, maintaining the physical hardware running the service, patching security vulnerabilities, and continuously updating the underlying operating system.

Even in the managed platform world where customers shouldn’t care about the underlying platform they are physically running on, the reality is that some applications, depending on their framework, perform better on a specific operating system. This is the reason the team is putting a lot of work into the Linux hosting offering and making it easier to try it out. This includes our recent announcement about the free tier for Linux web apps, making it quick and simple to try out the platform with no commitments.

We’re excited to introduce a promotional price on the Basic app service plan for Linux, which depending on regional meters in your datacenter of choice, leads to a 66 percent price drop!

You can use the free tier to test the platform out, and then move up to the Basic tier and enjoy more of the platform’s capabilities. You can host many frameworks on this tier, including WordPress sites, Node.js, Python, Java, and PHP sites, and one of the most popular options that we’ve seen on the Linux offering – custom docker containers. Running a container hosted in Azure App Service provides an easy on-ramp for customers wanting to enjoy a fully managed platform, but also want a single deployable artifact containing an app and all of its dependencies, or want to work with a custom framework or version beyond the defaults built into the Azure App Service platform.

You can even use the Linux offering with networking solutions to secure your app using the preview feature of Azure virtual networks (VNet) integration to connect to an on-premise database, or to call into an Azure virtual network of your choice. You may also use access restrictions to control where your app may receive traffic from and place additional safeguards on the platform level.

What now? If you have a web workload you’re thinking of taking to the next level, try out Azure App Service now! Explore all of the possibilities waiting for you as you host your code or container on a managed platform that currently hosts more than two million sites!

Create your free Azure trial today.

Post on the Microsoft Developer Network forum for questions about Azure App Service.

If you have a feature suggestion for the product, please enter it in the feedback forum.
Quelle: Azure

Conversational AI updates for July 2019

At Build, we highlighted a few customers who are building conversational experiences using the Bot Framework to transform their customer experiences. For example, BMW discussed its work on the BMW Intelligent Personal Assistant to deliver conversational experiences across multiple canvases by leveraging the Bot Framework and Cognitive Services. LaLiga built their own virtual assistant which allows fans to experience and interact with LaLiga across multiple platforms.

With the Bot Framework release in July, we are happy to share new releases of Bot Framework SDK 4.5 and preview of 4.6, updates to our developer tools, and new channels in Azure Bot Service. We’ll use the opportunity to provide additional updates for the Conversational AI releases from Microsoft.

Bot Framework channels

We continue to expend channels support and functionality for Bot Framework and Azure Bot Service.

Voice-first bot applications: Direct Line Speech preview

The Microsoft Bot Framework lets you connect with your users wherever your users are. We offer thirteen supported channels, including popular messaging apps like Skype, Microsoft Teams, Slack, Facebook Messenger, Telegram, Kik, as well as a growing number of community adapters.

Today, we are happy to share the preview of Direct Line Speech channel. This is a new channel designed for voice-first experiences for your Bot Framework utilizing Microsoft’s Speech Services technologies.  he Direct Line Speech channel is a native implementation of speech for mobile applications and IoT devices, with support for Text-to-speech, Speech-to-text, and custom wake words.  We’re happy to share that we’re now opening the preview to all Bot Framework customers.

Getting started with voice support to your bot is easy. Simply update to the latest Bot Framework SDK, configure the Direct Line Speech channel for your bot, and use the Speech SDK to embed voice into your mobile application or device today.

Better isolation for your bot: Direct Line App Service Extension

Direct Line and Webchat are used broadly by Bot Framework customers to provide chat experiences on their web pages, mobile apps, and devices. For some scenarios, customers have given us the feedback that they’d like a version of Direct Line that can be deployed in isolation, such as in a Virtual Network (VNET). A VNET lets you create your own private space in Azure and is crucial to your cloud network as it offers isolation, segmentation, and other key benefits. The Direct Line App Service Extension can be deployed as part of a VNET, allowing IT administrators to have more control over conversation traffic and improve latency in conversations due to reduction in the number of hops. Feel free to get started with Direct Line App Service Extension.

Bot Framework SDK

As part of the Bot Framework SDK 4.6 preview we updated Adaptive Dialog, which allows developers to dynamically update conversation flow based on context and events. This is especially handy when dealing with conversation context switches and interruptions in the middle of a conversation. Learn more by reading the documentation and reviewing the samples.

Continuing our commitment to the Open Source community and following on our promise to allow developers to use their favorite programing language, we updated Bot Framework Python SDK. The Python SDK now supports OAuth, Prompts, CosmosDB, and includes all major functionality in SDK 4.5. In addition we got new samples.

Addressing customers’ and developers’ ask for better testing tools, the July version of the SDK introduces a new unit testing capability. The Microsoft.Bot.Builder.testing package simplifies the process of unit testing dialogs in your bot. Check out the documentation and samples.

Introduced at Microsoft Build 2019, the Bot Inspector is a new feature in the Bot Framework Emulator which lets you debug and test bots on channels like Microsoft Teams, Slack, Cortana, and more. As you use the bot on specific channels, messages will be mirrored to the Bot Framework Emulator where you can inspect the message data that the bot received. Additionally, a snapshot of the bot memory state for any given turn between the channel and the bot is rendered as well.

Following enterprise customers asks, we put together a web chat sample for a single sign-on to enterprise apps using OAuth. In this sample, we show how to authorize a user to access resources on an enterprise app with a bot. Two types of resources are used to demonstrate the interoperability of OAuth, Microsoft Graph and GitHub API.

Solutions

Virtual agent solution accelerator

We updated the Virtual Assistant and associated skills to enable out-of-box support for Direct Line Speech opening voice assistant experiences with no additional steps. This includes middleware to enable control of the voice being used. Once a new Virtual Assistant has been deployed, you can follow instructions for configuring Virtual Assistant with the Direct Line Speech channel. The example test harness application is also provided to enable you to quickly and easily test Speech scenarios.

An Android app client for Virtual Assistant is also available which integrates with Direct Line Speech and Virtual Assistant, demonstrating how a device client can interact with your Virtual Assistant and render Adaptive Cards.

In addition, we have added out-of-box support for Microsoft Teams ensuring that your Virtual Assistant and skills work including authentication and adaptive cards. You can follow steps for creating the associated application manifest.

The Virtual Assistant Solution Accelerator provides a set of templates, solution accelerators, and skills to help build sophisticated conversational experiences. A new Android app client for Virtual Assistant that integrates with Direct Line Speech and Virtual Assistant demonstrates how a device client can interact with your Virtual Assistant and render adaptive cards. Updates also include support for Direct-Line Speech and Microsoft Teams.

The Dynamics 365 Virtual Agent for Customer Service preview provides exceptional customer service with intelligent, adaptable virtual agents. Customer service experts can easily create and enhance bots with AI-driven insights. The Dynamic 365 Virtual Agent is built on top of the Bot Framework and Azure.
Quelle: Azure

Azure Monitor for containers with Prometheus now in preview

Prometheus is a popular open source metric monitoring solution and is a part of Cloud Native Compute Foundation. We have many customers who like the extensive metrics which Prometheus provides on Kubernetes. However, they also like how easy it is to use Azure Monitor for containers which provides fully managed, out of the box monitoring for Azure Kubernetes Service (AKS) clusters. We have been receiving requests to funnel the Prometheus data into Azure Monitor and today, we are excited to share Prometheus integration with Azure Monitor for containers is now in preview and brings together the best of two worlds.

Typically, to use Prometheus you need to setup and manage a Prometheus server with a database. With the Azure Monitor integration, no Prometheus server is needed. You just need to expose the Prometheus end-point through your exporters or pods (application), and the containerized agent for Azure Monitor for containers can scrape the metrics for you. We have provided a seamless onboarding experience to collect Prometheus metrics with Azure Monitor. The example below shows how the coredns metrics, which is part of the kube-dns-metric, is collected into Azure Monitor for logs. 

You can also collect workload metrics from your containers by instrumenting Prometheus SDK into your application. The example below shows the collection of the prommetrics_demo_requests_counter. You can collect workload metrics through URL, endpoints, or pod annotation as well.

Full stack monitoring with Azure Monitor for containers

So how does Prometheus metrics fit in with the rest of the metrics including the recently added storage and network performance metrics that Azure Monitor for containers already provides. You can see how the metrics all fit together below. Azure Monitor for containers provides out of the box telemetry at the platform, container, orchestrator level, and to an extent the workload level. With the additional workload metrics from Prometheus you now get full stack, end to end monitoring view for your Azure Kubernetes Services (AKS) in Azure Monitor for containers.

Visualizing Prometheus metrics on Azure dashboard and alerting

Once the metrics are stored in Azure Monitor logs, you can query against the metrics using Log Analytics with Kusto Query Language (KQL). Here’s a sample query that instruments the Prometheus SDK.  You can quickly plot the result using queries in the Azure portal.

<Queries>
InsightsMetrics
| where Name == "prommetrics_demo_requests_counter_total"
| extend dimensions=parse_json(Tags)
| extend request_status = tostring(dimensions.request_status)
| where request_status == "bad"
| where TimeGenerated > todatetime('2019-07-02T09:40:00.000')
| where TimeGenerated < todatetime('2019-07-02T09:54:00.000')
| project request_status, Val, TimeGenerated | render timechart

You can pin the chart to your Azure dashboard and create your own customized dashboard. You can also pin your current pod and node charts to the dashboard from the Azure Monitor for container cluster view.

If you would like to alert against the Prometheus metrics, you can do so using alerts in Azure. 

This has been an exciting integration for us, and we are looking to continue our effort to help our customers on monitoring Kubernetes. For more information on configuring the agent to collect Prometheus data, querying, and using the data on Azure Monitor for containers, visit our documentation. Prometheus provides rich and extensive telemetry, if you need to understand the cost implications here’s a query which will show you the data ingested from Prometheus into Azure Monitor logs.

For available metrics on Prometheus, please go to Prometheus website.

For any feedback or suggestions, please reach out to us through the techforum or stackoverflow.
Quelle: Azure

Azure Marketplace new offers – Volume 41

We continue to expand the Azure Marketplace ecosystem. For this volume, 109 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

Applications

Active Directory Domain Controller 2019: This virtual machine comes pre-loaded with the Active Directory Domain Services role, DNS server role, remote administration tools for Active Directory, DNS, and the required PowerShell modules.

ADQTM-aThingz Data Quality Tracking and Management: Improve operational efficiency, optimize cost, gain productivity, and eliminate recurring problems in your data with ADQTM’s seamless pre-built dashboards, KPIs, data models, machine learning, and cognitive features.

Anqlave Data Vault: Anqlave Data Vault is a secure and scalable key management system that leverages the Intel Software Guard Extensions technology on Azure and allows users to run and manage its Hardware Security Module on Azure.

ANS GLASS Cloud Management Portal: Managed Azure with ANS GLASS delivers business-critical cloud support to ensure you realize the full business value of Azure while keeping your business operationally agile and efficient in the cloud.

ArcBlock ABT Blockchain Node Taiwan: ArcBlock ABT Blockchain Node is fully decentralized and uses ArcBlock's blockchain development platform to easily build, run, and use DApps and blockchain-ready services.

ArcSight ArcMC 2.90: ArcSight Management Center (ArcMC) is a centralized security management system that manages large deployments of ArcSight solutions such as ArcSight Logger, SmartConnectors, FlexConnectors, and Connector Appliance through a single interface.

Array vAPV ADC for Azure: Array Networks vAPV is an easy-to-use, secure, high capacity application delivery controller that integrates with the Azure cloud environment while maintaining feature parity across physical, virtual, and cloud computing environments.

Attivo Networks ThreatDefend Deception: The Attivo ThreatDefend Deception Platform provides a comprehensive, customer-proven platform for proactive security and threat detection in user networks, data centers, clouds, and a variety of specialized attack surfaces.

Azure Blockchain Service Explorer: Azure Blockchain Service Explorer provides a rich interface for interpreting data on your ledger, with detailed views of tokens, contracts, accounts, transactions, and blocks.

Blue Ocean Note: Blue Ocean Note is a Software-as-a-Service care record management system for welfare facilities and nursery schools. This application is available only in Japanese.

BrightSkool: BrightSkool is designed to help schools manage challenges in a single unified solution. It is a reliable, affordable, web-based platform with a proven record of increased productivity and efficiency.

Cloud Edition for Lustre Client: Cloud Edition for Lustre Client is a scalable, parallel file system purpose-built for high performance computing (HPC) and ideally suited for dynamic, pay-as-you-go applications from rapid simulation and prototyping to peak HPC workloads.

Cloud Security Center: Ensure your cloud environments are equipped with the latest Microsoft 365 E5 Security features, yielding protection for user and administrator identities including all devices, applications, and data. This application is available only in Dutch.

Cloud Velocity: Cloud Velocity offers an “easy button” for monitoring Office 365. Devices are placed in critical locations, escalation protocols are defined, and then the outcomes simply happen: automated phone calls, emails, texts, alerts, and more.

CloudBees Jenkins Distribution: CloudBees Jenkins Distribution provides development teams with a highly dependable, secure Jenkins environment curated from the most recent supported Jenkins release. The distribution comes with a recommended catalog of tested plugins.

Corda Enterprise Virtual Machine: With Corda Enterprise Virtual Machine deployed on Microsoft Azure, developers can quickly and easily deploy nodes on a long-lived Corda network using pre-made cloud templates.

CrateDB Cloud: CrateDB Cloud is a scalable SQL cloud service hosted on Azure and operated 24/7 by Crate.io. It is ideal for industrial time series data processing and other IoT and machine data analytic workloads.

DataFabric for Azure: DataFabric automatically creates a live graph of stateful objects to represent real-world data sources, such as sensors, devices, and systems, and then dynamically interlinks these objects to maintain concurrency through a secure mesh of connections.

Datavard Glue: Datavard Glue seamlessly integrates your SAP landscape with big data applications running on Hadoop. Adapt big data technologies with no extra effort and leverage your SAP experience for big data operations.

DMX ETL: DMX enables data transformation to extract, transfer, and load data from multiple sources to on-premises SQL or Azure targets. DMX ETL is an easy-to-configure transformation tool that does not require coding or tuning.

DMX-H ETL: Syncsort’s DMX-H offers a single software environment for accessing and integrating all your enterprise data sources – both batch and streaming – while managing, governing, and securing the entire process.

eCourt: eCourt provides information and communication technology enablement for courts. It features an approval system at every stage in the court process, case approval notification, decentralized blockchain record storage, and more.

Ericom Connect VDI: Ericom Connect VDI provides virtual desktops running Windows 10 on Azure. The solution implements multifactor authentication and provides single sign-on, clientless access, and more. This application is available only in Japanese.

GAPTEQ | Low-Code-Platform: GAPTEQ is a professional front end for your SQL database. Use GAPTEQ to connect to a Microsoft SQL Server database or MySQL and then use its metadata, tables, logic, and data.

hd backup 365: Azure Backup replaces your existing on-premises or off-site backup solution with a cloud-based solution that is reliable, secure, and cost-competitive. This application is available only in Korean.

Human Risks: Human Risks is an online platform that enables you to manage the entire enterprise security risk management process from risk assessments to incident response.

Hyperproof Trial: Hyperproof is targeted at technical teams, compliance managers, and auditors considering new compliance programs or looking to make existing programs more efficient. Hyperproof makes it easy to manage all your day-to-day compliance tasks.

IncrediBuild Demo: Easily accelerate a Visual Studio sample with IncrediBuild or upload your code to gain exceptional build performance. This instance includes a pre-installed IncrediBuild Coordinator and Agent with Visual Studio Community and a Visual Studio sample project.

Infosys Analytics Workbench: Infosys Analytics Workbench provides leading capabilities for data discovery, analytical modeling, model management, visualization, and self-service model consumption to deliver end-to-end self-service capabilities.

Intellicus BI Server V18.1 (10 Users – Linux): Intellicus BI Server on Microsoft Azure is an end-to-end self-service BI platform that offers advanced reporting and analytics capabilities, a semantic layer, and integrated ETL capabilities.

Intellicus BI Server V18.1 (5 Users – Linux): Intellicus BI Server on Microsoft Azure is an end-to-end self-service BI platform that offers advanced reporting and analytics capabilities, a semantic layer, and integrated ETL capabilities.

Johnson Controls Digital Vault: Johnson Controls Digital Vault is a flexible, scalable platform that reaches across silos to gather data from disparate sources, store it securely, and standardize it. It then converts that data into something you can leverage to gain new insights.

Kamstrup Analytics – District Analyser: Kamstrup’s analytics platform for water utilities comprises two systems – Water Intelligence and Incidents – to help you effectively go from imagining “What if” to knowing “How to.”

KEYRUS – Chatbot CODY – Smart Assistant Integrado: The CODY Smart Assistant solution with Microsoft LUIS offers an intelligent conversation platform that allows quick access to results and KPI placements for sales and operations. This application is available only in Portuguese.

Lenses.io: Lenses is an innovative DataOps platform providing SQL access and processing on streaming data. Lenses on Azure is optimized for both Azure HDInsight and your own Kafka clusters to streamline the configuration.

M365 Workplace Cloud Storage | Easy Intune Storage: Microsoft cloud-managed devices get relevant policies and configurations from Microsoft Intune, with some settings relying on files available by URL. This application manages these files with an easy, web-based approach.

Managed Detection and Response for Azure: Protect your Azure deployment with Paladion’s comprehensive Managed Detection and Response service that leverages next-generation AI to defend your Azure deployment at every stage of a threat’s lifecycle.

ManageEngine Mobile Device Manager Plus MSP: Mobile Device Manager Plus MSP is mobile device management software that features device enrollment, app management, profile management, security management, and more.

MinIO Client Container Image: MinIO Client is a Golang command line interface tool that offers alternatives for ls, cp, mkdir, diff, and rsync commands for file systems and object storage systems.

MinIO Container Image: MinIO is an object storage server that is compatible with cloud storage services and is mainly used for storing unstructured data such as photos, videos, and log files.

movingimage Secure Enterprise Video Platform: This Azure-based platform offers a smooth, secure video streaming experience for large companies across different verticals – including 26 of 30 DAX-listed companies.

OpenCities DigitalWorkplace Intranet: Empower your city employees with tools that elevate communication and collaboration. OpenCities DigitalWorkplace is a powerful cloud-based intranet that can streamline processes to save your city time and money.

OpenCities Web CMS: OpenCities makes it easier for cities to transform their websites into digital government platforms. OpenCities’ user-tested templates deliver beautiful and functional sites that allow staff to create content and online services that engage citizens.

OPSAI.COM: The OPSAI platform delivers deep insight into your IT estate, allowing IT and business users to have a common view of systems and processes. Ensure a secure, compliant IT infrastructure and automate your operations.

Phish Hunter: Phish Hunter offers an automated solution to phishing and identity compromise. The solution simplifies the process of detecting and remediating phishing incidents, eliminating the risk of compromised credentials.

Photographic Asset Inspection: Photographic Asset Inspection is for infrastructure owners responsible for maintaining and operating concrete surface infrastructure such as bridges.

PI3: PI3 is an Azure-based card payments analytics and reporting platform for financial institutions. The PI3 platform enables businesses connected to card transactions (credit or debit) to gain insights from their data.

PiXYZ Studio: PiXYZ Studio prepares and transforms 3D CAD data into 3D assets that are ready to use in real-time experiences for various business purposes, including design, marketing, and training.

Precision Campus Analytics: Precision Campus provides an online query tool and dashboard system for your college or university. Enable colleagues to explore enrollment, retention rates, course success rates, and other metrics you choose.

Publico24 sp. z o.o.: Publico24 is a digital press newsstand that offers news stories in HTML as a service that can help boost customer satisfaction. This application is available only in Polish.

PyTorch Container Image: PyTorch is a deep learning platform that accelerates the transition from research prototyping to production deployment. This Bitnami image includes Torchvision for specific computer vision support.

Real-World Audiences and Triggers for Dynamics: Neura enables you to attribute events such as session starts, app opens, push engagement attempts, and in-app features to real-world user behavior, uncovering actionable insights for campaign optimization.

Rhapsody Golf: Get comprehensive golf course management with Rhapsody Golf. The Front Office module handles bag drop, locker assignment, flight and caddy, and tournaments and scoring while the Membership module seamlessly handles privileges, statements, and renewals.

Rhapsody Hospitality Management System: The comprehensive Rhapsody Hospitality Management System handles financial consolidation, centralized purchasing, and group-level business intelligence while helping manage multiple properties.

RStudio Connect: Publish R and Python data products in one IT-managed and monitored location with flexible security policies to bring the power of data science to your enterprise.

SD-INTERNET: SD-INTERNET helps small and midsize enterprises accelerate their digital transformation journey by enhancing Azure cloud application experiences through Adaptiv Networks' high-performance Network-as-a-Service platform.

Service Sheeft: Developed on Microsoft technologies to be used as Software-as-a-Service on Azure, Service Sheeft is a ticketing system focused on improving communication and collaboration when solving end user support issues.

SmartCursors Marketplace: SmartCursors is a marketplace of integrated cloud applications for managing, driving, and transforming every aspect of business.

Smetric Business Intelligence Service: Smetric’s business intelligence tools automatically extract, analyze, and present data from various sources in beautiful, customized dashboards. The visual formats are easy to read, easy to share, and accessible from anywhere, anytime.

Solucion Neurona: Designed for financial institutions, Neurona works as a transactional switch specialized in managing electronic money transactions (mass payments, electronic collections, and funds transfers). This application is available only in Spanish.

SwaggerHub Cloud: Create a single source of truth for OpenAPI definitions with SwaggerHub's API design platform. Collaborate on changes and new development, define and enforce standards across your API catalog, and integrate seamlessly with other API lifecycle solutions.

SysTrack Digital Experience Monitoring: SysTrack is an experience monitoring solution that gathers data on what affects your users and their productivity in the digital workplace – including CPU, RAM, application resource use, and over 10,000 other data points.

Teradata Data Mover: Teradata Data Mover is a powerful data movement tool that intelligently chooses the fastest method to copy data and database objects between databases.

Teradata Data Mover (IntelliSphere): Teradata Data Mover is a powerful data movement tool that intelligently chooses the fastest method to copy data and database objects between databases.

Teradata Ecosystem Manager: Teradata Ecosystem Manager provides an end-to-end approach to meeting application SLAs through monitoring, administration, and control of data warehouse environments to let you more effectively manage your deployment.

Teradata Query Service: Teradata Query Service provides application developers a simplified, modern interface to connect to data from a web page or application.

Teradata QueryGrid Manager (IntelliSphere): Teradata QueryGrid Manager (IntelliSphere) provides federated query capability that allows users to access and query data in remote servers that are part of the Teradata QueryGrid data fabric.

Teradata Vantage with IntelliSphere: Teradata Vantage is Teradata's flagship analytics platform that provides a fast path to secure, scalable, high-performance analytics for tackling your most complex business challenges.

Teradata Viewpoint (IntelliSphere): Teradata Viewpoint (IntelliSphere) is an advanced web-based management portal for up to 10 Teradata systems whether in the cloud or on-premises. Entitlement comes from a paid Teradata Vantage with IntelliSphere subscription.

UiPath Robot: This solution template delivers provisioning of UiPath robots including automatic connection to your UiPath Orchestrator for secure scheduling, management, and control of your enterprise-wide digital workforce.

Unscrambl Answers: Unscrambl Answers has been trained with domain-specific knowledge about your business, understands your data, and has deeply embedded machine learning algorithms that help you discover and present relevant insights in natural language.

Voyado: The powerful, user-friendly Voyado loyalty system helps strengthen your customer relations and uses data to increase sales, cut costs, and reach maximum profitability.

VSBLTY VisionCaptor: The VisionCaptor content management system provides a wide variety of capabilities for bringing proximity-aware, interactive brand messaging to life on any digital screen or platform.

VULCAN: VULCAN analyzes your trainees’ performance in real time while they execute an exercise, providing you with the information needed to increase efficiency and flexibility.

WebFOCUS BUE 8201m: WebFOCUS BUE is for business users and analysts who would like to generate and share reports, charts, dashboards, and in-document analytics to conduct data discovery and explore data for trends, patterns, and opportunities.

Consulting services

Azure 4 Week Briefing, Assessment, and POC Offer: Pyramid Consulting Solutions offers an innovative Azure consulting service in three phases: on-site kickoff briefing, 30-day assessment, and proof of concept for migration of an initial workload to Azure.

Azure Backup & Restore: 4-Wk POC: Using Azure for backup and data protection presents an opportunity to address a number of risk and compliance objectives. Test-drive Azure Backup for four weeks and protect up to five workloads in Azure with this offer from Foundation IT.

Azure Cloud Assessment with Rackspace: 2-Wk: Rackspace consultants will assess your application estate and infrastructure platform to set a strategy for moving workloads to the cloud. They will also provide a report to determine a roadmap and high-level Azure design.

Azure Data Warehouse & Data Lake: 2-Hr Assessment: Neudesic will review your Azure logical data warehouse and Azure Data Lake requirements, then explain how a repeatable approach can deliver opportunities in predictive and prescriptive analytics in your environment.

Azure DevOps Jumpstart: 1-Week Implementation: Wintellect's one-week consulting offer jump-starts your dev-ops move to the cloud. Build, test, automate, and deploy applications more efficiently while reducing costs and increasing team productivity.

Azure Foundations Service: 10-day Implementation: Transparity Solutions will help you lead your Azure journey with governance and security. Learn how to architect key components such as networking integration, identity, network security, compute, and storage.

Azure Migration Planning Free 4 Hour Workshop: SystemsUp offers a free four-hour workshop to discuss whether your existing compute environment could be successfully migrated to Azure, resulting in a statement of work or proposal for work to deliver the engagement.

Azure Migration: 1-day Assessment: Atmosera offers a customer-proven assessment practice that ensures a match of your needs with the optimal cloud solution, delivering a clear roadmap with options to make informed decisions on Azure migration.

Azure Migration: 3-week Assessment: TCS offers a detailed, three-week cloud suitability assessment of up to 20 business applications along with associated infrastructure. An outcome report covers deployment model, cost benefit analysis, and migration plan.

Cloud Backup: 2-Day Assessment: Find out the ROI of moving to cloud backup and disaster recovery with this assessment by Insight, which compiles and clarifies the data you need to make well-informed decisions that will affect your organization’s operational resiliency.

CloudForte Consulting for Azure: Unisys CloudForte for Azure is a comprehensive and customizable managed services offering that addresses the most critical and trickiest cloud adoption challenges, especially around compliance and security.

CSP Migration: 2 Week Free Rapid Migration: Hanu Software offers a no-cost assessment and migration for existing Azure customers to Hanu's Cloud Service Provider (CSP). Hanu will provide a migration roadmap and recommendations on cost and performance optimizations.

CTA for Azure Migration: 2-Wk Assessment: This two-week datacenter-to-Azure migration assessment by Silicus is focused on helping enterprises with cloud adoption strategy, roadmap planning, current-state assessment, and solution gap analysis.

Datacom Enabling Services: 3-Wk Implementation: Standardize, automate, and securely deliver business-grade cloud applications with help from Datacom, an Azure Expert MSP. Customers benefit from the scale, security, and expert skillsets available via Datacom Enabling Services.

DevOps Practices and Platform: 1 Day Assessment: This professional dev-ops service from CloudOps supports current Azure users (or customers looking to get started with Azure) who care about speed to market and modernizing their applications and infrastructure practices.

Digital Media Assessment: 2-Hr Assessment: Globant will assess your digital media strategy and content delivery requirements and will provide a recommendation on how Azure can deliver your media to multiple endpoints for accelerated business impact.

Disaster Recovery trial in Azure: 4-Wk POC: This trial lets you test how Azure can protect workloads. Foundation IT will set up and test Azure Site Recovery on your behalf and simulate a DR test so that you have a complete view of how a managed cloud DR service works.

Domino & Notes App Modernization: 1-Day Workshop: This workshop from Binary Tree will detail the technical, business, and end user considerations and options available to pursue a Notes/Domino retirement or retention program.

DRaaS on Azure – 1 Week Assessment: Cloud4C, a Microsoft CSP Gold partner, can help assess and execute your disaster recovery plan. Azure-certified architects will create a roadmap to understand, define, and plan an optimal DR strategy for your organization.

Encrypted Briefcase: 1-Hour Briefing: Communication Square offers a briefing on best practices to implement its Encrypted Briefcase solution, which provides data protection, access tracking, and permissions revocation for Word, Excel, PowerPoint, and PDF files.

Encrypted Briefcase: 1-Wk Assessment: This assessment from Communication Square analyzes your file storage and document collaboration platforms and then explains how to deploy Encrypted Briefcase to track file access, revoke permissions, and restore older data.

Encrypted Briefcase: 2 Weeks PoC: Communication Square offers this proof of concept for Encrypted Briefcase, which provides data protection for Word, Excel, PowerPoint, and PDF files. Experts will set up, provision, and provide admin and onboarding guides for your solution.

Encrypted Briefcase: 4 Weeks Implementation: This four-week training by Communication Square leads to a deeper understanding and implementation of data protection provided by Encrypted Briefcase and delivers complimentary email support for one year.

Hybrid Identity and Access Management: 10-Wk Imp: Conterra will design a solution based on Microsoft Identity Manager (MIM) 2016 and Azure Active Directory and deploy it in your production environment either on-premises on via Azure IaaS.

Mobile App Innovation: 1hr Briefing: This one-hour mobile app innovation briefing from Dootrix shares best practices and helps you learn how to build a next-generation mobile app on the Azure cloud platform.

Running SCOM in Azure: 5 Day Assessment: The SCOM to Azure service assesses your System Center Operations Manager infrastructure and provides a framework to move it to Azure effectively and cost-efficiently.

SAP on Azure – 1 week Assessment: Cloud4C's SAP-certified consultants participate in a detailed assessment and workshop to define the best path for SAP migration and onboarding on Microsoft Azure.

Secure Communication System – 2 Week PoC: This Azure-based offering is designed to reimagine how your business approaches secure communication and compliance to industry-defined standards.

Secure Communication System: 1-Week Assessment: This assessment helps IT directors assess that their communication and collaboration are complaint, validate that access to data controls are in place and functioning properly, and confirm that company information is secure.

Secure Communication System – 1 Hour Briefing: Communication Square's Azure-based offering covers making, receiving, and transferring business calls in the office, at home, or on the road using your phone or PC without the need for a traditional phone system.

Secure Emailing System – 1-Hour Briefing: This briefing will address the following topics: How to protect your data no matter where it is, how to automatically classify sensitive information, and how to track and revoke access to emails and attachments.

Secure Emailing System – 2-Week Proof of Concept: This Azure-based offering is designed to reimagine how your business approaches securing email systems, compliance to industry-defined standards, and secure access to data.

Secure Emailing System: 1 Week Assessment: This assessment illustrates encryption and decryption methods, automatic data classification, protection against non-compliance, and how to combine the right set of tools, knowledge, and expertise to benefit your email security.

Secure Emailing System: 4 Wk Implementation: Communication Square's Azure-based offering helps classify your data based on sensitivity, protect your data, and leverage deployment and management flexibility.

Quelle: Azure

Microsoft makes it easier to build popular language representation model BERT at large scale

This post is co-authored by Rangan Majumder, Group Program Manager, Bing and Maxim Lukiyanovm, Principal Program Manager, Azure Machine Learning.

Today we are announcing the open sourcing of our recipe to pre-train BERT (Bidirectional Encoder Representations from Transformers) built by the Bing team, including code that works on Azure Machine Learning, so that customers can unlock the power of training custom versions of BERT-large models using their own data. This will enable developers and data scientists to build their own general-purpose language representation beyond BERT.

The area of natural language processing has seen an incredible amount of innovation over the past few years with one of the most recent being BERT. BERT, a language representation created by Google AI language research, made significant advancements in the ability to capture the intricacies of language and improved the state of the art for many natural language applications, such as text classification, extraction, and question answering. The creation of this new language representation enables developers and data scientists to use BERT as a stepping-stone to solve specialized language tasks and get much better results than when building natural language processing systems from scratch.

The broad applicability of BERT means that most developers and data scientists are able to use a pre-trained variant of BERT rather than building a new version from the ground up with new data. While this is a reasonable solution if the domain’s data is similar to the original model’s data, it will not deliver best-in-class accuracy when crossing over to a new problem space. For example, training a model for the analysis of medical notes requires a deep understanding of the medical domain, providing career recommendations depend on insights from a large corpus of text about jobs and candidates, and legal document processing requires training on legal domain data. In these cases, to maximize the accuracy of the Natural Language Processing (NLP) algorithms one needs to go beyond fine-tuning to pre-training the BERT model.

Additionally, to advance language representation beyond BERT’s accuracy, users will need to change the model architecture, training data, cost function, tasks, and optimization routines. All these changes need to be explored at large parameter and training data sizes. In the case of BERT-large, this can be quite substantial as it has 340 million parameters and trained over 2.5 billion Wikipedia and 800 million BookCorpus words. To support this with Graphical Processing Units (GPUs), the most common hardware used to train deep learning-based NLP models, machine learning engineers will need distributed training support to train these large models. However, due to the complexity and fragility of configuring these distributed environments, even expert tweaking can end up with inferior results from the trained models.

To address these issues, Microsoft is open sourcing a first of a kind, end-to-end recipe for training custom versions of BERT-large models on Azure. Overall this is a stable, predictable recipe that converges to a good optimum for developers and data scientists to try explorations on their own.

“Fine-tuning BERT was really helpful to improve the quality of various tasks important for Bing search relevance,” says Rangan Majumder, Group Program Manager at Bing, who led the open sourcing of this work.  “But there were some tasks where the underlying data was different from the original corpus BERT was pre-trained on, and we wanted to experiment with modifying the tasks and model architecture.  In order to enable these explorations, our team of scientists and researchers worked hard to solve how to pre-train BERT on GPUs. We could then build improved representations leading to significantly better accuracy on our internal tasks over BERT.  We are excited to open source the work we did at Bing to empower the community to replicate our experiences and extend it in new directions that meet their needs.”

“To get the training to converge to the same quality as the original BERT release on GPUs was non-trivial,” says Saurabh Tiwary, Applied Science Manager at Bing.  “To pre-train BERT we need massive computation and memory, which means we had to distribute the computation across multiple GPUs. However, doing that in a cost effective and efficient way with predictable behaviors in terms of convergence and quality of the final resulting model was quite challenging. We’re releasing the work that we did to simplify the distributed training process so others can benefit from our efforts.”

Results

To test the code, we trained BERT-large model on a standard dataset and reproduced the results of the original paper on a set of GLUE tasks, as shown in Table 1. To give you estimate of the compute required, in our case we ran training on Azure ML cluster of 8xND40_v2 nodes (64 NVidia V100 GPUs total) for 6 days to reach listed accuracy in the table. The actual numbers you will see will vary based on your dataset and your choice of BERT model checkpoint to use for the upstream tasks.

Table1. GLUE Test results, evaluated by the provided test script on the GLUE development set. The “Average” column is simple average over the table results. F1 scores are reported for QQP and MRPC, Spearman correlations are reported for STS-B, and accuracy scores are reported for the other tasks. The results for tasks with smaller dataset sizes have significant variation and may require multiple fine-tuning runs to reproduce the results.

The code is available in open source on the Azure Machine Learning BERT GitHub repo. Included in the repo is:

A PyTorch implementation of the BERT model from Hugging Face repo.
Raw and pre-processed English Wikipedia dataset.
Data preparation scripts.
Implementation of optimization techniques such as gradient accumulation and mixed precision.
An Azure Machine Learning service Jupyter notebook to launch pre-training of the model.
A set of pre-trained models that can be used in fine-tuning experiments.
Example code with a notebook to perform fine-tuning experiments.

With a simple “Run All” command, developers and data scientists can train their own BERT model using the provided Jupyter notebook in Azure Machine Learning service. The code, data, scripts, and tooling can also run in any other training environment.

Summary

We could not have achieved these results without leveraging the amazing work of the researchers before us, and we hope that the community can take our work and go even further. If you have any questions or feedback, please head over to our GitHub repo and let us know how we can make it better.

Learn how Azure Machine Learning can help you streamline the building, training, and deployment of machine learning models. Start free today.
Quelle: Azure

Assess the readiness of SQL Server data estates migrating to Azure SQL Database

Migrating hundreds of SQL Server instances and thousands of databases to Azure SQL Database, our Platform as a Service (PaaS) offering, is a considerable task, and to streamline the process as much as possible, you need to feel confident about your relative readiness for migration. Being able to identify low-hanging fruit including the servers and databases that are fully ready or that require minimal effort to prepare for migration eases and accelerates your efforts. We are pleased to share that Azure database target readiness recommendations have been enabled.

The Azure Migrate hub provides a unified view of all your migrations across the servers, applications, and databases. This integration provides customers with a seamless migration experience beginning during the discovery phase. The functionality allows customers to use assessment tools for visibility into the applications currently run on-premises so that they can determine cloud suitability and project the cost of running their applications in the cloud. It also allows customers to compare options between competing public and hybrid cloud options.

Assessing and viewing results

Assessing the overall readiness of your data estate for a migration to Azure SQL Database requires only a few steps:

Provision an instance of Azure Migrate, create a migration project, and then add Data Migration Assistant to the migration solution to perform the assessment.
After you create the migration project, download Data Migration Assistant and run an assessment against one or more SQL Server instances.
Upload the Data Migration Assistant assessment results to the Azure Migrate hub.

In a few minutes, the Azure SQL Database target readiness results will be available in your Azure Migrate project.

You can use single assessment for as many SQL Servers as you want, or you can run multiple parallel assessments and upload them to the Azure Migrate hub. The Azure Migrate hub consolidates all the assessments and a provide summarized view of SQL Server and database readiness.

The Azure Migrate dashboard provides a view of your data estate and its overall readiness for migration. This includes the number of databases that are ready to migrate to Azure SQL Database and to SQL Server hosted on an Azure virtual machine. Readiness is computed based on feature parity and schema compatibility with various Azure SQL Database offerings. The dashboard also provides insight into overall migration blockers and the all-up effort involved with migrating to Azure.

IT pros and database administrators can drill-down further to view a specific set of SQL Server instances and databases for a better understanding their readiness for migration.

The “Assessed database” view provides an overview of individual databases, showing info like migration blockers and readiness for Azure SQL Database and SQL Servers hosted on an Azure virtual machine.

Get started

Migrations can be overwhelming and a bit daunting, but we’re here with the expertise and tools, like Data Migration Assistant, to support you along the way. Discover your readiness results and acceleration your migration.

Get started:

Step-by-step guide on how to assess  your readiness
Perform a SQL Server migration assessment with Data Migration Assistant

Quelle: Azure

New capabilities in Stream Analytics reduce development time for big data apps

Azure Stream Analytics is a fully managed PaaS offering that enables real-time analytics and complex event processing on fast moving data streams. Thanks to zero-code integration with over 15 Azure services, developers and data engineers can easily build complex pipelines for hot-path analytics within a few minutes. Today, at Inspire, we are announcing various new innovations in Stream Analytics that help further reduce time to value for solutions that are powered by real-time insights. These are as follows:

Bringing the power of real-time insights to Azure Event Hubs customers

Today, we are announcing one-click integration with Event Hubs. Available as a public preview feature, this allows an Event Hubs customer to visualize incoming data and start to write a Stream Analytics query with one click from the Event Hub portal. Once the query is ready, they will be able to operationalize it in few clicks and start deriving real time insights. This will significantly reduce the time and cost to develop real-time analytics solutions.

One-click integration between Event Hubs and Azure Stream Analytics

Augmenting streaming data with SQL reference data support

Reference data is a static or slow changing dataset used to augment real-time data streams to deliver more contextual insights. An example scenario would be currency exchange rates regularly updated to reflect market trends, and then converting a stream of billing events in different currencies to a common currency of choice.

Now generally available (GA), this feature provides out-of-the-box support for Azure SQL Database as reference data input. This includes the ability to automatically refresh your reference dataset periodically. Also, to preserve the performance of your Stream Analytics job, we provide the option to fetch incremental changes from your Azure SQL Database by writing a delta query. Finally, Stream Analytics leverages versioning of reference data to augment streaming data with the reference data that was valid at the time the event was generated. This ensures repeatability of results.

New analytics functions for stream processing

Pattern matching:

With the new MATCH_RECOGNIZE function, you can easily define event patterns using regular expressions and aggregate methods to verify and extract values from the match. This enables you to easily express and run complex event processing (CEP) on your streams of data. For example, this function will enable users to easily author a query to detect “head and shoulder” patterns on the on a stock market feed.

Use of analytics function as aggregate:

You can now use aggregates such as SUM, COUNT, AVG, MIN, and MAX directly with the OVER clause, without having to define a window. Analytics functions as Aggregates enables users to easily express queries such as “Is the latest temperature greater than the maximum temperature reported in the last 24 hours?”

Egress to Azure Data Lake Storage Gen2

Azure Stream Analytics is a central component within the Big Data analytics pipelines of Azure customers. While Stream Analytics focuses on the real-time or hot-path analytics, services like Azure Data Lake help enable batch processing and advanced machine learning. Azure Data Lake Storage Gen2 takes core capabilities from Azure Data Lake Storage Gen1 such as a Hadoop compatible file system, Azure Active Directory, and POSIX based ACLs and integrates them into Azure Blob Storage. This combination enables best in class analytics performance along with storage tiering and data lifecycle management capabilities and the fundamental availability, security, and durability capabilities of Azure Storage.

Azure Stream Analytics now offers native zero-code integration with Azure Data Lake Storage Gen2 output (preview.)

Enhancements to blob output

Native support for Apache parquet format:

Native support for egress in Apache parquet format into Azure Blob Storage is now generally available. Parquet is a columnar format enabling efficient big data processing. By outputting data in parquet format into a blob store or a data lake, you can take advantage of Azure Stream Analytics to power large scale streaming extract, transfer, and load (ETL), to run batch processing, to train machine learning algorithms, or to run interactive queries on your historical data. We are now announcing general availability of this feature for egress to Azure Blob Storage.

Managed identities (formerly MSI) authentication:

Azure Stream Analytics now offers full support for Managed Identity based authentication with Azure Blob Storage on the output side. Customers can continue to use the connection string based authentication model. This feature is available as a public preview.

Many of these features just started rolling out worldwide and will be available in all regions within several weeks.

Feedback

The Azure Stream Analytics team is highly committed to listening to your feedback and letting the user voice influence our future investments. We welcome you to join the conversation and make your voice heard via our UserVoice page.
Quelle: Azure