SAP HANA backup using Azure Backup is now generally available

Today, we are sharing the general availability of Microsoft Azure Backup’s solution for SAP HANA databases in the UK South region.

Azure Backup is Azure's native backup solution, which is BackInt certified by SAP. This offering aligns with Azure Backup’s mantra of zero-infrastructure backups, eliminating the need to deploy and manage backup infrastructure. You can now seamlessly backup and restore SAP HANA databases running on Microsoft Azure Virtual Machines (VM) — M series Virtual Machine is also supported, and leverage enterprise management capabilities that Azure Backup provides.

Benefits

15-minute Recovery Point Objective (RPO): Recovery of critical data of up to 15 minutes is possible.
One-click, point-in-time restores: Easy to restore production data on SAP HANA databases to alternate servers. Chaining of backups and catalogs to perform restores is all managed by Azure behind the scenes.
Long-term retention: For rigorous compliance and audit needs, you can retain your backups for years, based on the retention duration, beyond which the recovery points will be pruned automatically by the built-in lifecycle management capability.
Backup Management from Azure: Use Azure Backup’s management and monitoring capabilities for improved management experience.

Watch this space for more updates on GA rollout to other regions. We are currently in preview in these Azure geos.

Getting started

We are working on making the SAP HANA backup experience even better. Find out the scenarios we support today.
See how to backup and restore SAP HANA databases.
Manage and Monitor the backed up SAP HANA databases.
Need help? Read the troubleshooting documentation. Or reach out to the Azure Backup forum for support.
Learn more about Azure backup.
Follow us on twitter @AzureBackup for more updates.

Quelle: Azure

Azure Cost Management updates – November 2019

Whether you're a new student, thriving startup, or the largest enterprise, you have financial constraints and you need to know what you're spending, where, and how to plan for the future. Nobody wants a surprise when it comes to the bill, and this is where Microsoft Azure Cost Management comes in.

We're always looking for ways to learn more about your challenges and how Cost Management can help you better understand where you're accruing costs in the cloud, identify and prevent bad spending patterns, and optimize costs to empower you to do more with less. Here are a few of the latest improvements and updates based on your feedback:

Cost Management now available for Cloud Solution Providers
What's new in Cost Management Labs
Customizing the name on dashboard tiles
Upcoming changes to Azure usage data
Save up to 72% with Azure reservations–now available for 16 services
New videos
Documentation updates

Let's dig into the details.

Cost Management now available for Cloud Solution Providers

In case you missed it, as of November 1, Cloud Solution Provider (CSP) partners can now see and manage costs for their customers using Azure Cost Management in the Azure portal by transitioning them to Azure plan subscriptions via Microsoft Customer Agreement. Partners can also enable Azure Cost Management for customers to allow them to see and manage the cost of their subscriptions.

If you're working with a CSP partner to manage your Azure subscriptions, talk to them about getting you onboarded and your subscriptions switched over to the new Azure plan using Microsoft Customer Agreement. Not only will this allow you to see and manage costs in the Azure portal, but you'll also be able to use some Azure services that aren't currently available to your classic CSP subscriptions. As an example, some organizations have dependencies on external solutions that still require classic services, including virtual machines. To work around this, organizations are creating separate pay-as-you-go subscriptions for those resources. This adds additional overhead to manage separate billing accounts with Microsoft and your partner. Once you've switched over to Azure plan subscriptions, you may be able to consolidate any existing CSP and non-CSP subscriptions into a single billing account, managed by your partner. In general, you'll have the same benefits and offerings at the same time as everyone else using Microsoft Customer Agreement. Make sure you talk to your partner today!

If you're a CSP provider, enabling Cost Management for your customers involves three steps:

Confirm acceptance of the Microsoft Customer Agreement on behalf of your customers
Present the Microsoft Customer Agreement to your customers and, once they've agreed, confirm the customer's official acceptance in Partner Center or via the API/SDK.
Transition your customers to Azure plan
The last step for you, as the partner, to see and manage cost in the Azure portal is to transition existing CSP offers to an Azure plan. You'll need to do this once for each reseller and direct customer.
Enable Azure Cost Management for your customers
In order for your customers to see and manage costs in Azure Cost Management, they need to have access to view charges for their subscriptions. This can be enabled from the Azure portal for each customer and shows them their cost based on pay-as-you-go prices and do not include partner discounts or any discounts you may offer. Please ensure your customers understand the cost will not match your invoice if you offer additional discounts or use custom prices.

To learn more about what you'll see after enabling Azure Cost Management for your customers, read Get started with Azure Cost Management for partners.

What's new in Cost Management Labs

With Cost Management Labs, you get a sneak peek at what's coming in Azure Cost Management and can engage directly with us to share feedback and help us better understand how you use the service so we can deliver more tuned and optimized experiences. Here are a few features you can see in Cost Management Labs:

Get started quicker with the cost analysis Home view
Cost Management offers five built-in views to get started with understanding and drilling into your costs. The Home view gives you quick access to those views so you get to what you need faster.
Performance optimizations in cost analysis and dashboard tiles—Now available in the public portal
Whether you're using tiles pinned to the dashboard or the full experience, you'll find cost analysis loads faster than ever.
NEW: Show views name on pinned cost analysis tiles—Now available in the public portal
When you pin cost analysis to the dashboard, it now shows the name of the view you pinned. To change it, simply save the view with the desired name and pin cost analysis again!
NEW: Quick access to cost analysis help and support—Now available in the public portal
Have a question? Need help? The quickstart tutorial is now one click away in cost analysis. And if you run into an issue, create a support request from cost analysis to send additional context to help you submit and resolve your issue quicker than ever.

Of course, that's not all. Every change in Cost Management is available in Cost Management Labs a week before it's in the full Azure portal. We're eager to hear your thoughts and understand what you'd like to see next. What are you waiting for? Try Cost Management Labs today.

Customizing the name on dashboard tiles

You already know you can save and share views in cost analysis. You'll typically start by saving a customized view in cost analysis so others can use it. You might share a link so they can jump directly into the view from outside the portal or share an image of the view to include in an email or presentation. But if you really want to keep an eye on specific perspectives of your cost every time you sign in to the portal, the best option is to pin your view to the dashboard.

Pinning is easy: Just click the pin icon in the top-right corner of cost analysis and you're done. When you pin your view, the tile shows the name of your view, the scope it represents, and the main chart or table from cost analysis. If you have an older tile you need to rename, open it in cost analysis, click Save as to change the name of the view, then pin it again.

Enjoy and let us know what you'd like to see next!

Upcoming changes to Azure usage data

Many organizations use the full Azure usage and charges to understand what's being used, identify what charges should be internally billed to which teams, and to look for opportunities to optimize costs with Azure reservations and Azure Hybrid Benefit. If you're doing any analysis or setup integration based on product details in the usage data, please update your logic for the following services. All of the following changes will start effective December 1:

VNet Gateway service will become VPN Gateway.
Process Automation Watcher will have a new meter ID.
Azure Monitor custom metrics will become Network watcher perf monitor connection metrics.

Also, remember the key-based EA billing APIs have been replaced by new Azure Resource Manager APIs. The key-based APIs will still work through the end of your enrollment, but will no longer be available when you renew and transition into Microsoft Customer Agreement. Please plan your migration to the latest version of the UsageDetails API to ease your transition to Microsoft Customer Agreement at your next renewal.

Save up to 72 percent with Azure reservations – now available for 16 services

Azure reservations help you save up to 72% compared to pay-as-you-go rates when you commit to one or three years of usage. You may know Azure Advisor tells you when you can save money with virtual machine reservations, but did you know with the addition of six new services, you can now purchase reservations for a total of 16 services? Here's the full list as of today:

Virtual machines and managed disks
Blob storage
App Service
SQL database and data warehouse
Azure Database for MySQL, MariaDB, and PostgreSQL
Cosmos DB
Data Explorer
Databricks
SUSE and Red Hat Linux
Azure Red Hat OpenShift
Azure VMWare solution by CloudSimple

What services would you like to see next? Learn more about Azure reservations and start saving today!

New videos

If you weren't able to make it to Microsoft Ignite 2019 or didn't catch the Azure Cost Management sessions, they're now available online and open for everyone:

Analyze, manage, and optimize your cloud cost with Azure Cost Management (46 minutes)
Learn how Azure Cost Management can help you gain visibility, drive accountability, and optimize your cloud costs. Special guest, Mars Inc, will show how they use Azure Cost Management to get the most value out of Azure.
 
Manage and optimize your cloud cost with Azure Cost Management (21 minutes)
Just getting started with Azure? Get a quick view of how you can use Azure Cost Management as you use Can't make the full hour? Join us for a quick overview of Azure Cost Management in this short, theater session.

If you're looking for something a little shorter, you can also check out these videos:

Azure Cost Management at a glance (6 minutes)
Azure Cost Management overview (14 minutes)

Subscribe to the Azure Cost Management YouTube channel to stay in the loop with new videos as they're released and let us know what you'd like to see next.

Documentation updates

There were many documentation updates. Here are a few you might be interested in:

Added information about Data Explorer reservations
Outlined a set of account management tasks available in the Azure portal for account admins
Introducing the Microsoft Cloud Adoption Framework for Azure

Want to keep an eye on all of the documentation updates? Check out the Cost Management doc change history in the azure-docs repository on GitHub. If you see something missing, select Edit at the top of the document and submit a quick pull request.

What's next?

These are just a few of the big updates from last month. We're always listening and making constant improvements based on your feedback, so please keep the feedback coming.

Follow @AzureCostMgmt on Twitter and subscribe to the YouTube channel for updates, tips, and tricks. And, as always, share your ideas and vote up others in the Cost Management feedback forum.
Quelle: Azure

Application Gateway Ingress Controller for Azure Kubernetes Service

Today we are excited to offer a new solution to bind Azure Kubernetes Service (AKS) and Application Gateway. The new solution provides an open source Application Gateway Ingress Controller (AGIC) for Kubernetes, which makes it possible for AKS customers to leverage Application Gateway to expose their cloud software to the Internet.

Bringing together the benefits of the Azure Kubernetes Service, our managed Kubernetes service, which makes it easy to operate advanced Kubernetes environments and Azure Application Gateway, our native, scalable, and highly available, L7 load balancer has been highly requested by our customers.

How does it work?

Application Gateway Ingress Controller runs in its own pod on the customer’s AKS. Ingress Controller monitors a subset of Kubernetes’ resources for changes. The state of the AKS cluster is translated to Application Gateway specific configuration and applied to the Azure Resource Manager. The continuous re-configuration of Application Gateway ensures uninterrupted flow of traffic to AKS’ services. The diagram below illustrates the flow of state and configuration changes from the Kubernetes API, via Application Gateway Ingress Controller, to Resource Manager and then Application Gateway.

Much like the most popular Kubernetes Ingress Controllers, the Application Gateway Ingress Controller provides several features, leveraging Azure’s native Application Gateway L7 load balancer. To name a few:

URL routing
Cookie-based affinity
Secure Sockets Layer (SSL) termination
End-to-end SSL
Support for public, private, and hybrid web sites
Integrated web application firewall

The architecture of the Application Gateway Ingress Controller differs from that of a traditional in-cluster L7 load balancer. The architectural differences are shown in this diagram:

An in-cluster load balancer performs all data path operations leveraging the Kubernetes cluster’s compute resources. It competes for resources with the business apps it is fronting. In-cluster ingress controllers create Kubernetes Service Resources and leverage kubenet for network traffic. In comparison to Ingress Controller, traffic flows through an extra hop.
Ingress Controller leverages the AKS’ advanced networking, which allocates an IP address for each pod from the subnet shared with Application Gateway. Application Gateway has direct access to all Kubernetes pods. This eliminates the need for data to pass through kubenet. For more information on this topic see our “Network concepts for applications in Azure Kubernetes Service” article, specifically “Comparing network models” section.

Solution performance

As a result of Application Gateway having direct connectivity to the Kubernetes pods, the Application Gateway Ingress Controller can achieve up to 50 percent lower network latency vs in-cluster ingress controllers. Application Gateway is a managed service, backed by Azure virtual machine scale sets. As a result, Application Gateway does not use AKS compute resources for data path processing. It does not share or interfere with the resources allocated to the Kubernetes deployment. Autoscaling Application Gateway at peak times, unlike an in-cluster ingress, will not impede the ability to quickly scale up the apps’ pods. And of course, switching from in-cluster L7 ingress to Application Gateway will immediately decrease the compute load used by AKS.

We compared the performance of an in-cluster ingress controller and Application Gateway Ingress Controller on a three node AKS cluster with a simple web app running 22 pods per node. A total of 66 web app pods shared resources with three in-cluster ingresses – one per node. We configured Application Gateway with an instance count of two. We used Apache Bench to create a total of 100K requests with concurrency set at 3K requests. We launched Apache Bench twice: once pointing it to the SLB fronting the in-cluster ingress controller, and a second time connecting to the public IP of Application Gateway. On this very busy AKS cluster we recorded the mean latency across all requests:

Application Gateway: 480ms per request
In-cluster Ingress: 710ms per request

As proven by the data gathered above, under heavy load, the in-cluster ingress controller has approximately 48 percent higher latency per request compared to Application Gateway ingress. Running the same benchmark on the same cluster but with two web app pods per node, a total of six pods, we observed the in-cluster ingress controller performing with approximately 17 percent higher latency than Application Gateway.

What’s next?

Application Gateway Ingress Controller is now stable and available for use in production environments. The project is maturing quickly, and we are working actively to add new capabilities. We are working on enhancing the product with features that customers have been asking for, such as using certificates stored on Application Gateway, mutual TLS authentication, gRPC, and HTTP/2. We invite you to try the new Application Gateway Ingress Controller for AKS, follow our progress, and most importantly – give us feedback on GitHub.
Quelle: Azure

Preview: Live transcription with Azure Media Services

Azure Media Services provides a platform with which you can broadcast live events. You can use our APIs to ingest, transcode, and dynamically package and encrypt your live video feeds for delivery via industry-standard protocols like HTTP Live Streaming (HLS) and MPEG-DASH. You can also use our APIs to integrate with CDNs and deliver to millions of concurrent viewers. Customers are using this platform for scenarios ranging from multi-day sporting events and entire seasons of professional sports, to webinars and town-hall meetings.

Live transcriptions is a new preview feature in our v3 APIs, wherein you can enhance the streams delivered to your viewers with machine-generated text that is transcribed from spoken words in the audio feed. This feature is an option you can enable for any type of Live Event that you create in our service, including pass-through Live Events, where you configure a live encoder upstream to generate and push a multiple bitrate live feed into the service (visualized in the diagram below).
  

Figure 1. Schematic diagram for live transcription

When a live contribution feed is sent to the service, it extracts the audio signal, decodes it, and calls to the Azure Cognitive Services speech-to-text APIs to get the speech transcribed. The resultant text is then packaged into formats that are suitable for delivery via streaming protocols. For HTTP Live Streaming (HLS) protocol with media packaged into MPEG Transport Stream (TS) fragments, the text is packaged into WebVTT fragments. For delivery via MPEG-DASH or HLS with CMAF protocols, the text is wrapped in IMSC1.1 compatible TTML, and then packaged into MPEG-4 Part 30 (ISO/IEC 14496-30) fragments.

You can use Azure Media Player (version 2.3.3 or newer) to play the video, as well as display the text on a wide variety of browsers and devices. You can also play back the streams on the iOS native player. If building an app for Android devices, playback of transcriptions has been verified by NexPlayer. You can contact them to request a demo.

Figure 2. Display of live transcription on Azure Media Player

For HTTP Live Streaming (HLS) protocol with media packaged into MPEG Transport Stream (TS) fragments, the text is packaged into WebVTT fragments. For delivery via MPEG-DASH or HLS with CMAF protocols, the text is wrapped in IMSC1.1 compatible TTML, and then packaged into MPEG-4 Part 30 (ISO/IEC 14496-30) fragments.

The live transcription feature is now available in preview in the West US 2 region. Read the full article here to learn how to get started with this preview feature.
Quelle: Azure

Multi-protocol access on Data Lake Storage now generally available

We are excited to announce the general availability of multi-protocol access for Azure Data Lake Storage. Azure Data Lake Storage is a unique cloud storage solution for analytics that offers multi-protocol access to the same data. This is a no-compromise solution that allows both the Azure Blob Storage API and Azure Data Lake Storage API to access data on a single storage account. You can store all your different types of data in one place, which gives you the flexibility to make the best use of your data as your use case evolves. The general availability of multi-protocol access creates the foundation to enable object storage capabilities on Data Lake Storage. This brings together the best of both object storage and Hadoop Distributed File System (HDFS) to enable scenarios that were not possible until today without data copy.

Broader ecosystem of applications and features

Multi-protocol access provides a powerful foundation to enable integrations and features for Data Lake Storage. Existing object storage applications and connectors can now be used to access data stored in Data Lake Storage with no changes. This vastly accelerated the integration of Azure services and the partner ecosystem with Data Lake Storage. We are also announcing the general availability of multiple Azure service integrations with Data Lake Storage including: Azure Stream Analytics, IoT Hub, Azure Event Hubs Capture, Azure Data Box, and Logic Apps. These Azure services now integrate seamlessly with Data Lake Storage. Real-time scenarios are now enabled by easily ingesting streaming data into Data Lake Storage via IoT Hub, Stream Analytics and Event Hubs Capture.

Ecosystem partners have also strongly leveraged multi-protocol access for their applications. Here is what our partners are saying:

“Multi-protocol access is a massive paradigm shift that enables cloud analytics to run on a single account for both blob data and analytics data. We believe that multi-protocol access helps customers rapidly achieve integration with Azure Data Lake Storage using our existing blob connector. This brings tremendous value to customers without needing to do costly re-development efforts.” – Rob Cornell, Head of Cloud Alliances, Talend

Our customers are excited about how their existing blob applications and workloads “just work” leveraging the multi-protocol capability. There are no changes required for their existing blob applications saving them precious development and validation resources. We have customers today running multiple workloads seamlessly against the same data using both the blob connector and the Azure Data Lake Storage connector.

We are also making the ability to tier data between hot and cool tiers for Data Lake Storage generally available. This is great for analytics customers who want to keep frequently used analytics data in the hot tier and move less used data to cooler storage tiers for cost efficiencies. As we continue our journey, we will be enabling more capabilities on Data Lake Storage in upcoming releases. Stay tuned for more announcements in the future!

Get started with multi-protocol access

Visit our multi-protocol access documentation to get started. For additional information see our preview announcement. To learn more about pricing, see our pricing page.
Quelle: Azure

A year of bringing AI to the edge

This post is co-authored by Anny Dow, Product Marketing Manager, Azure Cognitive Services.

In an age where low-latency and data security can be the lifeblood of an organization, containers make it possible for enterprises to meet these needs when harnessing artificial intelligence (AI).

Since introducing Azure Cognitive Services in containers this time last year, businesses across industries have unlocked new productivity gains and insights. The combination of both the most comprehensive set of domain-specific AI services in the market and containers enables enterprises to apply AI to more scenarios with Azure than with any other major cloud provider. Organizations ranging from healthcare to financial services have transformed their processes and customer experiences as a result.

 

These are some of the highlights from the past year:

Employing anomaly detection for predictive maintenance

Airbus Defense and Space, one of the world’s largest aerospace and defense companies, has tested Azure Cognitive Services in containers for developing a proof of concept in predictive maintenance. The company runs Anomaly Detector for immediately spotting unusual behavior in voltage levels to mitigate unexpected downtime. By employing advanced anomaly detection in containers without further burdening the data scientist team, Airbus can scale this critical capability across the business globally.

“Innovation has always been a driving force at Airbus. Using Anomaly Detector, an Azure Cognitive Service, we can solve some aircraft predictive maintenance use cases more easily.”  —Peter Weckesser, Digital Transformation Officer, Airbus

Automating data extraction for highly-regulated businesses

As enterprises grow, they begin to acquire thousands of hours of repetitive but critically important work every week. High-value domain specialists spend too much of their time on this. Today, innovative organizations use robotic process automation (RPA) to help manage, scale, and accelerate processes, and in doing so free people to create more value.

Automation Anywhere, a leader in robotic process automation, partners with these companies eager to streamline operations by applying AI. IQ Bot, their unique RPA software, automates data extraction from documents of various types. By deploying Cognitive Services in containers, Automation Anywhere can now handle documents on-premises and at the edge for highly regulated industries:

“Azure Cognitive Services in containers gives us the headroom to scale, both on-premises and in the cloud, especially for verticals such as insurance, finance, and health care where there are millions of documents to process.” —Prince Kohli, Chief Technology Officer for Products and Engineering, Automation Anywhere

For more about Automation Anywhere's partnership with Microsoft to democratize AI for organizations, check out this blog post.

Delighting customers and employees with an intelligent virtual agent

Lowell, one of the largest credit management services in Europe, wants credit to work better for everybody. So, it works hard to make every consumer interaction as painless as possible with the AI. Partnering with Crayon, a global leader in cloud services and solutions, Lowell set out to solve the outdated processes that kept the company’s highly trained credit counselors too busy with routine inquiries and created friction in the customer experience. Lowell turned to Cognitive Services to create an AI-enabled virtual agent that now handles 40 percent of all inquiries—making it easier for service agents to deliver greater value to consumers and better outcomes for Lowell clients.

With GDPR requirements, chatbots weren’t an option for many businesses before containers became available. Now companies like Lowell can ensure the data handling meets stringent compliance standards while running Cognitive Services in containers. As Carl Udvang, Product Manager at Lowell explains:

"By taking advantage of container support in Cognitive Services, we built a bot that safeguards consumer information, analyzes it, and compares it to case studies about defaulted payments to find the solutions that work for each individual."

One-to-one customer care at scale in data-sensitive environments has become easier to achieve.

Empowering disaster relief organizations on the ground

A few years ago, there was a major Ebola outbreak in Liberia. A team from USAID was sent to help mitigate the crisis. Their first task on the ground was to find and categorize the information such as the state of healthcare facilities, wifi networks, and population density centers.  They tracked this information manually and had to extract insights based on a complex corpus of data to determine the best course of action.

With the rugged versions of Azure Stack Edge, teams responding to such crises can carry a device running Cognitive Services in their backpack. They can upload unstructured data like maps, images, pictures of documents and then extract content, translate, draw relationships among entities, and apply a search layer. With these cloud AI capabilities available offline, at their fingertips, response teams can find the information they need in a matter of moments. In Satya’s Ignite 2019 keynote, Dean Paron, Partner Director of Azure Storage and Edge, walks us through how Cognitive Services in Azure Stack Edge can be applied in such disaster relief scenarios (starting at 27:07): 

Transforming customer support with call center analytics

Call centers are a critical customer touchpoint for many businesses, and being able to derive insights from customer calls is key to improving customer support. With Cognitive Services, businesses can transcribe calls with Speech to Text, analyze sentiment in real-time with Text Analytics, and develop a virtual agent to respond to questions with Text to Speech. However, in highly regulated industries, businesses are typically prohibited from running AI services in the cloud due to policies against uploading, processing, and storing any data in public cloud environments. This is especially true for financial institutions.

A leading bank in Europe addressed regulatory requirements and brought the latest transcription technology to their own on-premises environment by deploying Cognitive Services in containers. Through transcribing calls, customer service agents could not only get real-time feedback on customer sentiment and call effectiveness, but also batch process data to identify broad themes and unlock deeper insights on millions of hours of audio. Using containers also gave them flexibility to integrate with their own custom workflows and scale throughput at low latency.

What's next?

These stories touch on just a handful of the organizations leading innovation by bringing AI to where data lives. As running AI anywhere becomes more mainstream, the opportunities for empowering people and organizations will only be limited by the imagination.

Visit the container support page to get started with containers today.

For a deeper dive into these stories, visit the following

Automation Anywhere case study
Automation Anywhere’s partnership with Microsoft
Lowell case study
Azure Stack Edge update from Microsoft Ignite 2019
Cognitive Services in Azure Stack Edge demo (at 27:07)

Quelle: Azure

Multi-language identification and transcription in Video Indexer

Multi-language speech transcription was recently introduced into Microsoft Video Indexer at the International Broadcasters Conference (IBC). It is available as a preview capability and customers can already start experiencing it in our portal. More details on all our IBC2019 enhancements can be found here.

Multi-language videos are common media assets in the globalization context, global political summits, economic forums, and sport press conferences are examples of venues where speakers use their native language to convey their own statements. Those videos pose a unique challenge for companies that need to provide automatic transcription for video archives of large volumes. Automatic transcription technologies expect users to explicitly determine the video language in advance to convert speech to text. This manual step becomes a scalability obstacle when transcribing multi-language content as one would have to manually tag audio segments with the appropriate language.

Microsoft Video Indexer provides a unique capability of automatic spoken language identification for multi-language content. This solution allows users to easily transcribe multi-language content without going through tedious manual preparation steps before triggering it. By that, it can save anyone with large archive of videos both time and money, and enable discoverability and accessibility scenarios.

Multi-language audio transcription in Video Indexer

The multi-language transcription capability is available as part of the Video Indexer portal. Currently, it supports four languages including English, French, German and Spanish, while expecting up to three different languages in an input media asset. While uploading a new media asset you can select the “Auto-detect multi-language” option as shown below.

Our application programming interface (API) supports this capability as well by enabling users to specify 'multi' as the language in the upload API. Once the indexing process is completed, the index JavaScript object notation (JSON) will include the underlying languages. Refer to our documentation for more details.

Additionally, each instance in the transcription section will include the language in which it was transcribed.

Customers can view the transcript and identified languages by time, jump to the specific places in the video for each language, and even see the multi-language transcription as video captions. The result transcription is also available as closed caption files (VTT, TTML, SRT, TXT, and CSV).

Methodology

Language identification from an audio signal is a complex task. Acoustic environment, speaker gender, and speaker age are among a variety of factors that affect this process. We represent audio signal using a visual representation, such as spectrograms, assuming that, different languages induce unique visual patterns which can be learned using deep neural networks.

Our solution has two main stages to determine the languages used in multi-language media content. First, it employs a deep neural network to classify audio segments with very high granularity, in other words, very few seconds. While a good model will successfully identify the underlying language, it can still miss-identify some segments due to similarities between languages. Therefore, we apply a second stage for examining these misses and smooth the results accordingly.

Next steps

We introduced a differentiated capability for multi-language speech transcription. With this unique capability in Video Indexer, you can become more effective about the content of your videos as it allows you to immediately start searching across videos for different language segments. During the coming few months, we will be improving this capability by adding support for more languages and improving the model’s accuracy.

For more information, visit Video Indexer’s portal or the Video Indexer developer portal, and try this new capability. Read more about the new multi-language option and how to use it in our documentation.

Please use our UserVoice to share feedback and help us prioritize features or email visupport@microsoft.com with any questions.
Quelle: Azure

Azure Backup support for SQL Server 2019 and Restore as files

As SQL Server 2019 continues to push the boundaries of availability, performance, and data intelligence, a centrally managed, enterprise-scale backup solution is imperative to ensure the protection of all that data. This is especially true if you are running the SQL Server in the cloud to leverage the benefits of dynamic scale and don't want to continue using the legacy backup methods that are tedious, infrastructure-heavy, and difficult to scale.

We are excited to share native backup for SQL Server 2019 running in Azure Virtual Machine. This is a key addition to the general availability of Azure Backup for SQL Server Virtual Machine, announced earlier this year.  Azure Backup is a zero-infrastructure solution that protects standalone SQL Server and SQL AlwaysOn configurations in Azure Virtual Machine without the need to deploy and manage any backup infrastructure. While it offers long-term retention and central monitoring capabilities to help IT admins govern and meet their compliance requirements, it lets SQL Admin continue to exercise the power of self-service backup and restore for operational recoveries.

In addition to this, we are also sharing Azure Backup general availability for:

 SQL Server 2008 and 2008 R2 migrating to Azure as SQL Server running in virtual machines.
SQL Server running on Windows 2019

Restore as files:

Adding to the list of enhancements is the key capability of Restore as Files, now restore anywhere by recovering the backed-up data as .bak files. Move these backup files across subscriptions, regions, or even to on-premises SQL Servers and trigger database restore wherever you want. Besides aiding cross-subscription and cross-region restore scenarios, this feature helps users stay compliant by giving them greater control over storing and recovering backup data to any destination of their choice.

 

Getting started:

Under the Restore operation, you will see a newly introduced option of Restore as files. Specify the destination server (this server should be SQL Server Virtual Machine registered to the vault) and path on that server. The service will dump all the .bak files specific to the recovery point you have chosen to this path. Typically, a network share path or path of a mounted Azure File share when specified as the destination enables easier access to these files by other machines in the same network or with the same Azure File share mounted on them.

Once the restore operation is completed, you can move these files to any machine across subscriptions or locations and restore them as a database using SQL Server Management Studio. Learn more.

Additional resources

Watch the getting started video.
Want more details about this feature? Review the Azure Backup for SQL Server documentation.
Get the Azure Backup pricing details for this feature.

Quelle: Azure

Change feed support now available in preview for Azure Blob Storage

Change feed support for Microsoft Azure Blob storage is now available in preview. Change feed provides a guaranteed, ordered, durable, read-only log of all the creation, modification, and deletion change events that occur to the blobs in your storage account. This log is stored as append blobs within your storage account, therefore you can manage the data retention and access control based on your requirements.

Change feed is the ideal solution for bulk handling of large volumes of blob changes in your storage account, as opposed to periodically listing and manually comparing for changes. It enables cost-efficient recording and processing by providing programmatic access such that event-driven applications can simply consume the change feed log and process change events from the last checkpoint.

Some scenarios that would benefit from consuming a blob change feed include:

Bulk processing a group of newly uploaded files for virus scanning, resizing, or backups.
Storing, auditing, and analyzing changes to your objects over any period of time for data management or compliance.
Combining data uploaded by various IoT sensors into a single collection for data transformation and insights.
Additional data movement by synchronizing with a cache, search engine, or data warehouse.

How to get started

To enroll in preview, you will need to submit a request to register this feature to your subscription. After your request is approved (within a few days), any existing or new GPv2 or blob storage accounts in West US 2 and West Central US can then enable the change feed feature.

To submit a request, run the following PowerShell or Microsoft Azure CLI commands:

Register by using PowerShell

Register-AzProviderFeature -FeatureName Changefeed -ProviderNamespace Microsoft.Storage

Register-AzResourceProvider -ProviderNamespace Microsoft.Storage

 

Register by using Azure CLI

az feature register –namespace Microsoft.Storage –name Changefeed

az provider register –namespace 'Microsoft.Storage'

Once you’re registered and approved for the change feed preview, you can then turn it on for your storage accounts and start consuming the log. For more information, please see change feed support in Azure Blob Storage and process change feed in Azure Blob Storage. As with most previews, this feature should not be used for production workloads until it reaches general availability.

Cost

Change feed pricing is currently in preview and subject to change for general availability. Customers are charged for the blob change events captured by change feed as well as the data storage costs of the change feed log. See block blob pricing to learn more about pricing.

Build it, use it, and tell us about it

We will continue to improve our feature capabilities and would like to hear your feedback regarding change feed or other features through email at AzureStorageFeedback@microsoft.com. As a reminder, we love hearing all of your ideas and suggestions about Azure Storage, which you can post at Azure Storage feedback forum.
Quelle: Azure

How AI can supercharge content understanding for businesses

Organizations face challenges when it comes to extracting insights, finding meaning, and uncovering new opportunities in the vast troves of content at their disposal. In fact, 82 percent of organizations surveyed in the latest Harvard Business Review (HBR) Analytic Services report say that exploring and understanding their content in a timely manner is a significant challenge. This is exacerbated because content is not only spread over multiple systems but also in multiple formats such as PDF, JPEG, spreadsheets, and audio files.

The first wave of artificial intelligence (AI) was designed for narrow applications, training a single model to address a specific task such as handwriting recognition. What’s been challenging, however, is that these models individually can’t capture all the different attributes hidden in various types of content. This means developers must painfully stitch together disparate components to fully understand their content.

Instead, organizations need a solution that spans vision, speech, and language to fully unlock insights from all content types. We are heavily investing in this new category of AI, called knowledge mining, to enable organizations to maximize the value of their content.

Knowledge mining with Azure Cognitive Search

Organizations can take advantage of knowledge mining today with Azure Cognitive Search. Organizations can now easily glean insights from all their content through web applications, bots, and Power BI visualizations. With Azure Cognitive Search, organizations can not only benefit from the industry’s most comprehensive domain-specific models but also integrate their own custom models. What used to take months to accomplish can be realized in mere hours without needing data science expertise.

Delivering real business impact

The same Harvard Business Review report describes how our customers across industries are benefiting from knowledge mining in ways that were previously unimaginable.

 Financial Services: “The return on investment (ROI) for knowledge mining at a small fund with one or two analysts is 30 percent to 58 percent. For much larger funds, with 50 or more analysts, it is over 500 percent.”—Subra Bose, CEO of Financial Fabric.
 Healthcare: “A reliable knowledge mining platform can drive roughly a third of the costs out of the medical claims process.” —Ram Swaminathan, CEO at BUDDI Health.
 Manufacturing: “Unlocking this potential will significantly change the way we do business with our customers and how we service their equipment.” —Chris van Ravenswaay, global business solutions manager for Howden.
 Legal: “AI tells you what is inside the contract. It also tells you what the relationship of the contract is with the outside world.” —Monish Darda, CTO of Icertis.

And we’re just getting started. You can expect even deeper integration and more great knowledge mining experiences built with Azure Cognitive Search as we continue this journey. I encourage you to take a look at Harvard Business Review’s survey and findings and hear their perspective on the landscape of knowledge mining.

Getting started

Read the Harvard Business Review Analytics Services report, Knowledge Mining: The Next Wave of Artificial Intelligence-Led Transformation.
Learn more about Azure Cognitive Search.

Quelle: Azure