Announcing the preview of OpenAPI Specification v3 support in Azure API Management

Azure API Management has just introduced preview support of OpenAPI Specification v3 – the latest version of the broadly used open-source standard of describing APIs. Implementation of the feature is based on the OpenAPI.NET SDK.

In this blog post we will explore:

The benefits of using OpenAPI Specification for your APIs.
How you can create APIs from OpenAPI Specification documents in Azure API Management.
How you can export your APIs as OpenAPI Specification documents.
The remaining work for the general availability release.

Why you should use OpenAPI Specification for your APIs

OpenAPI Specification is a widely adopted industry standard. The OpenAPI Initiative has been backed by over 30 companies, including large corporations such as Microsoft.

OpenAPI Specification lets you abstract your APIs from their implementation. The API definitions are language-agnostic.

They are also easy to understand, yet precise. Your APIs are represented through YAML or JSON files, readable for humans as well as machines.

The wide adoption of OpenAPI Specification has resulted in an extensive tooling ecosystem. Functionality of the tools ranges from facilitating collaborative process of designing APIs, to automatically generating client SDKs and server implementations in popular programming languages.

How to import OpenAPI Specification v3 definitions in Azure API Management

If your APIs are defined in an OpenAPI Specification file, you can easily import them in Azure API Management. The Azure portal will automatically recognize the right version of your OpenAPI Specification files. You can learn how to import your APIs through the visual interface by following our tutorial, “Import and publish your first API.”

Alternatively, you can import APIs using the REST API call, with the contentFormat payload parameter set to openapi, openapi+json, or openapi-link.

During import, if the servers field of the specification contains multiple entries, API Management will select the first HTTPS URL. If there aren't any HTTPS URLs, the first HTTP URL will be selected. If there aren't any HTTP URLs, the backend service URL will be empty.

The import functionality has a few restrictions. For example, it does not support examples and multipart/form-data fields.

How to export OpenAPI Specification v3 definitions in Azure API Management

With Azure API Management, you can also easily export your APIs in the OpenAPI Specification v3 format.

API specifications can be downloaded from your developer portal as JSON or YAML files. The developer portal is an automatically generated, fully customizable website, where visitors can discover APIs, learn how to use them, try them out interactively, and finally sign up to acquire API keys.

You can also export the specifications through the visual interface of the Azure portal or a REST API call, with the format query parameter set to openapi-link.

How to get started and what’s coming next

You can try the current functionality in a matter of minutes by importing your APIs from OpenAPI Specification files. Before the feature becomes generally available, we will implement export in a JSON format through a REST API call. In the coming months, we will also add OpenAPI Specification v3 import and export support in the PowerShell SDK.

Let us know what you think! 
Quelle: Azure

Connecting Node-RED to Azure IoT Central

Today I want to show how simple it is to connect a temperature/humidity sensor to Azure IoT Central using a Raspberry Pi and Node-RED.

As many of you know, Raspberry Pi is a small, single-board computer. Its low cost, low power nature makes it a natural fit for IoT projects. Node-RED is a flow-based, drag and drop programming tool designed for IoT. It enables the creation of robust automation flows in a web browser, simplifying IoT project development.

For my example, I’m using a Raspberry Pi 3 Model B and a simple DHT22 temperature and humidity sensor, but it should work with other models of the Pi. If you have a different kind of sensor, you should be able to adapt the guide below to use it, provided you can connect Node-RED to your sensor.

Configuring Azure IoT Central

Create an app.
Create a new device template.

Temp (temp)
Humidity (humidity)

Create a real device and get the DPS connection information.
Use dps-keygen to provision the device and get a device connection string.

Identify the three parts of the resulting connection string and save them for later.

Connecting the DHT22 sensor

Before we can get data from our DHT22 sensor, we need to connect it to the pi. The DHT22 typically has three pins broken out, but some of them have four. If you have one with four, check the datasheet to confirm which pins are voltage (may be shown as +, VCC or VDD), data (or signal), and ground.

With the pi powered off, use jumper wires to connect your DHT22 as shown below:

NOTE: The power jumper (red) should go to 3.3V, data jumper (yellow) should go to GPIO4 and the ground jumper (black) should go to ground. Some boards are different, so double-check your connections!

Installing required software

I started by installing Raspbian Lite using the guide. Then, I installed Node-RED. At this point you should be able to open a browser and visit http://raspberrypi.lan:1880 to see the Node-RED interface. Next, install the Azure IoT Hub nodes for Node-RED. The easiest way to do this is from the Node-RED interface, using the Manage Palette command.

Install the DHT22 nodes. Unfortunately, since this node has some lower-level hardware requirements, it can’t be installed through the Manage Palette command. Please follow the instructions using the link above.

Configuring the flow

Now that you have Node-RED up and running on your pi, you’re ready to create your flow. By default, Node-RED should already have a flow called “Flow 1,” but if you can easily create a new one by selecting the (+) icon above the canvas.

Starting the flow with the inject node

The first node we will add to this flow is an input node. For this example, we will use the inject node which simply injects an arbitrary JSON document into the flow. From the input section in the palette, drag the node from the palette on the left onto the canvas. Then, double select it to open the configuration window. Set the node properties as shown below:

This node will simply inject a JSON object where the payload is set to a timestamp. We don’t really care about that value. This is just a simple way to kick off the flow.

Getting data from the DHT22

In the Node-RED palette, find the rpi dht22 node and drag it onto the canvas. Double click on it to open the configuration window, and set the node properties as shown below:

Connect the inject node to the rpi dht22 node by dragging the little handle from one to the other.

Reformatting the message

The JSON message produced by the DHT22 node isn’t formatted correctly for sending to Azure IoT, so we need to fix that. We will use the change node to do this, so drag it out from the palette onto the canvas and connect it to the DHT22 node. Double click on it to open the configuration window and set the node properties as shown below:

For the functional part of this node, we will use JSONata, which is a query and transformation language for JSON documents. After selecting the JSONata type in the to selector, select the […] button to open the editor and enter the following:

Here we are extracting the temperature and humidity values from the input JSON message and putting them inside the data element in the resulting JSON message. We’re also adding the device ID and shared access key which you got from the Device Connection String earlier.

Sending the data to Azure IoT Central

Now that we’ve got the JSON message ready, find the Azure IoT Hub node in the palette and drag it onto the canvas. Again, double click on it to open the configuration window and set the properties as shown here:

Confirming your message and debugging

The final node we will add to our flow is a debug node, which simply outputs the message it is given to the debug panel in Node-RED. Connect it to the end of the flow (after Azure IoT Hub) and set the name to “Hub Response.”

If you’re interested in seeing the JSON message at any point in the flow, you can add more debug nodes anywhere you want. You can enable or disable the output of a debug node by selecting the little box on the right side of the node.

The flow

Here is what your flow should look like. I’ve added a couple of extra debug nodes while developing this flow, but you can see that only the Hub Response node is enabled.

Before you can run the flow, you need to deploy it from the workspace. To do this select the red Deploy button at the top right of the Node-RED screen. Then, simply select the little box on the left of the every minute node and it will start. Since we configured that node to run every minute, it will continue to send messages to Azure IoT Central until you stop it by either disabling the flow or redeploying.

Pop back over to your IoT Central app and you should start seeing data within a minute or so.

As you can see, connecting Node-RED to Azure IoT Central is pretty simple. This is a great way to quickly prototype and experiment with different sensors and message payloads without having to write any code! You can also use this approach for creating gateways or protocol translators so you can easily connect almost anything to Azure IoT Central.

Appendix: Flow source

If you want to just copy-paste the whole thing in instead of building it up yourself, you can import the following JSON into Node-RED and just update the three values from your Device Connection String (see the instructions above).

[{"id":"9e47273a.f12738", "type":"tab", "label":"DHT22-IoTC", "disabled":false, "info":""}, {"id":"b3d8f5b6.a243b8", "type":"debug", "z":"9e47273a.f12738", "name":"Hub Response", "active":true, "tosidebar":true, "console":false, "tostatus":false, "complete":"true", "x":740, "y":340, "wires":[]}, {"id":"117b0c09.6b3a04", "type":"azureiothub", "z":"9e47273a.f12738", "name":"Azure IoT Hub", "protocol":"mqtt", "x":520, "y":340, "wires":[["b3d8f5b6.a243b8"]]}, {"id":"ee333823.1d33a8", "type":"inject", "z":"9e47273a.f12738", "name":"", "topic":"", "payload":"", "payloadType":"date", "repeat":"60", "crontab":"", "once":false, "onceDelay":"", "x":210, "y":120, "wires":[["38f14b0d.96eb14"]]}, {"id":"38f14b0d.96eb14", "type":"rpi-dht22", "z":"9e47273a.f12738", "name":"", "topic":"rpi-dht22", "dht":22, "pintype":"0", "pin":4, "x":400, "y":120, "wires":[["f0bfed44.e988b"]]}, {"id":"f0bfed44.e988b", "type":"change", "z":"9e47273a.f12738", "name":"", "rules":[{"t":"set", "p":"payload", "pt":"msg", "to":"{t "deviceId":"{YOUR DEVICE ID} ", t "key":"{YOUR KEY}", t "protocol":"mqtt", t "data": {t "temp": $number(payload), t "humidity": $number(humidity)t t }tt}", "tot":"jsonata"}], "action":"", "property":"", "from":"", "to":"", "reg":false, "x":280, "y":340, "wires":[["117b0c09.6b3a04", "db5b70be.81e2a"]]}, {"id":"db5b70be.81e2a", "type":"debug", "z":"9e47273a.f12738", "name":"Payload", "active":true, "tosidebar":true, "console":false, "tostatus":false, "complete":"payload", "x":500, "y":420, "wires":[]}]
Quelle: Azure

Azure Backup now supports PowerShell and ACLs for Azure Files

We are excited to reveal a set of new features for backing up Microsoft Azure file shares natively using Azure Backup. All backup-related features have also been released to support file shares connected to Azure File Sync.

Azure files with NTFS ACLs

Azure Backup now supports preserving and restoring new technology file system (NTFS) access control lists (ACL) for Azure files in preview. Starting in 2019, Azure Backup automatically started capturing your file ACLs when backing up file shares. When you need to go back in time, the file ACLs are also restored along with the files and folders.

Use Azure Backup with PowerShell

You can now script your backups for Azure File Shares using PowerShell. Make use of the PowerShell commands to configure backups, take on-demand backups, or even restore files from your file shares protected by Azure Backup.

We have enabled on-demand backups that can retain your snapshots for 10 years using PowerShell. Schedulers can be used to run on-demand PowerShell scripts with chosen retention and thus take snapshots at regular intervals every week, month, or year. Please refer to the limitations of on-demand backups using Azure backup.

If you are looking for sample scripts, please write to AskAzureBackupTeam@microsoft.com. We have created a sample script using Azure Automation runbook that enables you to schedule backups on a periodic basis and retain them even up to 10 years.

Manage backups

A key enabler we introduced last year was the ability to “Manage backups” right from the Azure Files portal. As soon as you configure protection for a file share using Azure Backup, the “Snapshots” button on your Azure Files portal changes to “Manage backups.”

Using “Manage backups,” you can take on-demand backups, restore files shares, or individual files and folders, and even change the policy used for scheduling backups. You can also go to the Recovery Services Vault that backs up the file share and edit policies used to backup Azure File shares.

Email alerts

Backup alerts for the backup and restored jobs of Azure File shares has been enabled. The alerting capability allows you to configure notifications of job failures to chosen email addresses.

Best practices

Accidental deletion of data can happen for storage accounts, file shares, and snapshots taken by Azure Backup. It is a best practice to lock your storage accounts that have Azure Backup enabled to ensure your restores points are not deleted. Also, warnings are displayed before protected file shares or snapshots created by Azure Backup are deleted. This helps you to prevent data loss through accidental deletion.

Related links and additional content

If you are new to Azure Backup, start configuring the backup on the Azure portal.
Want more details? Check out Azure Backup documentation or the preview blog, “Introducing backup for Azure file shares.”
Need help? Reach out to the Azure Backup forum for support.
Tell us how we can improve Azure Backup by contributing new ideas and voting up existing ones.
Follow us on Twitter @AzureBackup for the latest news and updates.

Quelle: Azure

HDInsight Metastore Migration Tool open source release now available

We are excited to share the release of the Microsoft Azure HDInsight Metastore Migration Tool (HMMT), an open-source script that can be used for applying bulk edits to the Hive metastore.

The HDInsight Metastore Migration Tool is a low-latency, no-installation solution for challenges related to data migrations in Azure HDInsight. There are many reasons why a Hive data migration may need to take place. You may need to protect your data by enabling secure transfer on your Azure storage accounts. Perhaps you will be migrating your Hive tables from WASB to Azure Data Lake Storage (ADLS) Gen2 as part of your upgrade from HDInsight 3.6 to 4.0. Or you may have decided to organize the locations of your databases, tables, and user-defined functions (UDF) to follow a cohesive structure. With HMMT, these migration scenarios and many others no longer require manual intervention.

HMMT handles Hive metadata migration scenarios in a quick, safe, and controllable environment. This blog post is divided into three sections. First, the background to HMMT is outlined with respect to the Hive metastore and Hive storage patterns. The second section covers the design of HMMT and describes initial setup steps. Finally, some sample migrations are described and solved with HMMT as a demonstration of its usage and value.

Background

The Hive metastore

The Hive metastore is a SQL database for Hive metadata such as table, database, and user defined function storage locations. The Hive metastore is provisioned automatically when an HDInsight cluster is created. Alternatively, an existing SQL database may be used to persist metadata across multiple clusters. The existing SQL database is then referred to as an external metastore. HMMT is intended to be used against external metastores to persist metadata migrations over time and across multiple clusters.

Hive storage uniform resource identifiers

For each Hive table, database, or UDF available to the cluster, the Hive metastore keeps a record of that artifact’s location in external storage. Artifact locations are persisted in a Windows Azure Storage Blob or in Azure Data Lake Storage. Each location is represented as an Azure storage uniform resource identifier (URI), which describes the account-type, account, container, and subcontainer path that the artifact lives in. The above diagram describes the schema used to represent Hive table URIs. The same schema pattern applies to Hive databases and UDFs. 

Suppose a Hive query is executed against table1. Hive will first attempt to read the table contents from the corresponding storage entry found in the Hive metastore. Hive supports commands for displaying and updating a table’s storage location:

Changing the storage location of a table requires the execution of an update command corresponding to the table of interest. If multiple table locations are to be changed, multiple update commands must be executed. Since storage locations must be updated manually, wholesale changes to the metastore can be an error-prone and time-consuming task. The location update story concerning non-table artifacts is even less favorable – the location of a database or UDF cannot be changed from within Hive. Therefore, the motivation behind releasing HMMT to the public is to provide a pain-free way to update the storage location of Hive artifacts. HMMT directly alters the Hive metastore, which is the fastest (and only) way to make changes to Hive artifacts at scale.

How HMMT works

HMMT generates a series of SQL commands that will directly update the Hive metastore based on the input parameters. Only storage URIs that match the input parameters will be affected by the script. The tool can alter any combination of Hive storage accounts, account-types, containers, and subcontainer paths. Note that HMMT is exclusively supported on HDInsight 3.6 and onwards.

Start using HMMT right away by downloading it directly from the Microsoft HDInsight GitHub page. HMMT requires no installation. Make sure the script itself is run from an IP address that is whitelisted to the Hive metastore SQL Server. HMMT can be run from any UNIX command-line that has one of the supported query clients installed. The script does not necessarily need to be run from within the HDInsight cluster. Initially supported clients are Beeline and SqlCmd. Since Beeline is supported, HMMT can be run directly from any HDInsight cluster headnode.

Disclaimer: Since HMMT directly alters the contents of the Hive metastore, it is recommended to use the script with caution and care. When executing the script, the post-migration contents of the metastore will be shown as console output in order to describe the potential impact of the execution. For the specified migration parameters to take effect, the flag “liverun” must be passed to the HMMT command. The tool launches as a dry run by default. In addition, it is strongly recommended to keep backups of the Hive metastore even if you do not intend to use HMMT. More information regarding Hive metastore backups can be found at the end of this blog.

Usage examples

HMMT supports a wide variety of use cases related to the migration and organization of Hive metadata. The benefit of HMMT is that the tool provides an easy way to make sure that the Hive metastore reflects the results of a data migration. HMMT may also be executed against a set of artifacts in anticipation of an upcoming data migration. This section demonstrates the usage and value of HMMT using two examples. One example will cover a table migration related to secure storage transfer, and the other will describe the process to migrate Hive UDF JAR metadata.

Example 1: Enabling secure transfer

Suppose your Hive tables are stored across many different storage accounts, and you have recently enabled secure transfer on a selection of these accounts. Since enabling secure transfer does not automatically update the Hive metastore, the storage URIs must be modified to reflect the change (for example, from WASB to WASBS). With your IP whitelisted and a supported client installed, HMMT will update all matching URIs with the following command:

The first four arguments passed to the script correspond to the SQL server, database, and credentials used to access the metastore.
The next four arguments correspond to the ‘source’ attributes to be searched for. In this case the script will affect WASB accounts Acc1, Acc2 and Acc3. There will be no filtering for the container or subcontainer path. HMMT supports WASB, WASBS, ABFS, ABFSS, and ADL as storage migration options.
The target flag represents the table in the Hive metastore to be changed. The table SDS stores Hive table locations. Other table options include DBS for Hive databases, FUNC_RU for Hive UDFs, and SKEWED_COL_VALUE_LOC_MAP for a skewed store of Hive tables.
The Query Client flag corresponds to the query command line tool to be used. In this case, the client of choice is Apache Beeline.

The remaining flags correspond to the ‘destination’ attributes for affected URIs. In this case, all matching URIs specified by the source options will have their account type moved to WASBS. Up to one entry per destination flag is permitted. The values of these flags are merged together to form the post-migration URI pattern.

This sample script command will only pick up table URIs corresponding to WASB accounts, where the account name is “Acc1”, “Acc2”, or “Acc3.” The container and path options are left as a wildcard, meaning that every table under any of these three accounts will have its URI adjusted. The adjustment made by the script is to set the storage type to WASBS. No other aspects of the table URIs will be affected.

Example 2: UDF JAR organization

In this example, suppose you have loaded many UDFs into Hive over time. UDFs are implemented in JAR files, which may be stored in various account containers depending on which cluster the JAR was introduced from. As a result, the table FUNC_RU will have many entries across a variety of account containers and paths. If you wanted to clean up the locations of UDF JARs, you could do so using this command:

This command will pick up UDF JAR URIs, which are exclusively found in the table FUNC_RU, in the WASB storage account “Acc1” for any container and subcontainer path. Once the script is complete, the Hive metastore will show that all JARs from that account can be found in the /jarfiles/ directory under the container “jarstoragecontainer."

Feedback and contributions

We would love to get your feedback. Please reach us with any feature requests, suggestions, and inquiries at askhdinsight@microsoft.com. We also encourage feature asks and source-code contributions to HMMT itself via the HDInsight GitHub repository.

Other resources

HMMT GitHub repository
Hive external metastore
Beeline usage
Microsoft SqlCmd
Azure SQL Server Whitelisting
Azure HDInsight
HDInsight on GitHub
MSDN forum
Stack Overflow
SQL Database Backup steps
Guide to HIVE UDFs

Quelle: Azure

Export data in near real-time from Azure IoT Central

We are happy to share that you can now export data to Azure Event Hubs and Azure Service Bus in near real-time from your Azure IoT Central app! Previously, Continuous Data Export enabled exporting your IoT Central measurements, devices, and device templates data to your Azure Blob Storage account once every minute for cold path storage and analytics. Now you can export this data in near real-time to your Azure Event Hubs and Azure Service Bus instances for analytics and monitoring.

For example, an energy company wants to understand and predict trends in energy consumption in different areas over time of day and throughout the week. With electrical equipment connected to IoT Central, they can use Continuous Data Export to export their IoT data to Azure Event Hubs. They run their deployed machine learning models to gain insight over consumption and perform anomaly detection by connecting their Event Hubs to Azure Databricks. They can run highly custom rules for detecting specific outages by sending data from Event Hubs to Azure Stream Analytics. For long term data storage, they can continue to use Continuous Data Export to store all of their device data in Azure Blob Storage.

Continuous Data Export in Azure IoT Central

New capabilities

These are the new features and changes to Continuous Data Export in Azure IoT Central:

New export destinations include Azure Event Hubs and Azure Service Bus, in addition to Azure Blob Storage.
Export to all supported destinations using a valid connection string, including destinations that are in a different subscription than your IoT Central app.
Create up to 5 exports per app.
Export is available in both Trial apps and Pay-As-You-Go apps.
Continuous Data Export has moved! Find it in the left navigation menu.

Get started

For more information about Continuous Data Export and how to set it up, visit the documentation “Export your data in Azure IoT Central.” You can use your existing Blob Storage, Event Hubs, or Service Bus instance, or create a new instance.

Next steps

Use the new features in Continuous Data Export to export data to your own Azure Event Hubs, Azure Service Bus, and Azure Blob Storage instances for custom warm path and cold path processing, and analytics on your IoT data.

Have ideas or suggestions for new features? Post it on Uservoice.
Have feedback or questions? Don’t hesitate to write us at iotcfeedback@microsoft.com.
To explore the full set of features and capabilities start your free trial and learn more on the IoT Central website.
Check out our documentation including tutorials to connect your first device.
To learn more about the Azure IoT portfolio including the latest news, visit the Microsoft Azure IoT page.

Quelle: Azure

Azure.Source – Volume 66

Now in preview

Azure Monitor logs in Grafana – now in public preview

Grafana offers great dashboarding capabilities, rich visualizations, and integrations with over 40 data sources. Grafana integration is now available in preview for Microsoft Azure Monitor logs. This integration is achieved through the new Log Analytics plugin, now available as part of the Azure Monitor data source. If you’re already using Grafana for your dashboards, this new plugin can help you create a single pane of glass for your various monitoring needs. The new plugin enables you to display any data available in Log Analytics, such as logs related to virtual machine performance, security, Azure Active Directory which has recently integrated with Log Analytics, and many other log types including custom logs.

HDInsight now supported in Azure CLI as a public preview

Support for HDInsight in Microsoft Azure CLI is now available in public preview. With the addition of the new HDInsight command group, you can now utilize all of the features and benefits that come with the familiar cross-platform Azure CLI to manage your HDInsight clusters.With the addition of the new HDInsight command group, you can now utilize all of the features and benefits that come with the familiar cross-platform Azure CLI to manage your HDInsight clusters. Azure HDInsight is an easy, cost-effective, enterprise-grade service for open source analytics that enables customers to easily run popular open source frameworks including Apache Hadoop, Spark, Kafka, and others.

Also in preview

Public preview: Token-based identity for Azure AD users in SQL Database

Now generally available

Azure SQL Database: Upgraded infrastructure for monitoring and alerts
Azure Cognitive Services Custom Speech Model Hosting resource GUID change
General availability: Azure Availability Zones in East US 2
.NET Core 2.2 available for App Service on Windows
Azure Migrate is now available in Asia and Europe
Power BI Embedded supports Q&A with row-level security
Power BI Embedded API for capacity workload configuration
Power BI Embedded Status API tracks workspace assignment to capacities

News and updates

Microsoft Azure portal January 2019 update

This month we’re bringing you updates that improve the ease of navigation of the landing page, add to dashboard tile features, and increase functionality in Azure Container Instances. The new Azure portal home page is a quick and easy entry point into Azure, and includes a link to Azure.Source to help you keep current with what’s new in Azure. With the Azure portal, you can test features in preview by visiting preview.portal.azure.com.

 

AI is the new normal: Recap of 2018

Get a recap of the top 10 Azure AI highlights from 2018, across AI Services, tools and frameworks, and infrastructure at a glance. AI catalyzes digital transformation. Microsoft believes in making AI accessible so that developers, data scientists and enterprises can build systems that augment human ingenuity to tackle meaningful challenges. AI is the new normal. Microsoft has more than 20 years of AI research applied to our products and services. Everyone can now access this AI through simple, yet powerful productivity tools such as Excel and Power BI. In continual support of bringing AI to all, Microsoft introduced new AI capabilities for Power BI. These features enable all Power BI users to discover hidden, actionable insights in their data and drive better business outcomes with easy-to-use AI.

Our 2019 Resolution: Help you transform your 2008 server applications with Azure!

At Microsoft, with the end of support for 2008 servers looming, we’ve been thinking about how we can help you with your server refresh journey. we believe that the 3 reasons why Azure is the best place to transform your 2008 server applications are: security, innovation, and cost savings. The end of support for SQL Server 2008/R2 is now less than six months away on July 9th, 2019 and support ends for Windows Server 2008/R2 on January 14th, 2020. Windows 7, Office 2010 and Exchange Server are also ending their extended support soon. Microsoft and our partners are here to help you in every step of the way.

Microsoft Azure obtains Korea-Information Security Management System (K-ISMS) certification

Microsoft helps organizations all over the world comply with national, regional, and industry-specific regulatory requirements. The K-ISMS certification was introduced by the Korea Internet and Security Agency (KISA) and is designed to ensure the security and privacy of data in the region through a stringent set of control requirements. Achieving this certification means Azure customers in South Korea can more easily demonstrate adherence to local legal requirements for protection of key digital information assets and meet KISA compliance standards more easily. KISA established the K-ISMS to safeguard the information technology infrastructure within Korea. This helps organizations implement and operate information security management systems that facilitate effective risk management and enable them to apply best practice security measures.

Additional news and updates

GitHub Enterprise support and automatic GitHub service connections in pipelines – Sprint 146 Update
Azure Container Instances Memory Duration and vCPU Duration GUID migration

Technical content

Azure Backup for virtual machines behind an Azure Firewall

Learn more about the Azure Backup for SQL Server on Azure backup capability, which was made available in June, 2018 in public preview. This workload backup capability is built as an infrastructure-less, Pay as You Go (PAYG) service that leverages native SQL backup and restore APIs to provide a comprehensive solution to backup SQL servers running in Azure IaaS VMs. Azure Backup protects the data in your VMs by safely storing it in your Recovery Services Vault. Backup of SQL Servers running inside an Azure VM requires the backup extension to communicate with the Azure Backup service in order to upload backup and emit monitoring information. Azure Backup and Azure Firewall complement each other well to provide a complete protection to your resources and data in Azure.

Create alerts to proactively monitor your data factory pipelines

Organizations want to reduce the risk of data integration activity failures and the impact it cause to other downstream processes. Manual approaches to monitoring data integration projects are inefficient and time consuming. As a result, organizations want to have automated processes to monitor and manage data integration projects to remove inefficiencies and catch issues before they affect the entire system. Organizations can now improve operational productivity by creating alerts on data integration events (success/failure) and proactively monitor with Azure Data Factory. Creating alerts will ensure 24/7 monitoring of your data integration projects and make sure that you are notified of issues before they potentially corrupt your data or affect downstream processes. This helps your organizations to be more agile and increase confidence in your overall data integration processes. Learn more in this episode of Azure Friday, Monitor your Azure Data Factory pipelines proactively with alerts:

Azure IoT automatic device management helps deploying firmware updates at scale

Automatic device management in Azure IoT Hub automates many of the repetitive and complex tasks of managing large device fleets over the entirety of their lifecycles. The Azure IoT DevKit over-the-air (OTA) firmware update project is a great implementation of automatic device management. With automatic device management, you can target a set of devices based on their properties, define a desired configuration, and let IoT Hub update devices whenever they come into scope. This post highlights some of the ways you can kickstart your own implementation of the firmware update use case.

Pix2Story: Neural storyteller which creates machine-generated story in several literature genre

As one of Microsoft’s AI Lab projects, Pix2Story is a neural-storyteller web application on Azure that enables you to upload a picture and get a machine-generated story based on several literature genres. The idea is to obtain the captions from the uploaded picture and feed them to the Recurrent Neural Network model to generate the narrative based on the genre and the picture. Source code is available in GitHub so that you can train your own model.

Azure Data Explorer plugin for Grafana dashboards

Grafana is a leading open source software designed for visualizing time series analytics. It is an analytics and metrics platform that enables you to query and visualize data and create and share dashboards based on those visualizations. Combining Grafana’s beautiful visualizations with Azure Data Explorer’s snappy ad hoc queries over massive amounts of data, creates impressive usage potential. This post depicts the benefits of using Grafana for building dashboards on top of your Azure Data Explorer datasets. The Grafana and Azure Data Explorer teams have created a dedicated plugin which enables you to connect to and visualize data from Azure Data Explorer using its intuitive and powerful Kusto Query Language. Additional connectors and plugins to analytics tools and services will be added in the weeks to come.

Additional technical content

Who is @horse_js
Text Annotation on a Budget with Azure Web Apps & Doccano
Cognitive Services API — I need your clothes, boots and your motorcycle
Azure Cosmos DB + Functions Cookbook — Shared throughput and new health logs
Creating a Massively Scalable WordPress Site on Azure’s Hosted Bits
Running Node Apps Locally and in the Cloud with Docker and Azure
Create a Cosmos DB Azure Database for MongoDB
AzureR packages now on CRAN

Azure shows

The Azure Podcast | Episode 262 – Operationalizing Cosmos DB

John Kozell, a Principal Consultant at Microsoft and an expert in all things Azure Cosmos DB, especially when it comes to the Enterprise world. He gives us some unique perspectives on what Enterprises should do in order to make effective use of Azure Cosmos DB to and also meet their compliance and operational goals.

HTML5 audio not supported

Block Talk | Azure Blockchain Workbench 1.6 Highlights

In this episode, we dive into some of the new features available in Workbench 1.6, such as application versioning and troubleshooting.

Internet of Things Show | Build workflows with Azure IoT Central connector for Microsoft Flow

Learn about how to send a message to your Microsoft Teams channel when a rule is fired in your IoT Central app using Microsoft Flow. We'll cover what is Microsoft Flow, and go through how to build workflows easily using the hundreds of connectors available.

AI Show | Using Cognitive Services in Containers

In this video we will talk about our new capability that allows developers to deploy some of our cognitive services as containers. Get ready for the intelligent edge. Process data in the cloud or on device at the edge, the choice is yours.

The Open Source Show | Intro to Service Meshes: Data Planes, Control Planes, and More

Armon Dadgar (@armon), HashiCorp CTO and co-founder joins Aaron Schlesinger (@arschles) to school him on all things service meshes. You'll understand what a service mesh actually does, when and why it makes sense to use them, the role of observability, and the differences between data planes and control planes (and what's relevant to app developers). Armon makes concepts real with specific examples and analogies, Aaron sees how to easily apply it to his favorite project (Kubernetes, of course) and they sign off with their favorite resources, so you can apply to your apps.

Azure Friday | Using HashiCorp Consul to connect Kubernetes clusters on Azure

HashiCorp Consul is a distributed service mesh to connect, secure, and configure services across any runtime platform and public or private cloud. In this episode, Scott Hanselman is joined by HashiCorp's Geoffrey Grossenbach who uses Helm to install a Consul server to Azure Kubernetes Service (AKS) cluster. Next, he deploys and secures a pair of microservices with Consul.

Azure Friday | Run Azure Functions from Azure Data Factory pipelines

Azure Functions is a serverless compute service that enables you to run code on-demand without having to explicitly provision or manage infrastructure. Using Azure Functions, you can run a script or piece of code in response to a variety of events. Azure Data Factory (ADF) is a managed data integration service in Azure that enables you to iteratively build, orchestrate, and monitor your Extract Transform Load (ETL) workflows. Azure Functions is now integrated with ADF, enabling you to run an Azure function as a step in your data factory pipelines.

This Week On Channel 9 | TWC9: Alexa Azure DevOps Skills, Hacking Your Career, ML.NET 0.9, 6502 Assembly in VS Code, and more

This week on Channel 9, Christina Warren is reliving the days of Tom from MySpace, while also breaking down the latest developer news.

The Xamarin Show | Azure Blockchain Development Kit for Mobile Apps

This week, James is joined by friend of the show Marc Mercuri, Program Manager on the Azure Blockchain Development Kit Team, who introduces us to the world of blockchain. He shows us a full end to end scenario of why and how you would use blockchain in applications. He then walks us through the new Azure Blockchain Development Kit that simplifies development using blockchain for web and mobile with some fantastic Xamarin mobile apps.

Azure Tips and Tricks | How to deploy Azure Logic Apps through Visual Studio 2017

Learn how to deploy Azure Logic Apps from Visual Studio 2017 with just a few clicks. Once you've downloaded the Visual Studio extension, you can easily deploy your logic apps straight into the cloud.

The Azure DevOps Podcast | Greg Leonardo on Architecting, Developing, and Deploying the Azure Way – Episode 019

In today’s episode, Greg Leonardo, a Cloud Architect at Campus Management Corp. and Webonology, and Jeffrey Palermo discuss the components of Greg’s new book and dive deep into topics such as; architecture, app service environments, web apps, web jobs, Windows Containers, and more.

HTML5 audio not supported

Events

Cloud Commercial Communities webinar and podcast newsletter – January 2019

Each month, the Cloud Commercial Communities hosts webinars and podcasts that cover core programs, updates, trends, and technologies that Microsoft partners and customers need to know so that they can increase success in using Microsoft Azure and Dynamics. Check out this post for information and links to three live webinars and several podcasts that are available this month, as well as recaps on webinars and podcasts from last month.

Azure Site Recovery team is hosting an Ask Me Anything session

The Azure Site Recovery (ASR) team will host a special Ask Me Anything (AMA) session on Twitter, Tuesday, January 22, 2019 from 8:30 AM to 10:00 AM Pacific Standard Time. You can tweet to @AzSiteRecovery or @AzureSupport with #ASR_AMA. With an AMA, you’ll get answers directly from the team and have a conversation with the people who build these products and services.

Microsoft Ignite | The Tour

Learn new ways to code, optimize your cloud infrastructure, and modernize your organization with deep technical training. Join us at the place where developers and tech professionals continue learning alongside experts. Explore the latest developer tools and cloud technologies and learn how to put your skills to work in new areas. Connect with our community to gain practical insights and best practices on the future of cloud development, data, IT, and business intelligence. Find a city near you and register today. In February, the tour visits London, Sydney, Hong Kong, and Washington, DC.

Customers and partners

Dynamic mission planning for drones with Azure Maps

This post highlights a customer, AirMap, whose software solutions rely on Azure Maps for real-time location intelligence in a new frontier of technology called dynamic mission planning for drones. AirMap is the leading global airspace management platform for drones. AirMap’s Unmanned Traffic Management (UTM) platform enables the deployment and operations of safe, efficient, and advanced drone operations for enterprises and drone solution providers.

Azure Marketplace new offers – Volume 29

The Azure Marketplace is the premier destination for all your software needs – certified and optimized to run on Azure. Find, try, purchase, and provision applications & services from hundreds of leading software providers. You can also connect with Gold and Silver Microsoft Cloud Competency partners to help your adoption of Azure. In the first half of December we published 60 new offers that successfully met the onboarding criteria.

A Cloud Guru's Azure This Week – 18 January 2019

This time on Azure This Week, Lars talks about Azure Data Box Disk which is in general availability, new Azure Migrate and Azure Site Recovery enhancements for cloud migration and multi-modal topic inferencing with Azure Video Indexer.

Quelle: Azure

Azure Cognitive Services adds important certifications, greater availability, and new unified key

One of the most important considerations when choosing an AI service is security and regulatory compliance. Can you trust that the AI is being processed with the high standards and safeguards that you come to expect with hardened, durable software systems?

Cognitive Services today includes 14 generally available products. Below is an overview of current certifications in support of greater security and regulatory compliance for your business.

Added industry certifications and compliance

Significant progress has been made in meeting major security standards. In the past six months, Cognitive Services added 31 certifications across services and will continue to add more in 2019. With these certifications, hundreds of healthcare, manufacturing, and financial use cases are now supported. 

The following certifications have been added:

ISO 20000-1:2011, ISO 27001:2013, ISO 27017:2015, ISO 27018:2014, and ISO 9001:2015 certification
HIPAA BAA
HITRUST CSF certification
SOC 1 Type 2, SOC 2 Type 2, and SOC 3 attestation
PCI DSS Level 1 attestation

For additional details on industry certifications and compliance for Cognitive Services, visit the Overview of Microsoft Azure Compliance page.

Enhanced data storage commitments

Cognitive Services now offers more assurances for where customer data is stored at rest. These assurances have been enabled by graduating several Cognitive Services to Microsoft Azure Core Services. The first services to make Azure Core Service commitments (effective January 1, 2019) are Content Moderator, Computer Vision, Face, Text Analytics, and QnA Maker.

For your reference, you can learn more about Microsoft Azure Core Services through the Online Services Terms (OST).

Greater regional availability

Additionally, more customers around the world can now take advantage of these intelligence services that are closer to their data. The global footprint for Cognitive Services has expanded over the past several months — going from 15 to 25 Azure data center regions.

For further reference, visit the Azure product availability page for the complete list where Cognitive Services are now available.

Simplified experience with a unified API key

When building large AI systems, many use cases require multiple Cognitive Services and as such, there are efficiencies in adding more services using a single key. Recently, we launched a new bundle of multiple services, enabling the use of a single API key for most of our generally available services: Computer Vision, Content Moderator, Face, Text Analytics, Language Understanding, and Translator Text. Now developers can provision all these services in 21 Azure regions around the world1. More regions and APIs will be added to this unified service throughout 2019.

Get started today by creating a Cognitive Service resource in the Azure portal. To learn more, watch the latest “This Week in Cognitive,” video on using the unified key.

If you haven’t yet started using Cognitive Services for your business, you can try it for free. Visit the Cognitive Services page to learn more.

1 Note: When a unified key resource is provisioned, the services that are regional will be provisioned in the selected region. Services that are non-regional will still be provisioned globally even though they can now be accessed using the unified key and API endpoint.
Quelle: Azure

Dynamic mission planning for drones with Azure Maps

Real-time location intelligence is critical for business operations. From getting real-time road data, to building asset-tracking solutions for navigating drone fleets. Today, we’re excited to highlight a customer, AirMap, whose software solutions rely on Azure Maps for real-time location intelligence in a new frontier of technology called dynamic mission planning for drones.

AirMap is the leading global airspace management platform for drones. AirMap’s Unmanned Traffic Management (UTM) platform enables the deployment and operations of safe, efficient, and advanced drone operations for enterprises and drone solution providers. Since 2017, AirMap has been part of the Microsoft Ventures portfolio and has chosen Microsoft Azure as its trusted cloud for its cloud-based UTM platform. AirMap offers open, standardized APIs and SDKs that make it easy for software developers to integrate AirMap’s intelligence services and capabilities into third party applications. This includes situational awareness of flight conditions, airspace advisories, and global airspace regulations. The AirMap developer platform also offers easy access to AirMap’s global network of airspace authorities, who offer notification, authorization, and more to drone operators on the AirMap UTM platform.

Figure 1: AirMap dynamically renders polygons representing different geographic areas subject to airspace regulations.

When faced with the decision of selecting location intelligence services, AirMap didn’t have to venture far with Azure Maps offering world-class geospatial capabilities natively in Azure. This allowed for seamless, secure, and scalable integration with AirMap’s existing Azure solution.

“The speed and performance of Azure Maps is a strong complement to AirMap’s safety-critical airspace intelligence services.”

– Andreas Lamprecht, Chief Technology Officer, AirMap

AirMap utilized the vector tile service (Figure 1) on Azure Maps to create an AirMap contextual airspace plugin for Azure Maps. This plugin allows users to view and interact with AirMap’s contextual airspace advisory layers, rendered on dynamic map tiles from Azure Maps. The Azure Maps custom vector tile service supported AirMap’s high performance needs of visualizing a large data set with custom data-driven styling. The Azure intelligent cloud platform provides the ideal infrastructure for operating AirMap’s complex and real-time tracking solutions. The AirMap widget for Azure Maps enables developers to include drone-specific data and capabilities into a variety of Azure solutions, which is critical for safe drone operation. Azure Maps developers can further enrich map visualization by adding imagery captured by drones using image layers. Other Azure Maps capabilities include satellite imagery, search, and routing which can be used to implement solutions for agriculture, construction sites, insurance firms, and many other industries that will increasingly leverage drone technology.

To get started, you can install the AirMap contextual airspace plugin for Azure Maps.
Quelle: Azure

HDInsight now supported in Azure CLI as a public preview

We recently introduced support for HDInsight in Microsoft Azure CLI as a public preview. With the addition of the new HDInsight command group, you can now utilize all of the features and benefits that come with the familiar cross-platform Azure CLI to manage your HDInsight clusters.

Key Features

Cluster CRUD: Create, delete, list, resize, and show properties for your HDInsight clusters.
Script actions: Execute script actions, list and delete persistent script actions, promote ad-hoc script executions to persistent script actions, and show the execution history of script actions on HDInsight clusters.
Operations Management Suite (OMS): Enable, disable, and show the status of OMS/Log Analytics integration on HDInsight clusters.
Applications: Create, delete, list, and show properties for applications on your HDInsight clusters.
Core usage: View available core counts by region before deploying large clusters.

Azure CLI benefits

Cross platform: Use Azure CLI on Windows, macOS, Linux, or the Azure Cloud Shell in a browser to manage your HDInsight clusters with the same commands and syntax across platforms.
Tab completion and interactive mode: Autocomplete command and parameter names as well as subscription-specific details like resource group names, cluster names, and storage account names. Don't remember your 88-character storage account key off the top of your head? Azure CLI can tab complete that as well!
Customize output: Make use of Azure CLI's globally available arguments to show verbose or debug output, filter output using the JMESPath query language, and change the output format between json, tab-separated values, or ASCII tables, and more.

Getting started

Install Azure CLI for Windows, macOS, or Linux. Alternatively, you can use Azure Cloud Shell to use Azure CLI in a browser.
Log in using the az login command.
Run az account show to view your currently active subscription.

If you want to change your active subscription, run az account set -s <Subscription Name>

Take a look at our reference documentation, “az hdinsight” or run az hdinsight -h to see a full list of supported HDInsight commands and descriptions and start using Azure CLI to manage your HDInsight clusters.

Try HDInsight now

We hope you will take full advantage of HDInsight support in Azure CLI and we are excited to see what you will build with Azure HDInsight. Read this developer guide and follow the quick start guide to learn more about implementing these pipelines and architectures on Azure HDInsight. Stay up-to-date on the latest Azure HDInsight news and features by following us on Twitter #AzureHDInsight and @AzureHDInsight. For questions and feedback, reach out to AskHDInsight@microsoft.com.

About HDInsight

Azure HDInsight is an easy, cost-effective, enterprise-grade service for open source analytics that enables customers to easily run popular open source frameworks including Apache Hadoop, Spark, Kafka, and others. The service is available in 27 public regions and Azure Government Clouds in the US and Germany. Azure HDInsight powers mission-critical applications in a wide variety of sectors and enables a wide range of use cases including ETL, streaming, and interactive querying.
Quelle: Azure

Azure Data Explorer plugin for Grafana dashboards

Are you using Azure Data Explorer to query vast amounts of data? Are you following business metrics and KPIs with Grafana dashboards? Creating a Grafana data source with Azure Data Explorer has never been easier.

Grafana is a leading open source software designed for visualizing time series analytics. It is an analytics and metrics platform that enables you to query and visualize data and create and share dashboards based on those visualizations. Combining Grafana’s beautiful visualizations with Azure Data Explorer’s snappy ad hoc queries over massive amounts of data, creates impressive usage potential.

The Grafana and Azure Data Explorer teams have created a dedicated plugin which enables you to connect to and visualize data from Azure Data Explorer using its intuitive and powerful Kusto Query Language. In just a few minutes, you can unlock the potential of your data and create your first Grafana dashboard with Azure Data Explorer.

Once you build an Azure Data Explorer data source in Grafana, you can create a dashboard panel and select Edit to add your query.

Kusto Query Language is available for executing queries in the Metrics tab. The built-in intellisense which proposes query term completion, assists in query formulation. You run the query to visualize the data.

GithubEvent
| where Repo.name has 'Microsoft'
| summarize TotalEvents = count() by bin(CreatedAt,30d)
|order by CreatedAt asc

For more details on visualizing data from Azure Data Explorer in Grafana please visit our documentation, “Visualize data from Azure Data Explorer in Grafana.” It depicts the step-by-step process needed to set up Azure Data Explorer as a data source for Grafana, and then visualizes data from a sample cluster.

Next steps

In this blog, we depict the benefits of using Grafana for building dashboards on top of your Azure Data Explorer datasets. Additional connectors and plugins to analytics tools and services will be added in the weeks to come. Stay tuned for more updates.

To find out more about Azure Data Explorer you can:

Try Azure Data Explorer in preview now.
Find pricing information for Azure Data Explorer.
Access documentation for Azure Data Explorer.

Quelle: Azure