Azure.Source – Volume 67

Now in preview

Introducing IoT Hub device streams in public preview

Azure IoT Hub device streams is a new PaaS service that addresses the need for security and organization policy compliance by providing a foundation for secure end-to-end connectivity to IoT devices. At its core, an IoT Hub device stream is a data transfer tunnel that provides connectivity between two TCP/IP-enabled endpoints: one side of the tunnel is an IoT device and the other side is a customer endpoint that intends to communicate with the device. IoT Hub device streams address end-to-end connectivity needs by leveraging an IoT Hub cloud endpoint that acts as a proxy for application traffic exchanged between the device and service. IoT Hub device streams are particularly helpful when devices are placed behind a firewall or inside a private network.

Azure IoT Hub Device Streams

Announcing the preview of OpenAPI Specification v3 support in Azure API Management

Azure API Management has just introduced preview support of OpenAPI Specification v3 – the latest version of the broadly used open-source standard of describing APIs. We based the implementation of this feature on the OpenAPI.NET SDK. OpenAPI Specification is a widely-adopted industry standard that enables you to abstract your APIs from their implementation in a language-agnostic and easy to understand format. The wide adoption of OpenAPI Specification (formerly known as Swagger) resulted in an extensive tooling ecosystem.  If your APIs are defined in an OpenAPI Specification file, you can easily import them in Azure API Management (APIM). APIM helps organizations publish APIs to external, partner, and internal developers to unlock the potential of their data and services. Once the backend API is imported into APIM, the APIM API becomes a façade for the backend API.

Regulatory compliance dashboard in Azure Security Center now available

The regulatory compliance dashboard in Azure Security Center (ASC) provides insight into your compliance posture for a set of supported standards and regulations, based on continuous assessments of your Azure environment. The ASC regulatory compliance dashboard is designed to help you improve your compliance posture by resolving recommendations directly within the dashboard. Click through to each recommendation to discover its details, including the resources for which the recommendation should be implemented. The regulatory compliance dashboard preview is available within the standard pricing tier of Azure Security Center, and you can try it for free for the first 30 days.

Public preview: Read replicas in Azure Database for PostgreSQL

You can now replicate data from a single Azure Database for PostgreSQL server (master) to up to five read-only servers (read replicas) within the same Azure region. This feature uses PostgreSQL's native asynchronous replication. With read replicas, you can scale out your read-intensive workloads. Read replicas can also be used for BI and reporting scenarios. You can choose to stop replication to a replica, in which case it becomes a normal read/write server. Replicas are new servers that can be managed in similar ways as normal standalone Azure Database for PostgreSQL servers. For each read replica, you are billed for the provisioned compute in vCores and provisioned storage in GB/month.

Now generally available

HDInsight Tools for Visual Studio Code now generally available

The Azure HDInsight Tools for Visual Studio Code are now generally available on Windows, Linux and Mac. These tools provide best-in-class authoring experiences for Apache Hive batch jobs, interactive Hive queries, and PySpark jobs. The tools feature a cross-platform, lightweight, keyboard-focused code editor which removes constraints and dependencies on a platform. Azure HDInsight Tools for Visual Studio Code is available for download from Visual Studio Marketplace.

Azure Service Bus and Azure Event Hubs expand availability

Availability Zones is a high availability offering that protects applications and data from datacenter failures. Availability Zones support is now generally available for Azure Service Bus premium and Azure Event Hubs standard in every Azure region that has zone redundant datacenters. Note that this feature won’t work with existing namespaces—you will need to provision new namespaces to use this feature.  Availability Zones support for Azure Service Bus Premium and Azure Event Hubs Standard is available in the following regions: East US 2, West US 2, West Europe, North Europe, France Central, and Southeast Asia.

Azure Cognitive Services adds important certifications, greater availability, and new unified key

Over the past six months, we added added 31 certifications across services in Cognitive Services and will continue to add more in 2019. With these certifications, hundreds of healthcare, manufacturing, and financial use cases are now supported. In addition, Cognitive Services now offers more assurances for where customer data is stored at rest. These assurances have been enabled by graduating several Cognitive Services to Microsoft Azure Core Services. Also, the global footprint for Cognitive Services has expanded over the past several months — going from 15 to 25 Azure data center regions. Recently, we launched a new bundle of multiple services, enabling the use of a single API key for most of our generally available services: Computer Vision, Content Moderator, Face, Text Analytics, Language Understanding, and Translator Text.

Also generally available

Access generally available functionality in Azure Database Migration Service to migrate Amazon RDS for SQL Server, PostgreSQL, and MySQL to Azure while the source database remains online during migration:

Support for Amazon RDS for SQL Server to Azure SQL Database online migrations
Support for Amazon RDS for PostgreSQL to Azure Database for PostgreSQL online migrations
Support for Amazon RDS for MySQL to Azure Database for MySQL online migrations

News and updates

Microsoft and Citus Data: Providing the best PostgreSQL service in the cloud

On Thursday, Microsoft  announced the acquisition of Citus Data, an innovative open source extension to scale out PostgreSQL databases without the need to re-architect existing applications. Citus Data delivers unparalleled performance and scalability by intelligently distributing data and queries across multiple nodes, which makes sharding simple. Because Citus Data is packaged as an extension (not a fork) to PostgreSQL, customers can take advantage of all the innovations in community PostgreSQL with queries that are significantly faster compared to proprietary implementations of PostgreSQL. More information is available in this post by Rohan Kumar, Corporate Vice President, Azure Data: Microsoft acquires Citus Data, re-affirming its commitment to Open Source and accelerating Azure PostgreSQL performance and scale.

Export data in near real-time from Azure IoT Central

You can now export data in near real-time to your Azure Event Hubs and Azure Service Bus in near real-time from your Azure IoT Central app. Use the new features in Continuous Data Export to export data to your own Azure Event Hubs, Azure Service Bus, and Azure Blob Storage instances for custom warm path and cold path processing, and analytics on your IoT data. Watch this episode of the Internet of Things Show to learn how to export device data to your Azure Blob storage, Azure Event Hub, and Azure Service Bus using continuous data export in IoT Central. You’ll also learn how to set up continuous export to export measurements, devices, and device template data to your destination and how to use this data.

Export data from your IoT Central app to Azure Event Hubs and Azure Service Bus

HDInsight Metastore Migration Tool open source release now available

Microsoft Azure HDInsight Metastore Migration Tool (HMMT) is an open-source shell script that you can use for applying bulk edits to the Hive metastore. HMMT is a low-latency, no-installation solution for challenges related to data migrations in Azure HDInsight. This blog post covers how HMMT is outlined with respect to the Hive metastore and Hive storage patterns, the design of HMMT and describes initial setup steps, and finally, some sample migrations are described and solved with HMMT as a demonstration of its usage and value.

Azure Backup now supports PowerShell and ACLs for Azure Files

Azure Backup now supports preserving and restoring new technology file system (NTFS) access control lists (ACL) for Azure files in preview. You can now script your backups for Azure File Shares using PowerShell. Make use of the PowerShell commands to configure backups, take on-demand backups, or even restore files from your file shares protected by Azure Backup. Using the “Manage backups” capability in the Azure Files portal, you can take on-demand backups, restore files shares, or individual files and folders, and even change the policy used for scheduling backups. You can also go to the Recovery Services Vault that backs up the file share and edit policies used to backup Azure File shares. Backup alerts for the backup and restored jobs of Azure File shares are enabled, which enables you to configure notifications of job failures to chosen email addresses.

Analyze data in Azure Data Explorer using KQL magic for Jupyter Notebook

Jupyter Notebook enable you to create and share documents that contain live code, equations, visualizations, and explanatory text. Its includes data cleaning and transformation, numerical simulation, statistical modeling, and machine learning. KQL magic commands extend the functionality of the Python kernel in Jupyter Notebook. KQL magic enable you to write KQL queries natively and query data from Microsoft Azure Data Explorer. You can easily interchange between Python and KQL, and visualize data using rich Plot.ly library integrated with KQL render commands. KQL magic supports Azure Data Explorer, Application Insights, and Log Analytics as data sources to run queries against. KQL magic also works with Azure Notebooks, Jupyter Lab, and the Visual Studio Code Jupyter extension.

Hyperledger Fabric updates now available

Hyperledger Fabric is an enterprise-grade distributed ledger that provides modular components, enabling customization of components to fit various scenarios. You can now download from the Azure Marketplace an updated template for Hyperledger Fabric that supports Hyperledger Fabric version 1.3. The automation provided by this solution is designed to make it easier to deploy, configure and govern a multi-member consortium using the Hyperledger Fabric software stack. This episode of Block Talk walks through the Hyperledger Fabric ledger and discusses the core features you can use to customize the deployment of Hyperledger Fabric in your environment. 

Hyperledger Fabric on Azure

Additional news and updates

Azure FXT Edge Filer (Avere Update)
M-series virtual machines (VMs) are now available in Australia Central region.

Technical content

Connecting Node-RED to Azure IoT Central

In this post, Peter Provost, Principal PM Manager, Azure IoT, shows how simple it is to connect a temperature/humidity sensor to Azure IoT Central using a Raspberry Pi and Node-RED. Node-RED is a flow-based, drag and drop programming tool designed for IoT. It enables the creation of robust automation flows in a web browser, simplifying IoT project development.

Getting started with Azure Blueprints

Azure Blueprints (currently in Preview) helps you define which policies – including policy initiatives – RBAC settings, and ARM templates to apply on a subscription basis, making it easy to set configurations at scale, knowing that any resources created in those subscriptions will comply with those settings (or will show as non-compliant in the case of audit policies). Sonia provides an intro to the service, showing how they group configuration controls, like Azure Policy and RBAC, and then uses an example scenario to demonstrate how and why to use Blueprints to simplify compliance and governance.

RStudio Server on Azure

RStudio Server Pro, the premier IDE for the R programming language is now available on the Azure Marketplace, letting you launch it on a virtual machine of your choice. David details the benefits of this new offering and also lists alternative solutions for developers interested in running a self-managed instance of RStudio Server.

Sneak Peek: Making Petabyte Scale Data Actionable with ADX Part 2

To celebrate the recent announcement of free private repos in GitHub, Ari released a sneak peak of what he's working on for Part II of his "Making Petabyte Scale Data Actionable with Azure Data Explorer" series.

Azure shows

The Azure Podcast | Episode 263 – Partner Spotlight – Aqua Security

Liz Rice, Technical Evangelist at Aqua Security and master of all things Security in Kubernetes, talks to us about her philosophy on security and gives us the some great tips-n-tricks on how to secure your container workloads in Azure, on-prem or any cloud.

HTML5 audio not supported

Azure Friday | An intro to Azure Cosmos DB JavaScript SDK 2.0

Chris Anderson joins Donovan Brown to discuss Azure Cosmos DB JavaScript SDK 2.0, which adds support for multi-region writes, a new fluent-style object model—making it easier to reference Azure Cosmos DB resources without an explicit URL—and support for promises and other modern JavaScript features. It is also written in TypeScript and supports the latest TypeScript 3.0.

AI Show | Learn by Doing: A Look at Samples

Gain an understanding of the landscape of sample projects available for Cognitive Services.

Five Things | Five Reasons Why You Should Check Out Cosmos DB

What does a giant Jenga tower have in common with NoSQL databases? NOTHING. But we're giving you both anyway. In this episode, Burke and Jasmine Greenaway bring you five reasons that you should check out Cosmos DB today. They also play a dangerous game of Jenga with an oversized tower made out of 2×4's, and someone nearly gets crushed.

The DevOps Lab | Verifying your Database Deployment with Azure DevOps

While at Microsoft Ignite | The Tour in Berlin, Damian speaks to Microsoft MVP Houssem Dellai about some options for deploying your database alongside your application. Houssem shows a few different ways to deploy database changes, including a clever pre-production verification process for ensuring your production deployment will succeed. Database upgrades are often the scariest part of your deployment process, so having a robust check before getting to production is very important.

Overview of Managed Identities on Azure Government

In this episode of the Azure Government video series, Steve Michelotti talks with Mohit Dewan, of the Azure Government Engineering team, about Managed Identities on Azure Government. Whether you’re storing certificates, connection strings, keys, or any other secrets, Managed Identities is a valuable tool to have in your toolbox. Watch this video to see how quick and easy it is to get up and running with Managed Identities in Azure Government.

Azure Tips and Tricks | How to create a container image with Docker

In this edition of Azure Tips and Tricks, learn how to create a container image to run applications with Docker. You’ll see how to create a folder inside a container and create a script to execute it.

Azure Tips and Tricks | How to manage multiple accounts, directories, and subscriptions in Azure

Discover how to easily manage multiple accounts, directories, and subscriptions in the Microsoft Azure portal. In this video, you'll learn how to log in to the portal and manage multiple accounts, establish the contexts between accounts and directories, and how to filter and scope the portal at a few different levels to their billable subscriptions.

The Azure DevOps Podcast | Paul Hacker on DevOps Processes and Migrations – Episode 020

In this episode, Paul Hacker is joining the Azure DevOps Podcast to discuss DevOps processes and migrations. Paul has some really interesting perspectives on today’s topic and provides some valuable insights on patterns that are emerging in the space, steps to migrating to Azure DevOps, and common challenges (and how to overcome them). Listen to his insight on migrations, DevOps processes, and more.

HTML5 audio not supported

Events

Microsoft Ignite | The Tour

Learn new ways to code, optimize your cloud infrastructure, and modernize your organization with deep technical training. Join us at the place where developers and tech professionals continue learning alongside experts. Explore the latest developer tools and cloud technologies and learn how to put your skills to work in new areas. Connect with our community to gain practical insights and best practices on the future of cloud development, data, IT, and business intelligence. Find a city near you and register today. In February, the tour visits London, Sydney, Hong Kong, and Washington, DC.

Customers, partners, and industries

Security for healthcare through cloud agents and virtual patching

For a healthcare organization, security and protection of data is a primary value, but solutions can be attacked from a variety of vectors such as malware, ransomware, and other exploits. The attack surface of an organization could be complex, email and web browsers are immediate targets of sophisticated hackers. One Microsoft Azure partner, XentiT (ex-ent-it), is devoted to protecting healthcare organizations despite the complexity of the attack surface. XentIT leverages two other security services with deep capabilities and adds its own expertise to create a dashboard-driven security solution that lets healthcare organizations better monitor and protect all assets.

AI & IoT Insider Labs: Helping transform smallholder farming

Microsoft’s AI & IoT Insider Labs was created to help all types of organizations accelerate their digital transformation. Learn how AI & IoT Insider Labs is helping one partner, SunCulture, leverage new technology to provide solar-powered water pumping and irrigation systems for smallholder farmers in Kenya. SunCulture, a 2017 Airband Grant Fund winner, believed sustainable technology could make irrigation affordable enough that even the poorest farmers could use it without further aggravating water shortages. The company set out to build an IoT platform to support a pay-as-you-grow payment model that would make solar-powered precision irrigation financially accessible for smallholders across Kenya.

A Cloud Guru | Azure This Week – 25 January 2019

This time on Azure This Week, Lars talks about Azure Monitor logs for Grafana in public preview, New Azure Portal landing page, and it is time to move on from Windows Server 2008.

Quelle: Azure

AI & IoT Insider Labs: Helping transform smallholder farming

This blog post was authored by Peter Cooper, Senior Product Manager, Microsoft IoT.

From smart factories and smart cities to virtual personal assistants and self-driving cars, artificial intelligence (AI) and the Internet of Things (IoT) are transforming how people around the world live, work, and play.

But fundamentally changing the ways people, devices, and data interact is not simple or easy work. Microsoft’s AI & IoT Insider Labs was created to help all types of organizations accelerate their digital transformation. Member organizations around the world get access to support both technology development and product commercialization, for everything from hardware design to manufacturing to building applications and turning data into insights using machine learning.

Here’s how AI & IoT Insider Labs is helping one partner, SunCulture, leverage new technology to provide solar-powered water pumping and irrigation systems for smallholder farmers in Kenya.

Affordable irrigation for all

Kenyan smallholdings face some of the most challenging growing conditions in the world. 97 percent rely on natural rainfall to support their crops and livestock—and the families that depend on them. But just 17 percent of the country’s farmland is suitable for rainfed agriculture. Electricity is unavailable in most places and diesel power is often financially out of reach, so farmers spend hours every day pumping and transporting water. This limits them to low-value crops like maize and small yields, all because they lack the resources to irrigate their crops. Additionally, irrigation technologies have an important role to play in reducing the impact agriculture has on the earth’s freshwater resources, especially in Africa.

SunCulture, a 2017 Airband Grant Fund winner, believed sustainable technology could make irrigation affordable enough that even the poorest farmers could use it without further aggravating water shortages. The company set out to build an IoT platform to support a pay-as-you-grow payment model that would make solar-powered precision irrigation financially accessible for smallholders across Kenya.

How SunCulture’s solution works

SunCulture’s RainMaker2 pump combines the energy efficiency of solar power with the effectiveness of precision irrigation, making it cheaper and easier for farmers to grow high-quality fruits and vegetables. Using the energy of the sun, the SunCulture system pulls water from any source—lake, stream, well, etc.—and pumps it directly to the farm with sprinklers and drip irrigation.

This cutting-edge solution combines ClimateSmart™ solar and lithium-ion energy storage technology with cloud-based remote monitoring and optimization software developed with support from AI & IoT Insider Labs. It’s a powerful platform that makes it simple and cheap to deploy off-grid energy and connected solutions.

Farmers get the information they need to make good irrigation decisions at scale, without the costs involved in sending agronomy experts into the field. How? SunCulture processes a steady flow of sensor data, like soil moisture, pump efficiency, solar battery storage, and other factors, that is analyzed within Microsoft Azure’s cloud environment. This sensor data is combined with data from SunCulture’s network of 2,000 hyperlocal weather stations to leverage Azure machine learning tools and provide simple, real-time, precision irrigation recommendations directly to the farmer via text messaging (SMS).
 
The platform also enables real-time locking and unlocking of devices that makes the pay-as-you-grow model feasible. The platform is smart enough to shut off pumps automatically when power levels are getting low on a cloudy day, or when optimal irrigation thresholds are reached.

How farmers are benefiting from SunCulture

SunCulture's pay-as-you-grow revenue model allows farmers to make small, monthly payments until they own their precision sensor-based irrigation system outright, empowering even the region’s poorest smallholder farmers to take control of their environment.

On average, SunCulture customers enjoy a 300 percent increase in crop yields and a 10x increase in annual income. Farmers with livestock double their milk yield, earning an extra $3.50/day in income from milk alone. The 17 hours per week they used to spend moving water manually is now directed to better tending their crops and livestock. At a price point of $1.25/day for the RainMaker2 with ClimateSmart™, a farmer’s investment is recouped quickly, and profit starts flowing from increased agricultural productivity.

Download SunCulture’s case study to learn more.
Quelle: Azure

Hyperledger Fabric updates now available

In late 2017, a growing number of customers were interested in using Hyperledger Fabric (HLF) to build their applications on Azure. At this time, we announced support for this popular offering through the Azure Marketplace. Over the early part of this year we added support for Visual Studio Code (VS Code) and the Go extension for VS Code, which enabled users to both deploy a HLF network and write chain code in Go on Windows, Mac, and Linux.

We are happy to share a series of new enhancements for developers building solutions using Hyperledger Fabric on Azure.

Updated template for Hyperledger Fabric 1.3

Today, we’re sharing an updated template for Hyperledger Fabric that you can download from the Azure Marketplace. This version includes:

Hyperledger Fabric version 1.3 support
Unified template to allow both single VM (multi container) development and multi VM (scale out) models
The ability to connect multi-subscriptions via a private connection, automated by the template
Orderers run by using a full, highly available Kafka backend for production quality deployments
Peers run using either LevelDB or CouchDB for persistence and to enable analytics

Completed documentation for the architecture can be found on GitHub. This is the initial release of the unified template. Future roadmap items include:

Network joining automation
Hyperledger Explorer integration

On behalf of the team, thank you for choosing Azure. We’re excited to bring these new assets to the community and we’re looking forward to seeing what you build.
Quelle: Azure

Analyze data in Azure Data Explorer using KQL magic for Jupyter Notebook

Exploring data is like solving a puzzle. You create queries and receive instant satisfaction when you discover insights, just like adding pieces to complete a puzzle. Imagine you have to repeat the same analysis multiple times, use libraries from an open-source community, share your steps and output with others, and save your work as an artifact. Notebooks helps you create one place to write your queries, add documentation, and save your work as output in a reusable format.

Jupyter Notebook allows you to create and share documents that contain live code, equations, visualizations, and explanatory text. Its includes data cleaning and transformation, numerical simulation, statistical modeling, and machine learning.

We are excited to announce KQL magic commands which extends the functionality of the Python kernel in Jupyter Notebook. KQL magic allows you to write KQL queries natively and query data from Microsoft Azure Data Explorer. You can easily interchange between Python and KQL, and visualize data using rich Plot.ly library integrated with KQL render commands. KQL magic supports Azure Data Explorer, Application Insights, and Log Analytics as data sources to run queries against.

Use a single magic “%kql” to run a single line query, or use cell magic “%%kql” to run multi-line queries. In the following example we run a multi-line query and render a pie chart using the ploy.ly Python library:

If you are a Python user, you can place the result set into a pandas dataframe.

Common use cases

Data science: Data scientists use KQL magic to analyze and visualize data from Azure Data Explorer, easily interchange Python code with KQL queries to experiment, train, score machine learning models, and also save notebooks as artifacts.
Data analytics: Use KQL magic to query, analyze, and visualize data, with no Python knowledge needed. For Python users, easily query data from Azure Data Explorer and use various open-source libraries from the Python ecosystem.
Business reviews: Use KQL magic for business and product reviews. Create the notebook once and refresh with new values every time you use it.
Incident response: Use KQL magic to create operational documents, chain-up your queries for easy investigation, save the notebook for reproducibility and artifacts for remote connectivity analyzer (RCA).
Security analytics: Query data from Azure Data Explorer and use the rich Python ecosystem for security analytics to analyze and visualize your data. For example, one of the internal Microsoft security teams uses KQL magic with Juypter for standard analysis patterns to triage security alerts, they have been transforming incident response playbooks into parameterized Jupyter Notebooks to automate repetitive investigation workflows. A sample notebook is available in the Azure Data Explorer KQL magic Demo and in GitHub Repo under Threat-hunting-with-notebooks.

Getting started

Our exciting capabilities will allow you to have fun with your data analytics. You can see additional documentation and examples of KQL magic by visiting our documentation, “Analyze data using Jupyter Notebook and KQL magic.”
Quelle: Azure

Security for healthcare through vigilant agents and virtual patching

Healthcare organizations depend on data-driven decisions. To enable better decisions and better health outcomes, healthcare organizations are moving to the cloud. There, the latest advances in artificial intelligence, machine learning, and analytics can be more easily tested and implemented. For a healthcare organization, security and protection of data is a primary value, but solutions can be attacked from a variety of vectors such as malware, ransomware, and other exploits. The attack surface of an organization could be complex, email and web browsers are immediate targets of sophisticated hackers. One Microsoft Azure partner is devoted to protecting healthcare organizations despite the complexity of the attack surface. XentIT (ex-ent-it) leverages two other security services with deep capabilities and adds its own expertise to create a dashboard-driven security solution that lets healthcare organizations better monitor and protect all assets.

Problem: Slow information velocity

Anyone in a critical health condition wants their medical professionals to be up to date. Speed matters, and making a medical decision requires all sources of information to be available as soon as possible. The inability to quickly access and process patient data due to outdated infrastructure may result in a life or death situation.

Solution: Agents and virtual patching

The healthcare cloud security stack (HCSS) for Azure helps healthcare entities modernize the IT infrastructure, while maintaining focus on cloud security and compliance. The unified dashboard of HCSS provides a single pane of glass into the vulnerabilities identified by Qualys, the number and types of threats stopped by Trend Micro Deep Security, and intelligence for further investigation and remediation by security analysts and engineers. The unified stack eliminates the overhead of security automation and orchestration after migration to the cloud.

The figure below shows the architecture of a solution built on Azure with the XentIT dashboard monitoring the components, and the Qualsys service built in.

Benefits

Full implementation of Qualys Vulnerability Management, Cloud Agents, and Trend Micro™ Deep Security™.
Extensive environment configuration.
A unified dashboard view of security protection, tailored for healthcare organizations.
Optimized for Microsoft Azure to ensure flexible, scalable protection of your operating systems, applications, and data against vulnerabilities. Detect suspicious activity, stop targeted attacks, and meet compliance from within one security console.
Vulnerability Management continuously scans and identifies vulnerabilities with six sigma 99.99966 percent accuracy, protecting your Azure infrastructure. The Cloud Agent is lightweight, self-updating, and provides continuous data collection for IT security and compliance applications.

Microsoft technologies

The solution uses several Azure services.

Azure SQL Database
Data Encryption — such as SQL Server Always Encrypted
Key Vault
Load Balancer
Azure Monitor
Network Security Groups
Storage
Web Application Firewall (WAF)

“HCSS’ continuous monitoring, fast detection, and efficient remediation of vulnerabilities enables healthcare organizations to effectively secure workloads on an ongoing basis, empowering them to fully realize the benefits of Azure, accelerate the delivery of better patient care, reduce healthcare costs, and improve the experience of both patients and healthcare professionals.”

– Hector Rodriguez, Director, Worldwide Commercial Health at Microsoft

To find out more about the solution, go to XentIT in the marketplace and select “Contact me.”
Quelle: Azure

Microsoft and Citus Data: Providing the best PostgreSQL service in the cloud

Today, we announced the acquisition of Citus Data, an innovative open source extension to scale out PostgreSQL databases without the need to re-architect existing applications. Citus Data delivers unparalleled performance and scalability by intelligently distributing data and queries across multiple nodes, which makes sharding simple. Because Citus Data is packaged as an extension (not a fork) to PostgreSQL, customers can take advantage of all the innovations in community PostgreSQL with queries that are significantly faster compared to proprietary implementations of PostgreSQL.

In this blog, I would like to share how we will work closely together with Citus Data to bring a differentiated PostgreSQL offering to our customers. Our objective since the beginning of our journey in bringing PostgreSQL database to Microsoft Azure has been to provide our customers with the best enterprise grade managed PostgreSQL service in the cloud. We are proud to offer innovations on Azure Database for PostgreSQL such as built-in high availability that can help save customers over half the costs versus other clouds, and capability to scale up or down compute in seconds, helping customers easily adjust to changes in workload demands. Additionally, built-in intelligent features such as auto-tuning help customers further lower their TCO by having customized recommendations and insights to maximize the performance of their PostgreSQL databases. These benefits, coupled with built-in security and compliance, Azure’s global reach, and Azure IP advantage, free up more time for customers to focus on their business and applications. We pair all this value with a community-based engine model, so our customers are not locked in.

As open source relational databases grow on Azure, we continue to see accelerated adoption from customers across various industry verticals serving diverse, mission-critical application patterns ranging from complex analytical applications for digital marketing to machine learning based IoT applications. These applications demand low latency and high query processing throughput, and our investment in Citus Data is a powerful accelerant to deliver high performance and scalable distributed databases.

PostgreSQL is one of the fastest growing open source relational database engines that is loved and embraced by enterprise developers. Citus Data complements Microsoft’s approach to community PostgreSQL with its innovative open source extension built on PostgreSQL. With Citus Data technology, Microsoft will be able to offer customers unparalleled performance and scale for their PostgreSQL databases on top of the innovative, intelligent and secure Azure Database for PostgreSQL.

We are very excited to welcome the Citus Data team to Microsoft and to build on the culture of contribution to the open source community! We look forward to accelerating an open PostgreSQL and can’t wait to see how to see how this technology advances our customer’s business needs.

Check out the following blog post from Rohan Kumar, CVP Azure Data group to learn more about this exciting development.
Quelle: Azure

Regulatory compliance dashboard in Azure Security Center now available

Meeting regulatory compliance obligations and complying with all the requirements of benchmark standards can be a significant challenge in a cloud or hybrid environment. Identifying which assessments to perform, evaluating the status, and resolving the gaps can be a very daunting task. Azure Security Center (ASC) now helps streamline this process with the new regulatory compliance dashboard, which was recently released to public preview.

The regulatory compliance dashboard provides insight into your compliance posture for a set of supported standards and regulations, based on continuous assessments of your Azure environment.

The assessments performed by Azure Security Center analyze risk factors in your hybrid cloud environment in accordance with security best practices. These assessments are mapped to selective compliance controls from a supported set of standards. In the regulatory compliance dashboard, you get a single view of the status of all assessments within your environment, in the context of a particular standard or regulation. As you act on the recommendations and reduce risk factors in your environment, you can see your compliance posture improve.

Assess and improve your compliance posture

In the Azure Security Center regulatory compliance blade, you can get an overview of key portions of your compliance posture with respect to a set of supported standards. Currently supported standards are Azure CIS, PCI DSS 3.2, ISO 27001, and SOC TSP.

In the dashboard, you will find your overall compliance score, and the number of passing versus failing assessments with each standard. You can now focus your attention on the gaps in compliance for a standard or regulation that is important to you.

The ASC regulatory compliance dashboard is designed to help you improve your compliance posture by resolving recommendations directly within the dashboard. Click through to each recommendation to discover its details, including the resources for which the recommendation should be implemented.

The information provided by the regulatory compliance dashboard can be very useful for providing evidence to internal and external auditors as to your compliance status with the supported standards. Stay tuned for additional features, like the ability to create and export reports that can be readily shared with stakeholders. In addition, data from the ASC compliance dashboard will soon be integrated into Compliance Manager, delivering the benefit of automated assessments from Azure directly into the Compliance Manager experience instead of requiring manual processes.

The regulatory compliance dashboard preview is available within the standard pricing tier of Azure Security Center, and you can try it for free for the first 30 days.

To learn more about regulatory compliance in Azure Security Center see the documentation, “Tutorial: Improve your regulatory compliance.” Try it out and let us know what you think!
Quelle: Azure

HDInsight Tools for Visual Studio Code now generally available

We are pleased to announce the general availability for Azure HDInsight Tools for Visual Studio Code (VSCode). HDInsight Tools for VSCode give developers a cross-platform lightweight code editor for developing HDInsight PySpark and Hive batch jobs and interactive query. 

For PySpark developers who value the productivity Python enables, HDInsight Tools for VSCode offer a quick Python editor with simple getting started experiences, and allow you to submit PySpark statements to HDInsight clusters with interactive responses. This interactivity brings the best properties of Python and Spark to developers and empowers you to gain faster insights.

For Hive developers, HDInsight tools for VSCode offer great data warehouse query experiences for big data and helpful features in querying log files and gaining insights. 

Key customer benefits   

Integration with Azure worldwide environments for Azure sign-in and HDInsight cluster management 
HDInsight Hive and Spark job submission with integration with Spark UI and Yarn UI
Interactive responses with the flexibility to execute one or multiple selected Hive and Python scripts
Preview and export your interactive query results to CSV, JSON, and Excel format
Built-in Hive language services such as IntelliSense auto-suggest, autocomplete, and error marker, among others
Supports HDInsight ESP Cluster and Ambari connection
Simplified cluster and Spark job configuration management

Latest improvements

Since public preview, we have worked closely with customers to address feedback, implement new functionality, and constantly improve user experiences. Some key improvements include:

HDInsight Tools for VSCode can be connected to all the Azure environments which host HDInsight services. Read more in the blog post, “HDInsight Tools for VSCode supports Azure environments worldwide.”
Support for HDInsight Enterprise Security Package. Read more in the blog post, “HDInsight Tools for VSCode integrates with Ambari and HDInsight Enterprise Secure Package.”
Leverage VSCode built-in user settings and workspace settings for clusters and job configuration management. Read more in the blog post, “HDInsight tools for Visual Studio Code: simplifying cluster and Spark job configuration management.”
Integrate with VSCode Azure Account and HDInsight explorer to improve Azure sign-in experience, as well as cluster and Hive metadata browse. Read more in the blog post, “HDInsight Tools for VSCode: Integrations with Azure Account and HDInsight Explorer.”

How to get started

First, install Visual Studio Code and download Mono 4.2.x (for Linux and Mac). Then, get the latest HDInsight Tools by going to the VSCode Extension repository or the VSCode Marketplace and searching HDInsight Tools for VSCode.

For more information about HDInsight Tools for VSCode, please see the following resources:

User Manual: HDInsight Tools for VSCode
User Manual: Set Up PySpark Interactive Environment
Demo Video: “HDInsight Tools for VSCode to support Hive Interactive, Hive Batch and PySpark”

If you have questions, feedback, comments, or bug reports, please send a note to hdivstool@microsoft.com.
Quelle: Azure

Azure Service Bus and Azure Event Hubs expand availability

The Azure Messaging team is continually working to enhance the resiliency and availability of our service offerings – Azure Service Bus, Azure Event Hubs, and Azure Event Grid. As part of this effort, in June 2018, we previewed Azure Service Bus Premium tier for Availability Zones and Azure Event Hubs Standard tier in 3 regions – Central US, East US 2, and France Central.

Today, we’re happy to announce that we’ve added Availability Zones support for Azure Service Bus Premium and Azure Event Hubs Standard in the following regions:

East US 2
West US 2
West Europe
North Europe
France Central
Southeast Asia

Availability Zones is a high availability offering by Azure that protects applications and data from datacenter failures. Availability Zones are unique physical locations within an Azure region. Each zone is made up of one or more datacenters equipped with independent power, cooling, and networking. To ensure resiliency, there’s a minimum of three separate zones in all enabled regions. The physical separation of Availability Zones within a region protects applications and data from datacenter failures. Zone-redundant services replicate your applications and data across Availability Zones to protect from single-points-of-failure.

With this, Azure Service Bus Premium and Azure Event Hubs Standard are generally available for Availability Zones, and Azure Event Hubs Standard in every Azure region that has zone redundant datacenters.

How do you enable Availability Zones on your Azure Service Bus Premium namespace or Azure Event Hubs Standard?

You can enable Availability Zones on new namespaces only. Migration of existing namespaces is not supported.

If using an ARM template to create a Service Bus Premium namespace, it is as simple as specifying an AZ supported region and setting the zoneRedundant property to true in the template.

For Azure Service Bus Premium namespace:

"resources": [{
"apiVersion": "2018-01-01-preview",
"name": "[parameters('serviceBusNamespaceName')]",
"type": "Microsoft.ServiceBus/namespaces",
"location": "[parameters('location')]",
"sku": {
"name": "Premium"
},
"properties": {
"zoneRedundant": true
}
}],

For Azure Event Hubs Standard namespace:

"resources": [{
"apiVersion": "2018-01-01-preview",
"name": "[parameters('eventHubNamespaceName')]",
"type": "Microsoft.EventHub/namespaces",
"location": "[parameters('location')]",
"sku": {
"name": "Standard"
},
"properties": {
"zoneRedundant": true
}
}],

You can also enable zone-redundancy by creating a new namespace in the Azure portal as shown below. It is important to note that you cannot disable zone redundancy after enabling it on your namespace.

Azure Service Bus Premium:

Azure Event Hubs Standard:

General availability of Service Bus and Event Hubs Availability Zones

In addition to the announcements regarding Availability Zones, we’re happy to announce that we’ve added support for Azure Service Bus Premium tier in the following regions:

China North 2
China East 2
Australia Central
Australia Central 2
France Central
France South

Built on the successful and reliable foundation of Azure Service Bus messaging, we introduced Azure Service Bus Premium in 2015. The Premium tier allows our customers to provision dedicated resources for the Azure Service Bus namespace so that they can ensure greater predictability and performance for the most demanding workloads paired with an equally predictable pricing model. With Service Bus Premium Messaging, our customers benefit from the economics and operational flexibility of a multi-tenant public cloud system, while getting single-tenant reliability and predictability.

Azure Service Bus Premium also provides access to advanced enterprise features such as Availability Zones, Geo-Disaster recovery, and Virtual Network Service Endpoints along with Firewall rules. These additional features make the Premium tier tremendously valuable for customers looking for a highly reliable, resilient, and secure enterprise messaging solution.

For more information on Availability Zones:

Azure Availability Zones
What are Availability Zones in Azure?

For more information on Service Bus Premium:

Azure Service Bus Premium Messaging
Azure Service Bus Premium Messaging launch blog

Quelle: Azure

Introducing IoT Hub device streams in public preview

In today's security-first digital age, ensuring secure connectivity to IoT devices is of paramount importance. A wide range of operational and maintenance scenarios in the IoT space rely on end-to-end device connectivity in order to enable users and services to interact with, login, troubleshoot, send, or receive data from devices. Security and compliance with the organization's policies are therefore an essential ingredient across all these scenarios.

Azure IoT Hub device streams is a new PaaS service that addresses these needs by providing a foundation for secure end-to-end connectivity to IoT devices. Customers, partners, application developers, and third-party platform providers can leverage device streams to communicate securely with IoT devices that reside behind firewalls or are deployed inside of private networks. Furthermore, built-in compatibility with the TCP/IP stack makes device streams applicable to a wide range of applications involving both custom proprietary protocols as well standards-based protocols such as remote shell, web, file transfer and video streaming, among others.

At its core, an IoT Hub device stream is a data transfer tunnel that provides connectivity between two TCP/IP-enabled endpoints: one side of the tunnel is an IoT device and the other side is a customer endpoint that intends to communicate with the device (the latter is referred here as service endpoint). We have seen many setups where direct connectivity to a device is prohibited based on the organization's security policies and connectivity restrictions placed on its networks. These restrictions, while justified, frequently impact various legitimate scenarios that require connectivity to an IoT device.

Examples of these scenarios include:

An operator wishes to login to a device for inspection or maintenance. This scenario commonly involves logging to the device using Secure Shell (SSH) for Linux and Remote Desktop Protocol (RDP) for Windows. The device or network firewall configurations often block the operator's workstation from reaching the device.
An operator needs to remotely access device's diagnostics portal for troubleshooting. Diagnostic portals are typically in the form of a web server hosted on the device. A device's private IP or its firewall configuration may similarly block the user from interacting with the device's web server.
An application developer needs to remotely retrieve logs and other runtime diagnostic information from a device's file system. Protocols commonly used for this purpose may include File Transfer Protocol (FTP) or Secure Copy (SCP), among others. Again, the firewall configurations typically restrict these types of traffic.

IoT Hub device streams address the end-to-end connectivity needs of the above scenarios by leveraging an IoT Hub cloud endpoint that acts as a proxy for application traffic exchanged between the device and service. This setup is depicted in the figure below and works as follows.

Device and service endpoints each create separate outbound connections to an IoT Hub endpoint that acts as a proxy for the traffic being transmitted between them.
IoT Hub endpoint will relay traffic packets sent from device to service and vice-versa. This establishes an end-to-end bidirectional tunnel through which device and service applications can communicate.
The established tunnel through IoT Hub provides reliable and ordered packet delivery guarantees. Furthermore, the transfer of traffic through IoT Hub as an intermediary is masked from the applications, giving them the seamless experience of direct bidirection communication that is on par with TCP.

Benefits

IoT Hub device streams provide the following benefits:

Firewall-friendly secure connectivity: IoT devices can be reached from service endpoints without opening of inbound firewall port at the device or network perimeters. All that is needed is the ability to create outbound connections to IoT Hub cloud endpoints over port 443 (devices that use IoT Hub SDK already maintain such a connection).
Authentication enforcement: To establish a stream, both device and service endpoints need to authenticate with IoT Hub using their corresponding credentials. This enhances security of the device communication layer, by ensuring that the identity of each side of the tunnel is verified prior to any communication taking place between them.
Encryption: By default, IoT Hub device streams use TLS-enabled connections. This ensures that the application traffic is encrypted regardless of whether the application uses encryption or not.
Simplicity of connectivity: The use of device streams eliminates the need for complex setup of Virtual Private Networks (VPN) to enable connectivity to IoT devices. Furthermore, unlike VPN, which give broad access to the entire network, device streams are point-to-point involving a single device and a single service at each side of the tunnel.
Compatibility with the TCP/IP stack: IoT Hub device streams can accommodate TCP/IP application traffic. This means that a wide range of proprietary as well as standards-based protocols can leverage this feature. This includes well established protocols such as Remote Desktop Protocol (RDP), Secure Shell (SSH), File Transfer Protocol (FTP), and HTTP/REST, among many others.
Ease of use in private network setups: Devices that are deployed inside of private networks can be reached without the need to assign publicly routable IP addresses to each device. Another similar case involves devices with dynamic IP assignment which might not be known by the service at all times. In both cases, device streams enable connectivity to a target device using its device ID (rather than IP address) as identifier.

As outlined above, IoT Hub device streams are particularly helpful when devices are placed behind a firewall or inside a private network (with no publicly reachable IP address). Next, we review one such setup as a case study where direct connectivity to the device is restricted.

A case study: Remote device access in a manufacturing setup

To further illustrate the applicability of device streams in real-world IoT scenarios, consider a setup involving equipment and machinery (i.e., IoT devices) on a factory floor that are connected to the factory's local area network. The LAN typically is connected to the Internet through a network gateway or an HTTP proxy and is protected by a firewall at the network boundary. In this setup, the firewall is configured based on the organizations security policies which may prohibit opening of certain firewall ports. For example, port 3389 used by Remote Desktop Protocol is often blocked. Therefore, users from outside of the network cannot access devices over this port.

While such a network setup is in widespread use, it introduces challenges to many common IoT scenarios. For example, if operators need to access equipment from outside of the LAN, the firewall may need to allow inbound connectivity on arbitrary ports used by the application. In the case of a Windows machine that uses the RDP protocol, this comes at odds with the security policies that block port 3389.

Using device streams, the RDP traffic to target devices is tunneled through IoT Hub. Specifically, this tunnel is established over port 443 using outbound connections originating from the device and service. As a result, there is no need to relax firewall policies in the factory network. In our quickstart guides available in C, C#, and NodeJS languages, we have included instructions on how to leverage IoT Hub device streams to enable the RDP scenario. Other protocols can use a similar approach by simply configuring their corresponding communication port.

Next steps

We are excited about the possibilities that can be enabled to communicate with IoT devices securely via IoT Hub device streams. Use the following links to learn more about this feature:

Device streams documentation page
IoT Show recording on Channel 9

Quelle: Azure