Service Fabric Processor in public preview

Microsoft clients for Azure Event Hubs have always had two levels of abstraction. There is the low-level client, which includes event sender and receiver classes which allow for maximum control by the application, but also force the application to understand the configuration of the Event Hub and maintain an event receiver connected to each partition. Built on top of that low-level client is a higher-level library, Event Processor Host, which hides most of those details for the receiving side. Event Processor Host automatically distributes ownership of Event Hub partitions across multiple host instances and delivers events to a processing method provided by the application.

Service Fabric is another Microsoft-provided library, which is a generalized framework for dividing an application into shards and distributing those shards across multiple compute nodes. Many customers are using Service Fabric for their applications, and some of those applications need to receive events from an Event Hub. It is possible to use Event Processor Host within a Service Fabric application, but it is also inelegant and redundant. The combination means that there are two separate layers attempting to distribute load across nodes, and neither one is aware of the other. It also introduces a dependency on Azure Storage, which is the method that Event Processor Host instances use to coordinate partition ownership, and the associated costs.

Service Fabric Processor is a new library for consuming events from an Event Hub that is directly integrated with Service Fabric, it uses Service Fabric's facilities for managing partitions, reliable storage, and for more sophisticated load balancing. At the same time it provides a simple programming interface that will be familiar to anyone who has worked with Event Processor Host. The only specific requirement that Service Fabric Processor imposes is that the Service Fabric application in which it runs must have the same number of partitions as the Event Hub from which it consumes. This allows a simple one on one mapping of Event Hub partitions to application partitions, and lets Service Fabric distribute the load most effectively.

Service Fabric Processor is currently in preview and available on NuGet at the “Microsoft.Azure.EventHubs.ServiceFabricProcessor” web page. The source code is on GitHub in our .NET Event Hubs client repository. You can also find a sample application available on GitHub.

From the developer's point of view, there are two major pieces to creating an application using Service Fabric Processor. The first piece is creating a class that implements the IEventProcessor interface. IEventProcessor specifies methods that are called when processing is starting up for a partition (OpenAsync), when processing is shutting down (CloseAsync), for handling notifications when an error has occurred (ProcessErrorAsync), and for processing events as they come in (ProcessEventsAsync). The last one is where the application's business logic goes and is the key part of most applications.

The second piece is integrating with Service Fabric by adding code to the application's RunAsync method, which is called by Service Fabric to run the application's functionality. The basic steps are:

Create an instance of EventProcessorOptions and set any options desired.

Create an instance of the IEventProcessor implementation. This is the instance that will be used to process events for this partition.

Create an instance of ServiceFabricProcessor, passing the options and processor objects to the constructor.

Call RunAsync on the ServiceFabricProcessor instance, which starts the processing of events.

Next steps

For more details follow our programming guide which is available on GitHub. Did you enjoy this blog post? Don't forget to leave your thoughts and feedback in the comment section below. You can also learn more about Event Hubs by visiting our product page.
Quelle: Azure

Announcing new capabilities in Azure Firewall

Today we are excited to launch two new key capabilities to Azure Firewall.

Threat intelligence based filtering
Service tags filtering

Azure Firewall is a cloud native firewall-as-a-service offering which enables customers to centrally govern all their traffic flows using a DevOps approach. The service supports both application (such as *.github.com), and network level filtering rules. It is highly available and auto scales as your traffic grows.

Threat intelligence based filtering (preview)

Microsoft has a rich signal of both internal threat intelligence data, as well as third party sourced data. Our vast team of data scientists and cybersecurity experts are constantly mining this data to create a high confidence list of known malicious IP addresses and domains. Azure firewall can now be configured to alert and deny traffic to and from known malicious IP addresses and domains in near real-time. The IP addresses and domains are sourced from the Microsoft Threat Intelligence feed. The Microsoft Intelligent Security Graph powers Microsoft Threat Intelligence and provides security in multiple Microsoft products and services, including Azure Security Center and Azure Sentinel.

Threat intelligence-based filtering is default-enabled in alert mode for all Azure Firewall deployments, providing logging of all matching indicators. Customers can adjust behavior to alert and deny.

Figure 1 – Azure Firewall concept architecture

Managing your firewall

Logging analysis of threat data and actionable insights are all crucial and central themes to planning, building, and operating applications and infrastructure.

Azure Firewall provides full integration with Azure Monitor. Logs can be sent to Log Analytics, Storage, and Event Hubs.  Azure Log Analytics allows for the creation of rich dashboards and visualization. Along with custom data queries this powerful integration provides a common place for all your logging needs, with vast options to customize the way you consume your data. Customers can send data from Azure Monitor to SIEM systems such as Splunk, ArcSight and similar third party offerings.

Figure 2 – Azure Firewall detecting a compromised VM using threat intelligence and blocking these outbound connections

Figure 3 – Azure Firewall detecting port scan attempts using threat intelligence and blocking these inbound connections

Service tags filtering

Along with threat intelligent-based filtering, we are adding support for service tags which have also been a highly requested feature by our users. A service tag represents a group of IP address prefixes for specific Microsoft services such as SQL Azure, Azure Key Vault, and Azure Service Bus, to simplify network rule creation. Microsoft today supports service tagging for a rich set of Azure services which includes managing the address prefixes encompassed by the service tag, and automatically updating the service tag as addresses change. Azure Firewall service tags can be used in the network rules destination field. We will continue to add support for additional service tags over time.

Central management

Azure Firewall public REST APIs can be used by third party security policy management tools to provide a centralized management experience for Azure Firewalls, Network Security Groups, and network virtual appliances (NVAs). In September 2018, we announced the private preview for Barracuda’s new service, AlgoSec CloudFlow and Tufin. We are happy to announce that AlgoSec CloudFlow is now available as a public beta. Learn more and join at the Algosec website.

We want to thank all our customers for their amazing feedback since Azure Firewall became generally available in September 2018. We continue to be amazed by the adoption, interest, positive feedback, and the breadth of use cases customers are finding for our service. Please do keep your feedback coming and we look forward to continuing to advance the service to meet your needs.

Learn more

Azure Firewall Documentation
Azure Firewall Threat Intelligence
Azure Firewall Service Tags
Pricing

Quelle: Azure

Presenting the new IIC Security Maturity Model for IoT

Organizations deploying IoT solutions often ask similar questions as they address security—What is the risk my organization takes on as we adopt IoT? How much security do we need for our scenario? Where should we invest for the biggest impact? To answer those questions, Microsoft co-authored and edited the Industrial Internet Consortium (IIC) IoT Security Maturity Model (SMM) Practitioner’s Guide. The SMM leads organizations as they assess the security maturity state of their current organization or system, and as they set the target level of security maturity required for their IoT deployment. Once organizations set their target maturity, the SMM gives them an actionable roadmap that guides them from lower levels of security maturity to the state required for their deployment.

Because not all IoT scenarios require the same level of security maturity, the goal of the SMM is to allow organizations to meet their scenario needs without over-investing in security mechanisms. For example, a manufacturing or an oil and gas solution involving safety needs an especially high maturity state.

The SMM complements Microsoft’s body of existing research and standards for IoT security, such as the “Seven Properties of Highly Secure Devices.” While the research in the Seven Properties paper provides a comprehensive deep dive into device security, the SMM takes a broader view of IoT security. This comprehensive model is used in the IoT security space to assess the maturity of organizations’ systems including governance/process, technology, and system security management. Other models typically address IT but not IoT, or IoT but not security, or security but not IoT. The SMM covers all these aspects and leverages other models where appropriate.

Applying the SMM to your organization

While the SMM’s intended audience is owners of IoT systems, decision makers, and security leaders, we expect assessment companies, and assessments groups within organizations, to be the main practitioners of the model. The SMM allows decision makers and security leaders to understand and consistently apply assessments performed by different groups. It also provides flexibility for industry extensions (currently being explored with several industry groups and associations) and allows for different types of visualization of the model results.

The SMM is organized as a hierarchy and includes domains, subdomains, and practices. This hierarchical approach enables the maturity and gap analysis to be viewed at different levels of detail, making it easier for organizations to prioritize gaps.

The SMM also makes an important distinction between security levels and security maturity states, helping organizations to understand the differences between what their goals need to be and where they are in their security journey. In the SMM, a security level is a measure of how much security you have. The SMM does not dictate what the appropriate security level should be for your organization. Rather, it provides guidance and structure so organizations can identify considerations for different security maturity states appropriate for their industry and systems.

Security maturity, on the other hand, is a measure of how well your security technology, processes, and operations meet your organization’s specific needs. The SMM helps you determine how much security you need, based on cost, benefit, and risk. The model allows you to consider various factors such as the specific threats to your organization's industry vertical, regulatory, and compliance requirements, the unique risks present in the environments your IoT operates in, and your organization's threat profile.

As you begin working with the SMM, it guides you through each step of the assessment using the model. Your organization begins by establishing a target state or identifying a relevant industry profile you want to target. Your organization then conducts an assessment to capture a current maturity state. By comparing the target and current states, organizations identify gaps. Based on the gap analysis, business and technical stakeholders can establish a roadmap, take action, and measure the progress. Organizations improve their security state by making continued security assessments and improvements over time. No matter how far along you are with IoT security, the model will help you close gaps that bring you to your desired security maturity.

Assessing security details with the security maturity model

Once you begin working with the SMM, it guides you through a rigorous approach to defining how well your security state meets your needs. To help you identify actionable areas to improve and to avoid blind spots in your plan, the SMM introduces domains, subdomains, and practices. You can gauge how well your organization is doing in each domain, subdomain, and practice along two dimensions — comprehensiveness and scope. Comprehensiveness is a measure of the depth and with higher levels indicating higher degree of maturity of a process or technology. Scope allows for identifying general, industry, and system specific requirements, ensuring the SMM can be tailored to your industry and use case with more precision than previous models could achieve.

SMM Hierarchy

The domains in the SMM include governance, enablement, and hardening. These domains determine the priorities of security maturity enhancements at the strategic level.

Governance influences business process and includes program management, risk management, and supply chain and third-party management.
Enablement covers architecture considerations and security technology mechanisms and includes identity management, access control, and physical protection.
Hardening defines countermeasures to deal with incidents and includes monitoring, events detection, and remediation.

The subdomains reflect the means of obtaining the priorities at the planning level. The practices define typical activities associated with subdomains identified at the tactical level.

The SMM includes practice tables grouped by domains and subdomains. Each SMM practice includes a table describing what must be done to reach a given comprehensiveness level at the general scope. For each comprehensiveness level, the table describes the objective and general considerations, a description of the level, practices that should be in place to achieve that level, and indicators of accomplishment to help assessors determine if the organization has met the requirements of the level.

Of course, general guidelines are often difficult to apply to specific scenarios. For that reason, an example follows each table using various industry use cases to demonstrate how an organization might use the table to pick a target state or to evaluate a current state. The guide also contains three case studies that show IoT stakeholders how to apply the process. The case studies include a smarter data-driven bottling line, an automotive gateway supporting Over the Air (OTA) updates, and consumer residential settings using security cameras. As our work on the SMM continues, we will work with industry organizations and associations to define industry profiles for the SMM.

Getting started with the SMM

If you want more information on exactly how the SMM works or how you can begin, the best spot to start is with the model itself: evaluate or improve your IoT security with the SMM today. To learn more about the SMM from its authors, watch our SMM introduction webinar.

For details on building your secure IoT solution on the trusted Azure IoT cloud services, see our Azure IoT Security Architecture for more information or start your free trial to get immediate hands on experience.
Quelle: Azure

Announcing new Azure Security Center capabilities at RSA 2019

This is an exciting week for us at Microsoft. At RSA Conference 2019, we are announcing new and exciting capabilities in Azure and Microsoft 365. With this blog post, we wanted to share with you what we have been working on for Azure Security Center. Azure Security Center now leverages machine learning to reduce the attack surface of internet facing virtual machines. Its adaptive application controls have been extended to Linux and on-premises servers, and extends the network map support to peered virtual network (VNet) configurations.

Leveraging machine learning to reduce attack surface

One of the biggest attack surfaces for workloads running in the public cloud are connections to and from the public Internet. Our customers find it hard to know which Network Security Group (NSG) rules should be in place to make sure that Azure workloads are only available to required source ranges. Security Center can now learn the network traffic and connectivity patterns of your Azure workload and provide you with NSG rule recommendations for your internet facing virtual machines. This helps you better configure your network access policies and limit your exposure to attacks.

Azure Security Center uses machine learning to fully automate this process, including an automated enforcement mechanism, enabling its customers to better protect their internet facing virtual machines with only a few clicks. These recommendations also use Microsoft’s extensive threat intelligence reports to make sure that known bad actors are blocked.

Extending adaptive application controls

Adaptive application control is an intelligent, automated end-to-end application whitelisting solution from Azure Security Center. It helps you control which applications can run on your VMs located in Azure, which, among other benefits, helps harden your VMs against malware. Security Center uses machine learning to analyze the applications running on your VMs and helps you apply the specific whitelisting rules using this intelligence.

We are extending adaptive application controls in Azure Security Center to include Linux VMs and servers/VMs external to Azure (Windows and Linux) in audit mode. This means that Azure Security Center will identify applications running on your servers which are not in compliance with the Azure Security Center generated whitelisting rules and will audit those violations. This will enable you to  detect threats that might otherwise be missed by antimalware solutions; to comply with your organization's security policy that dictates the use of only licensed software and to audit unwanted software that is being used in your environment.

Network map support for VNet peering

Azure Security Center’s network map has added support for virtual network peering, a configuration in which traffic flows between Azure Virtual Networks through the Microsoft backbone, as if they were virtual machines in the same virtual network, through private IP addresses only. The support includes displaying allowed traffic flows between peered VNets and peering related information on Security Center’s network map.

With these additions, Azure Security Center strengthens its role as the unified security management and advanced threat protection solution for your hybrid cloud workloads. We encourage you to take advantage of these new capabilities for all your Internet-exposed Azure resources. If you have not started using Azure Security Center in your Azure subscription, get started today.
Quelle: Azure

Guardian modules: Bringing Azure Sphere security to brownfield IoT

When someone mentions the words “Internet of Things,” often the first picture that comes to mind is some sort of device with the Internet “built-in.” However, a built-in design involves months or years of design work and applies only to devices that have yet to come to market. How do businesses leverage IoT for the billions of devices already in the field without creating a large security risk? Within the Azure Sphere team, we have a term for those scenarios that is called “brownfield” deployments.  Josh Nash, our product planner, is guest blogging today to tell you more about brownfield IoT and how Azure Sphere can safely connect devices already deployed in the field.

Happy reading!

– Ed Nightingale

As a product planner, I have spent thousands of hours meeting with partners and customers to understand their needs to ensure that our product is not only secured, but also practical and useful. Our first focus is often on new devices, which are devices that have Azure Sphere deeply integrated into the product platform. We refer to these devices as “greenfield” scenarios. These scenarios shine as Azure Sphere’s value proposition resonates soundly, and the implementation is comparatively more straightforward due to the flexibility before a design is considered final. OEMs can focus on how to integrate Azure Sphere within a device’s internal design to meet power, performance, and functionality goals. Devices already in service without connectivity represent a “brownfield” opportunity to create meaningful, new connected experiences across a wide range of equipment. This blog post outlines the challenges of these brownfield devices and how Azure Sphere can help.

The problem

The act of connecting enterprise equipment represents an opportunity to enable markedly better outcomes by enabling scenarios and innovative business models for the enterprise such as preventive maintenance, just-in-time reporting, and even role-based access to equipment and data. But there are millions of devices in enterprises around the world that were either put into service before connectivity for non-IT devices was considered feasible or were intentionally not connected because they were deemed too mission critical to be subjected to the unsafe world of the Internet.

While connecting these unconnected devices creates positive outcomes and opportunities for the enterprise, in either case, there are risks associated with connecting these devices. Whether the devices predated connectivity or were deemed too important to connect, the outcome is the same – their lack of connectedness is their security model, a concept known as air-gap security.  However, as capabilities improve to solve optimization and other resourcing decisions with the cloud, the value of connectedness increases. At the same time, the risk of connecting air-gap devices is at least as high as before. Businesses need a mechanism to balance the equation, mitigating risk by infusing a new security model into the system design. As the value of connectedness increases, enterprises need a mechanism to securely connect these devices that are already in service.

Securing existing equipment and devices with a guardian module

Microsoft announced Azure Sphere at RSA last year to enable secure, connected, microcontroller- (MCU-) based devices. Azure Sphere unlocks IoT by establishing a foundation on which an enterprise can trust a device to run securely in any environment. Based on the Microsoft whitepaper, “The 7 Properties of Highly Secure Devices,” Azure Sphere delivers device security by combining hardware, operating system software, and cloud services that has been purpose-built for secure IoT applications. Azure Sphere is raising the bar for manufacturers and enterprises to enable secure connectivity in new devices by delivering the seven properties, but it can do more.

Azure Sphere can also deliver secured connectivity to established devices already in service. By utilizing existing ports on an unconnected device, Azure Sphere can be built into a “guardian module” that can be paired with existing equipment to enable secured connectivity that rebalances the opportunity versus risk debate in favor of device connectivity. The key to security in these scenarios is that Azure Sphere, not the device, communicates with the cloud. By separating the device from the network, a guardian module enabled by Azure Sphere can protect the equipment from attack, ensure data is only transmitted between trusted cloud and device communications partners, and ensure the software of the module and the equipment remains intact and secured.

With Azure Sphere in a guardian module, enterprises can enable a variety of secured operations between the device and the cloud. From a device health and security perspective, the device can utilize the Azure Sphere Security Service for certificate-based authentication, failure reporting, and over-the-air software updates. When a guardian module passes authentication, it receives a certificate signed by the Azure Sphere Security Service that can be used to identify the device as genuine in communicating with cloud services. These communications between the guardian module and the cloud could comprise of data coming off the device to signal events or inform decision-making, and they could also represent messages from the cloud to trigger activity from the device itself.

In both directions, applications running on Azure Sphere can be used to validate that messages are properly formed before they are passed to or from the brownfield device and Azure Sphere ensures that the guardian module can only communicate with trusted endpoints. The Azure Sphere Security Service can also be used as a distribution point for software updates not only to the Azure Sphere OS and guardian module manufacturer’s device software, but also for updates to the downstream, previously unconnected device’s software. In this situation, the enterprise can avoid costly truck rolls to update software on their equipment. With a guardian module enabled by Azure Sphere, these brownfield devices can reap most of the benefits of a device that integrated Azure Sphere into its design.

With an Azure Sphere-enabled device, enterprise customers can more confidently connect existing devices to the cloud and unlock scenarios related to preventive maintenance, optimizing utilization, and even role-based access control. When linking a multitude of devices together in the cloud, the possibilities are almost endless.

To start the discussion on how Azure Sphere can help your business, email us at nextinfo@microsoft.com.
Quelle: Azure

Azure.Source – Volume 72

Now in preview

Announcing Azure Spatial Anchors for collaborative, cross-platform mixed reality apps

Azure Spatial Anchors, a mixed reality service that enables you to build a new generation of mixed reality applications that are collaborative, cross-platform, and spatially aware, is now in public preview. Across industries, developers and businesses are using mixed reality in their daily workflows and giving us feedback on what they’d like to see next. When we look across all the mixed reality solutions that customers have built, two things stand out: they want to easily share their mixed reality experiences and place applications in the context of the real world. Learn about two application patterns gaining momentum across industries, and how Azure Spatial Anchors can help you deliver them with greater ease and speed.

Introducing Microsoft Azure Sentinel, intelligent security analytics for your entire enterprise

Microsoft Azure Sentinel is available in preview in the Azure portal. Security can be a never-ending saga—a chronicle of increasingly sophisticated attacks, volumes of alerts, and long resolution timeframes where today’s Security Information and Event Management (SIEM) products can’t keep pace. We’ve reimagined a new cloud-native SIEM tool called Microsoft Azure Sentinel to provide intelligent security analytics at cloud scale while making it easy to collect security data across your entire hybrid organization; from devices, to users, to apps, to servers on any cloud.

Announcing Azure Monitor AIOps Alerts with Dynamic Thresholds

Metric Alerts with Dynamic Thresholds is now available in public preview. Dynamic Thresholds significantly enhance Azure Monitor Metric Alerts so you no longer need to manually identify and set thresholds. The alert rule now leverages advanced machine learning (ML) capabilities to learn metrics' historical behavior while identifying patterns and anomalies that indicate possible service issues. Metric Alerts with Dynamic Threshold is currently available for free during the public preview.

Working with AZCopy 10 and Azure Storage Blob Access Tiers

AzCopy v10 is now available in public preview. Azure Blob Storage offers three different access tiers for saving money depending on storage requirements. Get high-performance, reliable data transfers that work with mixed access tiers inside an Azure storage account using the latest AzCopy, a console tool to help with uploading blobs to Azure Storage.

Announcing Azure Integration Service Environment for Logic Apps

Integrated Service Environment is now available in every region for which Logic Apps is currently available. In critical business scenarios, you need to have the confidence that your data is flowing between all the moving parts. The core Logic Apps offering is a multi-faceted service for integrating between data sources and services, but sometimes you also need to have a dedicated service to ensure that your integration processes are as performant as can be. That’s why we developed the Integration Service Environment (ISE), a fully isolated and dedicated environment for all enterprise-scale integration needs. Integration Service Environments are available in every region that Logic Apps is currently available in except for West Central US, Brazil South and Canada East.

Also available in preview

Public preview: Azure Log Analytics is available in new regions in Australia
Azure Container Registry firewall rules and Virtual Network (in preview)
Azure App Service – WildFly on Linux is in preview
Code-free data transformation at scale using Azure Data Factory
Data Migration Assistant support for Cassandra to Azure Cosmos DB assessment
Azure Maps events in Azure Event Grid
Azure SQL DB as reference data input
Machine learning-based anomaly detection functions in Azure Stream Analytics (preview)
Azure DevOps CLI preview and feature updates in Pipelines – Sprint 148 Update

Now generally available

 

Announcing the general availability of Java support in Azure Functions

Announcing the general availability of Java support in Azure Functions 2.0; enabling a wide range of options for you to build and run your Java apps in the 50+ regions offered by Azure around the world. Azure Functions provides a productive programming model based on triggers and bindings for accelerated development and serverless hosting of event-driven applications. It enables developers to build apps using the platform (Windows, Mac, or Linux), programming languages, and tools of their choice; with an end-to-end developer experience that spans from building and debugging locally, to deploying and monitoring in the cloud.

Announcing the general availability of Azure Lab Services

Announcing the general availability of Azure Lab Services – computer labs in the cloud. Provisioning and managing a lab’s underlying infrastructure makes preparing the right lab experience for your users difficult. With Azure Lab Services, you can easily set up and provide on-demand access to preconfigured virtual machines (VMs) to teach a class, train professionals, run hackathons or hands-on labs, and more. Azure Lab Services GA pricing goes into effect on May 1, 2019. Learn more about using Azure Lab Services in this post from the Premier Developer blog: Azure Lab services scheduling now Available.

Running Cognitive Services on Azure IoT Edge

Announcing support for running Azure Cognitive Services containers for Text Analytics and Language Understanding containers on edge devices with Azure IoT Edge so you can run workloads locally. Whether you don’t have a reliable internet connection, or want to save on bandwidth cost, have super low latency requirements, or are dealing with sensitive data that needs to be analyzed on-site, Azure IoT Edge with the Cognitive Services containers gives you consistency with the cloud. These container images are directly available to try as IoT Edge modules on the Azure Marketplace.

Also generally available

Azure Database for MySQL: Read replica now generally available
Virtual network service endpoints for Azure Database for MariaDB are now generally available
New features are now generally available in Event Grid

General availability: Azure Availability Zones in East US
General availability: Global VNet Peering in Azure Government regions

Events

MWC 2019: Azure IoT customers, partners accelerate innovation from cloud to edge

The Internet of Things (IoT) has expanded the world of computing far beyond mobile and PC, bringing a new and ever-growing class of cloud-connected devices that is on track to reach 20 billion devices by 2020. Announcing new IoT partnerships for global-scale IoT solutions with new devices and cloud services that further increase the strategic value of IoT.

Learn more in this post by Julia White, Corporate Vice President, Microsoft Azure on the Official Microsoft Blog: Microsoft at MWC Barcelona: Introducing Microsoft HoloLens 2.

News and updates

Instantly restore your Azure Virtual Machines using Azure Backup

Azure Backup Instant Restore capability for Azure Virtual Machines (VMs) is now available. If you use Azure Backup, Instant Restore helps you quickly recover VMs from the snapshots stored together with the disks. In addition, you get complete flexibility in configuring the retention range of snapshots at the backup policy level depending on the requirements and criticality of the virtual machines associated; giving users more granular control over their resources. We are enabling and rolling out this experience on a region-by-region basis.

Cognitive Services Speech SDK 1.3 – February update

Developers can now access the latest Cognitive Services Speech SDK which now supports: selection of the input microphone through the AudioConfig class, expanded support for Debian 9, Unity in C# (beta) on Windows x86 and x64 (desktop or Universal Windows Platform applications), and Android (ARM32/64, x86), and more. Read about all the updates made to the Cognitive Services Speech SDK made in February.

Improving the TypeScript support in Azure Functions

TypeScript is becoming increasingly popular in the JavaScript community. Announcing a set of tooling improvements for TypeScript support in Azure Functions so you can more easily develop with TypeScript when building your event-driven applications. With this set of changes to the Azure Functions Core Tools and the Azure Functions Extension for Visual Studio Code, Azure Functions now supports TypeScript out of the box. Included with these changes are a set of templates for TypeScript, type definitions, and npm scripts.

New device modeling experience in Azure IoT Central

Optimize your device workflow for easier management and clarity. Introducing a new “Device Templates” navigation tab that replaces the existing “Application Builder” tab, as well as updated patterns for viewing or editing device templates. We've begun a flighted rollout of a new the device modeling.

Azure Data Factory updates

Monitor Azure Data Factory pipelines by using Azure Monitor and Log Analytics

Azure Data Factory visual tools integrated with GitHub
Self-hosted IR sharing across multiple data factories
Parameterize connections to your data stores in Azure Data Factory
Linked Resource Manager template support available for CI/CD in your data factories
Azure Functions now supported as a step in Azure Data Factory pipelines
Create alerts to proactively monitor your data factory pipelines
Enhanced monitoring capabilities and tags/annotations in Azure Data Factory
Support for Enterprise Security Package–enabled HDInsight clusters in Azure Data Factory

Additional news and updates

FastTrack for Azure
Global VNet Peering now supports Standard Load Balancer
Database rename is now supported in Azure SQL Database Managed Instance
Remove Application Insights connections via PowerShell
Instantly provision GraphQL on Azure with Hasura and Azure Database for PostgreSQL
Azure Database for MySQL and PostgreSQL: New compute options are now available
Azure Policy non-compliance reasoning and change history features
Azure Blockchain: Updated Hyperledger Fabric template 1.3 now available
M-series virtual machines (VMs) are now available in the Korea South region
Azure Event Grid Availability in Azure US Gov regions

Technical content

Five tips for securing your IaaS workloads

Implementing IaaS security best practices is an essential step to securing your IaaS resources. Get specific recommendations for improving your IaaS security posture, focus on data protection, strengthen network security, and streamline security management including threat protection. Read the Azure Government Security documentation to understand features and variations for Azure Government.

Azure Stack laaS – part two

Every organization has a unique journey to the cloud, based on the organization’s history, business specifics, culture, and maybe most importantly, their starting point. Typically in your migration journey, you use a mixture of tools, so you need to understand the options available in order to select the right tool for the specific workloads. The Azure migration center provides a good model and helpful resources to get you started on your migration to the cloud and to make sure you can create the proper frame for your migration.

AZX.ms – A Collection of Azure CLI Recipes

Simplify your Azure development with a collection of Azure CLI scripts accessible right at your fingertips.

 

Latest enhancements now available for Cognitive Services' Computer Vision

With the latest enhancements to Cognitive Services’ Computer Vision service, you can extract insights, unlock new workflows, and easily customize and deploy your model without requiring machine-learning expertise.

Creating IoT applications with Azure Database for PostgreSQL

There are numerous IoT use cases in different industries with common categories like predictive maintenance, connected vehicles, anomaly detection, asset monitoring, and many others. Azure IoT is a complete stack of IoT solutions; a collection of Microsoft managed cloud services that connect, monitor, and control billions of IoT assets. See how to implement an end-to-end Azure IoT solution and use Azure Database for PostgreSQL to store IoT event data in the JSONB format.

3 Reasons To Add Deep Learning to Your Time-Series Toolkit

In this article, Francesca shares 3-5 lessons learned while building neural networks for time series (leading up to 2-day trainings at AI Conference NYC and Strata Data Conference in San Francisco). With clear explanations, standard Python libraries, readers discovers tips and tricks to develop deep learning models for their own time series forecasting projects.

Understanding routing in istio

This is the first in a series of blog posts that will go into depth on how to use popular OSS on top of Azure Kubernetes service.  In this post, Scott Coulton runs through how to install istio with Helm and how to deploy two versions of the same application and route traffic on weight (percentage). The post also contains all of the source code and a demo application so you can go and test the topic out for yourself.

How to Lock Azure Resources to Prevent Modification or Deletion

This article demonstrates how to configure Azure Resource Locking using Azure Role Based Access Control (RBAC) enables us to restrict access to resources and resource actions, which helps prevent inadvertent resource deletion and modification.

Running Micro Focus Enterprise Server 4.0 in a Docker Container in Azure

Running a Docker container in Micro Focus Enterprise Server 4.0 is new and provides portability, performance, agility, and isolation. See how to effectively run and manage a mainframe CICS application in a Docker Container using the Windows 2016 Datacenter and the Containers VM now available from the Azure Marketplace.

Exploring Feature Weights using R and Azure Machine Learning Studio

Suppose we have to design a black box that displays a “thumbs up” or “thumbs down” depending on hundreds of different combinations of inputs. This post describes how to conduct exploratory data analysis using R and Azure Machine Learning Studio to train a “black-box” model in a case when it is difficult to explain how the model characterizes the relationship between the features and the target variable.

Intro to Microsoft Azure Resource Manager Templates

Cloud Advocate, Jay Gordon gives you an introduction to Azure Resource Manager Templates and how to begin using them with Azure Cloud Shell. You'll see the number of options available to quickly create Azure resources.

Additional technical content

Introduction to DevOps for Dynamics 365 Customer Engagement using YAML Based Azure Pipelines
How to Lock Azure Resources to Prevent Modification or Deletion
Xamarin and Azure Office Hours Recap from December 2018 and January 2019
Azure Development Community – Same Great Blog… NEW HOME!
Lesson Learned #73: Azure Database for MySQL – CONVERT_TZ returns null

Azure shows

Episode 268 – ExpressRoute Roadmap | The Azure Podcast

Paresh Mundade, a Senior PM in the Azure ExpressRoute team, gives the guys an update on the service and a glimpse into the roadmap of planned features.

HTML5 audio not supported

Azure Cosmos DB update: SDKs, CORS, multi-region strong consistency, and more | Azure Friday

Learn about the new SDKs for Azure Cosmos DB with the JavaScript SDK used as an example as well as learn about CORS (Cross-Origin Resource Sharing) via a simple JavaScript app for demonstration. See a new, low-cost database offer in the Azure portal, watch an Azure DevOps build task setup, get some cost saving tips, and learn about support for multi-region strong consistency.

An intro to Azure Cosmos DB .NET SDK 3.0 | Azure Friday

Learn about the new improvements for Azure Cosmos DB SDKs, including the new, idiomatic .NET SDK with a friendlier, more intuitive programming model, better testability, better performance, .NET Standard 2.0 support. Plus, it is now open sourced.

Azure Maps – The Microsoft Azure Enterprise Location Platform | Internet of Things Show

Azure Maps is the de facto location intelligence platform natively hosted in the Microsoft Azure cloud. Chris Pendleton, PM Lead for the service, gives us an overview of what Azure Maps is, who uses Azure Maps, how Azure Maps is being used across our customer base, and how you can start using Azure Maps today.

How to get started with Azure Front Door | Azure Tips and Tricks

learn how to get started with Azure Front Door. Azure Front Door easily makes your applications globally available and secure.

How to create, share, and use Azure Portal dashboards | Azure Portal Series

In this video of the Azure Portal "How To" series, learn how to easily create, share, and use dashboards in the Azure Portal. Learn more about the series: Introducing the Azure portal “how to” video series

What is Identity Protection? | Azure Active Directory

In this video, get a high-level overview of Identity Protection, a feature of Azure Active Directory. You’ll learn about different types of detections, risks and risk policies that exist in Identity Protection. It explains the benefits of the risk policies, recent UX enhancements, powerful APIs, improved risk assessment and overall alignment along risky users and risky sign-ins. In addition, this series also videos on How to deploy Identity Protection & How to use Identity Protection.

Martin Woodward on Azure DevOps With GitHub – Episode 25 | Azure DevOps Podcast

In this episode, Martin Woodward and Jeffrey Palermo dive right into the topic of Azure DevOps with GitHub; discussing some of the changes since Microsoft acquired GitHub, whether you should choose to work with Azure Repos or GitHub, and how to use Azure DevOps Services with GitHub.

HTML5 audio not supported

Customers, partners, and industries

Microsoft and SAP extend partnership to Internet of Things

The Internet of Things (IoT) is becoming mainstream as companies see market-making benefits from IoT and deploying at scale – from transforming operations and logistics, remote monitoring, and predictive maintenance at the edge, to new consumer experiences powered by connected devices. In all of these solutions, IoT data and AI are producing powerful insights that lead to new opportunities. Microsoft and SAP have announced an expansion of their partnership to include physical devices and assets with a new collaboration in the IoT space. SAP Leonardo IoT will integrate with Azure IoT services, providing customers with the ability to contextualize and enrich their IoT data with SAP business data and to seamlessly extend their SAP solution-based business processes to the Azure IoT Edge platform.

Azure Marketplace new offers – Volume 32

The Azure Marketplace is the premier destination for all your software needs – certified and optimized to run on Azure. Find, try, purchase, and provision applications & services from hundreds of leading software providers. You can also connect with Gold and Silver Microsoft Cloud Competency partners to help your adoption of Azure. In the second half of January we published 70 new offers.

Azure This Week – 1 March 2019 | A Cloud Guru – Azure This Week

This time on Azure This Week, Lars talks with JT from Microsoft about the brand new HoloLens 2 and how it will be cloud connected, Azure DevOps Projects Kubernetes support gets new features, and you can now protect Azure VMs by using Storage Spaces Direct with Azure Site Recovery.

Quelle: Azure

New device modeling experience in Azure IoT Central

On the Azure IoT Central team, we are constantly talking with our customers to understand how we can continue to provide more value. One of our top pieces of product feedback has been for a clearer device modeling experience that separates the device instance from the device template. Previously, viewing the device and editing the device template took place on the same page through an “Edit Template” button. This caused a lack of clarity between when you were making a change that applied to the device or if your changes were getting applied to all devices in that template. Recently we've begun a flighted rollout of a new device modeling experience that begins to directly address this feedback.

For app builder roles, we have introduced a new “Device Templates” navigation tab that replaces the existing “Application Builder” tab, as well as updated the pattern in which you view or edit your device templates. To edit your device templates, you can visit the “Device Templates” tab to make changes. To view or interact with your device instance, you can still find this under the “Explorer” tab. We’re excited to get the first set of changes in your hands so that device templates and device explorer can continue to evolve independently from one another in order to best support how our users interact with their devices. These changes will both optimize the operator experience of viewing or interacting with devices, as well as streamline the builder workflow of creating or modifying a template.

These changes are an important first step towards continuing to optimize your device workflow for easier management and clarity. Please leave us feedback at Azure IoT Central UserVoice, as we continue to invest in understanding and solving our customer needs.

To learn more, please visit our documentation, “Set up a device template.”
Quelle: Azure

Creating IoT applications with Azure Database for PostgreSQL

There are numerous IoT use cases in different industries, with common categories like predictive maintenance, connected vehicles, anomaly detection, asset monitoring, and many others. For example, in water treatment facilities in the state of California, IoT devices can be installed in water pumps to measure horse power, flow rate, and electric usage of the water pumps. The events emitted from these devices get sent to an IoT hub every 30 seconds for aggregation and processing. A water treatment facility company could build a dashboard to monitor the water pumps and build notifications to alert the maintenance team when the event data is beyond a certain threshold. They could then alert the maintenance team to repair the water pump if the flow rate is dangerously low. This is a very typical proactive maintenance IoT use case.

Azure IoT is a complete stack of IoT solutions. It’s a collection of Microsoft managed cloud services that connect, monitor, and control billions of IoT assets. The common set of components in the Azure IoT core subsystem include:

IoT devices that stream the events
Cloud gateway, where Azure IoT is most often used to enable communication to and from devices and edge devices
Stream processing that ingests events from the device and triggers actions based on the output of the analysis. A common workflow is the input telemetry encoded in Avro that may return output telemetry encoded in JSON for storage
Storage, that’s usually a database used to store IoT event data for reporting and visualization purposes

Let’s take a look at how we implement an end to end Azure IoT solution and use Azure Database for PostgreSQL to store IoT event data in the JSONB format. Using PostgreSQL as the NoSQL data store has its own advantages with its strong native JSON processing, indexing capabilities, and plv8 extension that further enhances it by integrating the JavaScript v8 engine with SQL. Besides the managed services capabilities and lower cost, one of the key advantages of using Azure Database for PostgreSQL is its native integration with the Azure ecosystem that enables modern applications with improved developer productivity.

In this implementation, we use Azure Database for PostgreSQL with the plv8 extension as a persistent layer for IoT telemetry stream for storage, analytics, and reporting. The high-speed streaming data is first loaded into the PostgreSQL database (master server) as a persistent layer. The master server is used for high speed data ingestion and the read replicas are leveraged for reporting and downstream data processing to take data-driven actions. You can leverage the Azure IoT Hub as the event processing hub and Azure Function to trigger the processing steps and extract what’s needed from emitted events to store them in Azure Database for PostgreSQL.

 

In this post, we’ll walk through the high-level implementation to get you started. Our GitHub repository has sample applications and a detailed QuickStart tutorial with step-by-step instructions for implementing the solution below. The QuickStart uses Node.js applications to send telemetry to the IoT Hub.

Step 1: Create an Azure IoT Hub and register a device with the Hub

In this implementation, the IoT sensor simulators are constantly emitting temperature and humidity data back to the cloud. The first step would be creating an Azure IoT Hub in the Azure portal using these instructions. Next, you’ll want to register the device name in the IoT Hub so that the IoT Hub can receive and process the telemetry from the registered devices.

In GitHub, you will see sample scripts to register the device using CLI and export the IoT Hub service connection string.

Step 2: Create an Azure Database for PostgreSQL server and a database IoT demo to store the telemetry data stream

Provision an Azure Database for PostgreSQL with the appropriate size. You can use the Azure portal or the Azure CLI to provision the Azure Database for PostgreSQL.

In the database, you will enable the plv8 extension and create a sample plv8 function that’s useful for querying to extract a temperature column from the JSON documents. You can use the JSON table to store the IoT telemetry data. You can locate the script to create a database and table and enable the plv8 extension in GitHub.

Step 3: Create an Azure Function Event Hub and extract message and store in PostgreSQL

Next you will create a JavaScript Azure Function with Event Hub trigger bindings to Azure IoT Hub created in step 1. Use the JavaScript index.js sample to create this function. The function is triggered for each incoming message stream in the IoT Hub. It extracts the JSON message stream and inserts the data into the PostgreSQL database created in Step 2.

Getting started by running the IoT solution end to end

We recommend that you try and implement this solution using the sample application in our GitHub repository. In GitHub, you will find steps on running the node.js application to simulate the generation of event data, creating an IoT Hub with device registration, sending the event data to the IoT Hub, deploying Azure function to extract the data from JSON message, and inserting it in Azure Database for PostgreSQL.

At the end of implementing all the steps in GitHub, you will be able to query and analyze the data using reporting tools like Power BI that allow you to build real-time dashboards as shown below.

We hope that you enjoy working with the latest features and functionality available in our Azure Database Service for PostgreSQL. Be sure to share your feedback via User Voice for PostgreSQL.

If you need any help or have questions, please check out the Azure Database for PostgreSQL documentation.

Acknowledgements

Special thanks to Qingqing Yuan, Bassu Hiremath, Parikshit Savjani, Anitah Cantele, and Rachel Agyemang for their contributions to this post.
Quelle: Azure

Cognitive Services Speech SDK 1.3 – February update

Developers can now access the latest Cognitive Services Speech SDK which now supports:

Selection of the input microphone through the AudioConfig class
Expanded support for Debian 9
Unity in C# (beta)
Additional sample code

Read the updated Speech Services documentation to get started today.

What’s new

The Speech SDK supports a selection of the input microphone through the AudioConfig class, meaning you can stream audio data to the Speech Service from a non-default microphone. For more details see the documentation and the how-to guide on selecting an audio input device with the Speech SDK. This is not yet available from JavaScript.

The Speech SDK now also supports Unity in a beta version. Since this is new functionality, please provide feedback through the issue section in the GitHub sample repository. This release supports Unity on Windows x86 and x64 (desktop or Universal Windows Platform applications), and Android (ARM32/64, x86). More information is available in our Unity quickstart.

Samples

The following new content is available in our sample repository.

Samples for AudioConfig.FromMicrophoneInput.
Python samples for intent recognition and translation.
Samples for using the Connection object in iOS.
Java samples for translation with audio output.
New sample for use of the Batch Transcription REST API.

Improvements and changes

A number of improvements and changes have been made since our last release including:

Python

Improved parameter verification and error messages in SpeechConfig
AddED support for the Connection object
Support for 32-bit Python (x86) on Windows
The Speech SDK for Python is out of beta

iOS

The SDK is now built against the iOS SDK version 12.1. and supports iOS versions 9.2 and later
Improved reference documentation and fixed several property names

JavaScript

Added support for the Connection object
Added type definition files for bundled JavaScript
Initial support and implementation for phrase hints
Returned properties collection with service JSON for recognition

Windows DLLs now contains a version resource.

Bug fixes

Empty proxy username and proxy password were not handled correctly before. With this release, if you set proxy username and proxy password to an empty string, they will not be submitted when connecting to the proxy.
Session ID's created by the SDK were not always truly random for some languages and environments. Random generator initialization has been added to fix this.
Improved handling of authorization token. If you want to use an authorization token, specify in the SpeechConfig and leave the subscription key empty. Then create the recognizer as usual.
In some cases, the Connection object wasn't released correctly. This has been fixed.

For more details and examples for how your business can benefit from the new functionality for Speech Services, check out release notes and samples in the GitHub sample repository for Speech Services.
Quelle: Azure

Introducing Microsoft Azure Sentinel, intelligent security analytics for your entire enterprise

Security can be a never-ending saga—a chronicle of increasingly sophisticated attacks, volumes of alerts, and long resolution timeframes where today’s Security Information and Event Management (SIEM) products can’t keep pace.

SecOps teams are inundated with a very high volume of alerts and spend far too much time in tasks like infrastructure set up and maintenance. As a result, many legitimate threats go unnoticed. An expected shortfall of 3.5M security professionals by 2021 will further increase the challenges for security operations teams. You need a solution that empowers your existing SecOps team to see the threats clearer and eliminate the distractions.

That’s why we reimagined the SIEM tool as a new cloud-native solution called Microsoft Azure Sentinel. Azure Sentinel provides intelligent security analytics at cloud scale for your entire enterprise. Azure Sentinel makes it easy to collect security data across your entire hybrid organization from devices, to users, to apps, to servers on any cloud.  It uses the power of artificial intelligence to ensure you are identifying real threats quickly and unleashes you from the burden of traditional SIEMs by eliminating the need to spend time on setting up, maintaining, and scaling infrastructure. Since it is built on Azure, it offers nearly limitless cloud scale and speed to address your security needs. Traditional SIEMs have also proven to be expensive to own and operate, often requiring you to commit upfront and incur high cost for infrastructure maintenance and data ingestion. With Azure Sentinel there are no upfront costs, you pay for what you use.

Many enterprises are using Office 365 and are increasingly adopting the advanced security and compliance offerings included in Microsoft 365. There are many cases when you want to combine security data from users and end point applications with information from your infrastructure environment and third-party data to understand a complete attack.

It would be ideal if you could do this all within the compliance boundaries of a single cloud provider. Today we are announcing that you can bring your Office 365 activity data to Azure Sentinel for free. It takes just a few clicks and you retain the data within the Microsoft cloud.

“With Microsoft Azure Sentinel, we can better address the main SIEM landscape challenges for our clients, along with simplifying data residency and GDPR concerns.”

Andrew Winkelmann, Global Security Consulting Practice Lead, Accenture

Let’s look at how Azure Sentinel will help you deliver cloud-native security operations:

Collect data across your enterprise easily – With Azure Sentinel you can aggregate all security data with built-in connectors, native integration of Microsoft signals, and support for industry standard log formats like common event format and syslog. In just a few clicks you can import your Microsoft Office 365 data for free and combine it with other security data for analysis. Azure Sentinel uses Azure Monitor which is built on a proven and scalable log analytics database that ingests more than 10 petabytes every day and provides a very fast query engine that can sort through millions of records in seconds.

We continue to collaborate with many partners in the Microsoft Intelligent Security Association. Azure Sentinel connects to popular solutions including Palo Alto Networks, F5, Symantec, Fortinet, and Check Point with many more to come. Azure Sentinel also integrates with Microsoft Graph Security API, enabling you to import your own threat intelligence feeds and customizing threat detection and alert rules. There are custom dashboards that give you a view optimized for your specific use-case.

Adam Geller, Senior Vice President, SaaS, virtualization, and cloud-delivered security of Palo Alto Networks said, “We’re pleased with our ongoing collaboration with Microsoft and the work we’re doing to deliver greater security orchestration for our joint customers. This latest integration allows customers to forward their physical and virtualized next generation firewall logs to Azure Sentinel and use custom dashboards and artificial intelligence to rapidly uncover potential security incidents. Palo Alto Networks customers can also extend AutoFocus and other third-party threat intelligence to Azure Sentinel via our new integration between MineMeld and the Microsoft Graph Security API.”

Analyze and detect threats quickly with AI on your side – Security analysts face a huge burden from triaging as they sift through a sea of alerts, and correlate alerts from different products manually or using a traditional correlation engine. That’s why Azure Sentinel uses state of the art, scalable machine learning algorithms to correlate millions of low fidelity anomalies to present a few high fidelity security incidents to the analyst. ML technologies will help you quickly get value from large amounts of security data you are ingesting and connect the dots for you. For example, you can quickly see a compromised account that was used to deploy ransomware in a cloud application. This helps reduce noise drastically, in fact we have seen an overall reduction of up to 90 percent in alert fatigue during evaluations. Early adopters are seeing the benefits of threat detections with AI. Reed M. Wiedower, CTO of New Signature said, “We see a huge value with Azure Sentinel because of its ability to generate insights across a vast array of different pieces of infrastructure.”

These built-in machine learning models are based on the learnings from the Microsoft security team over many years of defending our customer’s cloud assets. You do not need to be a data scientist to leverage use these benefits you just turn them on. Of course, if you are a data scientist and you want to customize and enrich the detections then you can bring your own models to Azure Sentinel using the built-in Azure Machine Learning service. Additionally, Azure Sentinel can connect to user activity and behavior data from Microsoft 365 security products which can be combined with other sources to provide visibility into an entire attack sequence.

Investigate and hunt for suspicious activities – Graphical and AI-based investigation will reduce the time it takes to understand the full scope of an attack and its impact. You can visualize the attack and take quick actions in the same dashboard.  

Proactive hunting of suspicious activities is another critical task for the security analysts. Often the process by which SecOps collect and analyze the data is a repeatable process which can be automated. Today, Azure Sentinel provides two capabilities that enable you to automate your analysis by building hunting queries and Azure Notebooks that are based on Jupyter notebooks. We have developed a set of queries and Azure Notebooks based on the proactive hunting that Microsoft’s Incident Response and Threat Analysts teams perform. As the threat landscape evolves, so will our queries and Azure Notebooks. We will provide new queries and Azure Notebooks via the Azure Sentinel GitHub community.

Automate common tasks and threat response – While AI sharpens your focus on finding problems, once you have solved the problem you don’t want to keep finding the same problems over and over – rather you want to automate response to these issues. Azure Sentinel provides built-in automation and orchestration with pre-defined or custom playbooks to solve repetitive tasks and to respond to threats quickly. Azure Sentinel will augment existing enterprise defense and investigation tools, including best-of-breed security products, homegrown tools, and other systems like HR management applications and workflow management systems like ServiceNow.

Microsoft’s unparalleled threat intelligence that is informed by analyzing 6.5+ trillions of signals daily and decades of security expertise at cloud scale will help you modernize your security operations.

“Azure Sentinel provides a proactive and responsive cloud-native SIEM that will help customers simplify their security operations and scale as they grow.”

Richard Diver, Cloud Security Architect, Insight Enterprises

Security doesn’t have to be an endless saga. Instead, put the cloud and large-scale intelligence to work. Make your threat protection smarter and faster with artificial intelligence. Import Microsoft Office 365 data for security analytics for free. Get started with Microsoft Azure Sentinel.

Microsoft Azure Sentinel is available in preview today in the Azure portal.
Quelle: Azure