AI is the new normal: Recap of 2018

The year 2018 was a banner year for Azure AI as over a million Azure developers, customers, and partners engaged in the conversation on digital transformation. The next generation of AI capabilities are now infused across Microsoft products and services including AI capabilities for Power BI.

Here are the top 10 Azure AI highlights from 2018, across AI Services, tools and frameworks, and infrastructure at a glance:

AI services

1. Azure Machine Learning (AML) service with new automated machine learning capabilities.

2. Historical milestones in Cognitive Services including unified Speech service.

3. Microsoft is first to enable Cognitive Services in containers.

4. Cognitive Search and basketball

5. Bot Framework v4 SDK, offering broader language support (C#, Python, Java, and JavaScript) and extensibility models.

AI tools and frameworks

6. Data science features in Visual Studio Code.

7. Open Neural Network Exchange (ONNX) runtime is now open source.

8. ML.Net and AI Platform for Windows developers.

AI infrastructure

9. Azure Databricks

10. Project Brainwave, integrated with AML.

With many exciting developments, why are these moments the highlight? Read on, as this blog begins to explain the importance of these moments.

AI services

These services span pre-built AI capabilities such as Azure Cognitive Services and Cognitive Search, Conversational AI with Azure Bot Service, and custom AI development with Azure Machine Learning (AML).

1. Azure Machine Learning

At Microsoft Connect, the Azure Machine Learning (AML) service with new automated machine learning (automated ML) capabilities became available. With AML, data scientists and developers can quickly and easily build, train, and deploy machine learning models anywhere from the intelligent cloud to the intelligent edge. Once the model is developed, organizations can deploy and manage their models in the cloud and on edge, including IoT devices with integrated (CI/CD) tooling.

To learn more, read our announcement blog, “Announcing the general availability of Azure Machine Learning service.”

Few people know the story behind how Automated ML came to be. It all started in the gene-editing lab in 2016.

Dr. Nicolo Fusi, a machine learning researcher at Microsoft, encountered a problem while working with a new gene editing technology called CRISPR.ML. He tried to use machine learning to predict the best way to edit a gene. His model contained thousands of hyperparameters, making it too difficult and time consuming to optimize with existing methods. Then, Dr. Fusi had a breakthrough idea, why not apply the same approach and algorithms used for recommending movies and products to this problem of model optimization? The result is a recommendation system for machine learning pipelines. The approach combines ideas from collaborative filtering and Bayesian optimization to identify possible machine learning pipelines, allowing data scientists and developers to automate model selection and hyperparameter tuning.

In this interview, Dr.Fusi gives you an inside look at how automated ML empowers decision-making and takes the tedium out of data science.

Check out this Cornell-published white paper, “Probabilistic Matrix Factorization for Automated Machine Learning” to learn more.

2. New milestones for Azure Cognitive Services

Azure Cognitive Services is a collection of APIs that lets developers easily add the ability of vision, speech, language, and search into applications and machines. To date, more than a 1.2 million developers use Cognitive Services.

At the Build 2018 conference, Microsoft unveiled the next wave of innovation for Cognitive Services:

New Services:

A unified Speech service, enabling developers to perform Speech to Text (speech transcription), Text to Speech (speech synthesis), and Speech Translation for providing real-time speech translation capabilities all through a single API.
A Custom Vision Service that makes it effortless to train an image recognition system by simply dragging and dropping a collection of images.
The preview of the Speech devices SDK as well as the new Speech client SDK.

Enhancements to existing services:

Updates to Video Indexer to automatically detect known brands in speech and visual text and can be trained to recognize custom brands.
Updates to Bing Custom Search, Custom Decision Service, and Cognitive Services Labs with previews of emerging Cognitive Services technologies. As well as, announced support for the customization of neural machine translation.

For more details, read “Microsoft Empowers developers with new cognitive services capabilities."

3. Microsoft is the first company to deliver Cognitive Services in containers

In November Azure Cognitive Services containers became available in preview, making Azure the first platform with pre-built Cognitive Services that span the cloud and the edge.

To learn more, please read the technical blog, “Getting started with Azure Cognitive Services in containers."

4. Azure Cognitive Search and Basketball

Azure Cognitive Search, an AI-first approach to content understanding became available through preview. Cognitive Search expands Azure Search with built-in cognitive skills to extract knowledge. This knowledge is then organized and stored in a search index, enabling new ways for exploring the data.

Check out how the National Basketball Association (NBA) used Cognitive Search, Cognitive Services, and custom models to power rich data exploration in the //BUILD 2018 keynote.

Read “Announcing Cognitive Search: Azure Search + cognitive capabilities" for more details.

5. Bot Framework v4 SDK

With the general availability of Bot Framework v4 SDK announced in September, developers can take advantage of broader language support. C# and JavaScript are generally available, while Python and Java are in preview. Also take advantage of better extensibility to harness a vibrant ecosystem of pluggable components like dialog management and machine translation. The Bot Framework also includes an emulator and a set of CLI tools to streamline the creation and management of different bot language understanding services. Today the service has over 340 thousand users and growing.

To learn more, check out Conversational AI Updates.

AI tools and frameworks

These tools and frameworks include Visual Studio tools for AI, Azure Notebooks, Data Science VMs, Azure Machine Learning Studio, ONNX, and the AI Toolkit for Azure IoT Edge.

6. Data science features in Visual Studio Code

As of November, data science features are available  in the Python extension for Visual Studio Code! With these features, developers can work with data interactively in Visual Studio Code. Whether for exploring data or for incorporating machine learning models into applications, this makes Visual Studio Code an exciting new option for those who prefer an editor for data science tasks.

Visual Studio Tools for AI provides additional details for taking advantage of these new features.

7. ONNX Runtime is now open source

ONNX Runtime is now open source. ONNX is an open format to represent machine learning models that enable developers and data scientists to use the frameworks and tools that work best for them including PyTorch, TensorFlow, scikit-learn, and more. ONNX Runtime is the first inference engine that fully supports the ONNX specification. Users typically see two timesthe improvement in performance gains.

At Microsoft, teams are using ONNX Runtime to improve the scoring latency and efficiency of their models. For models the teams converted to ONNX, average performance improved by two times compared to scoring in previous solutions. Leading hardware companies such as Qualcomm, Intel and NVIDIA are actively integrating their custom accelerators into ONNX Runtime.

More details are available in the blog post, "ONNX Runtime is now open source.”

8. ML.NET and AI Platform for Windows Developers

Developers can access ML.Net, a new open-source, cross-platform machine learning framework. The technology behind AI features in Office and Windows has been released as a project on Github.

In addition, the AI Platform for Windows developers, allows ONNX models to run natively on Windows-based devices.

Check out this blog post and video, “How Three Lines of Code and Windows Machine Learning Empower .NET Developers to Run AI Locally on Windows 10 Devices” for a helpful example of how to use these platforms.

AI infrastructure

This category covers Azure Data Services, compute services including Azure Kubernetes Services (AKS), and AI Silicon support including GPUs and FPGAs.

9. Azure Databricks

Azure Databricks, a fast, easy, and collaborative Apache® Spark™-based analytics platform optimized for Azure became widely available. Today, organizations benefit from Azure Databricks' native integration with other services like Azure Blob Storage, Azure Data Factory, Azure SQL Data Warehouse, and Azure Cosmos DB. This platform enables new analytics solutions that support modern data warehousing, advanced analytics, and real-time analytics scenarios.

To learn more, read “Ignite 2018 – Making AI real for your business with Azure Data.”

10. Project Brainwave, integrated with Azure Machine Learning

Microsoft showcased the preview of Project Brainwave, integrated with Azure Machine Learning. This service brings hardware-accelerated real-time inference for AI to Azure. The Project Brainwave architecture is deployed on a type of computer chip from Intel called a field programmable gate array (FPGA), which makes real-time AI calculations at a competitive cost and with the industry's lowest lag time.

In addition, customers got a sneak peak of bringing Project Brainwave to the edge. Meaning customers can take advantage of this computing speed for their businesses and facilities, even if their systems aren't connected to a network or the Internet.

Read “Real-time AI: Microsoft announces a preview of Project Brainwave" for more details.

AI is the new normal

AI catalyzes digital transformation. Microsoft believes in making AI accessible so that developers, data scientists and enterprises can build systems that augment human ingenuity to tackle meaningful challenges.

AI is the new normal. Microsoft has more than 20 years of AI research applied to our products and services. Everyone can now access this AI through simple, yet powerful productivity tools such as Excel and Power BI.

In continual support of bringing AI to all, Microsoft introduced new AI capabilities for Power BI. These features enable all Power BI users to discover hidden, actionable insights in their data and drive better business outcomes with easy-to-use AI. No code needed to get started. Here are a few highlights:

Integration of Azure Cognitive Services.
Key driver analysis helps users understand what influences key business metrics.
Create machine learning models directly in Power BI using automated ML.
Seamless integration of Azure Machine Learning within Power BI.

Moving forward into 2019

Many thanks to you, our customers, MVPs, developers, and partners in being a part of Microsoft’s journey to empower businesses to build globally scalable AI applications. A new year is on the way, and the possibilities are endless. We can’t wait to share what we have in store for you in 2019 and to see what you will build with Azure this upcoming year. Happy New Year from the Azure AI team!
Quelle: Azure

Azure.Source – Volume 65

Now generally available

Announcing the general availability of Azure Data Box Disk

Azure Data Box Disk, an SSD-based solution for offline data transfer to Azure, is now generally available in the US, EU, Canada, and Australia, with more country/regions to be added over time. Each disk is an 8 TB SSD that can copy data up to USB 3.1 speeds and support the SATA II and III interfaces. The disks are encrypted using 128-bit AES encryption and can be locked with your custom passkeys. In addition, check out the end of this post for an announcement about the public preview for Blob Storage on Azure Data Box. When this feature is enabled, you will be able to copy data to Blob Storage on Data Box using blob service REST APIs.

New year, newly available IoT Hub Device Provisioning Service features

The following Azure IoT Hub Device Provisioning Service features are now generally available: Symmetric key attestation support; Re-provisioning support; Enrollment-level allocation rules; and Custom allocation logic. The IoT Hub Device Provisioning Service is a helper service for IoT Hub that enables zero-touch, just-in-time provisioning to the right IoT hub without requiring human intervention, enabling you to provision millions of devices in a secure and scalable manner. All features are available in all provisioning service regions, through the Azure portal, and the SDKs will support these new features by the end of January 2019 (with the exception of the Python SDK).

News and updates

Cognitive Services Speech SDK 1.2 – December update – Python, Node.js/NPM and other improvements

Developers can now access the latest improvements to Cognitive Services Speech Service including a new Python API and more. See this post to read what’s new for the Python API for Speech Service, Node.js support, Linux support, lightweight SDK for greater performance, control of server connectivity and connection status, and audio file buffering for unlimited audio session length support. Support for ProGuard during Android APK generation is also now available.

New Azure Migrate and Azure Site Recovery enhancements for cloud migration

This post covers some of the new features added to Microsoft Azure Migrate and Azure Site Recovery that will help you in your lift and shift migration journey to Azure. Azure Migrate enables you to discover your on-premises environment and plan your migration to Azure. Based on popular demand, we enabled Azure Migrate in two new geographies, Azure Government and Europe. We will enable support for other Azure geographies in future. Azure Site Recovery (ASR) helps you migrate your on-premises virtual machines (VMs) to IaaS VMs in Azure, this is the lift and shift migration, which now includes: Support for physical servers with UEFI boot type, Linux disk support, and Migration from anywhere (public or private clouds).

Additional updates for migration support and Azure Site Recovery:

Support for SQL to Azure SQL Database online migrations
Support for MySQL to Azure Database for MySQL online migrations
Support for PostgreSQL to Azure Database for PostgreSQL online migrations
Azure Site Recovery – Update Rollup 32

Streamlined development experience with Azure Blockchain Workbench 1.6.0

Azure Blockchain Workbench 1.6.0 is now available and includes new features such as application versioning, updated messaging, and streamlined smart contract development. You can deploy a new instance of Workbench through the Azure portal or upgrade existing deployments to 1.6.0 using an upgrade script. Be advised that this release does include some breaking changes, so check the blog post for details. In addition, information for the latest updates is available from within the Workbench UI.

New smart device security research: Consumers call on manufacturers to do more

To better understand how the allure of smart device experiences stacks up against the concern for security, and who consumers hold responsible to secure smart devices, we partnered with Greenberg Strategy, a consumer research firm to poll more than 3,000 people in the US, UK, and Germany. The research showed that more than 90% of people expect manufacturers to do more for device security, and most people will avoid brands that have public breaches. Security is the top consideration for consumers thinking of buying a device, and consumers are willing to pay more for highly secured devices. See the blog post for a detailed infographic that outlines the details of the research. Note that devices built with Azure Sphere always maintain their security health through a combination of secured hardware, a secured OS, and cloud security that provides automated software updates.

Multi-modal topic inferencing from videos

Azure Video Indexer is a cloud application built on Azure Media Analytics, Azure Search, Cognitive Services. It enables you to extract the insights from your videos using Video Indexer models. Multi-modal topic inferencing in Video Indexer is a new capability that can intuitively index media content using a cross-channel model to automatically infer topics. The model does so by projecting the video concepts onto three different ontologies – IPTC, Wikipedia, and the Video Indexer hierarchical topic ontology. Video Indexer’s topic model empowers media users to categorize their content using an intuitive methodology and optimize their content discovery. Multi-modality is a key ingredient for recognizing high-level concepts in video.

The January release of Azure Data Studio

Azure Data Studio (formerly known as SQL Operations Studio) is a new cross-platform desktop environment for data professionals using the family of on-premise and cloud data platforms (such as SQL Server, Azure SQL DB and Azure SQL Data Warehouse) on Windows, MacOS, and Linux. The January release includes: Azure Active Directory Authentication support; Data-Tier Application Wizard support; IDERA SQL DM Performance Insights (Preview); Updates to the SQL Server 2019 Preview extension; SQL Server Profiler improvements; results streaming for large queries (Preview); User setup installation support; and various bug fixes.

Additional updates

Azure Sphere: Update to the 18.11 release
Azure Sphere – Anatomy of a secured MCU
Final reminder: OMS portal moving to the Azure portal
Additional compute levels added to vCore-based Azure SQL databases and elastic pools

Technical content

To infinity and beyond: The definitive guide to scaling 10k VMs on Azure

Every platform has limits, workstations and physical servers have resource boundaries, APIs may be rate-limited, and even the perceived endlessness of the virtual public cloud enforces limitations that protect the platform from overuse or misuse. However, sometimes you experience scenarios that take platforms to their extreme, and those limits become real and therefore thought should be put into overcoming them. Solving this challenge must not only take into account the limitations and thresholds applied near the edge of the cloud platform’s capabilities, but also optimize cost, performance, and usability. Buzz is a scaling platform that uses Azure Virtual Machine Scale Sets (VMSS) to scale beyond the limits of a single set and enables hyper-scale stress tests, DDoS simulators and HPC use cases. Buzz orchestrates a number of Azure components to manage high scale clusters of VMs running and performing the same actions, such as generating load on an endpoint.

Teradata to Azure SQL Data Warehouse migration guide

With the increasing benefits of cloud-based data warehouses, there has been a surge in the number of customers migrating from their traditional on-premises data warehouses to the cloud. Teradata is a relational database management system and is one of the legacy on-premises systems from which customers are looking to migrate. This post introduces a technical white paper that gives insight into how to approach a Teradata to Azure SQL Data Warehouse migration. It is broken into sections which detail the migration phases, the preparation required for data migration including schema migration, migration of the business logic, the actual data migration approach and testing strategy.

5 Microsoft Learn Modules for Getting Started with Azure

In this quick read, Ari Bornstein shares his top five recommendations for getting up to speed with Azure Services to help you navigate through fundamentals, storing data, deploying to the cloud, administering containers, and using serverless APIs.

Performance troubleshooting using new Azure Database for PostgreSQL features

At Ignite 2018, Microsoft’s Azure Database for PostgreSQL announced the preview of Query Store (QS), Query Performance Insight (QPI), and Performance Recommendations (PR) to help ease performance troubleshooting, in response to customer feedback. This post builds on a previous post (Performance best practices for using Azure Database for PostgreSQL) to show how you can use these recently announced features to troubleshoot some common scenarios.

Questions on data residency and compliance in Microsoft Azure? We got answers!

Transparency and control are essential to establishing and maintaining trust in cloud technology, while restricted and regulated industries have additional requirements for risk management and to ensure ongoing compliance. To address this, Microsoft provides an industry-leading security and compliance portfolio. See this post for a link to the white paper, Achieving Compliant Data Residency and Security with Azure. This paper provides guidance about the security, data residency, data flows, and compliance aspects of Azure. It is designed to help you ensure that your data on Microsoft Azure is handled in a way that meets data protection, regulatory, and sovereignty requirements.

Best practices for alerting on metrics with Azure Database for MariaDB monitoring

Whether you are a developer, a database analyst, a site reliability engineer, or a DevOps professional at your company, monitoring databases is an important part of maintaining the reliability, availability, and performance of your MariaDB server. This post provides guidance and best practices for alerting on the most commonly monitored metrics for MariaDB and areas you can consider improving based on these various metrics.

Azure shows

The Azure Podcast | Episode 261 – Outage Communications

Kendall, Cale and Evan talk to Sami Kubba, a Senior PM Lead in the Azure CXP org, about how they handle communications of outages and other issues in Azure. Great insight into what goes on behind to scenes to maintain full transparency into the workings of Azure.

HTML5 audio not supported

Global real-time multi-user apps with Azure Cosmos DB | Azure Friday

Chris Anderson joins Donovan Brown to discuss how to use Azure Cosmos DB and other great Azure services to build a highly-scalable, real-time, collaborative application. You'll see techniques for using the Azure Cosmos DB change feed in both Azure Functions and SignalR applications. We also briefly touch on how custom authentication works with Azure Functions.

What’s New? A Single Key for Cognitive Services | AI Show

In this video we will talk about the work we are doing to simplify the use of Cognitive Services in your applications. We now have a single key, which eliminates having to reference and manage many keys per service for a single application.

Azure IoT Microsoft Professional Program | Internet of Things Show

Accelerate your career in one of the fastest-growing cloud technology fields: IoT. This program will teach you the device programming, data analytics, machine learning, and solution design skills you need for a successful career in IoT. Learn the skills necessary to start or progress a career working on a team that implements IoT solutions.

Consensus in Private Blockchains | Block Talk

This episode provides a review of consensus algorithms that are used, primarily for consortium based deployments.  This include the popular Proof of Authority, Proof of Work and a variant of BFT.  The core concepts of the algorithms are introduced and a demonstration of using the popular GETH client to provision a PoA based network, and how the consensus can be chosen at blockchain creation time, demonstrating the popular pluggable consensus.

TWC9: Unlimited Free Private GitHub Repos, Python in Azure App Service, CES Highlights and more

This Week on Channel 9, Christina Warren reports on the latest developer news.

How to add logic to your Testing in Production sites with PowerShell | Azure Tips and Tricks

Learn how to add additional logic by using PowerShell to automatically distribute the load between your production and deployment slot sites with the Testing in Production feature.

How to work with connectors in Azure Logic Apps | Azure Tips and Tricks

Learn how to work with connectors in Azure Logic Apps. Azure Logic Apps has a collection of connectors that you could use to integrate with 3rd party services, such as the Twitter connector.

Learn about Serverless technology in Azure Government

Steve Michelotti, Principal Program Manager on the Azure Government team, sits down with Yujin Hong, Program Manager on the Azure Government Engineering team, about Serverless computing in Azure Government.

Azure DevOps Podcast | Aaron Palermo on Cybersecurity and SDP – Episode 018

Jeffrey Palermo, interviews his own older brother, Aaron Palermo. Aaron is a DevOps engineer, solution architect, and all-around cybersecurity expert. This episode is jam-packed with incredibly useful information applicable to software developers — but also anybody who has a Wi-Fi network. Stay tuned to hear about how an SDP replaces a VPN, Aaron’s recommendations on how people can fully protect themselves online, which state-of-the-art multi-factor authentication people should be using, how to keep your data safe and protect from Wi-Fi vulnerabilities, and more.

HTML5 audio not supported

Events

CES 2019: Microsoft partners, customers showcase breakthrough innovation with Azure IoT, AI, and Mixed Reality

We are continuing to see great momentum for Azure IoT and Azure AI for connected devices and experiences, and new partners and customers choosing Azure IoT and Azure AI to accelerate their business. From connected home products to connected car experiences, check out this post for a few examples from CES 2019 in Las Vegas. Then take a look at a couple of examples that demonstrate innovation for immersive, secured digital experiences.

CES 2019: The rise of AI in automotive

CES 2019 was the perfect venue demonstrate how our customers and partners are enhancing their connected vehicle, autonomous vehicle and smart mobility strategies using the power of the Microsoft intelligent cloud, intelligent edge and AI capabilities. This post covers just a few examples of the innovative work that is happening today. As AI takes on more and more roles across the automotive ecosystem, it is inspiring to imagine the transformational possibilities that lie ahead for our industry.

Four ways to take your apps further with cloud, data, and AI solutions with Microsoft

Companies today demand the latest innovations for every solution they deliver. How can you make sure your infrastructure and data estate keep up with the demands of your business? Read this post for four tips on transforming your business with a modern data estate. Then register for a attend a free webinar on Thursday, January 24, to learn more about the new features and products that can help you optimize value and overcome challenges in modernizing your data estate.

Customers, partners, and industries

Implement predictive analytics for manufacturing with Symphony Industrial AI

Symphony Industrial AI has a mission: to bring the promise of Industrial IoT and AI to reality by delivering real value to their customers through predictive operations solutions. Two solutions by Symphony are specially tailored to the process manufacturing sector (chemicals, refining, pulp and paper, metals and mining, oil, and gas). Check this post to learn about two solutions offered by Symphony Industrial AI: Asset 360 AI and Process 360 AI.

Gain insight into your Azure Cosmos DB data with QlikView and Qlik Sense

Connecting data from various sources in a unified view can produce valuable insights that are otherwise invisible to the human eye and brain. As Azure Cosmos DB allows for collecting the data from various sources in various formats, the ability to mix and match this data becomes even more important for empowering your businesses with additional knowledge and intelligence. This is what Qlik’s analytics and visualization products, QlikView and Qlik Sense, have been able to do for years and now they support Azure Cosmos DB as a first-class data source. Qlik Sense and QlikView are data visualization tools that combine the data from different sources into a single view.

Microsoft Azure-powered Opti platform helps Atlanta prevent flooding

The City of Atlanta Department of Watershed Management will use the Opti platform to prevent flooding by making a retention pond at a local park more efficient. Microsoft CityNext partners with Opti to prevent flooding in Atlanta and other cities. Microsoft CityNext is helping cities around the world become more competitive, sustainable, and prosperous. With partners like Opti, Microsoft is working with cities to engage their citizens, empower city employees, optimize city operations and infrastructure, and transform to accelerate innovation and opportunity. The portfolio organizes solution categories across five broad functional areas: Digital Cities, Educated Cities, Healthier Cities, Safer Cities, and Sustainable Cities.

3 ways AI can help retailers stay relevant

Microsoft recently partnered with Blue Yonder, a JDA company, to survey retailers everywhere on how they are adapting to the rapidly evolving retail market by using new technologies. The findings show as retailers face new challenges in customer loyalty, competition online and changing consumer expectations, they are more committed than ever to investing in technologies like the Cloud and artificial intelligence (AI). Check out this post for three ways AI can help retailers survive in an unpredictable market. If you’re attending NRF this week, drop by the Microsoft booth to visit with JDA to learn more about price optimization solutions.

How Microsoft AI empowers transformation in your industry

AI presents incredible opportunities for organizations to change the way they do business. With 1,000 researchers—including winners of the Turing Award and Fields Medal—in 11 labs, Microsoft has established itself as a leader in AI through its dogged focus on innovation, empowerment, and ethics. Now, the groundbreaking capabilities of AI can move beyond the lab to make a positive impact on every enterprise, every industry. As Microsoft continues to research AI and incorporate its capabilities into the technologies of everyday life, it also remains committed to an ethical future. Microsoft has identified six principles—fairness, reliability and safety, privacy and security, inclusivity, transparency, and accountability—to guide the development and use of artificial intelligence so technology reflects the diversity of those who use it. In the end, it’s less about what AI can do than what people can do with AI. Visit this post to download the white paper, Microsoft’s vision for AI in the enterprise.

 

A Cloud Guru’s Azure This Week – 11 January 2019

This time on Azure This Week, Lars Klint talks about the definitive guide to scaling 10k VMs on Azure, Teradata to Azure SQL Data Warehouse migration guide, and using QlikView and Qlik Sense with Azure Cosmos DB.

Quelle: Azure

Azure Backup for virtual machines behind an Azure Firewall

This blog post primarily talks about how Azure Firewall and Azure Backup can be leveraged to provide comprehensive protection to your data. The former protects your network, while the latter backs up your data to the cloud. Azure Firewall, now generally available, is a cloud-based network security service that protects your Azure Virtual Network resources. It is a fully stateful firewall as a service with built-in high availability and unrestricted cloud scalability. With Azure Firewall you can centrally create, enforce, and log application and network connectivity policies across subscriptions and virtual networks. It uses a static public IP address for your virtual network resources, allowing outside firewalls to identify traffic originating from your virtual network.

Backup of Azure Virtual Machines

In a typical scenario, you may have Azure Virtual Machines (VMs) running business-critical workloads behind an Azure Firewall. While this is an effective means of shielding your VMs against network threats, you would also want to protect your data in the VMs using Azure VM Backup. This further reduces the odds of being exposed to several risks. Azure Backup protects the data in your VMs by safely storing it in your Recovery Services Vault. This involves moving data from your virtual machine storage to the vault and requires a network. However, all of this communication is performed over the secure Azure backbone network, with no need for accessing your virtual networks. You don’t need to open any ports, shortlist any IPs, or grant any accesses to Azure Backup in your Azure Firewall. Hence, your backups will work under the enhanced security of Azure Firewall without having you perform any actions from your end.

It is worth noting that this capability extends to other security measures that can lock a VM down under network restrictions, for example, NSGs. Hence, backup of Azure VMs will work seamlessly irrespective of network restrictions applied at your end to help keep your data within selected networks and without having to perform any additional actions.

Backup of SQL Server running inside an Azure VM (in preview)

Backup of SQL Servers running inside an Azure VM requires the backup extension to communicate with the Azure Backup service in order to upload backup and emit monitoring information. This extension resides inside the virtual machine and requires network access. Hence, when backing up SQL Servers running inside Azure VMs, you would need to permit the Azure Backup service to access the workload. This is a simple process that makes sure the data is restricted to Azure Backup and maintains your desired level of security.

All you need to do is complete the following steps:

1. Navigate to your Azure Firewall.

2. Go to Rules and select the Application rule collection tab. Here you can create a new application rule collection, or edit existing ones in case you have created application rule collections before.

3. Create a rule with the following details in an existing or new Application Rule Collection, under the FQDN tags section.

Field

Value

Priority

Enter an appropriate priority for the rule.

Action

Select Allow from the dropdown.

Name

Type a name for the rule.

Source Addresses

Enter * in the text box if you want this rule to be applicable to VMs in all subnets within the scope of the Firewall. Else, specify the desired IP or IP ranges.

FQDN Tags

Select AzureBackup from the dropdown

The following is a sample rule for allowing Azure Backup to protect your SQL Servers in Azure VMs.

4. Select Add to create the aforementioned rule.

Once the rule is created, you can back up your databases inside the Azure Virtual Machine without any interruptions. All while making sure it is protected by Azure Firewall from any external threats. For more on backing up your SQL Servers in Azure virtual machines, please read the blog, “Azure Backup for SQL Server on Azure now in public preview.”

Azure Backup and Azure Firewall complement each other well to provide a complete protection to your resources and data in Azure. You do not need any special configurations or infrastructure to reap benefits of using both services together. Read about backing up Azure Virtual Machines and backing up SQL servers inside Azure Virtual Machines for more details.
Quelle: Azure

Best practices for alerting on metrics with Azure Database for MariaDB monitoring

On December 4, 2018 Microsoft’s Azure Database for open sources announced the general availability of MariaDB. This blog intends to share some guidance and best practices for alerting on the most commonly monitored metrics for MariaDB.

Whether you are a developer, a database analyst, a site reliability engineer, or a DevOps professional at your company, monitoring databases is an important part of maintaining the reliability, availability, and performance of your MariaDB server. There are various metrics available for you in Azure Database for MariaDB to get insights on the behavior of the server. You can also set alerts on these metrics using the Azure portal or Azure CLI.

With modern applications evolving from a traditional on-premises approach to becoming more hybrid or cloud native, there is also a need to adopt some best practices for a successful monitoring strategy on a hybrid/public cloud. Here are some example best practices on how you can use monitoring data on your MariaDB server and areas you can consider improving based on these various metrics.

Active connections

Sample threshold (percentage or value): 80 percent of total connection limit for greater than or equal to 30 minutes, checked every five minutes.

Things to check

If you notice that active connections are at 80 percent of the total limit for the past half hour, verify if this is expected based on the workload.
If you think the load is expected, active connections limits can be increased by upgrading the pricing tier or vCores. You can check active connection limits for each SKU in our documentation, “Limitations in Azure Database for MariDB.”

Failed connections

Sample threshold (percentage or value): 10 failed connections in the last 30 minutes, checked every five minutes.

Things to check

If you see connection request failures over the last half hour, verify if this is expected by checking the logs for failure reasons.

If this is a user error, take the appropriate action. For example, if authentication yields a failed error check your username/password.
If the error is SSL related, check the SSL settings and input parameters are properly configured.

Example: psql "sslmode=verify-ca sslrootcert=root.crt host=mydemoserver.mariadb.database.azure.com dbname=mariadb user=mylogin@mydemoserver"

CPU percent or memory percent

Sample threshold (percent or value): 100 percent for five minutes or 95 percent for more than two hours.

Things to check

If you have hit 100 percent CPU or memory usage, check your application telemetry or logs to understand the impact of the errors.
Review the number of active connections. Check for connection limits in our documentation, “Limitations in Azure Database for MariaDB.” If your application has exceeded the max connections or is reaching the limits, then consider scaling up compute.

IO percent

Sample threshold (percent or value): 90 percent usage for greater than or equal to 60 minutes.

Things to check

If you see that IOPS is at 90 percent for one hour or more, verify if this is expected based on the application workload.
If you expect a high load, then increase the IOPS limit by increasing storage. Storage to IOPS mapping is illistrated below as a reference.

Storage

The storage you provision is the amount of storage capacity available to your Azure Database for PostgreSQL server. The storage is used for the database files, temporary files, transaction logs, and the PostgreSQL server logs. The total amount of storage you provision also defines the I/O capacity available to your server.

 
Basic
General purpose
Memory optimized

Storage type
Azure Standard Storage
Azure Premium Storage
Azure Premium Storage

Storage size
5GB TO 1TB
5GB to 4TB
5GB to 4TB

Storage increment size
1GB
1GB
1GB

IOPS
Variable

3IOPS/GB

Min 100 IOPS

Max 6000 IOPS

3IOPS/GB

Min 100 IOPS

Max 6000 IOPS

Storage percent

Sample threshold (percent or value): 80 percent

Things to check

If your server is reaching provisioned storage limits, it will soon be out of space and set to read-only.
Please monitor your usage. You can also provision for more storage to continue using the server without deleting any files, logs, and more.

If you have tried everything and none of the monitoring tips mentioned above lead you to a resolution, please don't hesitate to contact Microsoft Azure Support.

Acknowledgments

Special thanks to Andrea Lam, Program Manager, Azure Database for MariaDB for her contributions to this blog.
Quelle: Azure

New year, newly available IoT Hub Device Provisioning Service features

We’re ringing in 2019 by announcing the general availability for the Azure IoT Hub Device Provisioning Service features we first released back in September 2018! The following features are all generally available to you today:

Symmetric key attestation support
Re-provisioning support
Enrollment-level allocation rules
Custom allocation logic

All features are available in all provisioning service regions, through the Azure portal, and the SDKs will support these new features by the end of January 2019 (with the exception of the Python SDK). Let’s talk a little more about each feature.

Symmetric key attestation

Symmetric keys are one of the easiest ways to start off using the provisioning service and provide an easy "Hello world" experience for those of you who want to get started with provisioning but haven’t yet decided on an authentication method. Furthermore, symmetric key enrollment groups provide a great way for legacy devices with limited existing security functionality to bootstrap to the cloud via Azure IoT. Check the docs to learn more about how to connect legacy devices.

Symmetric key support is available in two ways:

Individual enrollments, in which devices connect to the Device Provisioning Service just like they do in IoT Hub.
Enrollment groups, in which devices connect to the Device Provisioning Service using a symmetric key derived from a group key.

The documentation has more about how to use symmetric keys to verify a device's identity.

Automated re-provisioning support

We added first-class support for device re-provisioning which allows devices to be reassigned to a different IoT solution sometime after the initial solution assignment. Re-provisioning support is available in two options:

Factory reset, in which the device twin data for the new IoT hub is populated from the enrollment list instead of the old IoT hub. This is common for factory reset scenarios as well as leased device scenarios.
Migration, in which device twin data is moved from the old IoT hub to the new IoT hub. This is common for scenarios in which a device is moving between geographies.

We’ve also taken steps to preserve backward compatibility for those who need it. Check the documentation, “IoT Hub Device reprovisioning concepts,” to learn the details. The documentation also has more on how to use re-provisioning.

Enrollment-level allocation rules

Customers need fine-grain control over how their devices are assigned to the proper IoT hub. For example, Contoso is a solution provider with two large multinational companies as customers. Each of Contoso’s customers is using Contoso devices across the globe in a geo-sharded setup. Contoso needs the ability to tell the provisioning service that customer A’s devices need to go to one set of hubs distributed geographically and that customer B’s devices need to go to another set of hubs distributed geographically. Enrollment-level allocation rules allow Contoso to do just that.

There are two pieces of functionality that light up:

Specifying allocation policy per enrollment gives finer-grain control.
Linked hub scoping allows the allocation policy to run over a subset of hubs.

This is available for both individual and group enrollments.

Custom allocation logic

With custom allocation logic, the Device Provisioning Service will trigger an Azure Function to determine where a device ought to go and what configuration should be applied to the device. Custom allocation logic is set at the enrollment level.

To sum things up with a limerick:

New features we announced last fall

Are ready for one and for all.

More flexibility

Makes provisioning easy

For devices from big to the small.
Quelle: Azure

Implement predictive analytics for manufacturing with Symphony Industrial AI

Technology allows manufacturers to generate more data than traditional systems and users can digest. Predictive analytics, enabled by big data and cloud technologies, can take advantage of this data and provide new and unique insights into the health of manufacturing equipment and processes. While most manufacturers understand the value of predictive analytics, many find it challenging to introduce into the line of business. Symphony Industrial AI has a mission: to bring the promise of Industrial IoT (IIoT) and artificial intelligence (AI) to reality by delivering real value to their customers through predictive operations solutions. Two solutions by Symphony are specially tailored to the process manufacturing sector (chemicals, refining, pulp and paper, metals and mining, oil, and gas).

There are two solutions offered by Symphony Industrial AI:

Asset 360 AI
Process 360 AI

The first focuses on existing machinery, and the second on common processes.

Problem: the complexity of data science

Manufacturers have deep knowledge of their manufacturing processes, but they typically lack the expertise of data scientists, who have a deep understanding of statistical modeling, a fundamental component of most predictive analytics applications. And when the application of predictive analytics is a success, most deployments fail to provide users with root causes, or contributing factors, of identified (predicted) issues so that they can take quick and decisive action on the new-found insight.

Solution: predictive analytics made easy

Symphony Industrial AI answers with a pre-built, template-driven approach that minimizes data scientist requirements and promotes rapid predictive analytics deployments. The solution features a data management platform for the process manufacturing sector. It provides real-time stream processing on time-series and related data for predictive analytics, leveraging cloud and big data technologies. The figure below shows an example of the solution’s dashboard.

Symphony Industrial AI’s solution speeds time-to-value through rapid deployment for minimized time and financial investment. Some of its features include:

Operations Date Lake (ODL): Pre-built integrations to existing systems of record (historians, EAM/CMMS, SCADA, and more).
Equipment and process template library: A library of equipment and process templates (pre-packaged analytics) that accelerate implementation and time-to-value.
AI/ML algorithms: Pre-packaged algorithms for failure/anomaly prediction.
Asset 360 AI and Process 360 AI: Pre-packaged solutions for asset performance intelligence and operations/process intelligence, respectively.

Two solutions: equipment models and process models

Predictive analytics solutions tend to focus on equipment health as scenarios, as the data is readily modeled. To ease the implementation, Asset 360 AI deploys equipment models (also known as asset models) from a template library — which includes heat exchangers, pumps, compressors, and so forth.

Symphony AI’s second solution Process 360 AI helps users create predictive models of their processes. A process is defined at the high level as the items (such as chemicals, fuels, metals, other intermediate and finished products) that are being produced through the equipment. Process template examples include an ammonia process, an ethylene process, an LNG process, and a polypropylene process. Process models help predict process upsets and trips — which equipment models alone may not be able to predict.

Benefits

Built with AI and machine learning (ML), Asset 360 AI and Process 360 AI integrate seamlessly with the equipment and devices already owned. The solutions predict failures before they happen, resulting in several benefits.

A reduction in unplanned downtime and process trips.
A reduction in capital expenditure and asset maintenance costs.
Improvement in quality using gathered process and product data.
Improvement in safety and in tracking workforce effectiveness.

Microsoft technologies

Symphony Industrial AI’s solution is delivered as a SaaS model on Azure using the following services:

Azure IoT Hub
Azure Machine Learning

These services ensure the latest features of IoT and AI advances can be implemented. Additionally, Power BI gives users a rich surface to use for finding insights and monitoring processes.

For manufacturers looking for a way to introduce predictive analytics, Symphony Industrial AI offers two solutions that are easy to implement through a template-driven process. The template libraries include models for existing equipment and standard manufacturing flows. To find out more, go to Asset 360 AI or Process 360 AI and select Contact me.
Quelle: Azure

Questions on data residency and compliance in Microsoft Azure? We got answers!

Questions about the security of and control over customer data, and where it resides, are on the minds of cloud customers today. We’re hearing you, and in response, we published a whitepaper that gives clear answers and guidance into the security, data residency, data flows, and compliance aspects of Microsoft Azure. The paper is designed to help our customers ensure that their customer data on Azure is handled in a way that meets their data protection, regulatory, and sovereignty requirements.

Transparency and control are essential to establishing and maintaining trust in cloud technology, while restricted and regulated industries have additional requirements for risk management and to ensure ongoing compliance. To address this, Microsoft provides an industry-leading security and compliance portfolio.

Security is built into the Azure platform beginning with the development process, which is conducted in accordance with the Security Development Lifecycle (SDL). Azure also includes technologies, controls, and tools that address data management and governance, such as Active Directory identity and access controls, network and infrastructure security technologies and tools, threat protection, and encryption to protect data in transit and at rest.

Microsoft gives customers options so they can control the types of data and locations where customer data is stored on Azure. With the innovation of the security and compliance frameworks, customers in regulated industries can confidently run mission-critical workloads in the cloud and leverage all the advantages of Microsoft’s hyperscale cloud.

Download the whitepaper, “Achieving compliant data residency and security with Azure.”

Learn more and get a list of Microsoft‘s compliance offerings on the Microsoft Trust Center site.
Quelle: Azure

Performance troubleshooting using new Azure Database for PostgreSQL features

At Ignite 2018, Microsoft’s Azure Database for PostgreSQL announced the preview of Query Store (QS), Query Performance Insight (QPI), and Performance Recommendations (PR) to help ease performance troubleshooting, in response to customer feedback. This blog intends to inspire ideas on how you can use features that are currently available to troubleshoot some common scenarios.

A previous blog post on performance best practices touched upon the layers at which you might be experiencing issues based on the application pattern that you are using. This blog nicely categorizes the problem space into several areas and the common techniques to rule out possibilities to quickly get to the root cause. We would like to further expand on this with the help of these newly announced features (QS, QPI, and PR).

In order to use these features, you will need to enable data collection by setting pg_qs.query_capture_mode and pgms_wait_sampling.query_capture_mode to ALL.

You can use Query Store for a wide variety of scenarios where you can enable data collection to help with troubleshooting these scenarios better. In this article, we will limit the scope to regressed queries scenario.

Regressed queries

One of the important scenarios that Query Store enables you to monitor is the regressed queries. By setting pg_qs.query_capture_mode to ALL, you get a history of your query performance over time. We can leverage this data to do simple or more complex comparisons based on your needs.

One of the challenges you face when generating a regressed query list is the selection of comparison period in which you baseline your query runtime statistics. There are a handful of factors to think about when selecting the comparison period:

Seasonality: Does the workload or the query of your concern occur periodically rather than continuously?
History: Is there enough historical data?
Threshold: Are you comfortable with a flat percentage change threshold or do you require a more complex method to prove the statistical significance of the regression?

Now, let’s assume no seasonality in the workload and that the default seven days of history will be enough to evaluate a simple threshold of change to pick regressed queries. All you need to do is to pick a baseline start and end time, and a test start and end time to calculate the amount of regression for the metric you would like to track.

Looking at the past seven-day history, compared to last two hours of execution, below would give the top regressed queries in the order of descending percentage. Note that if the result set has negative values, it indicates an improvement from baseline to test period when it’s zero, it may either be unchanged or not executed during the baseline period.

create or replace function get_ordered_query_performance_changes(
baseline_interval_start int,
baseline_interval_type text,
current_interval_start int,
current_interval_type text)
returns table (
query_id bigint,
baseline_value numeric,
current_value numeric,
percent_change numeric
) as $$
with data_set as (
select query_id
, round(avg( case when start_time >= current_timestamp – ($1 || $2)::interval and start_time < current_timestamp – ($3 || $4)::interval then mean_time else 0 end )::numeric,2) as baseline_value
, round(avg( case when start_time >= current_timestamp – ($3 || $4)::interval then mean_time else 0 end )::numeric,2) as current_value
from query_store.qs_view where query_id != 0 and user_id != 10 group by query_id ) ,
query_regression_data as (
select *
, round(( case when baseline_value = 0 then 0 else (100*(current_value – baseline_value) / baseline_value) end )::numeric,2) as percent_change
from data_set )
select * from query_regression_data order by percent_change desc;
$$
language 'sql';

If you create this function and execute the following, you will get the top regressed queries in the last two hours in descending order compared to their calculated baseline value over the last seven days up to two hours ago.

select * from get_ordered_query_performance_changes (7, 'days', 2, 'hours');

The top changes are all good candidates to go after unless you do expect the kind of delta from your baseline period because, say, you know the data size would change or the volume of transactions would increase. Once you identified the query you would like to further investigate, the next step is to look further into query store data and see how the baseline statistics compare to the current period and collect additional clues.

create or replace function compare_baseline_to_current_by_query_id(baseline_interval_cutoff int,baseline_interval_type text,query_id bigint,percentile decimal default 1.00)
returns table(
query_id bigint,
period text,
percentile numeric,
total_time numeric,
min_time numeric,
max_time numeric,
rows numeric,
shared_blks_hit numeric,
shared_blks_read numeric,
shared_blks_dirtied numeric,
shared_blks_written numeric,
local_blks_hit numeric,
local_blks_read numeric,
local_blks_dirtied numeric,
local_blks_written numeric,
temp_blks_read numeric,
temp_blks_written numeric,
blk_read_time numeric,
blk_write_time numeric
)
as $$

with data_set as
( select *
, ( case when start_time >= current_timestamp – ($1 || $2)::interval then 'current' else 'baseline' end ) as period
from query_store.qs_view where query_id = ( $3 )
)
select query_id
, period
, round((case when $4 <= 1 then 100 * $4 else $4 end)::numeric,2) as percentile
, round(percentile_cont($4) within group ( order by total_time asc)::numeric,2) as total_time
, round(percentile_cont($4) within group ( order by min_time asc)::numeric,2) as min_time
, round(percentile_cont($4) within group ( order by max_time asc)::numeric,2) as max_time
, round(percentile_cont($4) within group ( order by rows asc)::numeric,2) as rows
, round(percentile_cont($4) within group ( order by shared_blks_hit asc)::numeric,2) as shared_blks_hit
, round(percentile_cont($4) within group ( order by shared_blks_read asc)::numeric,2) as shared_blks_read
, round(percentile_cont($4) within group ( order by shared_blks_dirtied asc)::numeric,2) as shared_blks_dirtied
, round(percentile_cont($4) within group ( order by shared_blks_written asc)::numeric,2) as shared_blks_written
, round(percentile_cont($4) within group ( order by local_blks_hit asc)::numeric,2) as local_blks_hit
, round(percentile_cont($4) within group ( order by local_blks_read asc)::numeric,2) as local_blks_read
, round(percentile_cont($4) within group ( order by local_blks_dirtied asc)::numeric,2) as local_blks_dirtied
, round(percentile_cont($4) within group ( order by local_blks_written asc)::numeric,2) as local_blks_written
, round(percentile_cont($4) within group ( order by temp_blks_read asc)::numeric,2) as temp_blks_read
, round(percentile_cont($4) within group ( order by temp_blks_written asc)::numeric,2) as temp_blks_written
, round(percentile_cont($4) within group ( order by blk_read_time asc)::numeric,2) as blk_read_time
, round(percentile_cont($4) within group ( order by blk_write_time asc)::numeric,2) as blk_write_time
from data_set
group by 1, 2
order by 1, 2 asc;
$$
language 'sql';

Once you create the function, provide the query id you would like to investigate. The function will compare the aggregate values between the before and after based on the cutoff time you provide. For instance, the below statement would compare all points prior to two hours from now to points after the two hours mark up until now for the query. If you are aware of outliers that you want to exclude, you can use a percentile value.

select * from compare_baseline_to_current_by_query_id(30, 'minutes', 4271834468, 0.95);

If you don’t use any, the default value is 100 which does include all data points.

select * from compare_baseline_to_current_by_query_id(2, 'hours', 4271834468);

If you rule out that there is not a significant data size change and the cache hit ratio is rather steady, you may also want to investigate any obvious wait event occurrence changes within the same period. As wait event types combine different wait types into buckets similar by nature, there is not a single prescription on how to analyze the data. However, a general comparison may give us ideas around the system state change.

create or replace function compare_baseline_to_current_by_wait_event (baseline_interval_start int,baseline_interval_type text,current_interval_start int,current_interval_type text)
returns table(
wait_event text,
baseline_count bigint,
current_count bigint,
current_to_baseline_factor double precision,
percent_change numeric
)
as $$
with data_set as
( select event_type || ':' || event as wait_event
, sum( case when start_time >= current_timestamp – ($1 || $2)::interval and start_time < current_timestamp – ($3 || $4)::interval then 1 else 0 end ) as baseline_count
, sum( case when start_time >= current_timestamp – ($3 || $4)::interval then 1 else 0 end ) as current_count
, extract(epoch from ( $1 || $2 ) ::interval) / extract(epoch from ( $3 || $4 ) ::interval) as current_to_baseline_factor
from query_store.pgms_wait_sampling_view where query_id != 0
group by event_type || ':' || event
) ,
wait_event_data as
( select *
, round(( case when baseline_count = 0 then 0 else (100*((current_to_baseline_factor*current_count) – baseline_count) / baseline_count) end )::numeric,2) as percent_change
from data_set
)
select * from wait_event_data order by percent_change desc;
$$
language 'sql';

select * from compare_baseline_to_current_by_wait_event (7, 'days', 2, 'hours');

The above query will let you see some abnormal changes between the two periods. Note that event count here is taken as an approximation and the numbers should be taken within the context of the comparative load of the instance given the time.

As you can see, with the available time series data in Query Store, your creativity is your limit to the kinds of analysis and algorithms you could implement here. We showed you some simple calculations by which you could apply straight forward techniques to identify candidates and improve. We hope that this could be your starting point and that you share with us what works, what doesn’t and how you take this to the next level.

We are always looking forward to hearing feedback from you!

Acknowledgments

Special thanks to Senior Data Scientist Korhan Ileri, and Principal Data Scientist Managers Intaik Park and Saikat Sen for their contributions to this blog post.
Quelle: Azure

CES 2019: Microsoft partners, customers showcase breakthrough innovation with Azure IoT, AI, and Mixed Reality

Each year at CES, we see dozens of new product innovations that bring additional convenience, entertainment, efficiency – or completely new experiences to our daily lives. By bringing the power of the cloud to connected devices, the Internet of Things (IoT) and artificial intelligence (AI) have played an ever-expanding role in driving the connected product business opportunity. Today, smart thermostats, speakers, TVs, appliances, cars, and more are no longer serving an “early adopter” market – they are entering the mainstream – as people look for technology to help enrich how they plan and experience their daily lives.

Our Azure IoT and AI strategy enables customers to build these new products and solutions using the power of the intelligent cloud and intelligent edge, at scale. The Azure IoT platform helps customers build consistent AI-based applications and experiences from the cloud to the edge, that are adaptive and responsive to physical environments – from smart cities and spaces to connected products in homes and on the manufacturing floor. Our Azure AI services combine the latest advances in technologies like machine learning and deep learning, with our comprehensive data, Azure cloud and productivity platform, and a trusted, enterprise grade approach.

We are continuing to see great momentum for Azure IoT and Azure AI for connected devices and experiences, and new partners and customers choosing Azure IoT and Azure AI to accelerate their business. Here are just a few examples at CES:

Connected home products

Universal Electronics Inc. (UEI), a company specializing in universal control and sensing technologies for the smart home, and Microsoft are collaborating to launch a new digital assistant platform developed with Microsoft’s cloud, AI, and IoT services. Together, we have created a white-label Smart Home hub platform, Nevo Butler, with an integrated digital assistant (nevo.ai), as well as a range of turnkey kits addressing home safety and security, energy management, and entertainment control, in residential or hospitality domains, so companies who want to offer connected experiences to their customers can do so in a seamless and managed way. The QuickSet Cloud platform running on Azure IoT is powering millions of connected devices in the home through customers including Comcast, Sony, LGI, Samsung and others. The nevo.ai digital assistant development leverages the Azure Bot solution accelerator for virtual assistant, where complex and evolving features can be delivered through simple natural language interface, making these experiences accessible to a wide range of audiences.
Hampton Products International, makers of ARRAY By Hampton family of connected devices, has selected the Microsoft Azure IoT cloud platform to power its next generation smart devices and future IoT product development. Hampton is a trusted leader in the home security space, having produced more than 1.5 billion door locks, padlocks, door hardware, and security lights. With the ARRAY By Hampton Connected Lock and Video Coach Lights, Hampton Products is bringing IoT to home security for thousands of devices.

Connected car experiences

ZF, a company specializing in driveline, chassis and vehicle safety technology, is showcasing an intelligent platform that allows customers to integrate a variety of capabilities for a seamless end-user experience. This includes functions ranging from fleet management and ride-sharing to innovative delivery services – all built on the Azure IoT platform.
LG Electronics (LGE), a global technology and manufacturing company, is partnering with Microsoft to boost the growth of its connected and smart vehicle component business, using its Advanced Driver Assistance Systems to leverage Azure IoT platform coupled with data ingestion and transfer along with Azure Data Box and will incorporate the solution accelerator for virtual assistant.
Visteon, a Fortune 500 cockpit technology company, announced its DriveCore Studio autonomous driving platform at CES in 2018. This year, it announced that it is moving DriveCore to the cloud with Azure AD, giving OEMs a trusted and secure location to collaborate.
The BMW Group announced its own Intelligent Personal Assistant for its cars, coming March 2019 with support for 23 languages. BMW Intelligent Personal Assistant will be supported by Microsoft’s cloud, AI, and machine learning capabilities. Over time, it will learn from your habits and get smarter. Try out BMW’s personal assistant and learn more about their connected car vision at CES 2019.

Innovation for immersive, secured digital experiences

Itron has been in the energy and water markets for decades. To help customers create a more resourceful world, Itron Idea Labs was created to help scale innovation. Itron Idea Labs is exploring technologies that can be combined with Itron’s fully standards-compliant IPv6 multi-application network to connect IoT devices through a powerful distributed computing platform. At CES 2019, Itron Idea Labs is demoing several new ideas to innovate the way we think about how communities operate. A featured demo will allow visitors to step into the shoes of tomorrow’s urban planner in an experience using Microsoft’s HoloLens and Azure Digital Twin technologies. This experience will enable a visitor to interact with a virtualized architectural model of the Lincoln Heights neighborhood in downtown Los Angeles to simulate the impact of infrastructure improvements before they are installed.
Avnet, a Microsoft partner and leading global technology solutions provider, is announcing the new Azure Sphere MT3620 Starter Kit to further enhance the developer experience for creating secured, connected microcontroller-unit (MCU) devices. There are billions of MCU devices shipped every year in IoT connected devices. Azure Sphere, in public preview since September, was designed to address security of these devices holistically at every layer, from the silicon to the cloud. Avnet is a lead partner for Azure Sphere and is the first to distribute the solution.

These partners and more illustrate how IoT is reaching its full potential with new solutions that make it easier to create intelligent applications from the cloud to the edge. To learn more about our IoT platform, visit https://azure.microsoft.com/en-us/overview/iot/. We are excited to see what our customers and partners create next.
Quelle: Azure

New smart device security research: Consumers call on manufacturers to do more

It’s become a reliable January tradition for manufacturers to introduce an amazing array of consumer devices at the Consumer Electronics Show (CES). These new devices enter a booming market, and Gartner predicts “14.2 billion connected things will be in use in 2019, and that the total will reach 25 billion by 2021.” This year connected devices will dominate CES again as device manufacturers lean further into their vision for the smart home. These devices’ promise is resonant – smart home experiences will remove friction from our day-to-day lives, save us money, keep us healthy, and help us lower our environmental footprints—ultimately, empowering us all to achieve more.

As people open their personal lives and spaces to these smart experiences, they’re also becoming increasingly attuned to the security risks that smart technology can introduce. Their concern builds as news headlines give shape to the many ways that smart devices are being weaponized by attackers to invade personal privacy, steal sensitive data, and take down infrastructure with scaled attacks.

We set out to better understand how the allure of smart device experiences stacks up against the concern for security, and who consumers hold responsible to secure smart devices. To this end, we partnered with Greenberg Strategy, a consumer research firm to poll more than 3,000 people in the US, UK, and Germany.

The research showed that consumers demand action from manufacturers. More than 90% of people expect manufacturers to do more for device security, and most people will avoid brands that have public breaches. On the other hand, consumer security awareness creates opportunities for manufacturers who design devices with security in mind. Security is the top consideration for consumers thinking of buying a device, and consumers are willing to pay more for highly secured devices.

In sum, security is a revenue driver for manufacturers, not a cost driver, when it comes to connected devices.

Research details: The opportunity of highly-secured devices

 

Azure Sphere, an intelligent solution for smart device security

As consumer excitement builds around smart devices, cybersecurity risks grow each day. To minimize the security risks that smart devices introduce to homes, businesses, and society, it’s time for manufacturers across industries to redefine “smart.” If a device is not secure it is not smart. This need for smart, secure devices is why Microsoft introduced Azure Sphere – to make it easy for manufacturers to create smart products that are innately secured.

Devices built with Azure Sphere always maintain their security health through a combination of secured hardware, a secured OS, and cloud security that provides automated software updates. When device security is built-in, manufacturers and their customers can confidently embrace the opportunities and benefits of smart devices.

Learn more about how customers like E.ON are building smart experience secured by Azure Sphere.

Get started today with Azure Sphere

 

Dig deeper into the data

Check out our interactive graphs to explore how responses varied by age and country.
Quelle: Azure