Advancing the developer experience for serverless apps with Azure Functions

Azure Functions constantly innovates so that you can achieve more with serverless applications, enabling developers to overcome common serverless challenges through a productive, event-driven programming model. Some releases we made in the last few weeks are good examples of this, including:

The Azure Functions premium plan, enables a whole new range of low latency and networking scenarios.
The preview of PowerShell support in Azure Functions, provides a way to tackle cloud automation scenarios which is a common challenge to IT pros and SREs all around the globe.

The new releases and improvements do not stop there, and today we are pleased to present several advancements intended to provide a better end-to-end experience when building serverless applications. Keep reading below to learn more about the following:

A new way to host Azure Functions in Kubernetes environments
Stateful entities with Durable Functions (in preview)
Less cluttered .NET applications with dependency injection
Streamlined deployment with Azure DevOps
Improved integration with Azure API Management (in preview)

Bring Azure Functions to Kubernetes with KEDA

There’s no better way to leverage the serverless advantages than using a fully managed service in the cloud like Azure Functions. But some applications might need to run on disconnected environments, or they require custom hardware and dependencies. Customers usually take a containerized approach for these scenarios, in which Kubernetes is the de facto industry standard. Managing application-aware, event-driven scale in these environments is non-trivial and usually insufficient, as it’s based only on resource usage, such as CPU or memory.

Microsoft and Red Hat partnered to build Kubernetes-based event-driven auto scaling (KEDA). KEDA is an open source component for Kubernetes that provides event-driven scale for any container workload enabling containers to scale from zero to thousands of instances based on event metrics, such as the length of an Azure Queue or Kafka stream, and back to zero again when done processing.

Since Azure Functions can be containerized, you can now deploy a Function App to any Kubernetes cluster, keeping the same scaling behavior you would have on the Azure Functions service.

This is a significant milestone for the open source ecosystem around Kubernetes, so we’re sharing much more detail in a separate blog post titled, “Announcing KEDA: bringing event-driven containers and functions to Kubernetes.” If you want to learn more about it, register today for the Azure webinar series scheduled for later in May. In this webinar we will go more in depth on this exciting topic.

Durable Functions stateful patterns

We have been thrilled with the excitement and energy from the community around Durable Functions, and our extension to the Functions runtime that unlocks new stateful and workflow patterns for serverless workflows. Today we are releasing some new capabilities in a preview package of Durable Functions.

For stateful functions that map to an entity like an IoT device or a gaming session, you can use the new stateful entity trigger for actor-like capabilities in Azure Functions. We are also making the state management of your stateful functions more flexible with preview support for Redis cache as the state provider for Durable Functions, enabling scenarios where applications may run in a disconnected or edge environment.

You can learn more about the new durable features in our documentation, “Durable Functions 2.0 preview (Azure Functions).”

Dependency injection for .NET applications

We are constantly striving to add new patterns and capabilities that make functions easier to code, test, and manage. .NET developers have been taking advantage of dependency injection (DI) to better architect their applications, and today we’re excited to support DI in Azure Functions written in .NET. This enables simplified management of connections plus dependent services, and unlocks easier testability for functions that you author.

Learn more about dependency injection in our documentation, “Use dependency injection in .NET Azure Functions.”

Streamlined Azure DevOps experience

With new build templates in Azure Pipelines, you will have the ability to quickly configure your Azure Pipeline with function-optimized tasks to build your .NET, Node.js, and Python applications. We are also announcing the general availability of the Azure Functions deployment task, which is optimized to work with the best deployment option for your function app.

Additionally, with the latest Azure CLI release we introduced a new command that can automatically create and configure an Azure DevOps pipeline for your function app. The DevOps definition now lives with your code, which allows to fine tune build and deployment tasks.

For more detailed information please check our documentation, “Continuous delivery using Azure DevOps.”

Defining and managing your Functions APIs with serverless API Management

We have also simplified how you can expose and manage APIs built with Azure Functions through API Management. With this improved integration, the Function Apps blade in the Azure portal presents an option to expose your HTTP-triggered functions through a new or an existing API in API Management.

Once the Function App is linked with API Management, you can manage API operations, apply policies, edit and download OpenAPI specification files, or navigate to your API Management instance for a full-featured experience.

Learn more about how to expose your Function Apps with API Management in our documentation.

Sharing is caring

We have also included a set of improvements to the Azure Serverless Community Library, including an updated look, a streamlined sample submission process, and more detailed information about each sample. Check out the Serverless Community Library to gain inspiration for your next serverless project, and share something cool once you’ve built it.

Get started today

With Functions options expanding and quickly improving, we’d sincerely love to hear your feedback. You can reach the team on Twitter and on GitHub, and we also actively monitor StackOverflow and UserVoice. For the latest updates, please subscribe to our monthly live webcast.

Tell us what you love about Azure Functions, and start learning more about all the new capabilities we are presenting today:

Learn more about using KEDA to host your function apps in Kubernetes in its blog post, “Announcing KEDA: bringing event-driven containers and functions to Kubernetes” and register for the Azure webinar series to see it in action.
Take a look at how you can benefit from dependency injection in .NET., session enabled Service Bus trigger, and extension bundle for your function apps in our docs.
Understand how you can have stateful functions mapping to entities with the preview capabilities added in Durable Functions.
Simplify the deployment of your serverless applications with the streamlined Azure DevOps experience through new tasks and CLI commands.
Learn how to expose and manage serverless APIs integrating Azure Functions with Azure API Management
Sign up for an Azure free account if you don’t have one yet, and start building awesome serverless applications in the cloud today.

Quelle: Azure

What’s new in Azure Monitor

At Ignite 2018, we shared the vision to bring monitoring infrastructure, applications, and the network into one unified offering, providing full stack monitoring for your applications. Over last few months, individual capabilities such as Application Insights and Azure Monitor logs have come together to provide a seamless and integrated Azure Monitor experience.

We’d like to share our three favorites:

End-to-end monitoring for Azure Kubernetes Service
Integrated access control for logs
Intelligent scalable alerts

End-to-end monitoring for Azure Kubernetes Service

Today, Azure Kubernetes Service (AKS) customers rely on Azure Monitor for containers to get out of the box monitoring for the AKS clusters. Kubernetes event logs are now available in real-time in addition to live container logs. You can now filter the charts and metrics for specific AKS node pools and see Node Storage Capacity metrics when you drill down into node details.

For monitoring your applications running on AKS, you can just instrument with Application Insights SDKs, but if you cannot instrument your workloads (for example, you may be running a legacy app or a third party app), we now have an alternative that doesn’t require any instrumentation! Application Insights can leverage your existing service mesh investments, the preview currently supports Istio, and provide application monitoring for AKS without any modification to your app’s code. This enables you to immediately start taking advantage of out-of-the-box capabilities like Application Map, Live Metrics Stream, Application Dashboards, Workbooks, User Behavior Analytics, and more.

Through a combined view of application and infrastructure, Azure Monitor now provides a full-stack monitoring view of Kubernetes clusters.

Integrated access control for logs

Azure Monitor is the central platform for collecting logs across monitoring, management, security, and other log types in Azure. Customers love the powerful, embedded Azure Monitor Logs experience that allows you to run diagnostics, root-cause analysis, statistics, visualizations, and answer any other ad-hoc questions. One of the challenges that customers were facing was configuring access control based on a resource. For example, how do you ensure that anyone who has access to virtual machine (VM) also has access to the logs generated by the VM. In line with our vision to provide you a seamless and native monitoring experience, we are now providing granular role-based access control for logs that help you cascade the permissions you have set at a resource level down to the operational logs.

Users can now also access logs scoped to their resource, allowing them to explore and query logs without needing the understand the entire workspace structure.

Intelligent and scalable alerts

Metric Alerts with Dynamic Thresholds, now generally available, enables Azure Monitor to determine the right thresholds for alert rules. Multi-resource alerts makes it easy to create a single alert rule and apply across multiple VMs.

The new Action Rules, available in preview, add more flexibility and finer controls for Action Groups. With Action Rules, scaling Action Groups to suppress alerts during a maintenance window is easy to do with a couple clicks

We shared three examples of how we are making Azure Monitor integrated, intelligent, and scalable, but that’s only a part of the story. Here is a list of other exciting announcements coming to you from Build.

Preview of Azure Monitor application change analysis, providing a centralized view and analysis of changes at different layers of a web app. The first iteration of this feature is now available in the App Services Diagnose and Solve Problems experience.
Improved visualizations in Application Map with better filtering to quickly scope to specific components, and ability to group/expand common dependencies, including Azure Functions v2.
Improved codeless instrumentation experience for ASP.NET apps on IIS with the preview of Status Monitor v2. This enables clean redeployments, the latest SDKs, support for TLS 1.2, offline install support, and more!
Application Insights SDK for Java workloads now fully supports W3C and provides monitoring support for Async Java apps with manual instrumentation API. Improved ILogger logs collection support for .NET Core apps and support for Live Metrics Stream for Node.JS apps.
Workbooks are now first-class citizen for Azure Monitor and available in the main menu. Use the sample templates to customize interactive reports or troubleshooting guides with rich text, analytics queries, metrics, and various parameters across your apps and infrastructure resources! New templates also available supporting Azure Monitor for VMs to monitor open ports and their connections. 

Get monitoring

Azure Monitor is constantly evolving to discover new insights and reduce potential issues with applications. Find the latest updates for Azure Monitor in the Azure portal. We want to hear from you! Ask questions or provide feedback.

Ready to get started?

Monitor metrics quickstart
Subscription Audit and Alerts quickstart
Learn how to view or analyze data collected.
Learn how to find and diagnose run-time exceptions.

Quelle: Azure

Azure IoT at Build: making IoT solutions easier to develop, more powerful to use

IoT is transforming every business on the planet, and that transformation is accelerating. Companies are harnessing billions of IoT devices to help them find valuable insights into critical parts of their business that were previously not connected—how customers are using their products, when to service assets before they break down, how to reduce energy consumption, how to optimize operations, and thousands of other user cases limited only by companies’ imagination.

Microsoft is leading in IoT because we’re passionate about simplifying IoT so any company can benefit from it quickly and securely.

Last year we announced a $5 billion commitment, and this year we highlighted the momentum we are seeing in the industry. This week, at our premier developer conference, Microsoft Build in Seattle, we’re thrilled to share our latest innovations that further simplify IoT and dramatically accelerate time to value for customers and partners.

Accelerating IoT

Developing a cloud-based IoT solution with Azure IoT has never been faster or more secure, yet we’re always looking for ways to make it easier. From working with customers and partners, we’ve seen an opportunity to accelerate on the device side.

Part of the challenge we see is the tight coupling between the software written on devices and the software that has to match it in the cloud. To illustrate this, it’s worth looking at a similar problem from the past and how it was solved.

Early versions of Windows faced a challenge in supporting a broad set of connected devices like keyboards and mice. Each device came with its own software, which had to be installed on Windows for the device to function. The software on the device and the software that had to be installed on Windows had a tight coupling, and this tight coupling made the development process slow and fragile for device makers.

Windows solved this with Plug and Play, which at its core was a capability model that devices could declare and present to Windows when they were connected. This capability model made it possible for thousands of different devices to connect to Windows and be used without any software having to be installed on Windows.

IoT Plug and Play

Late last week, we announced IoT Plug and Play, which is based on an open modeling language that allows IoT devices to declare their capabilities. That declaration, called a device capability model, is presented when IoT devices connect to cloud solutions like Azure IoT Central and partner solutions, which can then automatically understand the device and start interacting with it—all without writing any code.

IoT Plug and Play also enables our hardware partners to build IoT Plug and Play compatible devices, which can then be certified with our Azure Certified for IoT program and used by customers and partners right away. This approach works with devices running any operating system, be it Linux, Android, Azure Sphere OS, Windows IoT, RTOSs, and more. And all of our IoT Plug and Play support is open source as always.

Finally, Visual Studio Code will support modeling an IoT Plug and Play device capability model as well as generating IoT device software based on that model, which dramatically accelerates IoT device software development.

We’ll be demonstrating IoT Plug and Play at Build, and it will be available in preview this summer. To design IoT Plug and Play, we’ve worked with a large set of launch partners to ensure their hardware is certified ready:

Certified-ready devices are now published in the Azure IoT Device Catalog for the Preview, and while Azure IoT Central and Azure IoT Hub will be the first services integrated with IoT Plug and Play, we will add support for Azure Digital Twins and other solutions in the months to come. Watch this video to learn more about IoT Plug and Play and read this blog post for more details on IoT Plug and Play support in Azure IoT Central.

Announcing IoT Plug and Play connectivity partners

With increased options for low-power networking, the role of cellular technologies in IoT projects is on the rise. Today we’re introducing IoT Plug and Play connectivity partners. Deep integration between these partners’ technologies and Azure IoT simplifies customer deployments and adds new capabilities.

This week at Build, we are highlighting the first of these integrations, which leverages Trust Onboard from Twilio. The integration uses security features built into the SIM to automatically authenticate and connect to Azure, providing a secure means of uniquely identifying IoT devices that work with current manufacturing processes.

These are some of the many connectivity partners we are working with:

Making Azure IoT Central more powerful for developers

Last year we announced the general availability of Azure IoT Central, which enables customers and partners to provision an IoT application in 15 seconds, customize it in hours, and go to production the same day—all without writing code in the cloud.

While many customers build their IoT solutions directly on our Azure IoT platform services, we’re seeing an upswell in customers and partners that like the rapid application development Azure IoT Central provides. And, of course, Azure IoT Central is built on the same great Azure IoT platform services.

Today at Build, we’re announcing a set of new features that speak to how we’re enabling and simplifying Azure IoT Central for developers. We’ll show some of these innovations, such as new personalization features that make it easy for customers and partners to modify Azure IoT Central’s UI to conform with their own look and feel. In the Build keynote, we’ll show how Starbucks is using this personalization feature for their Azure IoT Central solution connected to Azure Sphere devices in their stores.

We’ll also demonstrate Azure IoT Central working with IoT Plug and Play to show how fast and easy this makes it to build an end-to-end IoT solution, with Microsoft still wearing the pager and keeping everything up and running so customers and partners can focus on the benefits IoT provides. Watch this video to learn more about Azure IoT Central announcements.

The growing Azure Sphere hardware ecosystem

Azure Sphere is Microsoft’s comprehensive solution for easily creating secured MCU-powered IoT devices. Azure Sphere is an integrated system that includes MCUs with built-in Microsoft security technology, an OS based on a custom Linux kernel, and a cloud-based security service. Azure Sphere delivers secured communications between device and cloud, device authentication and attestation, and ongoing OS and security updates. Azure Sphere provides robust defense-in-depth device security to limit the reach and impact of remote attacks and to renew device health through security updates.

At Build this week, we’ll showcase a new set of solutions such as hardware modules that speed up time to market for device makers, development kits that help organizations prototype quickly, and our new guardian modules.

Guardian modules are a new class of device built on Azure Sphere that protect brownfield equipment, mitigating risks and unlocking the benefits of IoT. They attach physically to brownfield equipment with no equipment redesign required, processing data and controlling devices without ever exposing vital operational equipment to the network. Through guardian modules, Azure Sphere secures brownfield devices, protects operational equipment from disabling attacks, simplifies device retrofit projects, and boosts equipment efficiency through over-the-air updates and IoT connectivity.

The seven modules and devkits on display at Build are:

Avnet Guardian Module. Unlocks brownfield IoT by bringing Azure Sphere’s security to equipment previously deemed too critical to be connected. Available soon.
Avnet MT3620 Starter Kit. Azure Sphere prototyping and development platform. Connectors allow easy expandability options with a range of MikroE Click and Grove modules. Available May 2019.
Avnet Wi-Fi Module. Azure Sphere-based module designed for easy final product assembly. Simplifies quality assurance with stamp hole (castellated) pin design. Available June 2019.
AI-Link WF-M620-RSC1 Wi-Fi Module. Designed for cost-sensitive applications. Simplifies quality assurance with stamp hole (castellated) pin design. Available now.
SEEED MT3620 Development Board. Designed for comprehensive prototyping. Available expansion shields enable Ethernet connectivity and support for Grove modules. Available now.
SEEED MT3620 Mini Development Board. Designed for size-constrained prototypes. Built on the AI-Link module for a quick path from prototype to commercialization. Available May 2019.
USI Dual Band Wi-Fi + Bluetooth Combo Module. Supports BLE and Bluetooth 5 Mesh. Can also work as an NFC tag (for non-contact Bluetooth pairing and device provisioning). Available soon.

For those who want to learn more about the modules, you can find specs for each and links to more information on our Azure Sphere hardware ecosystem page.

See Azure Sphere in action at Build

Azure Sphere is also taking center stage at Build during Satya Nadella’s keynote this week. Microsoft customer and fellow Seattle-area company Starbucks will showcase how it is testing Azure IoT capabilities and guardian modules built on Azure Sphere within select equipment to enable partners and employees to better engage with customers, manage energy consumption and waste reduction, ensure beverage consistency, and facilitate predictive maintenance. The company’s solution will also be on display in the Starbucks Technology booth.

Announcing new Azure IoT Edge innovations

Today, we are announcing the public preview of Azure IoT Edge support for Kubernetes. This enables customers and partners to deploy an Azure IoT Edge workload to a Kubernetes cluster on premises. We’re seeing Azure IoT Edge workloads being used in business-critical systems at the edge. With this new integration, customers can use the feature-rich and resilient infrastructure layer that Kubernetes provides to run their Azure IoT Edge workloads, which are managed centrally and securely from Azure IoT Hub. Watch this video to learn more.

Additional IoT Edge announcements include:

Preview of Azure IoT Edge support for Linux ARM64 (expected to be available in June 2019).
General availability of IoT Edge extended offline support.
General availability of IoT Edge support for Windows 10 IoT Enterprise x64.
New provisioning capabilities using x.509 and SaS token.
New built-in troubleshooting tooling.

A common use case for IoT Edge is transforming cameras into smart sensors to understand the physical world and enable a digital feedback loop: finding a missing product on a shelf, detecting damaged goods, etc. These examples require demanding computer vision algorithms to deliver consistent and reliable results, large-scale streaming capabilities, and specialized hardware for faster processing to provide real-time insights to businesses. At Build, we’re partnering with Lenovo and NVIDIA to simplify the development and deployment of these applications at scale. With NVIDIA DeepStream SDK for general-purpose streaming analytics, a single IoT Edge server running Lenovo hardware can process up to 70 channels of 1080P/30FPS H265 video streams to offer a cost-effective and faster time-to-market solution.

This summer, NVIDIA DeepStream SDK will be available from the IoT Edge marketplace. In addition, Lenovo’s new ThinkServer SE350 and GPU-powered “tiny” edge gateways will be certified for IoT Edge.

Announcing Mobility Services through Azure Maps

Today, an increasing number of apps built on Azure are designed to take advantage of location information in some way.

Last November, we announced a new platform partnership for Azure Maps with the world’s number-one transit service provider, Moovit. What we’re achieving through this partnership is similar to what we’ve built today with TomTom. At Build this week, we’re announcing Azure Maps Mobility Services, which will be a set of APIs that leverage Moovit’s APIs for building modern mobility solutions.

Through these new services, we’re able to integrate public transit, bike shares, scooter shares, and more to deliver transit route recommendations that allow customers to plan their routes leveraging the alternative modes of transportation, in order to optimize for travel time and minimize traffic congestion. Customers will also be able to access real-time intelligence on bike and scooter docking stations and car-share-vehicle availability, including present and expected availability and real-time transit stop arrivals.

Customers can use Azure Maps for IoT applications—or any application that uses geospatial or location data, such as apps for field service, logistics, manufacturing, and smart cities. Retail apps may integrate mobility intelligence to help customers access their stores or plan future store locations that optimize for transit accessibility. Field services apps may guide employees from one customer to another based on real-time service demand. City planners may use mobility intelligence to analyze the movement of occupants to plan their own mobility services, visualize new developments, and prioritize locations in the interests of occupants.

You can stay up to date about how Azure Maps is paving the way for the next generation of location services on the Azure Maps blog, and if you’re at Build this week, be sure to visit the Azure Maps booth to see our mobility and spatial operations services in action.

Simplifying development of robotic systems with Windows 10 IoT

Microsoft and Open Robotics have worked together to make the Robot Operating System (ROS) generally available for Windows 10 IoT. Additionally, we’re making it even easier to build ROS solutions in Visual Studio Code with upcoming support for Windows, debugging, and visualization to a community-supported Visual Studio Code extension. Read more about integration between Windows 10 IoT and ROS.

Come see us at Build

If you’re in Seattle this week, you can see some of these new technologies in our booth, and even play around with them at our IoT Hands-on Lab. I’ll also be hosting a session on our IoT Vision and Roadmap. Stop by to hear more details about these announcements and see some of these exciting new technologies in action.
Quelle: Azure

Introducing new product innovations for SAP HANA, Expanded AI collaboration with SAP and more

For many enterprises modernizing ERP systems is key to achieving their digital transformation goals. At Microsoft we are committed to supporting our customers by offering the single best infrastructure choice that exists for SAP HANA, bar none. 

In terms of raw capabilities we not only have the largest number of SAP HANA-certified offerings (25 configurations that span virtual machines and purpose-built bare metal instances from 192GB to 24TB), but also the widest footprint of regions with SAP HANA certified infrastructure (26 with plans to launch 8 more by end of 2019). We also support some of the largest deployments of SAP HANA in the public cloud, such as CONA Services.

We, in partnership with SAP, are very happy to announce multiple enhancements to SAP on Azure at SAPPHIRE NOW. We will offer our customers even more choices in infrastructure giving them greater VM memory, even more options around bare metal instances and business continuity.

In addition to this we are announcing deeper integration between SAP and Azure around AI, data protection and identity integration. These integrations will help our joint customers accelerate their digital transformation with the power of the cloud.

Here’s what’s new:

6 TB and 12 TB VMs for SAP HANA: Azure’s Mv2 VM series will be available on May 13, offering virtual machines with up to 6TB RAM on a single VM. This is by far the largest-memory SAP HANA-certified configuration offered on any virtual machine in the public cloud. 6TB Mv2 VMs will be generally available and production certified in U.S. East and U.S. East 2 regions. U.S. West 2, Europe West, Europe North and Southeast Asia regions will be available in the coming months.

In addition, 12TB Mv2 VMs will become available and production certified for SAP HANA in Q3 2019. With this, customers with large-scale SAP HANA deployments can take advantage of the agility offered by Azure Virtual Machines to speed SAP release cycles by spinning up dev/test systems in minutes and simplify operational processes with Azure’s integrated tools for automated patching, monitoring, backup and disaster recovery.
Largest Bare Metal Instance with Intel Optane for SAP HANA: In Q4 2019 we plan to launch the largest Intel Optane optimized bare metal instances in the cloud with our SAP HANA on Azure Large Instances, including general availability of a 4 socket, 9TB memory instance and a preview of an 8 socket, 18TB memory instance. These instances enable customers to benefit from faster load times for SAP HANA data in case of a restart, offering lower Recovery Time Objective (RTO) and a reduced TCO. To learn more, please get in touch with your Microsoft representative.
Integration of Azure AI in SAP’s digital platform: SAP’s machine learning capabilities will leverage Azure Cognitive Services containers in preview for face recognition and text recognition. By deploying Cognitive Services in containers, SAP will be able to analyze information closer to the physical world where the data resides and deliver real-time insights and immersive experiences that are highly responsive and contextually aware.

“SAP’s Machine Learning team is working with Microsoft Azure Cognitive services team to augment its own portfolio of home grown and partner services by leveraging the containerized Vision and Text Recognition services for solving identity validation and text understanding use cases.” – Dr. Sebastian Wieczorek, VP, Head of SAP Leonardo Machine Learning Foundation
SAP Data Custodian on Microsoft Azure is now available: In September 2018, we announced our intent to make SAP Data Custodian, a SaaS offering, available on Microsoft Azure. We deliver on that promise today. Together, SAP and Microsoft offer unprecedented levels of data governance and compliance for our joint customers. Additionally, Microsoft will be a beta customer for SAP Data Custodian for our implementation of SAP SuccessFactors on Azure. For more information, you can read this blog from SAP.
Managed business continuity with Azure Backup for SAP HANA: Azure Backup support for SAP HANA databases is now in public preview. With this, customers can manage large-scale SAP HANA implementations with no infrastructure for backup. For more information, please refer to the Azure Backup for SAP HANA documentation.
Simplified integrations with Logic Apps connector for SAP: Today, the Logic Apps connector for SAP ECC and SAP S/4HANA is generally available for all customers. Azure Logic Apps is an integration platform-as-a-service offering connectors to 250+ applications and SaaS services. With this, customers can dramatically reduce time to market for integrations between SAP and best-in-class SaaS applications. For more information, check out our Logic Apps SAP connector documentation.
Boosted productivity and enhanced security with Azure Active Directory and SAP Cloud Platform: Today, Standards-based integration between Azure Active Directory and SAP Cloud Platform is in preview, enabling enhanced business security and experience. For example, when using SAP Cloud Platform Identity Provisioning and Identity Authentication Services, customers can integrate SAP SuccessFactors with Azure Active Directory and ensure seamless access to SAP applications such as SAP S/4HANA, improving end-user productivity while meeting enterprise security needs.

Customers benefiting from SAP on Azure

With more than 90% of the Fortune 500 using Microsoft Azure and SAP, our 25-year partnership with SAP has always been about mutual customer success. We are confident the announcements made today will help customers using SAP on Azure grow and innovate even more than they already are: Forrester’s Total Economic Impact Study found that SAP customers on Azure, on average, can realize an ROI of 102% with a payback in under nine months from their cloud investments.

Here are five reasons SAP customers increasingly choose Azure for their digital transformation, and some customers who are benefitting:

Business agility: With Azure’s on-demand SAP certified infrastructure, customers can speed up dev/test processes, shorten SAP release cycles and scale instantaneously on demand to meet peak business usage. Daimler AG sped up procurement processes to deliver months faster than would have been possible in its on-premises environment. It powers 400,000 suppliers worldwide by moving to SAP S/4HANA on Azure’s M-series virtual machines.
Efficient insights: Dairy Farmers of America migrated its fragmented IT application landscape spread across 18 data centers, including mission critical SAP systems over to Azure. It leverages Azure Data Services and PowerBI to enable remote users easily access SAP data in a simplified and secure manner.
Real-time operations with IoT: Coats, a world leader in industrial threads, migrated away from SAP on Oracle to SAP HANA on Azure several years ago, enabling Coats to optimize operations with newer IoT-driven processes. With IoT monitoring, Coats now predicts inventory, manufacturing and sales trends more accurately than ever before.
Transforming with AI: Carlsberg, a world leader in beer brewing, migrated 80% of its enterprise applications to Microsoft Azure, including mission critical SAP apps. By leveraging Azure AI and sensors from research universities in Denmark, Carlsberg’s Beer Fingerprinting Project enabled them to map a flavor fingerprint for each sample and reduce the time it takes to research taste combinations and processes by up to a third, helping the company get more distinct beers to market faster.
Mission-critical infrastructure: CONA Services, the services arm for Coca-Cola bottlers, chose Azure to run its 24 TB mission critical SAP BW on HANA system on Azure’s purpose-built SAP HANA Infrastructure, powering 160,000 orders a day, which represents an annual $21B of net sales value.

Over the past few years, we have seen customers across all industries and geographies running their mission critical SAP workloads on Azure. Whether it’s customers in Retail such as Co-op and Coca-Cola, Accenture and Malaysia Airlines in services or Astellas Pharma and Zeullig Pharma in Pharmaceuticals, Rio Tinto and Devon Energy in Oil & Gas, SAP on Azure helps businesses around the world with their digital transformation.

If you are at SAPPHIRE NOW, drop by the Microsoft booth #729 to learn about these product enhancements and to experience hands-on demos of these scenarios.
Quelle: Azure

Announcing new Marketplace revenue opportunities

Microsoft’s success has always been based on working closely with our vibrant partner community to meet our collective customers’ needs. With the rapid growth of cloud, we have invested to ensure our partners realizes this growth as well.

Today, we're excited to announce expanded opportunities for Microsoft partners through new marketplace investments and expanding the very successful IP Co-sell Program.

New Marketplace capabilities

Customers are pursuing cloud technology to find the right solution for their needs quickly and efficiently. To make this easier, we are expanding our marketplace investments in both AppSource and Azure Marketplace so customers can find a broader set of solutions from Microsoft and our partner ecosystem.

Our cloud marketplaces will now support transaction capabilities for SaaS solutions from partners making it a one-stop experience for customers looking for a cloud-based solution. AppSource and Azure Marketplace will offer both per-seat and per-site SaaS transaction capability for our partners who build on, extend, or connect to Microsoft Azure, Microsoft 365, Microsoft Dynamics 365, and the Microsoft Power Platform. The ISVs who offer transactable SaaS solutions in our marketplace and participate in the co-sell program will earn a reduced transaction fee of 10 percent. We will continue to invest in new marketplace capabilities to help customers find the solutions they need and enable greater partner growth.

Expanding our IP co-sell program

Two years ago, we introduced our IP Co-sell Program that brought the Microsoft salesforce, the largest enterprise salesforce in the world, to sell our partners’ solutions built on Microsoft Azure. This collaboration between ISV partners and Azure sellers generated over $5B in partner revenue in the past year from nearly 3,000 ISVs, engaging with more than 30,000 enterprise customer opportunities. Based on this incredible success, we are expanding the IP Co-sell Program beyond Azure to now also include partners building on Microsoft 365, Microsoft Dynamics 365, and the Microsoft Power Platform. 

A new channel for ISVs

We are also expanding the IP Co-sell Program to Microsoft’s global resellers which creates a new channel for ISVs to grow their business. Starting on July 1, 2019, all Microsoft ISVs with published offers in AppSource or Azure Marketplace will have access to a new distribution method through Microsoft’s worldwide reseller channel. This opens new growth opportunities for both ISVs and reseller partners while bringing a broader range of solutions to customers.

 

Learn more about how to publish your solutions on AppSource and Azure Marketplace and how to take advantage of the new go-to-market services and onboarding resources.
Quelle: Azure

Analytics in Azure remains unmatched with new innovations

Digital disruption has created unlimited potential for companies to embrace data as a competitive advantage for their business. As a result, analytics continues to be a key priority for enterprises. When it comes to analytics, customers tell us that they need a solution that provides them with the best price, performance, security, and privacy, as well as a system that can easily deliver powerful insights across the organization. Azure has them covered.
Quelle: Azure

Partnering with the community to make Kubernetes easier

Offering serverless Kubernetes has been key part of our vision to make Kubernetes simpler for everyone – by providing an end-to-end experience optimized for developer productivity on top of an enterprise-grade platform with hardened security and layers of isolation. We are working closely with the community on open source projects that make Kubernetes easier for everyone, wherever they run it.
Quelle: Azure

New Azure Machine Learning updates simplify and accelerate the ML lifecycle

With the exponential rise of data, we are undergoing a technology transformation, as organizations realize the need for insights driven decisions. Artificial intelligence (AI) and machine learning (ML) technologies can help harness this data to drive real business outcomes across industries. Azure AI and Azure Machine Learning service are leading customers to the world of ubiquitous insights and enabling intelligent applications such as product recommendations in retail, load forecasting in energy production, image processing in healthcare to predictive maintenance in manufacturing and many more.

Microsoft Build 2019 represents a major milestone in the growth and expansion of Azure Machine Learning with new announcements powering the entire machine learning lifecycle.

Boost productivity for developers and data scientists across skill levels with integrated zero-code and code-first authoring experiences as well as automated machine learning advancements for building high-quality models easily.
Enterprise-grade capabilities to deploy, manage, and monitor models with MLOps (DevOps for machine learning). Hardware accelerated models for unparalleled scale and cost performance and model interpretability for transparency in model predictions. 
Open-source capabilities that provide choice and flexibility to customers with MLflow implementation, ONNX runtime support for TensorRT and Intel nGraph, and the new Azure Open Datasets service that delivers curated open data to improve model accuracy.

With these announcements and other improvements being added weekly, Azure Machine Learning continues to help customers easily apply machine learning to grow, compete and meet their objectives.

“By seamlessly integrating Walgreens stores and our other points of care with Microsoft’s Azure AI platform and Azure Machine Learning, the partnership will offer personalized lifestyle, wellness and disease management solutions, available via customers’ delivery method of choice.” 

— Vish Sankaran, Chief Innovation Officer, Walgreens Boots Alliance, Inc.

Boost productivity with simplified machine learning

“Using Azure Machine Learning service, we get peace of mind with automated machine learning, knowing that we are exhausting all the possible scenarios and using the best model for our inputs.”

— Diana Kennedy, Vice President, Strategy, Architecture, and Planning, BP

Automated machine learning advancements

Doubling-down on our mission to simplify AI, the new automated machine learning user interface (Preview), enables business domain experts to train machine learning models on data without writing a single line of code, in just a few clicks. Learn how to run an automated ML experiment in the portal.

Automated machine learning UI

Feature engineering updates including new featurizers that provide tailor made inputs for any given data, to deliver optimal models. Improvements in sweeping different combinations of algorithms for algorithm selection and hyperparameters and the addition of new popular learners like the XGBoost algorithm and more, that enable greater model accuracy. Compute optimization automatically guides which algorithms to parse out and where to focus, while early termination ensures training runs that deliver models efficiently. Automated machine learning also provides complete transparency into algorithms, so developers and data scientists can manually override and control the process. All these advancements help ensure the best model is delivered.

Building forecasts is an integral part of any business, whether it’s revenue, inventory, sales, or customer demand. Forecasting with automated machine learning includes new capabilities that improve the accuracy and performance of recommended models with time series data, including new predict forecast function, rolling cross validation splits for time series data, configurable lags, window aggregation, and holiday featurizer.

This ensures very high accuracy forecasting models and supporting automation for machine learning across many scenarios.

Azure Machine Learning visual interface (Preview)

The Visual interface is a powerful drag and drop workflow capability that simplifies the process of building, training, and deploying machine learning models. Customers new to machine learning, who prefer a zero-code experience can take advantage of the capabilities similar to those found in Azure Machine Learning Studio inside Azure Machine Learning service. Data preparation, feature engineering, training algorithms, and model evaluation are presented in an intuitive web user experience backed by the scale, version control, and enterprise security of Azure Machine Learning service.

Azure Machine Learning visual interface

With this new visual interface, we have started to combine the best of Azure Machine Learning Studio in Azure Machine Learning service. We will continue to share more updates throughout the year as we move from Preview towards General Availability.

Try it out yourself with this tutorial.

Hosted notebooks in Azure Machine Learning (Preview)

The new notebooks VM based authoring is directly integrated into Azure Machine Learning, providing a code-first experience for Python developers to conveniently build and deploy models in the workspace experience. Developers and data scientists can perform every operation supported by the Azure Machine Learning Python SDK using a familiar Jupyter notebook in a secure, enterprise-ready environment.

Hosted notebook VM (Preview) in Azure Machine Learning

Get started quickly and access a notebook directly in Azure Machine Learning, use preconfigured notebooks with no set up required, and fully customize notebook VMs by adding custom packages and drivers.

Enterprise-grade model deployment, management, and monitoring

MLOps – DevOps for machine learning

MLOps (also known as DevOps for Machine Learning) is the practice for collaboration and communication between data scientists and DevOps professionals to help manage the production machine learning lifecycle.

New MLOps capabilities in Azure Machine Learning bring the sophistication of DevOps to data science, with orchestration and management capabilities to enable effective ML Lifecycle management with:

Model reproducibility and versioning control to track and manage assets to create the model and sharing of ML pipelines, using environment, code, and data versioning capabilities.
Audit trail to ensure asset integrity and provides control logs to help meet regulatory requirements.
Packaging and validation for model portability and to certify model performance.
Deployment and monitoring support with a simplified experience for debugging, profiling and deploying models, to enable releasing models with confidence and knowing when to retrain.
Azure DevOps extension for Machine Learning and the Azure ML CLI to submit experiments from a DevOps Pipeline, track code from Azure Repos or GitHub, trigger release pipelines when an ML model is registered, and automate end-to-end ML deployment workflows using Azure DevOps Pipelines.

Operationalize models efficiently with MLOps

These capabilities enable customers to bring their machine learning scenarios to production by supporting reproducibility, auditability, and automation of the end-to-end lifecycle and leading to improved model quality over time.

Learn more about MLOps with Azure Machine Learning.

Hardware accelerated models and FPGA on Data Box Edge

In addition to acceleration available with GPUs, now scale from cloud to edge with Azure Machine Learning Hardware Accelerated Models, powered by FPGAs. These Hardware Accelerated Models are now generally available in the cloud, along with a preview of models deployed to Data Box Edge.

FPGA technology supports compute intensive scenarios like deep neural networks (DNNs), that have ushered in breakthroughs in computer vision, without forcing tradeoffs between price and performance. With FPGA’s it is possible to achieve ultra-low latency with ResNet 50, ResNet 152, VGG-16, DenseNet 121, and SSD-VGG. FPGAs enable real-time insights for scenarios like manufacturing defect analysis, satellite imagery, or autonomous video footage to drive business critical decisions.

Learn more about FPGAs and Azure Machine Learning.

Model interpretability

Microsoft is committed to supporting transparency, intelligibility, and explanation in machine learning models. Model interpretability brings us one step closer to understanding the predictions a model makes to ensure fairness and avoid model bias. This deeper understanding of models is key when uncovering insights about the model itself both in order to improve model accuracy during training and to uncover model behaviors and explain model prediction outcomes during inferencing.

Model interpretability is available in Preview and cutting-edge open source technologies (e.g., SHAP, LIME) under a common API, giving data scientists the tools to explain machine learning models globally on all data, or locally on a specific data point in an easy-to-use and scalable fashion.

Learn more about model interpretability.

Open and interoperable platform providing flexibility and choice 

“All the data scientists on our team enjoy using Azure Machine Learning, because it’s fully interoperable with all the other tools they use in their day-to-day work—no extra training is needed, and they get more done faster now.”

— Matthieu Boujonnier, Analytics Application Architect and Data Scientist, Schneider Electric

ONNX Runtime with Azure Machine Learning

Azure Machine Learning service supports ONNX (Open Neural Network Exchange), the open standard for representing machine learning models from TensorFlow, PyTorch, Keras, SciKit-Learn, and many other frameworks. An updated version of ONNX Runtime is now available fully supporting ONNX 1.5 (including object detection models such as YOLOv3 and SSD). With ONNX Runtime, developers now have a consistent scoring API that enables hardware acceleration thanks to the general availability of NVIDIA TensorRT integration and the public preview of Intel nGraph integration. ONNX Runtime is used on millions of Windows devices as part of Windows ML. ONNX Runtime also handles billions of requests in hyperscale Microsoft services such as Office, Bing, and Cognitive Services where an average of two times the performance gains have been seen. An updated version of ONNX Runtime is now available fully supporting the ONNX 1.5 specification, including state of the art object detection models such as Yolov3 and SSD. 

Learn more about ONNX and Azure Machine Learning.

MLflow integration

Azure Machine Learning supports popular open-source frameworks to build highly accurate machine earning models easily, and to enable training to run in variety of environments whether on-prem or in the cloud. Now developers can use MLflow with their Azure Machine Learning workspace to log metrics and artifacts from training runs in a centralized, secure, and scalable location.

Azure Open Datasets (Preview)

Azure Open Datasets is a new service providing curated, open datasets hosted on Azure and easily accessible from Azure Machine Learning. Use these datasets for exploration or combine them with other data to improve the accuracy of machine learning models. Datasets currently provided are historical and forecast weather data form NOAA, and many more will be added over time. Developers and data scientists can also nominate data sets to Azure, to support the global machine learning community with relevant and optimized data. 

Azure Open Datasets

Learn more about Azure Open Datasets.

Start building experiences

Envisioning, building, and delivering these advancements to the Azure Machine Learning service has been made possible by closely working with our customers and partners. We look forward to helping simplify and accelerate machine learning even further by providing the most open, productive, and easy-to-use machine learning platform. Together, we can shape the next phase of innovation, making AI a reality for your business and enabling breakthrough experiences.

Get started with a free trial of Azure Machine Learning service.

Learn more about the Azure Machine Learning service and follow the quickstarts and tutorials. Explore the service using the Jupyter notebook samples. 

Read all the Azure AI news from Microsoft Build 2019 .
Quelle: Azure