9 ways to back up your SAP systems in Google Cloud

At the heart of every modern business is data. Use it right, and you open the door to emerging technologies that’ll help you compete. But as you continue to innovate and invest in your technology, the data that’s created and produced becomes even more critical to protect from loss and outages. For SAP customers using new systems like S/4HANA, including backup and storage design as part of your overall business continuity planning rings particularly true. Reasons for data loss and outages can be physical or logical. In this blog post, we’ll focus on protecting against physical outages, like those caused by data center failures or environmental disasters, so your business is ready for anything.Technology 101: How backups work in the SAP ecosystemEach of your SAP deployments has unique Recovery Point Objective (RPO) and Recovery Time Objective (RTO) requirements, which influence your entire backup strategy and toolset. You can think of RPO as your backup operations: The more capabilities you have here, the further back your recovery point goes. RTO refers to the time it takes for your systems to recover and get back online. Most of the time, a trade-off is made between the overall cost of backup operations and the cost of time due to lost data.A typical SAP workload consists of virtual machines (VMs) running databases and application- servers on disks. There is a dedicated boot disk for the operating system (OS), and most of the remaining disks are used for applications. Because of this, we recommend that all of our SAP customers allocate a separate disk, like Persistent Disk, for all files and data that aren’t part of your OS. This makes systems easily replaceable and moveable and simplifies data capture and storage processes. Backup strategies for SAP customers leveraging the cloudThe core principle for backup solutions is to segregate backup data copies from the primary storage location. But, in an on-premises setting, data has only one place to go: the in-house storage unit. The good news is that, as more SAP workloads have moved to the cloud on HANA, you now have multiple cloud-based backup solutions that are flexible, scalable, and self-manageable. Persistent disk snapshotsPersistent disk snapshots are fast and cost-effective. You can specify the storage location for snapshots as regional or multi-regional. In an SAP HANA database running on Google Cloud, you can store backup folders on separate persistent disks to capture and replicate the database server independently.Machine images (Beta)A Google Compute Engine resource, machine images store all the configuration, metadata, permissions, and data needed from disks to create a VM instance. Machine images are ideal resources for disk backups as well as instance cloning and replication.Shared file storage SAP systems can use shared file storage (for example, Google Cloud Filestore or Elastifile) to fulfill any high availability and disaster recovery requirements. Shared file systems can be combined with appropriately chosen Cloud Storage buckets (multi-region, dual region) to ensure availability of data backups across zones and regions.HANA Backint agent for Cloud Storage (Beta)For SAP HANA database backup, Google Cloud offers customers a free, SAP-certified, and application-aware Cloud Storage Backint agent for SAP which would eliminate the need for backing up with persistent disks. Third-party network storageThird-party network file system (NFS) solutions offer a backup of all relevant file system volumes of an SAP instance for both the application and database layers with scheduled snapshots, which are stored in Cloud Storage. For SAP HANA, this solution is only suitable for hosting backup and share volumes.Third-party backup agents and managed servicesThese solutions offer advanced technical features that enable rapid backup and recovery times, because third-party providers do not rely on database-level incremental backups. For enterprise-scale SAP landscapes, this reduces storage sizes. A word of advice, though: Stick to SAP HANA certified backup solutions.SAP HANA data snapshotSAP HANA databases can also create data snapshots independently, using native SQL. This doesn’t require certification, but it is a highly complex technique since some systems need to be deactivated before snapshots can be taken.SAP HANA stop/start snapshot of secondary HANA instanceThis solution is suitable for non-production cases where cost considerations supersede RPO requirements. Creating snapshots involves using a smaller standby instance in an SAP HANA system replication setup. You can also take this instance offline and make a complete VM snapshot for point-in-time recoverability.Snapshot and disk deallocationIf cost is a high priority, Google Cloud offers services that enable you to allocate a persistent disk in time for a snapshot and deallocate it once the backup is complete. A cloud-based infrastructure will allow you to create disks for backup on an as-needed and pay-as-you-use basis.While we wish we could say data loss and disasters will never happen, the reality is that the next outage or triggering event is just around the corner. For businesses rapidly modernizing and transforming in a digital landscape, like SAP customers migrating to HANA, protecting your data will determine whether you are able to compete in an unpredictable, complex, and dynamic business environment. From persistent disk snapshots to machine images, Google and SAP’s cloud solutions work seamlessly together to provide an ecosystem of customizable solutions.Explore your HA optionsWe’ve only scratched the surface when it comes to understanding the many ways Google Cloud supports and extends backup and recovery for your SAP instances. For an even deeper dive, read our white paper, “SAP on Google Cloud: Backup strategies and solutions.”
Quelle: Google Cloud Platform

Introducing Java 11 on Google Cloud Functions

The Java programming language recently turned 25 years old, and it’s still one of the top-used languages powering today’s enterprise application customers. On Google Cloud, you can already run serverless Java microservices in App Engine and Cloud Run. Today we’re bringing Java 11 to Google Cloud Functions, an event-driven serverless compute platform that lets you run locally or in the cloud without having to provision servers. That means you can now write Cloud Functions using your favorite JVM languages (Java, Kotlin, Groovy, Scala, etc) with our Functions Framework for Java, and also with Spring Cloud Functions and Micronaut!With Cloud Functions for Java 11, now in beta, you can use Java to build business-critical applications and integration layers, and deploy the function in a fully managed environment, complete with access to resources in a private VPC network. Java functions will scale automatically based on your load. You can write HTTP functions to respond to HTTP events, and background functions to process events sourced from various cloud and GCP services, such as Pub/Sub, Cloud Storage, Firestore, and more.Click to enlargeFunctions are a great fit for serverless application backends for integrating with third-party services and APIs, or for mobile or IoT backends. You can also use functions for real-time data processing systems, like processing files as they are uploaded to Cloud Storage, or to handle real-time streams of events from Pub/Sub. Last but not least, functions can serve intelligent applications like virtual assistants and chat bots, or video, image and sentiment analysis.Cloud Functions for Java 11 exampleYou can develop functions using the Functions Framework for Java, an open source functions-as-a-service framework for writing portable Java functions. You can develop and run your functions locally, deploy them to Cloud Functions, or to another Java environment.An HTTP function simply implements the HttpFunction interface:Add the Functions Framework API dependency to the Maven pom.xml:Then add the the Function Maven plugin so you can run the function locally:Run the function locally:You can also use your IDE to launch this Maven target in Debugger mode to debug the function locally.To deploy the function, you can use the gcloud command line:Alternatively, you can also deploy with the Function Maven plugin:You can find the full example on GitHub. In addition to running this function in the fully managed Cloud Functions environment, you can also bring the Functions Framework runtime with you to other environments, such as Cloud Run, Google Kubernetes Engine, or a virtual machine.Third-party framework supportIn addition to our Functions Framework for Java, both the Micronautframework and the Spring Cloud Function project now have out-of-the-box support for Google Cloud Functions. You can create both an HTTP function and background function using the respective framework’s programming model, including capabilities like dependency injection.MicronautThe Micronaut team implemented dedicated support for the Cloud Functions Java 11 runtime. Instead of implementing Functions Framework’s HttpFunction interface directly, you can use Micronaut’s programming model, such that a Helloworld HTTP Function can simply be a Micronaut controller:You can find a full example of Micronaut with Cloud Functions and its documentation on GitHub.Spring Cloud FunctionsThe Google Cloud Java Frameworks team worked with the Spring team to bring Spring Cloud GCP project to help Spring Boot users easily leverage Google Cloud services. More recently, the team worked with the Spring Cloud Function team to bring you Spring Cloud Function GCP Adapter. A function can just be a vanilla Java function, so you can run a Spring Cloud Function application on Cloud Functions without having to modify your code to run on Google Cloud.You can find a full example of a Spring Cloud Function with Cloud Functions on GitHub.JVM LanguagesIn addition to using the latest Java 11 language features with Cloud Functions, you can also use your favorite JVM languages, such as Kotlin, Groovy, and Scala, and more. For example, here’s a function written with Kotlin:Here’s the same function with Groovy:You can take a deeper dive into a Groovy example, and otherwise, find all the examples on GitHub (Kotlin, Groovy, Scala).Try Cloud Functions for Java 11 todayCloud Functions for Java 11 is now in beta, so you can try it today with your favorite JVM language and frameworks. Read the Quick Start guide, learn how to write your first functions, and try it out with a Google Cloud Platform free trial. If you want to dive a little bit deeper into the technical aspects, you can also read this article on Google Developers blog. If you’re interested in the open-source Functions Framework for Java, please don’t hesitate to have a look at the project and potentially even contribute to it. We’re looking forward to seeing all the Java the functions you write! Special thanks to Googlers Éamonn McManus, Magda Zakrzewska‎, Sławek Walkowski, Ludovic Champenois, Katie McCormick, Grant Timmerman, Ace Nassri, Averi Kitsch, Les Vogel, Kurtis Van Gent, Ronald Laeremans, Mike Eltsufin, Dmitry Solomakha, Daniel Zou, Jason Polites, Stewart Reichling, and Vinod Ramachandran. We also want to thank Micronaut and Spring Cloud Function teams for working on the Cloud Functions support!
Quelle: Google Cloud Platform

Azure Lighthouse—managing cloud, hybrid, and edge environments at-scale through a single control plane

Thousands of partners and enterprises use Azure Lighthouse to manage services across Azure tenants, representing tens of thousands of subscriptions and more than one million Azure resources from Azure Resource Manager—a unified control plane. With Azure Lighthouse, service providers, as well as self-managing enterprises, can achieve higher operational efficiency using Azure’s comprehensive and robust management tools. You can now view and manage resources, with higher automation, scale, and enhanced governance across hybrid estates and on-premises.

It is common for Managed Service Providers (MSPs) to service customer resources across hybrid estates and on-premises environments. Many MSP partners rely on Azure Lighthouse, and now Azure Arc, to achieve a unified management solution in these advanced scenarios. MSPs can extend their service offerings to manage their customers’ on-premises environments through Azure Resource Manager, managing resources at scale and governing compliance using Azure policy.

ClearDATA—delivering robust governance across hybrid environments for healthcare customers

Using Azure Lighthouse, Azure Policy, and Azure Arc, ClearDATA—an Azure Expert MSP—provides compliance insights to enterprise customers in regulated industries, such as healthcare. Azure Arc enables ClearDATA to easily perform virtual machine inventories in hybrid environments, while Azure Policy used with Azure Lighthouse helps them to achieve consistency, security, and compliance across all of their customers in all of the clouds and private datacenters or branch offices the customers use.

ClearDATA provides compliance state insights across hybrid environments to enterprise customers.

“ClearDATA’s HIPAA compliant and HITRUST 9.1 certified solutions on Azure help enterprise organizations easily transition and accelerate their move to the cloud with greater confidence. A rich library of compliance reference architecture for Azure services, coupled with our unique Automated Safeguards and Remediation technology, unlocks the true potential of Azure Lighthouse and Azure cloud. Our visual and easy-to-use compliance dashboard and flexible reports provide transparency and visibility needed to demonstrate compliance.”—Suhas Kelkar, Chief Product Officer, ClearDATA.

Yorktel—monitoring customer edge devices

Yorktel manages health states of Microsoft collaboration devices (Surface Hubs 1, 2, and Microsoft Teams Rooms), including displays, microphones, cameras, speakers, and Microsoft Teams’ real-time features, on-behalf of its end-customers. By pivoting to Azure Monitor as their primary monitoring tool, and Azure Lighthouse as their secure access mechanism, Yorktel is shaking up edge device management. Consolidated views across all its customers provides Yorktel with comprehensive oversight, enabling timely alerts that trigger response workflows for speedy problem resolution. Azure Lighthouse has created smoother user experiences and higher customer satisfaction.

Yorktel’s Azure-based monitoring workflow for edge devices.

“Yorktel’s Azure Lighthouse enabled monitoring and management solution couldn’t have come at a better time. As the post-COVID-19 world prepares to return to work, this proactive problem and resolution technology presents the potential for dramatic impact, both for managed services providers and their customers. The efficiencies generated by faster, large-scale problem resolution will allow companies to focus on the strategic and transformational initiatives that will help them grow and acclimate to the post-COVID-19 world, rather than the tactical, day-to-day ‘keeping the lights on’.” —Jeremy Short, SVP of Microsoft Solutions, Yorktel

Vandis—delivering managed network services

Azure Lighthouse has also enabled multiple service providers, such as Azure Networking MSPs, to build and operate optimized hybrid connectivity from customer premises to customer subscriptions in Azure. Vandis, for example, uses Azure Lighthouse to plan, build, and operate a hybrid network for customers based on Azure Virtual WAN and Azure Express Route.

“Azure Lighthouse has enabled us to expand our Network-as-a-Service Platform to our customers as well as drive work-from-home solutions such as Windows Virtual Desktop on Azure.” —Ryan Young, CTO, Vandis

Azure Lighthouse—continuing to innovate for management-at-scale scenarios in Azure

Congratulations to all our partners who continue to add value to our joint customers with enhanced services for managing Azure and hybrid estates. Our team is as motivated as ever to innovate for our partner ecosystem, and we’ve been constantly adding new Azure Lighthouse capabilities as a result.

Here are a few highlights:

Service providers can now trigger notification and onboarding workflows for their teams, in their own Azure control plane, through activity logs that monitor customers’ resource delegation actions.
Customers can now upgrade their managed services offers inside their own Azure portal experiences, in service providers views, rather than visiting other portals or marketplaces.
Automation tools of choice across command-line interface (CLI), APIs (subscription function), and PowerShell can now display managed and managing tenant context of an Azure subscription.
Service providers can opt-out of managing customer delegated Azure scopes, on their own, to accelerate compliance and offboarding needs.
Azure Backup Explorer and Backup reports now offer cross-customer consolidated views for service providers, driving operator efficiency.
Azure Lighthouse is now a FEDRAMP High certified service available in Microsoft Azure Government.
Partners can now draft and publish managed services offers to the Azure Marketplace directly from the Partner Center, streamlining offer and lead management into a single portal.
Azure Lighthouse Help and Support experiences have been enhanced, including recommended solutions for common issues, empowering managing tenants with more insights to solve issues themselves.

And that’s a wrap for Build 2020 with Azure Lighthouse. I cannot wait to share more with you at Inspire 2020 in July. In the meantime, check out our new Azure Lighthouse learning content.
Quelle: Azure

Virtual Build spotlights IoT updates and rollouts

As people around the globe adapt to new ways of working, the Microsoft Build 2020 conference took a new approach as well. Rather than gathering the developer community in person as planned, Microsoft shifted gears and put together 48 hours of streaming content for a virtual event.

Despite the new format, Microsoft Build’s goals remained the same: Connect our developers with the best of Microsoft so they can bring their ideas to life. For IoT, that included a lot of new innovations and training for developers, all geared toward simplifying IoT and empowering developers to build new breakthrough solutions.

On the training side, we’re especially excited to launch a new IoT certification to help build skills in the community and unlock the creativity of developers. We’ve also added some industry-leading capabilities with an all-new Azure Digital Twins release that can model just about any scenario.

Below is a roundup of the key news. I encourage you to click down into the individual announcements for more detail, and if you weren’t able to virtually attend the Microsoft Build conference, access the sessions online.

New IoT certification for developers

One of the biggest challenges for developers building IoT applications is acquiring the skills to do so. Microsoft offers multiple training options that empower developers to increase technical skills and prepare for Microsoft Certifications.

At Microsoft Build 2020, we announced the general availability of a recent addition to the Microsoft Certification portfolio: The Azure IoT Developer Specialty certification. Earning this certification can help developers become recognized as experts and advance their careers by validating technical knowledge and ability.

Developers can start the IoT learning and certification journey at Microsoft Learn, with free online, self-paced courses covering all the essentials like provisioning and managing devices, processing data, deploying cloud workloads to the edge, securing the solution, and more. Check out the Microsoft Learning Blog to explore all the resources available to skill up and get certified.

Azure Digital Twins: New preview features

A “digital twin” is a digital replica of real-world things—assets, environments, business systems—designed to understand, control, simulate, analyze, and improve how those things work in the real world.

At Microsoft Build 2020, we announced the next iteration of Azure Digital Twins, making it even easier for developers to build these dynamic virtual replicas. New capabilities include rich and flexible modeling that supports full graph topologies, a live execution environment, easy integration with other Azure services, and broad query APIs.

To drive openness in building IoT applications, the new Azure Digital Twins also uses an open modeling language called the Digital Twins Definition Language, based on the JSON-LD standard. This will provide great flexibility, ease of use, and easy integration into other Azure platform offerings such as IoT Hub and Time Series insights.

It also allows for expanded integration outside Azure, so partners can use Digital Twins as part of their existing modeling frameworks and third-party systems. The new features are expected to be out in the coming months.

We also highlighted two partners using new capabilities in exciting ways. Pennsylvania-based ANSYS is building physics-based simulations that can aid in designing large physical assets. Another partner, Bentley Systems, is creating a digital representation of major infrastructure including road and rail networks, public works and utilities, industrial plants, and commercial and institutional facilities to help customers better design, build, and operate.

Finally, as part of our commitment to openness and interoperability, we announced that Microsoft has joined Dell, Ansys, and LendLease in founding the Digital Twin Consortium, where we will work to build an open community that promotes best practices and standard digital twin models for all businesses and industry domains.

IoT Plug and Play: New preview features

IoT Plug and Play is an open approach that dramatically accelerates IoT by making it much easier to develop software on devices, connect them quickly to IoT solutions, and update each independently. Since our initial preview last year, we have been busy responding to customer feedback and at build we announced a set of new preview features which will be available soon:

Alignment with Digital Twins: IoT Plug and Play and Azure Digital Twins now share the same modeling language: the Digital Twins Definition Language (DTDL). This makes it simple to connect an IoT Plug and Play device to Azure Digital Twins and have the device appear instantly as a Digital Twin. 
Support for existing devices: we have made it easy to update existing devices to be IoT Plug and Play compatible, developers can simply author a DTDL document that describes the interaction model of their device, make targeted code changes, and then send the model when the device connects.

We will also be enabling our device providers to start their final certifications ahead of our IoT Plug and Play general availability.

Azure Time Series Insights: New features general availability

Traditionally comparing historical trends with time series data has meant spending days normalizing the data before analyzing it. With Azure Time Series Insights, developers can process, analyze, and get data insights in just minutes.

This year at Microsoft Build, we announced that new features for Azure Time Series Insights will be generally available in the coming months.

Several months ago we announced a preview of Azure Time Series Insights features, including an enhanced analytics user experience through Time Series explorer, seamless integration with advanced machine learning platforms and analytics tools, a native connector to Power BI, semantic model support for metadata, and more.

This version builds on our commitment to deliver a truly flexible analytics platform with the introduction of Azure Data Lake Storage Gen2 support. By combining customer-owned Azure Data Lake Storage with our native support for the open source, highly-efficient Apache Parquet, customers can gain insights over decades of IoT data. They can also integrate with other analytics tools of their choice to unlock significant business value and operational intelligence.

When our customers use Azure Time Series Insights together with Azure Digital Twins, they gain highly contextualized representations of their connected environments to better understand how assets, customers, and processes interact.

We also announced improvements in scale, security, and user experience that will be available in the next few months. Learn more about Azure Time Series Insights and start getting insights from your IoT data today.

Azure Maps: Creator feature in preview

Azure Maps is an enterprise location platform that enables developers to add spatial analytics and mobility to their IoT applications.

At Microsoft Build, we announced Azure Maps Creator in preview, which offers a fundamental shift in building and managing private map data, and moving geographic information systems (GIS) data management into Azure cloud.

With Azure Maps Creator, developers can upload private map information such as indoor floorplans, spaces, and physical assets into a customer-controlled, highly-secure, and fully-compliant geospatial storage system within Azure Maps.

Azure Maps Creator also helps Azure Digital Twins customers by handling private map data associated with Digital Twins for private spaces like building interiors, campuses, factories, and more. The combination of Azure Maps Creator and Azure Digital Twins helps customers manage, monitor, and track IoT assets within their environments through the Azure Maps interface. Learn more about Azure Maps Creator.

Azure IoT Central: First-class support for Azure Sphere and Azure IoT Edge

IoT Central is a fully managed software as a service (SaaS) IoT app platform that allows developers to easily create IoT applications without managing the underlying infrastructure. Developers can either use existing IoT Central industry templates or create customized solutions of their own design. Of particular note during our current public health crisis is IoT Central’s continuous patient monitoring health template designed to accelerate the assembly and deployment of healthcare wearables and patient monitoring solutions.

At Microsoft Build, IoT Central announced several new features, including first-class support for both Azure Sphere and Azure IoT Edge.

Integrating IoT Edge with IoT Central allows developers to deploy cloud workloads such as artificial intelligence and machine learning on edge devices. It dramatically increases the possibilities for IoT applications by allowing developers to deploy Edge software modules, find insights from them, and take actions—all from within IoT Central.

Pairing IoT Central with Azure Sphere’s integrated security solution provides the foundation needed to build, monitor, and safely manage IoT devices and products. It allows application builders to ensure device-to-cloud security through simplified security management from a single pane of glass. Developers can also model Azure Sphere devices in IoT Central using device templates integrated with Azure Sphere cloud services to facilitate secure error and device status reporting.

For more information on how IoT Central and Azure Sphere can help in the design and management of a robust IoT strategy, read the blog to learn more.

Follow the latest IoT Central innovations by subscribing to our monthly service updates.

Azure IoT Hub and Azure IoT Edge: New breakthrough capabilities for enterprise-grade IoT

At Microsoft Build, we announced another industry first: Azure IoT Hub now supports Azure Private Link for device connectivity as well as Managed Identity for securely connecting to locked-down Azure resources. As a result, customers can now bring IoT Hub into their Azure Virtual Network (VNET) and secure their IoT solution by eliminating exposure to the public internet. To learn more, see the full blog.

We also announced new industry-leading features that elevate Azure IoT Edge to the most sophisticated, production-grade edge platform in the industry:

IoT Edge added X.509 certificate attestation for IoT Hub Device Provisioning Service (DPS). This takes advantage of X.509 certificate chains to automate device provisioning, allowing for greater scale.
Additional features will make supportability and debugging quick and easy. A new feature called Support Bundle reduces the work required to debug issues across IoT Edge components. This feature allows collection of module, IoT Edge security manager, and container engine logs, along with iotedge check output and other useful debug information, in a single compressed file with a single command.
IoT Edge, together with IoT Hub Automatic Device Management, allows layered deployments that enable reuse of the same module in different combinations, reducing the number of unique deployments that need to be created.
Azure IoT Edge also works on Kubernetes, and we recently added new features for this support. These include an integrated, production-grade security architecture, a built-in lightweight proxy to deploy IoT Edge modules on Kubernetes with no code changes, integration of loT Edge features like automatic provisioning using IoT Hub Device Provisioning Service, and application model extensions that allow the use of select Kubernetes primitives in an edge deployment manifest.

And we are not done—based on our customers’ needs, we are working on the following new features that will be released soon as part of IoT Edge release 1.0.10 in the coming months:

Priority messages and Time-to-Live (TTL) support, which will allow greater control over network usage in constrained and expensive networking environments by letting our customers choose which data they want to receive first from an IoT Edge device.
IoT Edge runtime will be enhanced to emit rich operational metrics in an industry-standard Prometheus format, enabling powerful monitoring and alerting features both locally and remotely.

Azure RTOS

Getting intelligent, reliable hardware products to market can be time-consuming and complex. Azure RTOS is an embedded IoT development suite that includes a lightweight real-time operating system for microcontrollers (MCUs) and microprocessors (MPUs) to streamline the process of building high-performing devices.

At Microsoft Build we announced the general availability of Azure RTOS, the fastest, smallest, industry-grade RTOS on the planet. We also announced that Microsoft now supports Azure RTOS on development kits from ST, Renesas, NXP, and Microchip. This turnkey integration helps simplify many steps in the development cycle.

Full source code for all Azure RTOS components is now available on GitHub for developers to freely test and explore. Azure RTOS includes a preview integration of an Azure Security Center module. Later this year we will offer an add-on industrial certification package to help developers get to market even faster. For more details, read the full announcement.

Azure Sphere

Azure Sphere is a device security solution purpose-built with Azure Sphere-certified hardware—a highly secured OS and a cloud security service, with more than a decade of ongoing, on-chip security improvements.

Since we announced its general availability in February 2020, Microsoft has relied on Azure Sphere in our own datacenters to securely connect the critical infrastructure that delivers cloud services at scale. 
 
At Microsoft Build, we demonstrated Azure Sphere and Azure RTOS’s collective capability to address critical needs across the full spectrum of MCU and embedded-class IoT devices, enabling developers to build highly secure devices with real-time processing capabilities.

Windows for IoT: A broad range of updates, including something for every developer

At Microsoft Build, we also laid out the road map for the continued integration of IoT capabilities into Windows.

Customers love the security and manageability of Windows for IoT, and we are making it even easier to integrate with Azure and to access Linux modules by enabling the Linux version of Azure IoT Edge on Windows 10 IoT Enterprise. We are also creating new market opportunities for device builders by shrinking the footprint of Windows 10 IoT Enterprise, enabling NXP’s i.MX8 silicon, and adding new features for appliance scenarios and business models.

Our partners continue to build innovative solutions with Windows IoT. Democracy Live and Dover Fueling Solutions are examples of partners enabling secure, accessible, and empowered solutions with Windows 10 IoT Enterprise. It is also exciting to see Clearpath Robotics adding support for Robot Operating System (ROS) on Windows, and HIWIN enabling speech and vision cognition capabilities for robots running ROS on Windows.

For more detail on all the IoT updates happening around our upcoming releases of Windows IoT, check out the announcement blog.

Get more from Microsoft IoT

All of us at Microsoft IoT want to thank the developers who participated in our first virtual Microsoft Build. Shifting gears to put on this event in an accessible, inclusive way involved groups across Microsoft, and we hope the content helps the community stay connected to the platform and advance their own offerings.

Watch the virtual sessions and check out the detailed announcement blog posts linked above. We’ll be adding more in the coming months, so stay tuned—and stay safe.
Quelle: Azure