Helping Smart Cities become more Inclusive

According to the UN, we will see the world's urban populations grow from today's 55 percent to 68 percent by 2050. With almost a billion people on the path to be urban dwellers, most cities are still unfriendly to people with disabilities. As more people flock to cities, making our cities smarter and more inclusive will become increasingly important. The concept of smart cities is all about developing strategies that leverage data and technology to enhance urban life. The IoT plays a central role in collecting sensor data and then using the insights gained from that data to manage assets, resources, and services efficiently.

As city planners tackle the complex challenges of increasing urbanization, managing scarce resources, climate change, and creating safer more accessible cities, Azure Maps (a collection of geospatial APIs) becomes a critical tool for city planners. A key aspect of IoT & technology solutions is that they should be intuitive, easy to use, and accessible.

Azure Maps and accessibility

Azure Maps makes it easy for all users to navigate an interactive map experience. Users can interact with maps using a mouse, touch, or keyboard. Azure Maps provides screen readers with enhanced descriptions that can combine multiple updates into a single message that is easier to digest and understand. Recently Azure Maps achieved exciting new capabilities & Microsoft certification around accessibility. All apps, both Microsoft owned and third party, that use Azure Maps will benefit from the accessibility features that are provided out of the box.

Azure Maps also relies on best of breed content partnerships for everything from the maps data, traffic, real time transit, ride share, to weather data.

Moovit helps people with disabilities ride transit

One of the Azure Maps content partnerships is with Moovit. Launched in 2011 in Israel, Moovit has become the world’s most popular transit-planning and navigation app, with more than 500 million users and service in over 3,000 cities across 94 countries. The company is also a leader in inclusive technology, with innovative work that helps people across the disability spectrum use buses, trains, subways, ride-hailing services, and other modes of public transit.

In addition to offering a consumer app in 45 languages, Moovit has partnered with Microsoft to provide its multi-modal transit data to developers who use Azure Maps, and a set of mobility-as-a-service solutions to cities, governments, and organizations. The partnership will enable the creation of more inclusive smart cities and more accessible transit apps.

How Aira helps smart cities become more accessible

One of the companies that is leveraging the geospatial and mapping capabilities from Azure Maps and the transit capabilities from Moovit, is Aira. Aira is a technology company dedicated to making lives simpler and more engaging. Based in San Diego, California, they build solutions to connect people who are blind, have low vision, or are simply aging into a digital world, with highly-trained professionals who provide visual information on demand.

Public transportation is the lifeline to jobs, education, healthcare, and more, yet many blind or low vision riders still have trouble getting to their destination. They may be uncertain that they’ve caught the right bus, or unable to read the entrance sign they need to follow in order to access the subway. In addition, as populations age, the number of people experiencing age-related vision loss rises every day. Moovit, Microsoft, and Aira are joining forces in order to challenge these obstacles and make public transit more accessible and inclusive, empowering blind and low vision riders to travel with more confidence.

“In Azure Maps, we invested significant time and resources to define accessibility requirements, implementing capabilities for those with needs, and pushing ourselves to service this segment of users” says Chris Pendleton, Head of Azure Maps at Microsoft Corp. “I’m elated to see Aira, Moovit, and Azure Maps providing services together further justifying our investments in the benefit of those who need it most”

Smart City Expo World Congress

In order to connect with cities on their journeys for digital transformation, the Azure Maps team, along with Moovit and Aira, will be at Smart City Expo World Congress, the industry-leading event for urbanization, showcasing technologies and partners enabling the digital transformation of smart cities. For updates from SCEWC, follow us on twitter.    

 
Quelle: Azure

Accelerate IoMT on FHIR with new Microsoft OSS Connector

Microsoft is expanding the ecosystem of FHIR® for developers with a new tool to securely ingest, normalize, and persist Protected Health Information (PHI) from IoMT devices in the cloud.  

Continuing our commitment to remove the barriers of interoperability in healthcare, we are excited to expand our portfolio of Open Source Software (OSS) to support the HL7 FHIR Standard (Fast Healthcare Interoperability Resource). The release of the new IoMT FHIR Connector for Azure is available today in GitHub.

The Internet of Medical Things (IoMT) is the subset of IoT devices that capture and transmit patient health data. It represents one of the largest technology revolutions changing the way we deliver healthcare, but IoMT also presents a big challenge for data management.

Data from IoMT devices is often high frequency, high volume, and requires sub-second measurements. Developers have to deal with a range of devices and schemas, from sensors worn on the body, ambient data capture devices, applications that document patient reported outcomes, and even devices that only require the patient to be within a few meters of a sensor.

Traditional healthcare providers, innovators, and even pharma and life sciences researchers are ushering in a new era of healthcare that leverages machine learning and analytics from IoMT devices. Most see a future where devices monitoring patients in their daily lives will be used as a standard approach to deliver cost savings, improve patient visibility outside of the physician’s office, and to create new insights for patient care. Yet as new IoMT apps and solutions are developed, two consistent barriers are preventing broad scalability of these solutions: interoperability of IoMT device data with the rest of the healthcare data, such as clinical or pharmaceutical records, and the security and private exchange of protected health information (PHI) from these devices in the cloud.

In the last several years, the provider ecosystem began to embrace the open source standard of FHIR as a solution for interoperability. FHIR is rapidly becoming the preferred standard for exchanging and managing healthcare information in electronic format and has been most successful in the exchange of clinical health records. We wanted to expand the ecosystem and help developers working with IoMT devices to normalize their data output in FHIR. The robust, extensible data model of FHIR standardizes the semantics of healthcare data and defines standards for exchange, so it fuels interoperability across data systems. We imagined a world where data from multiple device inputs and clinical health data sets could be quickly normalized around FHIR and work together in just minutes, without the added cost and engineering work to manage custom configurations and integration with each and every device and app interface. We wanted to deliver foundational technology that developers could trust so they could focus on innovation. And today, we’re releasing the IoMT FHIR Connector for Azure.

This OSS release opens an exciting new horizon for healthcare data management. It provides a simple tool that can empower application developers and technical professionals working with data from devices to quickly ingest and transform that data into FHIR. By connecting to the Azure API for FHIR, developers can set up a robust and secure pipeline to manage data from IoMT devices in the cloud.

The IoMT FHIR Connector for Azure enables easy deployment in minutes, so developers can begin managing IoMT data in a FHIR Server that supports the latest R4 version of FHIR:

Rapid provisioning for ingestion of IoMT data and connectivity to a designated FHIR Server for secure, private, and compliant persistence of PHI data in the cloud
Normalization and integrated mapping to transform data to the HL7 FHIR R4 Standard
Seamless connectivity with Azure Stream Analytics to query and refine IoMT data in real-time
Simplified IoMT device management and the ability to scale through Azure IoT services (including Azure IoT Hub or Azure IoT Central)
Secure management for PHI data in the cloud, the IoMT FHIR Connector for Azure has been developed for HIPAA, HITRUST, and GDPR compliance and in full support of requirements for protected health information (PHI)

To enhance scale and connectivity with common patient-facing platforms that collect device data, we’ve also created a FHIR HealthKit framework that works with the IoMT FHIR Connector. If patients are managing data from multiple devices through the Apple Health application, a developer can use the IoMT FHIR Connector to quickly ingest data from all of the devices through the HealthKit API and export it to their FHIR server.

Playing with FHIR
The Microsoft Health engineering team is fully backing this open source project, but like all open source, we are excited to see it grow and improve based on the community's feedback and contributions. Next week we’ll be joining developers around the world for FHIR Dev Days in Amsterdam to play with the new IoMT FHIR Connector for Azure. Learn more about the architecture of the IoMT FHIR Connector and how to contribute to the project on our GitHub page.

FHIR® is the registered trademark of HL7 and is used with the permission of HL7
Quelle: Azure

Forrester names Microsoft a leader in Wave report for Industrial IoT Software Platforms

As a company, we work every day to empower every person on the planet to achieve more. As part of that, we’re committed to investing in IoT and intelligent edge, two technology trends accelerating ubiquitous computing and bringing unparalleled opportunity for transformation across industries. We’ve been working hard to make our Azure IoT platform more open, security-enhanced, and scalable, as well as to create opportunities in new market areas and our growing partner ecosystem. Our core focus is addressing the industry challenge of securing connected devices at every layer and advancing IoT to create a more seamless experience between the physical and digital worlds.

Today, Microsoft is positioned as a leader in The Forrester Wave™: Industrial IoT Software Platforms, Q4 2019, receiving of the highest score possible, 5.00, in partner strategy, innovation roadmap, and platform differentiation criteria, the highest score in the market presence category, and the second-highest score in the current offering category.

According to the Forrester report, “Microsoft powers industrial partners but also delivers a credible platform of its own. Microsoft continues to add features to the platform at an impressive rate, with the richer edge capabilities of Azure IoT Edge and the simplified application and device onboarding offered by Azure IoT Central formally launching since we last evaluated this market.”

We believe this latest recognition spotlights our commitment and ability to:

Support a comprehensive set of deployment models, from edge to cloud. According to our own IoT Signals research, the decision-makers surveyed believe that in the next two years, AI, edge computing, and 5G will be critical technological drivers for IoT success. And they want tools that can drive success across diverse deployment models.

Deliver business integration that goes beyond connectivity and device management. It’s become increasingly important for businesses to be able to link IoT workflows to data and processes across the operation, and we’re helping customers accelerate time to value.

Turn analytics into actionable intelligence. Industrial firms capture and generate mountains of time-series data in real-time. Transforming this data into timely insights is key to turning that data into decisions that move the business forward.

We’re committed to making Azure the ideal IoT platform, and this recognition comes at a great point in our journey. Download this complimentary full report and read the analysis behind Microsoft’s positioning as a Leader.

More information on our Azure IoT Industrial platform.

The Forrester Wave™: Industrial IoT Software Platforms, Q4 2019, Michele Pelino and Paul Miller, November 13, 2019. This graphic was published by Forrester Research as part of a larger research document and should be evaluated in the context of the entire document. 

Quelle: Azure

How to build globally distributed applications with Azure Cosmos DB and Pulumi

This post was co-authored by Mikhail Shilkov, Software Engineer, Pulumi.

Pulumi is reinventing how people build modern cloud applications, with a unique platform that combines deep systems and infrastructure innovation with elegant programming models and developer tools.

We live in amazing times when people and businesses on different continents can interact at the speed of light. Numerous industries and applications target users around the globe: e-commerce websites, multiplayer online games, connected IoT devices, collaborative work and leisure experiences, and many more. All of these applications demand computing and data infrastructure in proximity to the end-customers to minimize latency and keep the user experience engaging. The modern cloud makes these scenarios possible. 

Azure infrastructure

Azure Cosmos DB provides a turn-key data distribution to any number of regions, meaning that locations can be added or removed along the way while running production workloads. Azure takes care of data replication, resiliency, and efficiency while providing APIs for read and write operations with a latency of less than 10 milliseconds.

In contrast, compute services—virtual machines, container instances, Azure App Services, Azure Functions, and managed Azure Kubernetes Service—are located in a single Azure region. To make good use of the geographic redundancy of the database, users should deploy their application to each of the target regions.

 

Globally distributed application

Application regions must stay in sync with Azure Cosmos DB regions to enjoy low-latency benefits. Operational teams must manage the pool of applications and services to provide the correct locality in addition to auto-scaling configuration, efficient networking, security, and maintainability.

To help manage the complexity, the approach of infrastructure as code comes to the rescue.

Infrastructure as code

While the Azure portal is an excellent pane-of-glass for all Azure services, it shouldn’t be used directly to provision production applications. Instead, we should strive to describe the infrastructure in terms of a program which can be executed to create all the required cloud resources.

Traditionally, this could be achieved with an automation script, e.g., a PowerShell Cmdlet or a bash script calling the Azure CLI. However, this approach is laborious and error prone. Bringing an environment from its current state to the desired is often non-trivial. A failure in the middle of the script often requires manual intervention to repair environments, leading to downtime.

Desired state configuration is another style of infrastructure definition. A user describes the desired final state of infrastructure in a declarative manner, and the tooling takes care of bringing an environment from its current state to the parity with the desired state. Such a program is more natural to evolve and track changes.

Azure Resource Manager Templates is the bespoke desired-state-configuration tool in the world of Azure. The state is described as a JSON template, listing all the resources and properties. However, large JSON templates can be quite hard to write manually. They have a high learning curve and quickly become large, complex, verbose, and repetitive. Developers find themselves missing simple programming language possibilities like iterations or custom functions.

Pulumi solves this problem by using general-purpose programming languages to describe the desired state of cloud infrastructure. Using JavaScript, TypeScript, or Python reduces the amount of code many-fold, while bringing constructs like functions and components to the DevOps toolbox.

Global applications with Pulumi

To illustrate the point, we develpoed a TypeScript program to provision a distributed application in Azure.

The target scenario requires quite a few resources to distribute the application across multiple Azure regions, including:

Provision an Azure Cosmos DB account in multiple regions
Deploy a copy of the application layer to each of those regions
Connect each application to the Azure Cosmos DB local replica
Add a Traffic Manager to route user requests to the nearest application endpoint

Global application with Azure and Pulumi

 

However, instead of coding this manually, we can rely on Pulumi's CosmosApp component as described in How To Build Globally Distributed Applications with Azure Cosmos DB and Pulumi. The component creates distributed Azure Cosmos DB resources, as well as the front-end routing component while allowing pluggable compute layer implementation.

You can find the sample code in Reusable Component to Create Globally-distributed Applications with Azure Cosmos DB.

Pulumi CLI executes the code, translate it to the tree of resources to create, and deploys all of them to Azure:

After the command succeeds, the application is up and running in three regions of my choice.

Next steps

Infrastructure as code is instrumental in enabling modern DevOps practices in the universe of global and scalable cloud applications.

Pulumi lets you use a general-purpose programming language to define infrastructure. It brings the best tools and practices from the software development world to the domain of infrastructure management.

Try the CosmosApp (available on GitHub—TypeScript, C#) with serverless functions, containers, or virtual machines to get started with Pulumi and Azure.
Quelle: Azure

Democratizing Smart City solutions with Azure IoT Central

One of the most dynamic landscapes embracing Internet of Things (IoT) is the modern city. As urbanization grows, city leaders are under increasing pressure to make cities safer, accessible, sustainable, and prosperous.

Underlying all these important goals is the bedrock that makes a city run: infrastructure. Whether it be water, electricity, streets, traffic lights, cities are increasingly using the Internet of Things (IoT) to manage their infrastructure by capturing and analyzing data from connected devices and sensors. This gives city managers real-time insights to improve operational efficiency and outcomes and to altogether rethink and reinvent city government functions and operations.

Microsoft and its ecosystem of service and hardware providers are deeply engaged with cities and communities around the world, addressing the most pressing issues that government leaders face. For instance, traffic congestion continues to increase in most urban areas, placing growing pressure on existing physical infrastructure. In the emerging world, new physical infrastructure needs to be built altogether. Citizens also have growing concerns about public safety and security. Investments in IoT-based solutions for city operations are accelerating to address these concerns, led by applications like smart street lighting, smart waste, and smart parking. Cities are also realizing the benefit of IoT for optimizing the management of globally scarce resources, such as water and energy. Amidst this growing investment, early results from the world's leading smart cities are promising. Some cities have seen approximately 60 percent in energy savings from leveraging LED-based smart streetlights, while others have been able to save 25-80 liters of water per person per day. Optimized traffic flow in some areas is helping commuters shave 15-30 minutes daily, resulting in a 10-15 percent reduction in emissions, and 66 percent operational cost reduction from smart waste management.

Despite a growing consensus around the benefits of adopting IoT solutions, scaling beyond the proof of concept remains difficult. Most smart city solutions today consist of bespoke pilots, unable to scale or repeat due to growing costs, complexity, and lack of specialized technical talent, in a market landscape that is already incredibly fragmented. Earlier this year we surveyed 3,000 enterprise decision-makers across the world, including government organizations, of whom 83 percent consider IoT “critical” to success, notably for public safety and infrastructure and facilities management. At the same time, the vast majority of the decision-makers expressed concerns about persistent knowledge gaps for how to scale their solutions securely, reliably, and affordably, the main reason why the average maturity of production-level IoT projects remains extremely low (read the full IoT Signals report). In order to help IoT solution builders navigate the complexity of designing enterprise-grade IoT systems, we published our learnings in a whitepaper called “The 8 attributes of successful IoT solutions” to help IoT solution builders ask the right questions up front as they design their systems, and to help them select the right technology platforms.

Building Smart Cities IoT solutions with Azure IoT Central

To further help IoT solution builders confidently scale their projects, we recently announced updates to Azure IoT Central, our IoT app platform for designing, deploying, and managing enterprise-grade solutions. IoT Central provides a fully managed platform for building and customizing solutions, designed to support solution builders with each of the attributes of successful IoT systems, including security, disaster recovery, high availability, and more. By removing the complexity and overhead of setup, management, and operations, IoT Central is lowering the barrier for IoT solution builders across industries, and accelerates the creation of innovative solutions across all industries, from retail to healthcare to energy to government. Check out our recent IoT Central blog for a full list of our updates and examples of solution builders across different industries.

As part of our mission to democratize IoT for all, we released an initial set of Azure IoT Central government app templates to help solution builders start building IoT solutions quickly with out-of-box device command and control, monitoring and alerting, a user interface with built-in permissions, configurable dashboards, and extensibility APIs. Solution builders can brand, customize, and easily connect their solutions to their line of business applications, such as Dynamics 365 for integrated field service, Azure ML services, or their third-party services of choice.

Developers can get started today with any of the government app templates for free and access starter resources, including sample operator dashboards, simulated devices, pre-configured rules, and alerting to explore what is possible. We’ve also provided guidance for customizing and extending solutions with documentation, tutorials, and how-to’s. Ultimately you can brand and sell your finished solution to your customers, either directly or through Microsoft AppSource.

Government app templates available today:

Connected waste management: Sensors deployed in garbage containers in cities can inform how full a trash bin is and optimize waste collection routes. Moreover, advanced capabilities for smart waste applications involve the use of analytics to detect bin contamination.

Water quality monitoring: Traditional water quality monitoring relies on manual sampling techniques and field laboratory analysis, which is both time consuming and costly. By remotely monitoring water quality in real-time, water quality issues can be managed before citizens are impacted.

Water consumption monitoring: Traditional water consumption tracking relies on water operators manually reading water meters across various sites. More and more cities are replacing traditional meters with advanced smart meters, enabling remote monitoring of consumption as well as remote control of valves to manage water flow. Water consumption monitoring coupled with information and insights flowing back to individual households can increase awareness and reduce water consumption.

Expect to see more app templates for solution builders over time to cover other smart city scenarios, with templates for smart streetlights, air quality monitoring, smart parking, and more.

Innovative smart cities solution partners using Azure IoT Central

From established leading research organizations to enterprises to public utilities, we are seeing solution builders leverage Azure IoT Central to transform their public sector services.

Smart water infrastructure

Dutch-based company, Oasen, supplies 48 billion liters of high-quality drinking water every year to 750,000 residents across municipalities in the South Holland region. Oasen turned to Microsoft and OrangeNXT to digitally transform its water structure. Using Azure IoT Central, the company is introducing scalability, flexibility, and greater innovation to its operations through remote management of its water distribution network. Leveraging Azure Digital Twins and Azure IoT Central, Oasen connects multiple sources of data (including data extracted from smart water meters and smart valves in pipelines), to create a true digital twin of the water grid.

By remotely controlling and monitoring valves, Oasen can now automatically test grid sections (step-testing) to radically improve grid quality, as well as predict burst water mains and assess which pipelines are most at risk of damage and need repair. These smart water shutters and smart meter implementations significantly reduce manual work. Furthermore, the smart grid solution allows the automatic shutdown of sections of the distribution network if a leak is detected, preventing damage, and reducing water quality hazards.

Water quality monitoring

Other solution builders have built solutions for water quality management. According to the World Health Organization, nearly one-fourth of people across the globe drink water contaminated with feces, with an estimated 50 percent of the global population projected to live in water-stressed areas by 2025, (either in close proximity to polluted or otherwise scarce water sources). There has never been a greater need for high-quality data from liquid sensor networks to track ion levels in the water, which can fluctuate dramatically within the scope of several hundred meters and can have devastating impacts on public health. Imec, a leading international research and development firm specializing in nanoelectronics and digital technology, has developed water sensor devices from inexpensive ion sensors on silicone substrates for monitoring water quality in real-time.

Imec, together with partners, will pilot this solution in a testbed of about 2,500 sensors installed across the Flanders region in Belgium. The sensors detect salinity in the water in real-time, allowing officials to track water quality fluctuations over time. Imec’s water quality monitoring solution was built on Azure IoT Central, which provides the flexible foundation required to design, test, and scale the solution across the city.

“IoT Central is a fast and easy to use platform suitable for an innovative R&D organization such as ours. This means we can dedicate ourselves to enable large fine-grained networks of water quality sensors and, through the collected data, improve visibility into water quality and enable better water management to the mission to make water quality better visible. ”—Marcel Zevenbergen, Program Manager, Imec

Smart street lighting

Combined with LED conversion, smart street lighting solutions have helped uncover massive efficiency opportunities for cities, with operational savings typically reaching over 65 percent. Telensa is a world leader in connected streetlight solutions, managing over 1.7 million poles in 400 cities around the world. Telensa PLANet is an end-to-end smart street lighting system consisting of wireless nodes that connect individual lights to a dedicated network and a central management application. The system helps cities reduce energy and maintenance costs while improving the efficiency of maintenance through automatic fault reporting and turning streetlight poles into hubs for other smart city sensors, such as for air quality and traffic monitoring. Since no two cities are the same, Telensa has developed its Urban IQ solution to enables cities to add any 3rd party sensors to their connected street lighting, make the insights available across city departments, and to provide sophisticated real-time visualization out of the box. Telensa built its Urban IQ solution with Azure IoT Central, to fit with current systems and to be ready for future directions. By moving device management and connectivity functions to IoT Central,  and dramatically lowering the cost of adding other sense and control apps to their Azure data fabric, Telensa can focus on enhancing smart city functionality and adding value for its customers.

Connecting the dots for smarter cities

With solutions that take full advantage of the intelligent cloud and intelligent edge, we continue to demonstrate how cloud, IoT, and artificial intelligence (AI) have the power to drastically transform and enhances cities to be more sustainable, enjoyable, and inclusive. Azure IoT continues to accelerate results with a growing and diverse set of partners creating solutions relevant to smart cities from spatially-aware solutions that provide real-world context, to smart grids of the future, to urban mobility and spatial intelligence. Together, we can build more intelligent and connected cities that empower people and organizations to achieve more.

Get started today with Azure IoT Central.

Smart City Expo World Congress

Microsoft will be at Smart City Expo World Congress, the industry-leading event for urbanization, to connect smart city technologies and partners with cities on a digital transformation journey. Visit our booth at Gran Via, Hall P2, Stand B223 and learn more about our conference presence at SCEWC 2019. We also encourage you to meet with us at the following sessions:

Congress | Solutions Talk: Keys to Achieving Digital Transformation in Government – Wednesday, November 20 at 10:30 AM
Microsoft Booth 33 – Learning Hub: Talking cities: from smart streetlights to smart water to smart traffic – Wednesday, November 20 at 15:00 PM
Microsoft Booth 33 – Learning Hub: Mobility insights for Smart City AI – Tuesday, November 19th at 12:00 PM

Quelle: Azure

Azure Container Registry: Preview of diagnostic and audit logs

The Azure Container Registry team is happy to announce the preview of audit logs – one of our top items on UserVoice. In this release, we have new Azure portal and command-line interface (CLI) experiences to enable resource logs for diagnostic and audit evaluation of your registry logs.

This feature enables a capability to monitor your container registry by providing an audit trail of all relevant user driven activities on the registry. These logs contain information related to authentication, login details, repository level activities, and other user-driven events. In addition to these logs, Azure also provides a generic activity log which maintains a range of Azure Resource Manager information, including service health and other Azure management operations on the registry.

This feature also enables a user to turn on the resource logs for their container registry and can help facilitate with some of their compliance and diagnosing needs related to:

Security and compliance related tracking.
Diagnosing operational issues related to registry activities such as pull, push events.

Collection of resource logs for your registry however requires some additional steps as they are not turned on by default. Figure one displays how to configure diagnostics settings to enable Log Analytics. The logs can be viewed in Azure Monitor but would first require to be collected into a Log Analytics workspace.

Figure one

You can find the detailed steps to set up diagnostic workspace for collecting the logs and to use Azure Monitor for viewing the registry logs.

Azure Monitor is the consistent means to view and visualize your resource logs in Azure. Once the logs collections has been setup in Log Analytics, you can begin to view the logs data by running these queries. Figure 2 shows an example of running one of the sample queries.

Figure two

The current release is preview, in the future we will provide logs on other registry events like Delete, Untag, Replication, and more. Please continue to provide your feedback to help prioritize these feature asks.

Availability and feedback

Push, Pull, and Login event logs are currently available with delete and untag event logs to follow shortly.  As always, we love to hear your feedback on existing features as well as ideas for product roadmap.

Here’s a list of resources how you can use to engage with our team and provide feedback:

Roadmap – For visibility into our planned work.
UserVoice – To vote for existing requests or create a new request.
Issues – To view existing bugs and issues, logging new ones.
Azure Container Registry documents – For Container Registry tutorials and documentation.

Quelle: Azure

Improving observability of your Kubernetes deployments with Azure Monitor for containers

Over the past few years, we’ve seen significant changes in how an application is thought of and developed, especially with the adoption of containers and the move from traditional monolithic applications to microservices applications. This shift also affects how we think about modern application monitoring, now with greater adoption of open source technologies and the introduction of observability concepts.

In the past, vendors owned the application and infrastructure, and as a result, they knew what metrics to monitor. With open source products growing in number, vendors do not own all the metrics, and custom metrics are extremely necessary with current monitoring tools. Unlike the monolith application, which is a single deployment unit with a simple status of healthy or not, modern applications will consist of dozens of different microservices with fractional n-states. This is due to the sophisticated deployment strategies and rollbacks where customers may be running different versions of the same services in production, especially on Kubernetes. Thus, embracing these shifts is essential in monitoring.

Custom metrics and open source technologies help improve the observability of specific components of your application, but you also need to monitor the full stack. Azure Monitor for containers embraces both observability through live data and collecting custom metrics using Prometheus, providing the full stack end-to-end monitoring from nodes to Kubernetes infrastructure to workloads.

Collecting Prometheus metrics and viewing using Grafana dashboards

By instrumenting Prometheus SDK into your workloads, Azure Monitor for containers can scrape the metrics exposed from Prometheus end-points so you can quickly gather failure rates, response per secs, and latency. You can use Prometheus to collect some of the Kubernetes infrastructure metrics that are not provided out of the box by Azure Monitor by configuring the containerized agent.

From Log Analytics, you can easily run a Kusto Query Language (KQL) query and create your custom dashboard in the Azure portal dashboard. For many customers using Grafana to support their dashboard requirements, you can visualize the container and Prometheus metrics in a Grafana dashboard.

Below is an example of a dashboard that provides an end-to-end Azure Kubernetes Service (AKS) cluster overview, node performances, Kubernetes infrastructure, and workloads.
  

If you would like to monitor or troubleshoot other scenarios, such as list of all workload live sites, or noisy neighbor issues on a worker node, you can always switch to Azure Monitor for container to view the visualizations included from the Grafana dashboard by clicking on Azure Monitor – Container Insights in the top right-hand corner.

  
Azure Monitor for containers provides the live, real-time data of container logs and Kubernetes event logs to provide observability as seen above. You can see your deployments immediately and observe any anomalies using the live data.

If you are interested in trying Azure Monitor for containers, please check the documentation. Once you have enabled the monitoring, and if you would like to try the Grafana template, please go to the Grafana gallery. This template will light up using the out-of-the-box data collected from Azure Monitor for containers. If you want to add more charts to view other metrics collected, you can do so by checking our documentation.

Prometheus data collection and Grafana are also supported for AKS Engine as well.

For any feedback or suggestions, please reach out to us through Azure Community Support or Stack Overflow.
Quelle: Azure

Save more on Azure usage—Announcing reservations for six more services

With reserved capacity, you get significant discounts over your on-demand costs by committing to long-term usage of a service. We are pleased to share reserved capacity offerings for the following additional services. With the addition of these services, we now support reservations for 16 services, giving you more options to save and get better cost predictability across more workloads.

Blob Storage (GPv2) and Azure Data Lake Storage (Gen2).
Azure Database for MySQL.
Azure Database for PostgreSQL.
Azure Database for MariaDB.
Azure Data Explorer.
Premium SSD Managed Disks.

Blob Storage (GPv2) and Azure Data Lake Storage (Gen2)

Save up to 38 percent on your Azure data storage costs by pre-purchasing reserved capacity for one or three years. Reserved capacity can be pre-purchased in increments of 100 TB and 1 PB sizes, and is available for hot, cool, and archive storage tiers for all applicable storage redundancies. You can also use the upfront or monthly payment option, depending on your cash flow requirements.

The reservation discount will automatically apply to data stored on Azure Blob (GPv2) and Azure Data Lake Storage (Gen2). Discounts are applied hourly on the total data stored in that hour. Unused reserved capacity doesn’t carry over.

Storage reservations are flexible, which means you can exchange or cancel your reservation should your storage requirements change in the future (limits apply).

Purchase reserved capacity from Azure portal, or read the documentation.

Azure Database for MySQL, PostgreSQL, and MariaDB

Save up to 51 percent on your Azure Database costs for MySQL, PostgreSQL, and MariaDB by pre-purchasing reserved capacity. Reservation discount applies to the compute usage for these products and is available for both general-purpose and memory-optimized deployments. You can choose to pay monthly for the reservations.

As with all reservations, reservation discounts will automatically apply to the matching database deployments, so you don't need to do make any changes to your resources to get reservation discounts. The discount applies hourly on the compute usage. Unused reserved hours don't carry over.

You can exchange your reservations to move from general-purpose to memory-optimized, or vice-versa, any time after purchase. You can also cancel the reservation to receive a prorated amount back (limits apply).

Purchase reserved capacity from Azure portal, or read the documentation.

Azure Data Explorer Markup reserved capacity

Save up to 30 percent on your Azure Data Explorer Markup costs with reserved capacity. The reservation discount only applies on the markup meter, other charges, including compute and storage, are billed separately. You can also purchase reservations for virtual machines (VM) and storage to save even more on your total cost of ownership for Azure Data Explorer (Kusto) clusters. You can choose to pay monthly for the Azure Data Explorer markup reservations.

After purchase, the reservation discount will automatically apply to the matching cluster. The discount applies hourly on the markup usage. Unused reserved hours don't carry over. As usual, you can exchange or cancel the reservation should your needs change (limits apply).

Purchase reserved capacity from Azure portal, or read the documentation.

Premium SSD Managed Disks

Save up to 5 percent on your Premium SSD Managed Disk usage with reserved capacity. Discounts are applied hourly on the disks deployed in that hour regardless of whether the disks are attached to a VM. Unused reserved hours don't carry over. Reservation discount does not apply to Premium SSD Unmanaged Disks or Page Blobs consumption.

Disk reservations are flexible, which means you can exchange or cancel your reservation should your storage requirements change in the future (limits apply).

Purchase reserved capacity from Azure portal, or read the documentation.
Quelle: Azure

GitHub Actions for Azure is now generally available

GitHub Actions make it possible to create simple yet powerful workflows to automate software compilation and delivery integrated with GitHub. These actions, defined in YAML files, allow you to trigger an automated workflow process on any GitHub event, such as code commits, creation of Pull Requests or new GitHub Releases, and more.

As GitHub just announced the public availability of their Actions feature today, we’re announcing that the GitHub Actions for Azure are now generally available.

You can find all the GitHub Actions for Azure and their repositories listed on GitHub with documentation and sample templates to help you easily create workflows to build, test, package, release and deploy to Azure, following a push or pull request.

You can also use Azure starter templates to easily create GitHub CI/CD workflows targeting Azure to deploy your apps created with popular languages and frameworks including .NET, Node.js, Java, PHP, Ruby, or Python, in containers or running on any operating system.

Connect to Azure

Authenticate your Azure subscription using the Azure login (azure/login) action and a service principal. You can then run Azure CLI scripts to create and manage any Azure resource using the Azure CLI (azure/cli) action, which sets up the GitHub Action runner environment with the latest (or any user-specified) version of the Azure CLI.

Deploy a Web app

Azure App Service is a managed platform for deploying and scaling web applications. You can easily deploy your web app to Azure App Service with the Azure WebApp (azure/webapps-deploy)and Azure Web App for Containers (azure/webapps-container-deploy) actions. You could also configure App settings and Connection Strings using the Azure App Service Settings (azure/appservice-settings) action.

Learn more about Azure App Service.

Deploy a serverless Function app

Streamline the deployment of your serverless applications to Azure Functions, an event-driven serverless compute platform, by bringing either your code using the Azure Functions action (azure/functions-action) or your custom container image using the Azure Functions for containers action (azure/functions-container-action) .

Learn more about Azure Functions.

Build and Deploy containerized Apps

For containerized apps (single- or multi-containers) use the Docker Login action (azure/docker-login) to create a complete workflow to build container images, push to a container registry (Docker Hub or Azure Container Registry), and then deploy the images to an Azure web app, Azure Function for Containers, or to Kubernetes.

Deploy to Kubernetes

We have released multiple actions and to help you connect to a Kubernetes cluster running on-premises or on any cloud (including Azure Kubernetes Service), bake and deploy manifests, substitute artifacts, check rollout status, and handle secrets within the cluster.

Kubectl tool installer (azure/setup-kubectl): Installs a specific version of kubectl on the runner.
Kubernetes set context (azure/k8s-set-context): Used for setting the target Kubernetes cluster context which will be used by other actions or run any kubectl commands.
AKS set context (azure/aks-set-context): Used for setting the target Azure Kubernetes Service cluster context.
Kubernetes create secret (azure/k8s-create-secret): Create a generic secret or docker-registry secret in the Kubernetes cluster.
Kubernetes deploy (azure/k8s-deploy): Use this to deploy manifests to Kubernetes clusters.
Setup Helm (azure/setup-helm): Install a specific version of Helm binary on the runner.
Kubernetes bake (azure/k8s-bake): Use this action to bake manifest file to be used for deployments using Helm 2, kustomize, or Kompose.

To deploy to a cluster on Azure Kubernetes Service (AKS), you could use azure/aks-set-context to communicate with the AKS cluster, and then use azure/k8s-create-secret to create a pull image secret and finally use the azure/k8s-deploy to deploy the manifest files.

Deploy to Azure SQL or MySQL databases

We now have an action for Azure SQL Databases (azure/sql-action) that uses a connection string for authentication and DACPAC/SQL scripts to deploy to your Azure SQL Database.

If you would like to deploy to an Azure Database for MySQL database using MySQL scripts, use the MySQL action (azure/mysql-action) instead.

Trigger a run in Azure Pipelines

GitHub Actions make it easy to build, test, and deploy your code right from GitHub, but you can also use it to trigger external CI/CD tools and services, including Azure Pipelines. If your workflow requires an Azure Pipelines run for deployment to a specific Azure Pipelines environment, as an example, the Azure Pipelines (azure/pipelines) action will enable you to trigger this run as part of your Actions workflow.

Utility Actions

Finally, we also released an action for variable substitution Microsoft/variable-substitution, which enables you to parameterize the values in JSON, XML, or YAML files (including configuration files, manifests, and more) within a GitHub Action workflow.

More coming soon

We will continue improving upon our available set of GitHub Actions, and will release new ones to cover more Azure services.

Please try out the GitHub Actions for Azure and share your feedback via Twitter on @Azure. If you encounter a problem, please open an issue on the GitHub repository for the specific action.
Quelle: Azure

Azure Container Registry: preview of repository-scoped permissions

The Azure Container Registry (ACR) team is rolling out the preview of repository scoped role-based access control (RBAC) permissions, our top-voted item on UserVoice. In this release, we have a command-line interface (CLI) experience for you to try and provide feedback.

ACR already supports several authentication options using identities that have role-based access to an entire registry. However, for multi-team scenarios, you might want to consolidate multiple teams into a single registry, limiting each team’s access to their specific repositories. Repository scoped RBAC now enables this functionality.

Here are some of the scenarios where repository scoped permissions might come in handy:

Limit repository access to specific user groups within your organization. For example, provide write access to developers who build images that target specific repositories, and read access to teams that deploy from those repositories.

Provide millions of IoT devices with individual access to pull images from specific repositories.

Provide an external organization with permissions to specific repositories.

In this release, we have introduced tokens as a mechanism to implement repository scoped RBAC permissions. A token is a credential used to authenticate with the registry. It can be backed by username and password or Azure Active Directory(AAD) objects like Azure Active Directory users, service principals, and managed identities. For this release, we have provided tokens backed by username and password. Future releases will support tokens backed by Azure Active Directory objects like Azure Active Directory users, service principals, and managed identities. See Figure 1.

*Support for Azure Active Directory (AAD) backed token will be available in a future release.

Figure 1

Figure 2 below describes the relationship between tokens and scope-maps.

A token is a credential used to authenticate with the registry. It has a permitted set of actions which are scoped to one or more repositories. Once you have generated a token, you can use it to authenticate with your registry. You can do a docker login using the following command:

docker login –username mytoken –password-stdin myregistry.azurecr.io.

A scope map is a registry object that groups repository permissions you apply to a token. It provides a graph of access to one or more repositories. You can apply scoped repository permissions to a token or reapply them to other tokens. If you don't apply a scope map when creating a token, a scope map is automatically created for you, to save the permission settings.

A scope map helps you configure multiple users with identical access to a set of repositories.

Figure 2

As customers use containers and other artifacts for their IoT deployment, the number of devices can grow into the millions. In order to support the scale of IoT, Azure Container Registry has implemented repository based RBAC, using tokens (figure 3). Tokens are not a replacement for service principals or managed identities. You can add tokens as an additional option providing scalability of IoT deployment scenarios.

This article shows how to create a token with permissions restricted to a specific repository within a registry. With the introduction of token-based repository permissions, you can now provide users or services with scoped and time-limited access to repositories without requiring an Azure Active Directory identity. In the future, we will support tokens backed by Azure Active Directory objects. Check out this new feature and let us know your feedback on GitHub.

Figure 3

Availability and feedback

Azure CLI experience is now in preview. As always, we love to hear your feedback on existing features as well as ideas for our product roadmap.

Roadmap: For visibility into our planned work.

UserVoice: To vote for existing requests or create a new request.

Issues: To view existing bugs and issues, or log new ones.

ACR documents: For ACR tutorials and documentation.
Quelle: Azure