How smart buildings can help combat climate change

The Internet of Things (IoT) is changing the way governments and organizations tackle some of humanity’s thorniest challenges. In this three-part series, we looked first at common issues leaders must address to drive digital transformation in their cities. Next, we’ll be focusing on exciting, major applications of IoT. This post focuses on combatting climate change with technology, and a companion piece highlights advances in disaster management. 

With rising temperatures, extreme weather events, and other environmental impacts, signs of climate change are on the rise. Scientists are now warning that global temperatures are accelerating past the goal of the 2015 Paris Agreement (an increase of no more than 1.5°C to 2°C) and are expected to climb an average 3.2°C globally by 2100 if unchecked. In addition, significant demographic shifts are driving parallel impacts. The world’s population is anticipated to soar from 7.6 billion to 9.8 billion by 2050, with 70 percent living in urban areas.

Fast-paced urbanization will require large cites to maintain an uninterrupted supply of energy to power food and water production, transportation, residential and commercial life, and health and human services, all of which occurs in and around buildings. In fact, buildings and construction currently account for 36 percent of global final energy use and 39 percent of energy-related carbon dioxide emissions when upstream power generation is considered.

Smart buildings are a faster fix to reduce energy use 

Fast-paced urbanization offers an exciting opportunity to immediately reduce climate impacts. Because buildings—office complexes, multifamily housing, hotels, stores, schools, hospitals, and malls, among others—comprise a big part of city infrastructure, making them smarter can dramatically lower the energy and carbon footprint of a city. Thus, reducing climate change can happen one building at a time, by updating older buildings and by building technology into new construction. We don’t have to wait for shifts in public policy or huge budget allocations to make an impact, individually and collectively.

Connected building technology can manage lighting, heating, and cooling, reducing unnecessary use while maximizing usability and comfort. In addition, smart building software can schedule preventive maintenance, automatically identify and prioritize issues for resolution by cost and impact, and continually optimize buildings for comfort and energy efficiency. Despite these benefits, building owners have historically been hesitant to engage in these solutions, fearing a prolonged, costly, “strip to the studs” retrofit and remodel. However, changes can start as simply as using technology to manage and optimize HVAC usage.

Cutting energy usage at Microsoft headquarters by one-fifth

Microsoft was concerned massive CapEx investments would be needed when its building team analyzed its own portfolio. But the team discovered another way—a software retrofit instead of a physical retrofit. The company worked with ICONICS to develop an “analytical blanket” that enabled 30,000 sensor-connected pieces of equipment and diverse building management systems across 125 buildings to talk to each other. It also provides building managers with the analytics, machine learning, and online dashboards to drive optimization programs. Imagine being able to see at one glance how all your buildings consume energy and other resources and share that information with decision makers to shape priorities. With data visualization, courtesy of online dashboards, that is possible.

“The net result of the 88 Acres project was that it led to a really hyper-scalable project that could be deployed across buildings and cities,” says Daniel Lee, Senior Program Manager for digital transformation at Microsoft. “Microsoft has expanded its smart building program to more than 300 buildings across several campuses. With other efficiencies, we were able to reduce power usage from 59 megawatts of power, which is a small power plant, to 43 megawatts. That’s a 22 percent power reduction which saves millions of dollars annually.”

Companies like Microsoft are realizing that smart buildings don’t just make good environmental sense. They offer a compelling business rationale, reaping ROI in one to two years and then delivering significant cost savings for years to come.

Taking smart buildings to the next level

All of these gains are impressive, and they can be pushed even further with new technology. In addition to reviewing online dashboards, building managers and facilities crews can now use analytics and modeling to build smarter spaces with Microsoft Azure Digital Twins. A spatial intelligence solution, Azure Digital Twins helps users understand relationships and interactions between people, places, and devices; identify needs and issues for addressing; and connect them to automated ticketing systems such as Dynamics 365 for Field Service.

“Azure Digital Twins enables you to take assets, like an air handler, a filter, a pump, or a motor, and describe them in the platform,” says Russ Agrusa, President and CEO of ICONICS, which delivers a SaaS-based, automated smart-building solution. “Meanwhile, you have IoT sensors—flow, pressure, temperature—this data is all flowing into the Azure Digital Twins service. While there, you can do a lot of smart-system work, such as visualizing and analyzing data, scheduling preventive maintenance, and empowering connected field workers.”

For example, sensors, Azure Digital Twins, and ICONICS software can be used to identify issues such as open windows that should be closed, heating and cooling systems that are running simultaneously, or excessive heating and cooling on weekends and holidays. This technology also can be used to adjust lighting, temperature, and technology to real needs—such as a meeting in 10 minutes—reducing unnecessary use without sacrificing comfort. 

Using commercial space efficiently: a double win

Azure Digital Twins can also be used to analyze how space is used, helping reduce the real estate footprint required by companies, which further contributes to reducing climate impact. For example, at Microsoft, the building team looked at real space occupancy and utilization data instead of traditional “space-per-head” metrics and found that on average the company had built spaces 20 percent larger than necessary. Knowing this, the company can make more efficient use of its owned and leased spaces. 

While any organization with buildings can benefit from Azure Digital Twins, the gains are especially significant for those managing large campuses or installations, such as global companies, education institutions, city governments, and the military.

Commercial office space is slated to grow from 87 billion square feet in 2012, the last year it was measured, to over 126 billion square feet in 2050. Nearly half that space can reach net-zero energy use if current energy efficiency technologies and roof solar panels are implemented. IoT can push gains so much further, enabling the world to move from being energy-wasteful, to energy-neutral, to energy-positive. Here’s how.

Taking smart buildings to the next level

Succeeding with smart buildings opens the door to achieving the vision of smart cities: interconnected systems that help maximize precious natural resources, create new sources of energy, and give energy back to the grid.

Smart building campuses can become connected ecosystems, with technology-enabled businesses, solar panels, smart-battery storage, and electric vehicle fleets, managing energy efficiently to avoid overloading the grid or adding new infrastructure. Azure Digital Twins can help model energy generation, transmission, and distribution relationships, to increase their just-in-time effectiveness. With the electric-vehicle-charging value chain, Azure Digital Twins connects consumers, cars, utility companies, and more to drive smart utilization and growth of this energy-saving form of transportation. And, finally, even individual consumers can become energy producers, by using solar roofs and electric cars to store energy for when they need it or to produce energy to sell back to utility companies.

A business case that’s hard to argue against

Decreasing building energy usage around the world would significantly reduce carbon emissions and slow global warming, giving climate change programs a running start to succeed. Since lower energy use and falling alternative energy prices equal lower costs, smart buildings also provide a compelling business case for motivating government mandates and driving faster adoption. Some 66 percent of global organizations say that better energy management is their primary business driver for making smart building investments, according to a recent Harvard Business Review survey.

“Azure Digital Twins technology is absolutely ideal for smart cities,” says Agrusa. “You can’t do connected cities without it, because everything is siloed otherwise and on-premises, and you really need a cloud, hyper-scale solution to do that.”

In closing, there is a path forward when it comes to climate change that is business-led and economically motivated. As companies and other organizations share their successes with smart-building energy savings and connectivity gains, it’s likely that governments and cities will join in with mandates, investments, and resources. Smart buildings provide a rare opportunity to do what is morally responsible, deliver better services and outcomes, create a sustainable future, and deliver pro-business cost savings.

Don’t forget to read about digital transformation in smart cities and how IoT is reinventing disaster management.

Learn about technology for smart cities.
Quelle: Azure

How news platforms can improve uptake with Microsoft Azure’s Video AI service

I’m Anna Thomas, an Applied Data Scientist within Microsoft Engineering. My goals are to enable the field and partners to better integrate various AI tools into their applications. Recently, my team reached out to Microsoft News to see how they’re analyzing their data, and how our services may be able to help.

Microsoft News ingests more than 100,000 articles and videos every day from various news providers. With so many different aspects such as classifying news topics, tagging and translating content, I was immediately interested in understanding how they process all of that information.

As it turns out, Microsoft News has been working on some pretty advanced algorithms that analyze their articles and determine how to increase personalization, which ultimately increases consumption, for years. However, when I asked them if there were any gaps, they were quick to answer that they would love more insight on their videos.

Analyzing videos at scale to obtain insights is a nontrivial task. Having insights on videos, especially for a news platform, can help with increasing search quality, user engagement through personalization, and the accessibility of videos through captioning, translating, and more. There are so many different aspects related to this: classifying different news topics (potentially even opinion detection/provider authority/sentiment), tagging various content, translating content, summarizing content, grouping similar content together, etc.

Exploring Video Indexer

I set off to determine how we could meet the requested requirements from Microsoft News by using what I thought would be a combination of Video Indexer, Cognitive Services (Text Analytics, Language Understanding, Computer Vision, Face, Content Moderator, and more), and maybe even some custom solutions.

Here are the results of my research: 

Requested feature

Service that could help

Speech-to-text extraction

Video Indexer API

Subtitling and captions for browsing the video

Video Indexer API

For profane language detection

Video Indexer API

Visual profanity detection – moderation use case

Video Indexer API

Topic identification – based on OCR and transcript – classification use case 

Video Indexer API

Ability to extract all frames and detect salient frames from keyframes using vision services (no reprocessing expected)

Video Indexer API

To choose the best frame for the promo card image

Video Indexer API

Could use to enhance the site with filmstrip for users navigating videos

Video Indexer API

Keywords/Labels/Entity extraction (Bing/Satori IDs) – based on vision and text

Video Indexer API

I must admit I was surprised – the Video Indexer API does an impressive job of combining the various insights that other Cognitive Service APIs could give you, such as profane language detection from the Content Moderator API or Speech-to-text from the Speech API. You can see the full list of features in the documentation.

Intrigued by how much the Video Indexer API claimed to be capable of, I decided to check it out myself. It was pretty easy to get a sample created (you can check out my sample on GitHub) in C# to create a simple console application to bulk upload and process Microsoft News videos.

For the purposes of this blog, I wanted to share the results from one video, an episode I did with the AI Show a while back on Bots and Cognitive Services. Since it’s only one video, I decided to use the Video Indexer portal, which has a simple UI for uploading and processing videos. The video is about 12 minutes long, and it took about 4 minutes to process (I checked with the team, and they reported you can expect it to take 30-50 percent of the timed length of the video to process, so this number seems about right).

Once the video was processed, I was provided with a link to a widget that I could use to embed my video with the enhanced filmstrip in a website. I’m also able to download all the insights as a JSON file. I’m able to change the language of the insights and transcripts and even just download the transcript.

Below, you’re able to see who was in the video. And, if Rodrigo Souza and I happened to be famous (apparently we’re not and now it’s proven!), Bing would return our names and the beginning of our Wikipedia biographies.

You can see that with the widget, I can actually skip through to clips when Rodrigo is the focus and talking. There are also keywords, and similar to as with clips of people, I can skip through the video to hear the clips containing different topics.

Video Indexer creates several other enhanced filmstrips that I can use to search the video, including visual labels, brands present, emotions, and keyframes. Video Indexer also gives the option to layer the transcript. Here, I’ve chosen to include people and the transcript.

I can also decide that I want all the insights or the layered transcript in another language. See a snippet of the same layered transcript in Spanish below.

Video Indexer makes it really easy to unlock insights from videos. While this specific scenario shows how valuable using Video Indexer for news can be, I think that using Video Indexer is relevant in other scenarios as well. Here are a few other ideas of how and where:

In large enterprises, there are tons of documents (and videos!) circulating the intranet. Videos could be related to sales, marketing, engineering, learning, and more. As an employee, it can be hard to find the videos that you need. In efforts to reuse IP and increase the accessibility of existing materials, you could use the Video Indexer (which has Azure Search built-in) to create an enhanced pipeline for searching videos.

Taking the previous example one step further, you could create a custom skill in Azure Cognitive Search to get insights from videos via an index on a schedule. You may also choose to use Predefined Skills to get insights on other types of documents (i.e., images, PDFs, PowerPoints, etc.). By using Azure Search to configure an enhanced indexing pipeline, you can provide users with access to an intelligent search service for all documents. Learn more about Cognitive Search in this LearnAI workshop.

In the education space, sometimes a student or researcher could spend a very long time trying to find the answer to a specific question, from a lecture or presentation. With the help of Video Indexer, finding videos that contain certain keywords or topics can become easier. To take it one step further, you could use the insights obtained from the Video Indexer API to create a transfer learning model that can perform Machine Reading Comprehension (basically training a model to retrieve the answer to a question a student/researcher may have).

Have other use-cases or tips for using Video Indexer or Azure Cognitive Services? Reach out, and together we can continue to democratize AI. Follow me on Twitter and LinkedIn.
Quelle: Azure

Extracting insights from IoT data using the warm path data flow

This blog continues our coverage of the solution guide published by Microsoft’s Industry Experiences team. The guide includes the following components:

Ingesting data
Hot path processing
Cold path processing
Analytics clients

We already covered the recommendation for processing data for an IoT application in the solution guide and suggested using Lambda architecture for data flow. To reiterate the data paths:

A batch layer (cold path) stores all incoming data in its raw form and performs batch processing on the data. The result of this processing is stored as a batch view. It is a slow-processing pipeline, executing complex analysis, combining data from multiple sources over a longer period (such as hours or days), and generating new information such as reports, and machine learning models, etc.
A speed layer and a serving layer (warm path) analyze data in real time. This layer is designed for low latency, at the expense of accuracy. It is a faster-processing pipeline that archives and displays incoming messages, and analyzes these records, generating short-term critical information and actions such as alarms.

This blog post covers the warm path processing components of the solution guide.

Azure Event Hubs is a big data streaming platform and event ingestion service and is a great option for ingesting the data from the data stream. This article explores relevant Azure native services. However, if you prefer to use Apache Software Foundation’s tools and technologies, Event Hubs also supports Apache Kafka MirrorMaker, Apache Flink, and Kafka Streams in the Apache Kafka ecosystems.

Using Event Hubs in a scalable way is important as the volume of data streamed through them increases. Please review documents on the Event Hubs features and the programming guide for the partition and consumer groups concepts. Choosing the best combination of the partitions and consumer groups depends on the nature of the data and how you want to process it. Aggregate numbers of partitions on an Event Hub should be consistent with the throughput units (TU) — please read the frequently asked questions for details. You can scale the ingestion of the messages on one Event Hub by using partitions. If the default maximum is not enough, you can submit a support request to increase the maximum. Another option is to use multiple Event Hubs to send different message types to separate hubs for a fan-out solution, depending on your scenario.

You can fan out the processing of the incoming data stream using consumer groups on the Event Hub, and compose complex processing topologies as your business requirements demand, as shown in the following figure.

Azure Stream Analytics (ASA) is an event-processing engine that enables you to examine high volumes of data streaming from devices. It can accept data from Event Hub, IoT Hub, or Blob Storage. You can use other stream processing frameworks such as Apache Flink for processing the incoming data stream. Please see this tutorial on how to use Apache Flink with Event Hubs for Apache Kafka.

Let’s assume we want to stream the changes on the temperature of a rotary component on a machine. We can write a query as follows to get the average over every five seconds using a tumbling window.

SELECT System.TimeStamp AS WindowTime, DeviceName, ComponentName, avg(cast(SampleValue as float)) AS Average
FROM s2tempsamples timestamp by SampleTimestamp
WHERE SampleName = 'S2temp' AND TRY_CAST(SampleValue AS float) IS NOT NULL AND DeviceName = 'Machine1'
GROUP BY SampleTimestamp, DeviceName, ComponentName, tumblingwindow(second, 5)

ASA provides four kinds of temporal windows to choose from: Tumbling, Hopping, Sliding, and Session. The following diagrams describe these window functions in order, and more information is available in the documentation.

A recent feature of ASA is the AnomalyDetection operator. It is used to detect three different types of anomalies:

Bi-directional Level Change. A sustained increase or decrease in the level of values, both upward and downward. This value is different from spikes and dips, which are instantaneous or short-lived changes.
Slow Positive Trend. A slow increase in the trend over time.
Slow Negative Trend. A slow decrease in the trend over time.

If we are interested in detecting gradual increases, or bi-directional changes of the temperature on the rotary component, the sample query in the documentation can be modified to analyze the incoming stream as follows:

WITH AggregationStep AS
(
SELECT System.Timestamp as tumblingWindowEnd, AVG(SampleValue) as avgValue
FROM s2tempsamples timestamp by SampleTimestamp
WHERE SampleName = 'S2temp'
AND TRY_CAST(SampleValue AS float) IS NOT NULL
AND DeviceName = 'Machine01'
GROUP BY TumblingWindow(second, 5)
),

FillInMissingValuesStep AS
(
SELECT System.Timestamp AS hoppingWindowEnd,
TopOne() OVER (ORDER BY tumblingWindowEnd DESC) AS lastEvent
FROM AggregationStep
GROUP BY HOPPINGWINDOW(second, 300, 5)
),

AnomalyDetectionStep AS
(
SELECT
hoppingWindowEnd,
lastEvent.tumblingWindowEnd as lastTumblingWindowEnd,
lastEvent.avgValue as lastEventAvgValue,
system.timestamp as anomalyDetectionStepTimestamp,
ANOMALYDETECTION(lastEvent.avgValue) OVER (LIMIT DURATION(hour, 1)) as
scores
FROM FillInMissingValuesStep
)

SELECT
alert = 1,
hoppingWindowEnd,
lastTumblingWindowEnd,
lastEventAvgValue,
anomalyDetectionStepTimestamp,
scores
INTO output
FROM AnomalyDetectionStep
WHERE
CAST(GetRecordPropertyValue(scores, 'BiLevelChangeScore') as float) >= 3.25
OR CAST(GetRecordPropertyValue(scores, 'SlowNegTrendScore') as float) >=
3.25

The query output can be directed to various outputs with an “INTO” clause in the query. You can find the currently available output types in this document. One of the output types is Power BI that allows (near) real-time visualization options. The ASA job can also be directed to a data store, such as Azure Cosmos DB, or Azure SQL Database.

You may notice our recurring theme in this post from the previous ones. Analyzing IoT data is very context-dependent. Taking “utilization” as an example, determining the time a machine is producing parts depends on how your shifts are organized, the type of part you are manufacturing, and the machines you are using, in addition to technical components such as the communication protocol the machine is supporting and its configurations.

This blog has presented technology options on Azure for processing an incoming data stream but offers just examples. The actual solution architecture and implementation depend on your business needs and context.

Next steps

Complete this tutorial if you want to use Apache Flink with Event Hubs for Apache Kafka.
Complete the Power BI and Stream Analytics tutorial.
Get the larger picture for extracting insights from IoT data from the solution guide.

Quelle: Azure

Native Python support on Azure App Service on Linux: new public preview!

We’re excited to officially announce the public preview of the built-in Python images for Azure App Service on Linux, a much requested feature by our customers. Developers can get started today deploying Python Web Apps to the cloud, on a fully-managed environment running on top of the Linux operating system.

This new preview runtime adds to a list of growing stacks supported by Azure App Service on Linux, which includes also Node.js, .NET Core, PHP, Java SE, Tomcat, and Ruby. With the choice of Python 3.7, 3.6 and soon 2.7, developers can get started quickly and deploy Python applications to the cloud, including Django and Flask, and leverage the full suite of features of Azure App Service on Linux. This includes support for deployments via “git push”, and the ability to deploy and debug live applications using Visual Studio Code (our free and open source editor for macOS, Linux, and Windows).

When you use the official images for Python on App Service on Linux, the platform automatically installs the dependencies specified in the requirements.txt​ file. Additionally, it detects common Flask and Django application structures and hosts them using gunicorn, and includes the necessary modules for connecting to Azure DB for PostgreSQL.

While the underlying infrastructure of Azure App Service on Linux has been generally available (GA) for over a year, at the moment we’re releasing the runtime for Python in public preview, with GA expected in a few months. In addition to using the built-in images, Python developers can deploy their applications using a custom Docker container on Web Apps for Containers.

Learn more about Python on Azure and Visual Studio Code

Carlton Gibson, Django Software Foundation fellow and core maintainer of the Django project, recently joined our developer advocate Nina Zakharenko for a video series on using Python/Django on Visual Studio Code, Azure, and Azure DevOps.

The full walkthrough is available on the Microsoft + Open Source blog.

Next steps

Try out Python on App Service on Linux using the Azure CLI.
Get started experience using Visual Studio Code.

Let us know your feedback!
Quelle: Azure

Azure Monitor for containers now generally available

We are happy to announce that Azure Monitor for containers is now generally available. Azure Monitor for containers monitors the health and performance of Kubernetes clusters hosted on Azure Kubernetes Service (AKS). Since the launch of the public preview at Build in May 2018, we have seen a lot of excitement from customers. Customers love the fact that you can enable monitoring as soon as you create an AKS cluster and get all the monitoring telemetry in a centralized location in Azure without having to login to containers or rely on other tools. Since the public preview, we have been adding more capabilities and refining the experience based on your feedback. Let’s look at some of the recent changes.

Multi-cluster view – You often have multiple AKS clusters to manage. Wouldn’t it be great to view and manage all your clusters together? The multi-cluster view discovers all AKS clusters across subscriptions, resource group, and workspaces, and provides you a health roll up view. You can even discover clusters that aren’t being monitored and with just few clicks start monitoring them. 

Drill down further into AKS cluster with Performance Grid view – To investigate further, you can drill down to performance grid view that shows the health and performance of your nodes, controllers, and containers. From the node view tab, you can easily spot the noisy neighbor issue on the pod and drill further to see the controller it is part of. You can further see the controller limits, request setting, and actual usage to determine if you have configured your controller correctly. You can continue investigating by looking at the Kubernetes event logs associated to that controller.

Live debugging – We all know the importance of verifying that your application is working as expected, especially after you deploy an update. With live logs you get a real time, live stream of your container logs directly in your Azure portal. You can pause the live stream and search within the log file for errors or issues. Unlike the Azure Monitor logs, the live stream data is ephemeral and is meant for real time troubleshooting.

Onboarding – In addition to the Azure portal, we have added more ways for you to automate onboarding Azure Monitor for containers.

Azure CLI and ARM template – With the add-on option you can onboard Azure Monitor for containers with a single command. The command will automatically create the default Log Analytics workspace and deploy the agent for you.

For new AKS clusters:

az aks create –resource-group myAKSCluster –name myAKSCluster –node-count 1 –enable-addons monitoring –generate-ssh-keys 

For existing AKS clusters:

az aks enable-addons -a monitoring -n MyExistingAKSCluster -g MyExistingAKSClusterRG 

You can also enable monitoring for your containers by using Azure Resource Manager (ARM) template. To learn more, please review the detailed instructions for onboarding using Azure CLI and ARM template.

Terraform – Similar to ARM template, if you are using Terraform to deploy AKS clusters, you can enable monitoring right from the template. To learn more read the documentation from Terraform on setting up AKS cluster, Log Analytics solution, and workspace.

We would like to conclude with some inspiring words from one of our customers, Hafslund, a Nordic power company, with whom we recently published a case study:

“We found it easy to get Azure Monitor up and running for containers. The metrics and charts right out of the Monitor box are perfect to help us quickly tune our clusters and services and resolve technical issues.”

– Ståle Heitmann, CTO, Hafslund Nett AS

To learn more about Azure Monitor for containers, read our documentation, “Azure Monitor for containers overview.” Thank you for your feedback during the public preview and we look forward to your continued support as we add more exciting features and capabilities to Azure Monitor for containers.
Quelle: Azure

KubeCon North America 2018: Serverless Kubernetes and community led innovation!

Welcome to KubeCon North America 2018, and welcome to Seattle. It’s amazing to get the chance to welcome you to my hometown, and the site of Kubernetes birth. It was barely five years ago that Joe, Craig, and I had the first small ideas and demos that eventually turned into the amazing project and community. I’m honored that all of you over the years have chosen to invest your time, energy, and enthusiasm in Kubernetes, whether this is your first KubeCon or you’ve been here since the first one in San Francisco four years ago, welcome!

For the Azure Kubernetes team, KubeCon is especially exciting. It’s been a busy and fulfilling year, Azure Kubernetes Service (AKS) has been the fastest growing service in the history of Azure Compute, that’s been quite a ride! With KubeCon here, it’s a great chance to meet up with our customers and community collaborators to celebrate all the incredible things.

For the Azure Kubernetes Service, we started with the journey of "how to make Kubernetes easier for our customers." For example, by letting Azure take care of deployment, operations, and management of Kubernetes APIs and leveraging integrated tools, Maersk was able to free their engineers and talents to focus on things that makes the most business impact. Furthermore, by taking advantage of a fully-managed runtime environment provided by AKS, Siemens Healthineers realized shorter release cycles and achieved its desired continuous delivery approach in highly regulated environment.

We're seeing more and more Java customers port their existing Java application stacks to AKS with little or no changes. Xerox, for example, was able to run their Java apps in containers with no code modifications and leveraged Helm chart to automate customer onboarding. As a result, for their DocuShare Flex Content Management platform they were able to reduce the provisioning time from 24 hours to less than 10 minutes, accelerating sales and customer onboarding.

While we’re discussing Azure Kubernetes Service, it’s great to see more and more Azure services bring their strengths to Kubernetes. Here at KubeCon, we’re announcing the general availability (GA) of the Azure Monitor for containers. The Azure Cognitive Services have also announced containerization of their cognitive APIs, allowing users to take advantage of core cognitive technology on-premise, at the edge or wherever your data lives. For the Azure Kubernetes team, it’s been an exceptionally busy month, starting with the announcement, at KubeCon Shanghai, of AKS in Azure’s China region. Just last week in Las Vegas, we announced the public preview of AKS virtual nodes which together with Azure Container Instances (ACI) helps customers realize and take advantage of a serverless container infrastructure.

But honestly, the service that we build is only one (albeit very important) piece of what we work on as a team. Of equal importance is the work that we do in the open source community to work with others to develop novel solutions to our customers problems. With help from the community, like the great folks at the open policy agent framework, we launched an open source policy controller for Kubernetes. This policy agent installs on Kubernetes clusters anywhere and can provide enterprises with assurances that developers will successfully build reliable and compliant systems. We also are announcing the Osiris open source project that enables efficient “scale-to-zero" for Kubernetes containers. This technology can power Functions as a Service, or any programming paradigm where you need rapid scale-up in response to customer traffic.

With Docker, Bitnami, Hashicorp, and others we’ve announced the Cloud Native Application Bundle (CNAB) specification. CNAB is a new distributed application package that combines Helm or other configuration tools with Docker images to provide a complete, self-installing cloud applications. To see what CNAB can do for you, imagine being able to hand out a USB key to KubeCon attendees that could install your complete application. Finally, we’re celebrating the adoption of the Virtual Kubelet project into the CNCF sandbox, as we continue to work with VMWare, AWS, hyper.sh, and others in the community to make nodeless Kubernetes a reality.

At KubeCon Shanghai, I talked about my thoughts on serverless Kubernetes and the evolution of cloud native development. It’s a future driven by our mission of “Kubernetes for Everyone,” this includes reducing the complexity of Kubernetes operations by running your API for you in AKS and developing ‘nodeless’ Kubernetes with virtual nodes. It also means working on tools like Draft, and the Kubernetes extension for Visual Studio Code, which has been installed by nearly 175 thousand people that make Kubernetes a more integrated, easy to use experience.

At KubeCon North America, I’m taking off my forward-looking cap, and instead talking about the development and maintenance of the Java, .NET, TypeScript, and Python clients for Kubernetes. Whether you’re interested in talking about the future of cloud computing, or adding features like port-forwarding to the TypeScript client. I’ll be around the conference all week at the Azure booth and in the hallway track.

When it comes to explaining Kubernetes, one of my favorites is the Children’s Illustrated Guide to Kubernetes. For this KubeCon, I’m incredibly excited to announce that Microsoft is donating the likeness of Phippy, and all of your favorites from the book to the CNCF. To celebrate, we’re sharing a special second episode of the Children’s guide to Kubernetes. You can learn about the core concepts of Kubernetes in a fun way!

Whether you’re joining us in Seattle for KubeCon, or watching the talk streams from afar, we’ve got some great resources to get you started with Kubernetes, including the recently published best practices we’ve gathered from our customers and a webinar I will be sharing on structuring Kubernetes project in production.

Welcome to Seattle!

–brendan
Quelle: Azure

A hybrid approach to Kubernetes

We’re excited to see everyone at Kubecon this week! We’ve been working with our customers to understand how they’re thinking about Kubernetes and what we can do to make it easier for them. Azure Stack unleashes new hybrid capabilities for developing applications. You design, develop, and maintain your applications just like you do with Azure and you can deploy to any of the Azure clouds. Your application’s location becomes a configuration parameter rather than a design constraint.

So how does Azure Stack work with containers exactly? The way that containers and hybrid cloud work together can allow you to solve many problems. You can create a set of apps in containers using the languages you love like NodeJS, Python, Ruby, and many others. You can also take advantage of the wide array of tooling available, including Visual Studio Code. You can deploy your container or set of containers to a mix of environments that meet your user’s requirements. For instance, you can keep your sensitive data local in Azure Stack and access current functionality such as Azure Cognitive Services in global Azure. Or you can develop your apps in global Azure where you developers are and then deploy the containerized apps to a private cloud in Azure Stack that becomes completely disconnected on board a submarine. The possibilities are endless.

Azure Stack allows you to run your containers on-premise in pretty much the same you as you do with global Azure. You can choose the best place for your containers depending on data gravity, data sovereignty, or other business needs. Containers let you use Azure Services from your host running on-premise and lets you take advantage of the secure infrastructure, integrated Role Based access Control, and seamless DevOps tools allowing you to create a single pipeline targeting multiple Azure clouds. Your containers and supporting services are hosted in a secure infrastructure that integrates with your corporate network.

The Kubernetes Marketplace item available in Preview for Azure Stack is consistent with Azure since the template is generated by the Azure Container Service Engine, the resulting cluster will run the same containers as in AKS. It also complies with the Cloud Native Foundation.

Your developers can also use Open Shift Container Platform in Azure Stack. Open Shift provides a consistent container experience across Azure, Azure Stack, bare-metal, Windows, and RHEL. Open Shift brings together Microsoft and Red Hat developer frameworks and partner ecosystems as previously announced in September.

When you take your containers across Azure, Azure Stack, and Azure sovereign clouds, you should also consider that your application architecture likely depends on more than containers. Your application likely depends on numerous resources with different, specific versions. To make it easier to manage this, we recently announced the Cloud Native Application Bundles, a new open source package format specification created in close partnership with Docker and broadly supported by HashiCorp, Bitnami, and more. With Cloud Native Application Bundles, you can manage distributed applications using a single installable file, reliably provision application resources in different environments, and easily manage your application lifecycle without having to use multiple tools.

This week is KubeCon and if you are attending you can see Kubernetes and Azure Stack in action in the Expo Hall. Please drop by our booth #P18 to see great demos of the technologies I mentioned in this post.

I hope you find the information in this post useful! Stay tuned for new topics around developing hybrid applications and feel free to follow me on Twitter.

To learn more about hybrid application development, read the previous post in this series: "What you need to know when writing hybrid applications."
Quelle: Azure

Azure Marketplace new offers – Volume 26

We continue to expand the Azure Marketplace ecosystem. During September and October, 149 new consulting offers successfully met the onboarding criteria and went live. See details of the new offers below:

Consulting Services

 
1-Day Big Data Cloud Migration Workshop: This one-day workshop from Hashmap is for business and technical leaders and key stakeholders, and it's held at the client's facility. Receive guidance and assistance in defining services and the overall solution.

 
2-Week Azure Assessment: Assess your current datacenter environment (up to 100 VMs) and receive optimization recommendations for a cloud migration. An Azure architect from Catapult Systems will review the results of an assessment tool and provide recommendations.

 
3 Day Cloud Adoption Workshop: Black Marble's on-site cloud adoption workshop is free-form and wide-ranging, usually covering technology and architecture; development, deployment, and testing; monitoring and maintenance; security and compliance; and more.

 
3 Day Integration Health Check: A Black Marble consultant will spend one day with the customer, assessing the integration solution and producing a report based on the findings. Two days will be spent off-site to document system performance and make recommendations.

 
3-Day Office 365 Security Assessment: This assessment by ProServeIT is for any business stakeholder responsible for customer data and security. It includes a kickoff presentation, a questionnaire, and a presentation with a list of Office 365 security recommendations.

 
5 Day Azure DevOps Migration Workshop: A Black Marble consultant will deliver a two-day on-site workshop to discuss options for migrating on-premises TFS to Azure DevOps (formerly VSTS), followed by a three-day report covering discussions, recommendations, and next steps.

 
Active Directory Health Check: 5-Day Assessment: Tallan will work with you and your team to make documented suggestions to improve the administration, health, monitoring, auditing, alerting, backup, and replication of Active Directory.

 
AgileDataCenter: 2-Week Assessment: The Azure Readiness Assessment Planning offer is intended to assist a company's internal IT department in preparing for a datacenter migration to Microsoft's Infrastructure-as-a-Service platform.

 
AI and Insights Ideation Workshop: Catapult Systems will lead you through a co-creation event. Our AI ideation workshops guide organizations through the art of the possible so that we build a practical and actionable AI adoption road map.

 
AI Innovation: 2-Week Workshop: An innovation sprint helps clients think through the art of the possible using AI technologies and develop a set of ideas and a business case that can be rapidly built and tested. A small BJSS team works with client teams to answer critical questions.

 
ANSYS Cloud HPC Proof of Concept: This two-week proof of concept is designed to demonstrate how ANSYS can be configured and scaled on Azure HPC. Azure HPC is a scalable modern infrastructure solution that can expand or shrink based on your engineers’ needs.

 
API Economy – 1-Week Proof of Concept: For many organizations, APIs are now a critical component of solutions that impact the bottom line. The API Economy Platform Blueprint takes away the risk and steep learning curve of bringing an API-based proposition to market.

 
Application Modernization: 2 Week Trial Workshop: This engagement is designed to demonstrate the ease and value that Microsoft Azure and Docker containers deliver as a core part of an application modernization strategy.

 
Artificial Intelligence: 1-Hour Briefing: This briefing will articulate the benefits of a product-centric approach to designing and building AI solutions, from two-week proof-of-concept sprints and the rapid delivery of testable alphas to fully integrated enterprise-scale solutions.

 
Ask the Azure Expert – 3-Day Assessment: Ask the Azure Expert is your easy, fast, and free way to receive technical recommendations from the Topcoder community. Topcoder gives you on-demand access to a global network of developers ready to tackle your Azure-related questions.

 
Automated Lead Generation with AI Implementation: Week 1 will involve requirements engineering, a workshop, model training, infrastructure, and implementation of the solution. Week 2 will involve quality assurance, documentation, training, and change management.

 

Aztek CloudCare 1~30 Days Implementation: Aztek CloudCare delivers an array of support services for Microsoft Azure. Aztek CloudCare will supply business continuity, let you manage your budget wisely, and enable your organization to keep up with the latest cloud technology.

 

Azure – SQL Database, Disaster Recovery Assessment: CodeCenters International will review your SQL server, Azure database, and/or Analysis Services solution and provide an end-to-end backup and disaster recovery plan, including timelines and Azure cost projections.

 

Azure Active Directory: 1/2 Day Virtual Workshop: In this one-on-one workshop from 360 Visibility, you will gain an understanding of what Azure Active Directory is and how it can provide a more secure solution for your organization.

 
Azure ASM to ARM Planning and Migration: 2-Weeks: Presidio's team will review every aspect of your Azure ASM Resource Pool and produce a plan to provide a seamless transition to ARM-based solutions.

 
Azure Backup – 5 Days Assessment: This on-premises assessment by Programmer's will define a backup/archiving plan for a Microsoft Azure storage environment.

 
Azure Backup and Site Recovery: 1/2 Day Workshop: This one-on-one session from 360 Visibility will turn a complex subject into a simple, comprehensive, and robust solution that can be customized to meet your needs.

 
Azure Backup Recovery Planning and Delivery: 2-Weeks: Presidio's team will review your on-premises VMware and physically hosted application pool and produce a plan to provide advanced backup, disaster recovery, and business continuity plan failover solutions.

 
Azure Cloud Introduction: 1/2 Day Virtual Workshop: In this one-on-one session, 360 Visibility will develop a plan to make your transition to Microsoft Azure as easy as possible while mitigating potential challenges so you can maximize your return on investment.

 
Azure Data Center Migration: 1-Hour Briefing: This briefing from Communication Square will introduce Microsoft Azure and detail how you can benefit from it, covering cloud security, backups, disaster recovery, and a look at different industries using Azure.

 
Azure Data Center Migration: 2-Day Assessment: Communication Square will assess your environment and the cloud migration processes that fit the needs of your organization, then devise a road map for the migration.

 
Azure DevOps Hackathon: 3-Day Workshop: This workshop will cover the latest Azure technologies and foster team building. Teams will be provided quick-start materials so they can dive right into the challenge rather than spending time provisioning resources.

 
Azure Financial Management: 3-Day Workshop: This 3-day workshop by Catapult Systems is designed to help customers put processes in place for Azure financial awareness and optimization, departmental chargebacks, and account and subscription resource tags.

 
Azure Governance Solution: When paired with governance and security measures established and maintained via a proven governance model, the benefits of the public cloud can be fully realized.

 
Azure Governance: 1-Wk Workshop: By the end of this workshop by Thomas Duryea Logicalis, clients will understand their security profile, cost savings/management strategies, billing and reporting, and optimization through automation.

 
Azure Governance: 2-Day Workshop: This workshop by BrainScale will help you understand your governance requirements, presenting overall cloud governance pillars, covering compliance and billing, and developing an implementation plan.

 
Azure Governance: 3-Day Workshop: This three-day workshop by Catapult Systems is designed to help customers understand the role of governance in provisioning and managing Azure subscriptions and services.

 
Azure HPC Linux Cluster Implementation: 2-Weeks: Presidio's team will review your on-premises VMware and physically hosted application pool and produce a plan to provide high-performance computing solutions at a lower cost than on-premises infrastructure alone.

 
Azure Hybrid Cloud 5-Day Proof of Concept: During this proof-of-concept engagement, Meritum Cloud will carry out a discovery workshop and build an Azure tenant for the customer, configuring the environment based on security best practices.

 
Azure IaaS 10-Day Workshop: Whether you are new to the cloud, looking to enhance productivity, or wanting to streamline costs, this 10-day workshop by Tallan will help you understand what Azure Infrastructure-as-a-Service can do for you.

 
Azure IaaS Proof of Concept: 3-Day Engagement: In this engagement by Foundation IT, you will gain a quick and focused understanding of the benefits that Microsoft Azure Infrastructure-as-a-Service can bring to your organization.

 
Azure Migration Evaluation: Half-Day Workshop: This free workshop by Sysgain will educate business and IT professionals about migrating workloads to Azure. The workshop will be delivered remotely by Sysgain cloud experts and will be customized to each customer's needs.

 
Azure Migration Plan and Funding: 4-Hour Assessment: In this free engagement, Xerillion will assess your on-premises servers and networking, then create a proposal for a migration or a proof-of-concept project that can qualify for funding by Microsoft.

 
Azure OMS and Log Analytics Implementation: 2-Weeks: Presidio's team will review your Azure subscription and produce a plan to provide advanced automation, orchestration, logging analytics, and cloud-based network operation center solutions.

 
Azure Readiness Workshop and Report: LANET will evaluate the customer’s environment to determine a solution and provide an estimate of cost and timing, then present a workshop online or on-site and follow up with an assessment report.

 
Azure Resilience: 3 Days Proof of Concept: If you are a U.K. company looking to become certified in the British Standards Institute’s ISO 22301 for Business Continuity Management, then Nero Blanco's Azure Resilience proof of concept is well worth reviewing.

 
Azure Resilience: Implementation > 1 Week: This Azure Resilience implementation from Nero Blanco leverages our expertise in Azure Site Recovery, Azure Storage, and Azure Backup. We will implement a disaster recovery solution for your workloads.

 
Azure Site Recovery Planning and Delivery: 2-Weeks: Presidio's team will review your on-premises VMware and physically hosted application pool and produce a plan to provide advanced replication, high availability, and business continuity planning solutions.

 
Azure Site Recovery to Azure:10-Wk Implementation: Azure Site Recovery to Azure replicates workloads to Azure without adding secondary datacenter costs. This implementation by Infront Consulting Group will enable true business continuity and disaster recovery.

Azure SQL Cluster and HA: 2-Weeks: Presidio's team will review your on-premises SQL environment and produce a plan to provide advanced replication, high availability, and business continuity planning solutions at a lower cost than on-premises infrastructure.

 
Azure Visual Studio Dev/Test: 5-Day Workshop: Ready to move to the cloud? During this workshop, Tallan experts will discuss virtual networks, virtual machines in Azure, load testing, coded UI tests, automated deployment scenarios, and more.

 

AzureFactory Cloud Migration Assessment 1-week: The AzureFactory Cloud Migration Assessment by Cubesys leverages the power of software and AI to map your journey to Azure and provide detailed information for your optimized migration.

 
AzureFactory DR as a Service 3-week Implementation: AzureFactory Disaster Recovery-as-a-Service (DRaaS) is built on Microsoft Azure. It provides a fast and reliable way to protect your critical workloads. Let Cubesys take care of your disaster recovery plan.

 
AzureFactory Foundations 1-Week Implementation: AzureFactory Foundations by Cubesys gives you complete control of your Azure subscription by providing out-of-the-box governance and connectivity.

 
AzureFactory Foundations Assessment 3-Days: The AzureFactory Foundations Assessment by Cubesys delivers a vital assessment of your Azure foundations. You'll be provided with a clear report indicating the current state and targets for remediation.

 

AzureFactory Migration 4-Week Implementation: Cubesys’ AzureFactory Migration allows you to migrate to Azure, build secure cloud foundations, and optimize your workloads.

 
Bitnami Stacksmith 5-Day Proof-of-Concept: In five days, Bitnami will package and deploy your application to Azure and teach you how to maintain your app to keep it up to date and secure.

 
BizTalk Data Migration to Azure+PowerBI: 3-Day PoC: Migrate your BizTalk operations data from desktop tools to Azure and empower your users to view the status of business processes and interactions in real time without custom programming.

 

Bot in a Day: 1-Day Workshop: Tallan’s Bot in a Day workshop will give developers training on the basics of building and understanding AI bot technology while giving upper management insight to the art of the possible.

 
Briefing- Azure On Ramp: This briefing by Catapult Systems will include configuring and monitoring Azure, optimizing costs, adhering to governance policies, and personalizing cloud best practices.

 

Broadband Asset Assessment: Tilson will develop a checklist of potentially useful broadband assets that may be present in the region, identifying which are present and where. Tilson uses ESRI ArcGIS and QGIS for its geographic information system work.

 
Build and Enable a Self-Service Data Culture: In one hour, Decisive Data will discuss how we can work with you to build and enable a self-service data platform and culture utilizing a match made in the cloud: Microsoft Azure plus Alteryx plus Tableau.

 
Business Intelligence Roadmap 4-week assessment: For decision-makers struggling to obtain information to make the best decisions, a business intelligence road map is a good way to take stock of your current situation and define your needs and objectives.

 
Capstone Windows Server Migration Workshop: This engagement by Capstone Consulting will demonstrate a modernization process designed to migrate legacy Windows Server and SQL Server applications to Azure.

 
Chat Bot Concept Workshop and Report: A designer and a developer at Black Marble will deliver a one-day workshop that will include an introduction to bots, a demo, a Q&A session, a creative session for identifying key requirements, and a road map of next steps.

 
Chatbot Proof of Concept with BotStack: Black Marble will explore the potential of bots, delivering a workshop, creating a high-level report, and providing a proof-of-concept bot application aligned with the requirements identified at the concept stage.

 
Citadel-IX: Citadel Group has a flexible model to meet your needs, and we can migrate Citadel Information Exchange (Citadel-IX) as an on-premises solution to Azure or do implementations directly into Azure.

 
Cloud Adoption Assessment: 3-week assessment: Eduserv's Cloud Adoption Assessment examines your IT services and business objectives against the benefits, costs, and risks of moving to the cloud. Eduserv will also outline a migration plan and a development road map.

 
Cloud Adoption: 1-Day Assessment: This one-day session by Tallan is part education, part brainstorm. Our cloud architects will work with you to identify your specific business needs and challenges, recommend next steps, and put a plan in place to avoid common adoption pitfalls.

 
Cloud Architectural Review: 8-day assessment: Eduserv's Cloud Architectural Review assesses your public cloud estate and operational processes against your business objectives from the perspectives of operations, security, reliability, performance, and cost optimization.

 
Cloud Assure ISV QuickStart: 3-Day Assessment: Grey Matter’s Cloud Assure ISV QuickStart service will help you, as an ISV or application builder, deploy your application to Microsoft Azure regardless of the existing deployment model.

 
Cloud IaaS Migration: 3-Day Workshop: This three-day workshop by Thomas Duryea Logicalis is for technical leaders and key stakeholders and is held on-site at the client’s facility. It will focus on Azure governance, migration, deployment, compatibility, and costs.

 
Cloud Infrastructure Transformation 4 Week Impl-UK: New Signature's Cloud Infrastructure Transformation will help you rapidly establish a secure, scalable virtual datacenter in Azure based on Microsoft best practices and New Signature’s extensive expertise.

 
Cloud Managed Services :- 2 Weeks Briefing: Aricent's cloud offering provides a comprehensive set of services and in-house frameworks to efficiently deploy and migrate workloads from on-premises environments to the cloud.

 
Cloud Migration Cost Assessment: 6-Weeks: 2nd Watch’s six-week assessment simplifies and accelerates the path from on-premises to the cloud with instance rightsizing and total cost of ownership analysis using TSO Logic.

 
Cloud Readiness Assessment: 1-Day Evaluation: This service by Foundation IT provides a one-day review of the client's IT infrastructure and a report detailing an Azure migration approach and recommendations.

 
Cloudhouse Containers: 10 Day Migration Service: Cloudhouse's migration services help businesses prepare to move an application to the cloud. This 10-day engagement is for technical and business leaders and will be delivered partly on-site and partly via webinars.

 

Configuring and Managing Virtual Networks: This workshop by Dynamics Edge teaches IT professionals to create and manage virtual machines and to configure and manage Azure virtual networks (VNets). It also covers basic network configuration tasks.

 

Containers and DevOps – 1 Day Workshop: Spektra Systems' one-day training with a hands-on lab guides you through the process of building and deploying Docker images to the Kubernetes platform hosted on Azure Kubernetes Service.

 
CoPilot for Azure: As organizations adopt Azure at scale, CoPilot allows clients to continue optimizing their environments with the help of AHEAD's team of certified Azure engineers.

 
CST Cloud Consultation 1-Hour Briefing: This one-hour online briefing by UberCloud will teach you how Microsoft Azure can benefit your CST simulations. By running CST Studio Suite on Azure, you can take advantage of scalable, state-of-the-art hardware resources in the cloud.

 
Customer Service  intelligent Chatbots Implement: In this engagement, adesso AG implements customer service chatbots. The benefits of digital customer service include constant availability, cost reductions, and integration into the Microsoft landscape.

 

Cybersecurity Assessment with GDPR – 2 weeks: This assessment by Halian will result in a comprehensive Microsoft Power BI report that will give you specific recommendations on where and how you can achieve a higher level of security and GDPR compliance.

 
Data Analytics: 15-Days Proof-Of-Concept: In this three-week engagement, Programmer's will explore one business scenario at your organization and leverage data science to improve results.

 
Data Estate Modernisation 5-Day Assessment: Satalyst will assess your data environment, document the current state, define a high-level future-state data platform environment, estimate the costs of the new environment, and deliver a data estate modernization guide.

 

Data Quality Health Check: 4-Week Assessment: Sign up for a four-week engagement with data governance architects from Hitachi Solutions, who will give you step-by-step, personalized guidance on operational and analytics master data for a given domain.

 
Data Science Operationalization-1 Week Implementation: With endjin, design and develop flexible, extensible, scalable, and multi-tenant polyglot data processing pipelines to power your intelligent solutions.

 
Data Strategy and Modernization :5-Day Assessment: Through a series of workshops and whiteboard sessions, CloudMoyo's experts will assess your data layer, ETL layer, and reporting/analysis interface to help you determine how your BI environment is performing.

 
DB Health Check – 1 Week Assessment: Ascent Technology will take an inventory of your database system and check key performance indicators to provide you with a comprehensive health report with suggestions for maintenance and architecture.

 
Demand for Broadband Assessment: In this assessment, Tilson will evaluate current and future demand for broadband services in the region. We will use Form 477 data to determine where broadband services are available and the network speeds advertised in those areas.

 
Deploying and Managing Virtual Machines: This one-day workshop is for Azure administrators and will help them manage the cloud services that span storage, networking, and compute capabilities, with a deep understanding of each service across the full IT lifecycle.

 

Deploying SCCM: 3-Week Implementation: Infront Consulting Group will deploy Microsoft System Center Configuration Manager (SCCM) and coach the customer (via informal knowledge transfer) on how to accomplish some basic tasks.

 

Designing a Data Platform Solution: This workshop by Dynamics Edge will compare Azure database options, identify data streaming options for large-scale data ingestion, and identify longer-term data storage options.

 
Designing an Infrastructure Strategy: This workshop will describe DNS IP strategies for virtual networks in Azure, compare connectivity options, distribute network traffic, and design a hybrid connectivity scenario between the cloud and an on-premises environment.

 
Designing for Deployment Migration and Integration: This workshop will cover deploying an ARM template to a resource group, authoring a complex deployment using Azure Building Blocks tools, and integrating an API or logic app with Azure API Management.

 

Designing for Identity and Security: In this workshop by Dynamics Edge, cloud solution architects will learn about security and identity management within the context of Azure and multiple Software-as-a-Service solutions.

 
Develop Azure Cognitive Services, Bot, and IoT Sol: In this workshop, Dynamics Edge will detail how to integrate Azure Cognitive Services, how to create bots using Bot Framework and Azure portal, and how to leverage Azure Time Series Insights, Stream Analytics, and IoT Hub.

 
Develop Azure Platform as a Service Solutions: This workshop covers Azure Service Fabric, Azure Media Services, Azure Kubernetes Service, creating web apps with Azure App Service, and managing bulk operations through the Batch Service API.

 
Develop for an Azure Cloud Model: This workshop by Dynamics Edge covers asynchronous processing, autoscaling, long-running tasks, distributed transactions, Azure Search, and how to ensure a solution meets performance expectations.

 
Develop for Azure Storage: In this workshop, Dynamics Edge will discuss developing solutions leveraging Azure Storage options, including blob, table, or file storage; Cosmos DB; relational databases; caching; and content delivery networks.

 

DevOps Acceleration Engine: 4-Month Delivery: The Sirrus7 DevOps Acceleration Engine is a tailored and seamlessly integrated DevOps pipeline coupled with in-depth consulting services to quickly get your enterprise shipping code with Azure-native output.

 
DevOps Assessment and PoC: BCS Technology experts will guide you through your DevOps journey, which includes setting up application pipelines, monitoring performance, utilizing insights, and adding manual and automated quality checks.

 
DevOps with OSS on Azure – 1 Day Workshop: With this one-day training and hands-on lab by Spektra Systems, learn about building a continuous integration/continuous delivery environment on Azure using your favorite open-source tools.

 
Digital Transformation: 1-Hour Briefing: Legacy technology can be the biggest barrier to digital transformation. This introductory briefing by BJSS will help customers plan their journey.

 
Discovery Detailed 15-Day Workshop: This collaborative workshop by Kiandra IT examines user experience and considers how your new software solution can provide a great user experience.

 
Discovery Lite 7-Day Workshop: Each of the activities in this Lite workshop are timeboxed and tailored to meet intended outcomes. From there, we’ll document and visualize the results from the workshop so that your team is empowered to make decisions on a path forward.

 
Discovery Standard 10-Day Workshop: With more time, the Discovery Standard workshop from Kiandra IT allows us to dig deeper, resulting in a more accurate, detailed outcome.

 

Docker 1-Day Workshop: By the end of this one-day course from Architech, developers will understand how to create, deploy, secure, and manage the lifecycle of Docker containers.

 

Enterprise Blockchain Deep Dive: 5-Day Assessment: Participants of this workshop by Envision Blockchain Solutions will learn about the business value and benefits of blockchain solutions. This deep dive defines both functional and technical requirements.

 
Enterprise Blockchain Immersion: 1/2 Day Workshop: This half-day workshop by Envision Blockchain Solutions will provide a relaxed guided tour of today’s blockchain and IoT technology, aiming to help clients find the "aha" moment they are seeking.

 
Fiber-to-the-Home Industry Overview Briefing: Tilson will provide an overview of the Fiber-to-the-Home (FTTH) industry, addressing dark and lit networks, network owners, consumer and enterprise markets, internet service providers, and incumbents vs. new entrants.

 
First Analytical Model 2-Week implementation: EXIA's First Analytical Model (FAM) solution includes the development of a personalized analytical model based on client data, enabling the optimization of a business process.

 
Free Azure Optimization Assessment: 5 Days: In this free assessment, Azure experts from The Henson Group will review (no tools) every aspect of your tenant and produce a list of recommendations to improve performance, lower costs, add availability, and improve security.

 
GP Migration to Azure: 1-Week assessment: In this engagement, Dynamic Consulting will migrate a Dynamics GP 2016 server from on-premises infrastructure to a virtual machine hosted on Microsoft Azure.

 
Hands-on Labs for Cloud Workshops: 1-Hour Session: Learn how to run successful trainings and events with hands-on lab environments for Microsoft cloud workshops. Spektra Systems' training expert will guide participants through the setup process and best practices.

 
Hybrid Cloud: 3-Day Implementation: Emm&mmE Informatica will deploy a hybrid solution to your on-premises environment and Azure subscription.

 
Implement Advanced Virtual Networking: This workshop by Dynamics Edge will teach IT professionals how to implement and configure Azure networking traffic distribution tools, including Azure Load Balancer, Azure Traffic Manager, and Azure Application Gateway.

 
Implement Azure Development Integration Solutions: This workshop will show participants how to integrate and manage APIs by using Azure API Management; how to configure a message-based integration architecture; and how to develop an application message model.

 
Implement Security in Azure Development Solutions: This workshop by Dynamics Edge details how authentication and authorization work in Azure, and how to implement secure data solutions with encryption, Azure Key Vault, and SSL and TLS communications.

 
Implementing and Managing Application: This workshop teaches IT professionals how to manage and maintain infrastructure for core web apps and services. Learn how Azure App Service is used as a Platform-as-a-Service and app service environment.

 
Implementing and Managing Storage: This one-day workshop will teach IT professionals about Azure storage solutions as well as basic data replication concepts and schemes. Azure Storage Explorer will be introduced.

 
Implementing Workloads and Security: This workshop for IT professionals will help them assess, plan, and implement a migration of on-premises resources and infrastructure to Azure. Azure Migrate and Azure Site Recovery on a Hyper-V will also be covered.

 
Intelligent Mail Management 3-Week Implementation: Intelligent Mail Management is an automated solution for the recognition and processing of letters. adesso AG will identify your business problems and domain model requirements, then implement its mail solution.

 
Intro to Machine Learning 1-Day Workshop: In this one-day workshop by Aware Group, participants will be introduced to the fundamentals of data science theory, tools, and practice. This course is suitable for anyone with basic computer skills.

 
ITSM360 for Financial Services: 1-Hour Workshop Demo: Get a one-hour demo of BDO Canada's ITSM360, a complete IT service management solution powered by Microsoft Office 365 and SharePoint Online. The service is tuned for the Canadian financial services industry.

 
Lift Oracle to Azure 2-weeks Implementation: This two-week engagement from Ascent Technology will lift and shift your Oracle database into Azure so you can achieve high performance and scalability while reducing your yearly Oracle licensing costs.

 
Linux Lift and Shift to Azure – 1 Day Workshop: By the end of this workshop from Spektra Systems, you will be able to configure Linux virtual machines and virtual machine scale sets in Azure for availability, storage, and connectivity.

 
Manage Identities: This workshop by Dynamics Edge teaches IT professionals how to use Azure Active Directory. Participants will also learn about Azure AD’s differences compared to Active Directory Domain Services and how to integrate it with Software-as-a-Service solutions.

 
Manage Subscriptions and Resources: This one-day workshop by Dynamics Edge will help IT professionals manage their Azure subscriptions, their cloud resources through user and group accounts, and their core monitoring tools.

 
Manufacturing website / e-commerce: 1/2-Day Workshop: This half-day workshop by Profound Works is aimed at manufacturing businesses focused on improving websites and e-commerce, from UX design to CMS choice to system integration.

 
Migrate on-Premise to Azure (SQL, SSAS, SSIS): A successful cloud migration plan starts with a clear, data-driven understanding of your infrastructure. This assessment by CodeCenters International will deliver a migration road map for your SQL databases, cubes, or SSIS.

 
Migrate Servers to Azure: This workshop by Dynamics Edge will teach IT professionals how to assess, plan, and implement a migration of on-premises resources and infrastructure to Azure Migrate, Azure Site Recovery, and Azure Site Recovery on Hyper-V.

 
Migrate TFS On-premise to Online VSTS – Azure: Cognosys will migrate your on-premises Team Foundation Server to Azure DevOps Services (formerly known as Visual Studio Team Services). Retain access to Team Foundation Server even after the move.

 

Migrate to Azure: 1-Day Implementation: In this implementation service, Intercept will confidently migrate your IT environment, infrastructure, applications, and workloads to Azure.

 
Migrate to Azure: 4-Wk Implementation: Techstern can conduct a smooth transition of your business-critical applications, websites, and Infrastructure-as-a-Service solutions to Azure.

 
Modern Enterprise Analytics Platform Assessment: Adastra will assess your current state, perform a gap analysis, establish a target architecture, and develop a road map toward a scalable, stable, secure, and high-performing enterprise analytics platform powered by Azure.

 
Retail website / e-commerce: 1/2 Day Workshop: This half-day workshop by Profound Works will help retail or fast-moving consumer goods businesses improve websites and e-commerce, from UX design to CMS choice to system integration.

 
Satalyst Enhanced Bot 2-Hour Workshop: Bots and AI present a huge opportunity for companies to streamline customer data collection. During the workshop, Satalyst experts will work with you to identify opportunities to use bots for automated information collection.

 
SC:Strategy: Hybrid Cloud Strategy and Business Case: Shaping Cloud's SC:Strategy provides customers with strategic direction and thought leadership on their journey to the cloud, building a business case for Microsoft Azure.

 
Security and Compliance Workshop – 1 Week Workshop: Risk management strategies require multiple layers of protection that limit the pathways that could result in a data loss or breach. This workshop by endjin will provide an end-to-end review of your application's risks.

 
Select the Appropriate Azure Technology Dvlpt Sol: This workshop by Dynamics Edge covers Azure architecture, design, and connectivity patterns, and it will help you choose the right storage solution for your development needs.

 
SMB Ascend 1 week assessment: For IT directors considering migrating to the cloud, EXIA’s SMB Ascend is a solution that lets you quickly obtain a portrait of the different migration scenarios, the related costs and savings, and the support options available.

 
Snowflake on Azure Consultation: 1-Hour Briefing: In this briefing, Decisive Data will cover a variety of topics related to utilizing Snowflake on Azure, including data migration, Snowflake data sharing, data warehousing, ETL/ELT, and best practices.

 
Software Project Discovery 1-Day Workshop: Find out exactly what type of software solution your business needs with this collaborative discovery workshop from Kiandra IT.

 
SQL Platform Modernization 10 Day Assessment: Adastra will apply an ideal migration strategy while taking into account security, network sensitivity, configuration, and backup recovery.

 
SQL Server Modernization – 1 Week Assessment: Trianz will help you migrate your legacy SQL Server database to a modern Azure database under a fixed-fee engagement model, a predefined project plan, and tried-and-tested methodologies.

 
Stratiform Azure Scaffold Offer 5-Day Assessment: In Stratiform's Azure Scaffold engagement, our experts will help you deploy your Azure environment, taking you from design to implementation while supplying you with best practices to optimize the environment.

 
Stratiform Identity Management: Is identity management restricting your move to the cloud? PCM Canada's identity management specialists can help. Give us five days, and we'll give you a secure path to the cloud.

 
TFS Migration from OnPremise To Azure IaaS: Cognosys will migrate Team Foundation Server from the client’s on-premises environment to Azure Infrastructure-as-a-Service.

 
Understanding Cloud Architect Technology Solutions: This workshop by Dynamics Edge will teach IT professionals how operations are done in parallel/asynchronously, how an enterprise system must be resilient when failures occur, and how deployments can be automated.

 
Understanding Your Users 5-Day Workshop: Through a series of user group sessions, Kiandra IT aims to deepen your understanding of your users, their motivations, their needs, and their goals.

Windows 2008 Azure Migration 4-Week Implementation: Don't let your infrastructure and applications go unprotected. The Cubesys team is here to help you migrate to Azure for greater security, performance, and innovation.

 
Windows Server/SQL 2008 and 2008R2: 2-day Workshop: This workshop from Thomas Duryea Logicalis is for technical leaders and is held at the client’s facility. It will identify workloads that are suitable to migrate and provide an understanding of cost-saving strategies.

Your Business Reviewal 5-Day Workshop: Kiandra IT will work closely with you to review your current digital offerings and processes, identifying roadblocks, capabilities, and opportunities.

Quelle: Azure

Automatic performance monitoring in Azure SQL Data Warehouse (preview)

Monitoring and managing the performance of your data warehouse is critical to the overall health of your data estate. With the increase in data and query velocities, tracking query metrics pertaining to usage frequency, resource consumption, or regressions can impact your ability to efficiently draw meaningful insights from your data.

To increase your efficiency, we’re excited to reveal the preview of Query Store for Azure SQL Data Warehouse for both our Gen1 and Gen2 offers. Query Store is designed to help you with query performance troubleshooting by tracking queries, query plans, runtime statistics, and query history to help you monitor the activity and performance of your data warehouse. Query Store is a set of internal stores and Dynamic Management Views (DMVs) that allow you to:

Identify and tune top resource consuming queries.
Identify and improve ad hoc workloads.
Evaluate query performance and impact to the plan by changes in statistics, indexes, or system size (DWU setting).
See full query text for all queries executed.

The Query Store contains three actual stores: a plan store for persisting the execution plan information, a runtime stats store for persisting the execution statistics information, and a wait stats store for persisting wait stats information. These stores are managed automatically by SQL Data Warehouse and provide an unlimited number of queries storied over the last 7 days at no additional charge.

Enabling Query Store

Enabling Query Store is as simple as running an ALTER DATABASE T-SQL statement:

ALTER DATABASE [Database Name] SET QUERY_STORE = ON;

Note: You can disable Query Store by running the ALTER DATABASE command specifying OFF.

Finding the full text for any query

With Query Store, you can retrieve the full text of any query executed over the last 7 days by using the sys.query_store_query and sys.query_store_query_text DMVs.

SELECT
q.query_id
, t.query_sql_text
FROM
sys.query_store_query q
JOIN sys.query_store_query_text t ON q.query_text_id = t.query_text_id;

The results show the query_id and the text of the query being executed.

Finding your top executing queries

When enabled, Query Store tracks all query executions. On a busy data warehouse, you may want to look at the top queries by execution count. Using the Query Store views, we can get the query execution count for the 10 commands executed most frequently.

SELECT TOP 10
q.query_id [query_id]
, t.query_sql_text [command]
, SUM(rs.count_executions) [execution_count]
FROM
sys.query_store_query q
JOIN sys.query_store_query_text t ON q.query_text_id = t.query_text_id
JOIN sys.query_store_plan p ON p.query_id = q.query_id
JOIN sys.query_store_runtime_stats rs ON rs.plan_id = p.plan_id
GROUP BY
      q.query_id
, t.query_sql_text
ORDER BY
3 DESC;

Finding the execution times for a query

Query Store also gathers runtime query statistics for you to help you focus on queries with high variance in execution. Using the sys.query_store_plan and sys.query_store_runtime_stats DMVs, we can gather:

SELECT
q.query_id [query_id]
, t.query_sql_text [command]
, rs.avg_duration [avg_duration]
, rs.min_duration [min_duration]
, rs.max_duration [max_duration]
FROM
sys.query_store_query q
JOIN sys.query_store_query_text t ON q.query_text_id = t.query_text_id
JOIN sys.query_store_plan p ON p.query_id = q.query_id
JOIN sys.query_store_runtime_stats rs ON rs.plan_id = p.plan_id
WHERE
q.query_id = 10
AND rs.avg_duration > 0;

Finding the queries with the highest variance in execution

WITH RawData AS
(
SELECT
q.query_id [query_id]
, t.query_sql_text [command]
, rs.avg_duration [avg_duration]
, rs.min_duration [min_duration]
, rs.max_duration [max_duration]
, (((rs.max_duration * 1.0)/ (rs.min_duration * 1.0)) – 1) * 100   [variance_pct]
FROM
sys.query_store_query q
JOIN sys.query_store_query_text t ON q.query_text_id = t.query_text_id
JOIN sys.query_store_plan p ON p.query_id = q.query_id
JOIN sys.query_store_runtime_stats rs ON rs.plan_id = p.plan_id
WHERE
rs.min_duration > 0
)
SELECT
*
FROM
RawData
ORDER BY
variance_pct DESC

Next steps

Query Store is available in all Azure regions with no additional charge. Azure SQL Data Warehouse continues to lead in the areas of security, compliance, privacy, and auditing. For more information, refer to the whitepaper, “Guide to enhancing privacy and addressing GDPR requirements with the Microsoft SQL platform,” on Microsoft Trust Center, or our documentation, “Secure a database in SQL Data Warehouse.”

For more information on Query Store  in Azure SQL Data Warehouse, refer to the article, “Monitoring performance by using the Query Store,” and the Query Store DMVs, such as sys.query_store_query.
For feature requests, please vote on our UserVoice.
To get started today, create an Azure SQL Data Warehouse.
To stay up-to-date on the latest Azure SQL Data Warehouse news and features, follow us on Twitter @AzureSQLDW.

Quelle: Azure

Cloud Commercial Communities webinar and podcast update

Welcome to the Cloud Commercial Communities monthly webinar and podcast update. Each month the team focuses on core programs, updates, trends, and technologies that Microsoft partners and customers need to know to increase success using Azure and Dynamics. Make sure you catch a live webinar and participate in live Q&A. If you miss a session, you can review it on demand. Also consider subscribing to the industry podcasts to keep up to date with industry news.

Happening in December

Webinars

Transform Your Business with AI at Microsoft

December 4, 2018 at 11:00 AM Pacific Time

Explore AI industry trends and how the Microsoft AI platform can empower your business processes with Azure AI Services including bots, cognitive services, and Azure machine learning.

Azure Marketplace and AppSource Publisher Onboarding and Support

December 11, 2018 at 11:00 AM Pacific Time

Learn the publisher onboarding process, best practices around common blockers, plus support resources available.

Build Scalable Cloud Applications with Containers on Azure

December 17, 2018 at 1:00 PM Pacific Time

Overview of Azure Container Registry, Azure Container Instances (ACI), Azure Kubernetes Services (AKS), and release automation tools with live demos.

Podcasts

Blockchain, Artificial Intelligence, Machine Learning – what does it mean for healthcare?

December 11, 2018

David Houlding, a Microsoft Principal Healthcare Program Manager, discusses topics such as Blockchain, Artificial Intelligence, and Machine Learning as they impact healthcare.

Real world insights working with Machine Learning projects

December 17, 2018

Jess Panni and David Starr share insights learned from machine learning projects and the use of Machine Learning Studio to get actionable insights from the data produced.

Recap for November

Webinars

Get Started with Azure Applications: Solution Template Offer on Azure Marketplace

November 7, 2018

Learn more about publishing a solution in Azure Marketplace. With Azure Applications, publishers can automate the provisioning of one or more VMs/Azure Services using Azure Resource Manager, provision networking and storage resources, and more.

How Barracuda Reached Enterprise Customers Through Private Offers on Azure Marketplace

November 13, 2018

Learn about Private Offers on Azure Marketplace, which helps publishers create exclusive offers for their customers, offer customized software and terms, and run limited beta releases. Also learn how Barracuda Networks (a Microsoft partner) leveraged Private Offers to win larger deals and optimize their procurement process.

Check out more Cloud + AI events and join in the discussions in the C+AI partner community.

Podcasts

IoT with Streaming Data and Analytics and a Little Design Thinking with Element

This IoT episode with element ranges from oil and gas to Raspberry Pi, real-time streaming analytics, and even design thinking.

SKU Assortment using AI with experts Neal Analytics

How retail and consumer goods focus on improving product availability leveraging AI and ML.

The digital transformation of insurance with Nick Leimer

Nick Leimer from Microsoft brings us an astounding view into the digital transformation occurring inside the insurance industry.

Connecting IoT data with artificial intelligence at scale

In this episode of the podcast, we visit with Hari Menon and Diego Tamburini about the intersection of IoT and artificial intelligence.

Emerging and transformative technologies in retail with Mariya Zorotovich

Mariya Zorotovich talks with Vince Menzione about emerging and disruptive technologies in retail with a focus on Artificial Intelligence.

Current state, disruptors and technology trends in financial services with Howard Bush

Howard Bush discusses the current industry climate and technology disruptions occurring in financial services.

Accelerating your AI in healthcare initiative with blueprints

David Houlding and Gururaj Pandurangi take us on a journey through the technical advances Artificial Intelligence is bringing to healthcare and help you get started.

Paul Maher expands on the Industry Experiences team at Microsoft and his journey to the cloud

Paul Maher discusses the Industry Experiences team, why it was created, and how it can help your organization.

Check out recent podcast episodes at the Microsoft Industry Experiences team podcast page.
Quelle: Azure