Azure Site Recovery team is hosting an Ask Me Anything session

You can start asking your questions with #ASR_AMA soon!

The Azure Site Recovery (ASR) team will host a special Ask Me Anything (AMA) session on Twitter, Tuesday, January 22, 2019 from 8:30 AM to 10:00 AM Pacific Standard Time. You can tweet to @AzSiteRecovery or @AzureSupport with #ASR_AMA.

What’s an AMA session?

We'll have folks from across the ASR product team available to answer any questions you have. You can ask us anything about our products, services, or even our team!

Why are you doing an AMA?

We like reaching out and learning from our customers and the community. We want to know how you use ASR and how your experience has been. Your questions provide insights into how we can make the service better.

How do I ask questions on Twitter?

You can ask us your questions by mentioning #ASR_AMA in your tweet. Your question can span multiple tweets by replying to first tweet you post with this hashtag. You can also directly message @AzSiteRecovery or @AzureSupport if you want to keep your questions private. For our customers in different time zones who may not be able to attend the event at the specified time, you can start posting your questions one day before the scheduled time of AMA (Monday, January 21) and we will answer them during the event time. You can catch us for a live conversation during the scheduled hours. If there are further follow-ups, we will continue the conversation post event time. Go ahead and tweet us!

Who will be there?

You, of course! We'll also have Program Managers from the ASR team participating.

Have any questions about the following topics? Bring them to the AMA.

Disaster Recovery of VMs from a primary Azure region to a secondary Azure region
Disaster Recovery of VMs from VMware to Azure
Disaster Recovery of VMs from Hyper-V to Azure
Disaster Recovery of physical servers to Azure

Why should I ask questions here instead of StackOverflow or MSDN? Can I really ask anything?

An AMA is a great place to ask us anything. StackOverflow and MSDN have restrictions on which questions can be asked. With an AMA, you’ll get answers directly from the team and have a conversation with the people who build these products and services.

Here are some question ideas:

What is ASR?
How is ASR priced?
Can I perform Disaster Recovery of Azure VMs with managed disks?
Does ASR encrypt replication?
How often can data be replicated with ASR?

Go ahead, ask us anything about our public products or the team. Please note, we cannot comment on unreleased features, future plans, and issues which require deep level debugging.

We're looking forward to having a conversation with you!
Quelle: Azure

Check up on your remote fleet: Cloud IoT now makes Device Activity Logging generally available

Embedded systems engineers who develop IoT systems often face a number of challenges. Getting devices connected to the internet for the first time and every time is a tall order in today’s fragmented IoT market. Even the process of determining whether or not a particular device is working properly can be a challenge. Device Activity Logging (or just Logging, for short) allows customers to receive device activity logs from Cloud IoT Core, right in Stackdriver. Cloud IoT Core produces two types of logs: audit logs and now device logs as well. Both are available for viewing in Stackdriver.A view of the device activity logs available in IoT Core, using Stackdriver tools to sort and filter the logs. LoggingDuring development it can be frustrating to understand why a device isn’t connecting as it was designed to, or otherwise behaving as intended. Furthermore, debugging a deployed device or group of devices in a fleet of thousands can be near impossible without the correct instrumentation. Detailed activity logs are usually the only way to understand the lifecycle a specific device has encountered, and many IoT platforms make this information difficult to access.With Cloud IoT Core, you can now enable device logging for a single device or for an entire fleet. Device Activity Logging allows users to select different log levels, depending on the verbosity of logs they are interested in. Users can choose to see just errors, full connection history, or even a log of every time a device sends a message (note: message content is not logged, only the actual event). Device Activity Logs are written to Stackdriver, which means that they are available alongside all the rest of your GCP (and IoT) audit logs. This makes debugging errors or solving connectivity problems  a snap.An example of what a monitoring dashboard for a small number of devices might look like.MonitoringLogs are great for diagnosing problems with devices, but sometimes it’s necessary to understand the health of your entire fleet of devices, with just a quick glance. You can get a good idea of what’s going on within your business just by seeing how many devices are connected and how often they communicate. To meet this need, many businesses will build custom dashboards, or at least employ a simple visualization tool.The new monitoring tab in IoT Core can help you get a complete picture of your fleet with no additional setup. IoT Core automatically reports this data to Stackdriver Metrics where they can then be queried to create custom dashboards, if you wish. However, we have already created a standard dashboard of the most useful metrics, right in IoT Core.Simply click on the “Monitoring” tab to see information about how many devices are connected, how many messages they are sending, and how much data they are using. If you need more granular information, you can easily follow the link to Stackdriver Metrics.ConclusionTo find out more about which events are logged under each logging level, take a look at the documentation. Logging and Monitoring are now generally available, so try enabling it for some of your devices today. If you’d prefer to explore the functionality of Cloud IoT Core in an interactive, educational format, try our Cloud IoT Qwiklab, which includes logging examples.
Quelle: Google Cloud Platform

Pix2Story: Neural storyteller which creates machine-generated story in several literature genre

Storytelling is at the heart of human nature. We were storytellers long before we were able to write, we shared our values and created our societies mostly through oral storytelling. Then, we managed to find the way to record and share our stories, and certainly more advanced ways to broadly share our stories; from Gutenberg’s printing press to television, and the internet. Writing stories is not easy, especially if one must write a story just by looking at a picture in different literary genres.

Natural Language Processing (NLP) is a field that is driving a revolution in the computer-human interaction. We have seen the amazing accuracy we have today with computer vision, but we wanted to see if we could create a more natural and cohesive narrative showcasing NLP. We developed Pix2Story a neural-storyteller web application on Azure that allows users to upload a picture and get a machine-generated story based on several literature genres. We based our work on several papers “Skip-Thought Vectors,” “Show, Attend and Tell: Neural Image Caption Generation with Visual Attention,” “Aligning Books and Movies: Towards Story-like Visual Explanations by Watching Movies and Reading Books,” and some repositories neural storyteller. The idea is to obtain the captions from the uploaded picture and feed them to the Recurrent Neural Network model to generate the narrative based on the genre and the picture.

The solution

Part of the process we have trained a visual semantic embedding model on the MS COCO captions dataset of 300.000 images to make sense of the visual input by analyzing the uploaded image and generating captions. We also transformed the captions and generate a narrative based on the selected genre: adventure, Sci-Fi, or thriller. For this, we trained for two weeks an encoder-decoder model on more than 2,000 novels. This training allows each passage of the novels to be mapped to a skip-thought vector, a way of embedding thoughts in vector space. This allowed us to understand not only words but the meaning of those words in context to reconstruct the surrounding sentences of an encoded passage. We used the new Azure Machine Learning Service as well as the Azure model management SDK with Python 3 to create the docker image with these models and deploy it using Azure Kubernetes Services (AKS) with GPU capability making the project ready to production. Let’s see the process flow in detail.

Visual semantic embedding

The first part of the project is the one that transforms the input picture into captions. Captions briefly describe the picture as is shown in the example below.

A white dog that is looking at a Frisbee Small white dog with green and white bow tie on A white dog with black spots is sitting on the grass.

The model employed to generate this caption is composed by two different networks. First, one is a convolutional neural network to extract a set of feature vectors referred to as annotation vectors.

The second part of the model is a long short-term memory (LSTM) network that produces a caption by generating one word at every time step conditioned on a context vector, the previously hidden state, and the previously generated words.

Skipthought Vectors

Skipthought Vectors by R. Kiros is a model that generates generic sentences representations that can be used in the different task. For this project, the idea is to train an encoder-decoder model that tries to reconstruct the surrounding sentences of an encoded passage using the continuity of text from books.

The model is an encoder-decoder model. The encoder is used to map a sentence into a vector. The decoder then conditions on this vector to generate a translation for the source sentence.

The vocabulary used has been expanded using google news pre-trained vectors by generating a linear regressor that maps words in the founded in books vocabulary to words in this vector.

Style shifting

Attending to skipthoughts functioning if the sentences given to the encoder from the VSE are short descriptive sentences the final output will be a short sentence. For that reason, if the desired output is a more literary passage, we need to make a style shifting. That means to operate with skipthought vectors representations to set the input to the characteristics we want to induce in the output. The operation is the following:

Skipthoughts Decoder Input = Encoded Captions of the picture – Average All Captions Encoded + Encoded Passages with similar length and features as the output we expect.

Deployment

This project has been deployed using Azure Machine Learning Services Workspaces to generate the Docker image with the files and all models involved in prediction. The deployment for consumption has been made using AKS to automatically scale the solution.

rain your own model

For training new models:

Create conda environment:

conda env create –file environment.yml

Activate conda env:

activate storytelling

Set paths to your books or texts and your training settings in config.py.
Run training.py to train an encoder, generate necessary files and train a decoder based on your texts.
Generate bias: Mean of encoded sentences and mean of encoded captions.
To generate stories run the following on a python console:

>import generate
>story_generator = generate.StoryGenerator()
>story_generator.story(image_loc='path/to/your/image')

Congratulations! You should now have a fully working application to get started. Have fun testing the project and thank you for your contribution!

"Pix2Story- Neural Storyteller"-A web app that allows users to upload a picture and get an AI generated story based on several literary genres.

Additional resources

AI Lab
Playground

You can find the code, solution development process and all other details on GitHub.

We hope this post helps you get started with AI and motivates you to become an AI developer.
Quelle: Azure

Azure Monitor logs in Grafana – now in public preview

We’re happy to introduce the new Grafana integration with Microsoft Azure Monitor logs. This integration is achieved through the new Log Analytics plugin, now available as part of the Azure Monitor data source.

The new plugin continues our promise to make Azure’s monitoring data available and easy to consume. Last year, in the v1 of this data source we exposed Azure Monitor metric data in Grafana. While you can natively consume all logs in Azure Monitor Log Analytics, our customers also requested to make logs available in Grafana. We have heard this request and partnered with Grafana to enable you to use OSS tools more on Azure.

The new plugin allows you to display any data available in Log Analytics, such as logs related to virtual machine performance, security, Azure Active Directory which has recently integrated with Log Analytics, and many other log types including custom logs.

How can I use it?

The new plugin requires Grafana version 5.3 or newer. After the initial data source configuration, you can start embedding Azure Monitor logs in your dashboards and panels easily, simply select the service Azure Log Analytics and your workspace, then provide a query. You can reuse any existing queries you already have in Azure or write a new query. Writing queries in Grafana is made simple with the familiar IntelliSense auto-complete options you’ve already seen in the Azure Log Analytics query editor.

The plugin will run the query through the Log Analytics API, which means data is available to query as soon as it’s ingested to Log Analytics and is not copied to a separate store. In addition to the standard query language, Grafana supports specific macros such a $__timeFilter, which enables features like zooming-in on charts or using variables.

Added value

Grafana offers great dashboarding capabilities, rich visualizations, and integrations with over 40 data sources. If you’re already using Grafana for your dashboards, this new plugin can help you create a single pane of glass for your various monitoring needs. In result, covering both Azure metrics data as well as logs data.

Additionally, the plugin utilizes the powerful query language so you can do a lot more than display raw data.

Calculate data that’s meaningful, such as your resources’ 95th percentile versus your SLA.
Combine data from multiple workspaces and even from Application Insights using cross-workspace and cross-app queries.
Correlate data with joins.
Apply machine learning algorithms. The query language offers operators that perform smart analytics, so you can run advanced analytics, such as detect service disruptions based on trace logs or run a cohort analysis.

To learn more about the Grafana Azure Log Analytics plugin review the documentation, “Monitor your Azure services in Grafana.”
Quelle: Azure

Cloud Commercial Communities webinar and podcast newsletter – January 2019

Welcome to the Cloud Commercial Communities monthly webinar and podcast update. Each month the team focuses on core programs, updates, trends, and technologies that Microsoft partners and customers need to know so that they can increase success in using Microsoft Azure and Dynamics. Make sure you catch a live webinar and participate in a live Q&A. If you miss a session, you can review it on demand. Also consider subscribing to the industry podcasts to keep up to date with industry news.

Upcoming in January

Webinars

Grow, Build, and Connect with Microsoft for Startups – January 23, 2019 at 11:00 AM PST
Microsoft for Startups is a unique program designed to help startups become a Microsoft business partner through access to technology, channels, markets and customers. Tune into this session to learn more about the Microsoft for Startups program, a $500 million initiative to provide startups access to both the technology and customer base needed to build and grow their business.
Transform Data into Dollars by Enabling Intelligent Retail with Microsoft – January 29, 2019 at 10:00 AM PST
Microsoft is enabling retailers to deliver personalized customer experiences by empowering employees, driving digital transformation, and capturing data-based insights to accelerate growth for our partners and customers. This 30-minute session will arm partners with real case studies and actionable solutions for each intelligent retail scenario with an opportunity for live Q&A with our retail expert.
Azure Marketplace and AppSource Publisher Payouts and Seller Insights – January 30, 2019 at 11:00 AM PST
Azure Marketplace and AppSource is your launchpad to go-to-market with Microsoft and promote your offerings to customers. Join this exciting session to learn more about how Azure Marketplace and AppSource Publisher payouts work and gain exposure to the seller insights tool within Cloud Partner Portal.

Podcasts

Evolving actuarial risk compute and modeling on Azure – Nick Leimer shares changes occurring in the insurance industry and how companies are dealing with it. Specifically, we look at computing risk for regulatory compliance and how it might be a good match for Azure services, like Azure Batch or Azure High-Performance Computing.
Reduce healthcare costs with digital transformation: security, compliance, and backup on Azure – Healthcare IT veteran, David Houlding, chats with us about reducing costs in healthcare as part of an organization’s digital transformation and specifically outlines the tools and techniques needed for these transformations to succeed.
Adopting Azure for real-time payments – In this episode, Howard Bush talks with us about enabling real-time transactions instead of the customary batch transactions that financial institutions use today.
The full lifecycle of implementing IoT with PTC – From planning to streaming analytics, this episode looks at all phases of introducing IoT to a company. Just having the data is often not enough to make decisions. Insights must be gleaned from that data.
Live now – Joel Neidig of SIMBA Chain talks with us about blockchain as a service –  The podcast focuses on blockchain as a service and how it can be leveraged in manufacturing. With very real use cases and stories of success, we'll see how blockchain is affecting manufacturing in various ways today.
Live now – Using Cognitive Services with containers – Container support in Azure Cognitive Services allows developers to use the same rich APIs that are available in Azure and enables flexibility in where to deploy and host the services that come with Docker containers.

Recap for December

Webinars

Transform Your Business with AI at Microsoft – Explore AI industry trends, how the Microsoft AI platform can empower your business processes, Azure AI Services including bots, cognitive services, and Azure machine learning.
Azure Marketplace and AppSource Publisher Onboarding and Support – Learn the publisher onboarding process, best practices around common blockers, plus support resources available.
Build Scalable Cloud Applications with Containers on Azure – Overview of Azure Container Registry, Azure Container Instances (ACI), Azure Kubernetes Services (AKS), and release automation tools with live demos.

Podcasts

Blockchain, artificial intelligence, and machine learning what does it mean for healthcare – David Houlding, a Microsoft Principal Healthcare Program Manager, discusses topics such as blockchain, artificial intelligence, machine learning as they impact healthcare.
Real world insights working with machine learning projects – Jess Panni and David Starr share insights learned from machine learning projects and the use of Machine Learning Studio to get actionable insights from the data produced.

Check out recent podcast episodes at the Microsoft industry experiences team podcast page.
Quelle: Azure

Microsoft Azure portal January 2019 update

This month we’re bringing you updates that improve the ease of navigation of the landing page, add to dashboard tile features, and increase functionality in Azure Container Instances.

Sign in to the Azure portal now and see for yourself everything that’s new. Download the Azure mobile app.

Here’s the list of January updates to the Azure portal:

Landing page

New Azure portal home page

Dashboard

Shared time range for tiles on dashboards

Azure Container Instances (ACI)

New start functionality

Let’s look at each of these updates in detail.

Landing page

New Azure portal home page

The new Azure portal home page is a quick and easy entry point into Azure. From there, you can find recently visited resources, navigate to commonly used services, and discover how to use specialized services to learn, monitor, secure, and optimize your applications and infrastructure.

The landing page has been designed with the following goals:

Improve discoverability of our services. We start by highlighting some of the most popular services in the “Azure services” section at the top, but we also provide access to the entire list.
Help you to make better use of Azure. All the services in the “Make the most out of Azure” section are either free or have free offerings so you can start using them right away!
Provide quick access to recently used resources. We display up to 12 resources, but if you have more, we provide access to the full list of recently used resources.  
Offer immediate access to important resources. The useful links section points to resources to learn about the platform, such as technical docs and product information and to stay informed about what is going on like Azure updates and news.
Provide easy access to the Azure mobile app. Download the Azure mobile app so you can stay connected, informed, and in control when you are on the go.

The new home page is an addition to the user experience and does not need to replace your previously customized dashboard. You can choose to keep the new home page as your default, or you can change the default landing page back to your customized dashboard by following these steps:

Select Settings in the upper right corner.
Under “Choose your default view,” select Dashboard.

Dashboard

Shared time range for tiles on dashboards

Previously, the time range would need to be set individually for each tile on Azure dashboards, but you can now globally specify the time range for supported tiles. Not all tiles support the globally shared time range, but those that do will show a filter icon in the top left and will light up when the shared time range dialog is open as shown in the screenshot below.

Tiles for Log Analytics queries and Azure Monitor metrics do support shared time range, but if you have a metric chart that was pinned a long time ago, it might not support the shared time range. In that case, you will have to go to Azure Monitor and re-pin that chart.

To set time ranges:

Select Monitor from the left navigation pane or from the All services menu.
From Monitor, select the Metrics menu item.
From Metrics, configure a metric and then select Pin to dashboard.
Go back to the dashboard. You should see your tile with the filter icon in the top left. You should also see a time filter widget appear under the dashboard name.
Select the time filter widget to expose the time range dialog. You should see your new tile light up. Tiles that don't support the shared time range will be dimmed.
Modify the time range and select the Apply button. All tiles that support shared time range should reload their data scoped to the desired range.

Azure Container Instances

New start functionality

ACI allows you to quickly and easily run containers on Azure without managing servers or having to learn new tools. It is now possible to easily start and restart your containers in ACI via the portal. If you have any stopped containers, a new "Start" command will be available on the ACI overview page that will let you start all of your containers quickly and easily.

 

Did you know?

With the Azure portal, you can test features in preview by visiting preview.portal.azure.com.

Let us know what you think

Thank you for all your terrific feedback. The Azure portal is built by a large team of engineers who are always interested in hearing from you. If you’d like to learn how we streamlined resource creation in Microsoft Azure to improve usability, consistency, and accessibility, read the new Medium article “Creation at Cloud Scale.” If you’re curious to learn more about how the Azure portal is built, be sure to watch the Microsoft Ignite 2018 session, “Building a scalable solution to millions of users.”

We will soon be introducing Azure portal “how to” videos on YouTube on the Azure channel where you can learn about specific features in order to become more productive using the portal. Stay tuned for more details!

Don’t forget to sign in to the Azure portal and download the Azure mobile app today to see everything that’s new, and let us know your feedback in the comments section or on Twitter. See you next month!
Quelle: Azure