Azure Media Services announces support for AAD and deprecation of ACS authentication

This month we are announcing the release of support for Azure Active Directory (AAD) authentication in Azure Media Services. Customers of our REST API and .NET client libraries can now use AAD authentication to authorize requests. In addition, we are releasing a new management blade in the Azure Portal to simplify the usage of User and Service Principal authentication with AAD. With the release of this update to our REST API, we are now able to provide the same role-based access management (RBAC) as provided by the Azure Resource Management (ARM) service. By moving to AAD authentication you will also now be able to track and audit all changes made by specific users or an application connected to your Media Services account. The new Azure Media REST API requires that the user or application making REST API requests must have either contributor or owner level access to the resources it is attempting to manage. More details on how role-based access control works for Azure resources is available at Azure Role-based Access Control. IMPORTANT! 12-month deprecation notice of ACS authentication support in Azure Media Services Because Azure Active Directory provides powerful role-based access control features and support for more fine-grained access to resources in your account compared to the ACS token authentication model (“account keys”), we strongly recommend that you update your code and migrate from ACS to AAD-based authentication by June 22, 2018. Also, a key reason for the rapid migration is the upcoming announced deprecation of the ACS key based authentication system. What does this mean for you? Microsoft Azure Media Services will end support for Microsoft Azure Access Control Service (ACS)-based authentication on June 22, 2018. To provide customers sufficient time to update their application code, we are providing 12 months’ notice to manage the necessary transition. What actions should you take? We recommend that you take the following actions prior to June 22, 2018 to ensure that your applications continue to work as expected: Update the code for your applications authored for Media Services. Migrate from ACS-based authentication. Begin using AAD-based authentication. Mitigation steps must be taken on or before June 22, 2018 to ensure your applications authored for Media Services using ACS authentication tokens will continue to function as expected without failures in production. Please review each of the new authentication scenarios below closely and take the appropriate action to update to using AAD authentication in your source code. The Azure Media Services REST API supports authentication for both interactive users and web API, middle-tier, or daemon applications. The following sections provide details on how to use AAD authentication when working directly with the REST API or through the .NET client library. User Authentication with AAD in Media Services If you are looking to build a management application for your Azure Media Services account like the Azure Media Services Explorer tool, you can simply login with a User’s credentials that has been granted access to the Media Services Resource in the portal via the Access Control (IAM) blade. This type of solution is very useful when you want human interaction with the service that fits one of the following scenarios: Monitoring dashboard for your Encoding jobs Monitoring dashboard for your Live Streams Management application for desktop or mobile users to administer resources in a Media services account. A native application would first acquire an access token from Azure Active Directory and then use that access token to make all REST API calls. The following diagram shows a typical interactive application authentication flow. For a RESTAPI request to succeed, the calling user must be a “Contributor” or “Owner” of the Azure Media Services account it is trying to access. Unauthorized requests would fail with status code 401. If you see this failure, please double check that you have configured your user as “Contributor” or “Owner” on the Media Services account. You can check this through the Azure portal by searching for your media account and clicking on “Access control” tab. Users of the .NET client SDK for Media Services must upgrade to the latest version on Nuget (windowsazure.mediaservices version 4.0.0.4 or greater) to use AAD authentication for communicating with REST requests. The following example shows the differences between how to authenticate with the .NET client SDK previously using ACS and the new way that uses AAD credentials. NOTE: Applications will also need to update their references to include a new assembly “Microsoft.WindowsAzure.MediaServices.Client.Common.Authentication.dll” and add references to that namespace as well as reference to the “Microsoft.IdentityModel.Clients.ActiveDirectory” assembly to get access to the ITokenProvider interface. For more information and a detailed sample on using the .NET SDK with AAD see this overview and the available environment settings and constants.   DEPRECATED method of authenticating using ACS credentials // Create and cache Media Services credentials in a static class variable.
_cachedCredentials = new MediaServicesCredentials(
_mediaServicesAccountName,
_mediaServicesAccountKey,
“urn:windowsazuremediaservices”,
“https://wamsprodglobal001acs.accesscontrol.windows.net”);

// Used the cached credentials to create CloudMediaContext.
var mediaContext = new CloudMediaContext(_cachedCredentials);
mediaContext.Assets.FirstOrDefault();
New method of authenticating using AAD credentials and User authenticationvar tokenCredentials = new AzureAdTokenCredentials(“{YOUR AAD TENANT DOMAIN HERE}”, AzureEnvironments.AzureCloudEnvironment);
var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
var mediaContext = new CloudMediaContext(new Uri(“YOUR REST API ENDPOINT HERE”), tokenProvider);
mediaContext.Assets.FirstOrDefault() // This would return a 401 unauthorized if you are not set up as an authorized user

The “AzureEnvironments.AzureCloudEnvironment” constant is a helper in the .NET SDK to get the right environment variable settings for a public Azure Data Center. It contains pre-defined environment settings for accessing Media Services in the public data centers only. For sovereign or government cloud regions, you can use the ” AzureChinaCloudEnvironment”, “AzureUsGovernmentEnvrionment”, or “AzureGermanCloudEnvironment” respectively.
A lot of the details regarding acquiring an AAD access token has been wrapped and simplified for you in the AzureAdTokenProvider and AzureAdTokenCredentials classes. For example, you do not need to provide the AAD authority, Media services Resource URI or native AAD application details. These are well known values that are already configured by the AAD access token provider class. If you are not using our .NET client SDK, it is recommended to use the ADAL Library to simplify the creation of the access token request using these parameters. The following values are used by default in the AzureAdTokenProvider and AzureAdTokenCredentials classes.
You also have the option of replacing the default implementation of the AzureAdTokenProvider with your own implementation.
AAD Service Principal Authentication in Media Services
For non-human interaction through daemon services, Web APIs, Consumer (mobile or desktop), and Web application, where interactive login or direct user management/monitoring of resources in the Media Services account is not required, you will need to first create an Azure Active Directory application in its own tenant.
Once it is created, you will have to give this application “Contributor” or “Owner” level access to the Media Services account in the Access Control (IAM) blade. Both steps can easily be done through the Azure Portal or through the Azure CLI, or PowerShell script. Note that for AAD resources, “Contributor” has the same access to the resource as “Owner” but only the “Owner” role can grant access to other users. Currently this version of the Media Services REST API does not provide RBAC at the entity level, but that is something we have on the roadmap for our future API update in the Fall. We have also provided the new “API Access” blade in your Media Services account to make it easy to generate the required application or select from an existing one.  If you would like to use x509 certificates instead or ClientID and ClientKey, you can reference the documentation for details on how to configure the SDK
The following examples show how a daemon application may use AAD web application credentials to authenticate requests with the REST service.

Deprecated way of authenticating using ACS credentials // Create and cache Media Services credentials in a static class variable.
_cachedCredentials = new MediaServicesCredentials(
_mediaServicesAccountName,
_mediaServicesAccountKey,
“urn:windowsazuremediaservices”,
“https://wamsprodglobal001acs.accesscontrol.windows.net”);

// Used the cached credentials to create CloudMediaContext.
var mediaContext = new CloudMediaContext(_cachedCredentials);

New way of authenticating with an AAD Service Principal and client symmetric keyvar tokenCredentials = new AzureAdTokenCredentials(“{YOUR AAD TENANT DOMAIN HERE}”, new AzureAdClientSymmetricKey(“{YOUR CLIENT ID HERE}”, “{YOUR CLIENT SECRET}”), AzureEnvironments.AzureCloudEnvironment);
var tokenProvider = new AzureAdTokenProvider(tokenCredentials);

var mediaContext = new CloudMediaContext(_mediaServicesApiServerUri, tokenProvider);mediaContext.Assets.FirstOrDefault();

Making it easy to get started with the new API Access Blade for Media Services
Azure Active Directory authentication could be complex for users unfamiliar with the details of AAD, so we wanted to make it very easy to get started with very little knowledge of AAD. For that reason, we are introducing a new “API Access” blade for Media Services accounts in the portal that will replace the previous ACS “Account keys” blade. We are also disabling the ability to rotate the ACS keys to promote users to update their code and move to AAD support.
The new API Access blade makes the process of connecting to Azure Media Services with AAD much simpler. When you first select the API Access blade, you will be presented with a choice of using either user authentication for human interactive management applications, or creating a Service Principal and AAD application for non-human interaction with the Media Services API.
 
When selecting the user based authentication option, you will see a new panel that contains all the Active Directory information needed to authenticate with the API. This includes the API endpoint that you need to call, along with the ClientID, Domain, and Resource.
 
For Service Principal authentication, you will see additional values and the ability to select from an existing AAD Application or create a new one directly in the panel.

When the Service Principal blade opens, it selects the first AAD application that meets the following criteria:

It is a registered AAD application
It has “Contributor” or “Owner” RBAC permissions on the account
After creating or selecting an AAD app, you will be able to create and copy a Key (Client Secret) and copy the Client ID (Application ID) which are required to get the access token in this scenario. In the blade, you can choose to “Create New” AAD Application, or select from an Existing one in your Subscription.  When selecting an existing one, you will see a new blade listing your existing applications to choose from.

Once you select from an existing application or create a new one, you will see additional buttons to “Manage Permissions” or “Manage Application”.  You can use these settings to open the AAD application management blade directly to perform management tasks such as changing keys or reply URL, or customizing the applications manifest.
Clicking on the Manage Application button will bring up the AAD application management blade which allow you to create Keys for use with the API using this application.

If you do not have permissions to create AAD apps in your Domain, the AAD app controls of the blade are not shown and a warning message is shown instead.
  Next Steps and Actions for Media Services Customers
We are very excited to be making the transition from the older ACS key based authentication to the more secure, flexible, and role-based Azure Active Directory service. All Azure Media Services customers should begin immediately to migrate to use the new AAD based authentication model by downloading the latest .NET SDK, or updating their existing REST based API calls.
In addition, we are working on a new version of our REST APIs with support for more client SDK languages with AAD authentication. More details on that updated API will come in a later blog post.
The key actions you should be taking today:

If you are using .NET, update to the latest SDK and migrate to AAD authentication.
Plan early for the deprecation of ACS authentication support in Media Services API. The older ACS authentication support will be shutting off officially on June 22, 2018.
Note on Java SDK and Open Source and Community driven client SDKs
If you are currently using the Java SDK or one of the many community or open source generated client SDKs for Media Services, you have a couple of options at this time.  The existing Java SDK for Media Services will be updated in the coming months to support AAD authentication for customers to immediately begin migration to.  For the open source libraries, since these are not supported directly by the Media Services team, you would need to work with the community SDK developer to prioritize updating the SDK to support AAD for your scenario. We also recommend that you stick with the community SDK a little longer, as we reach out to the developers and work with them to update their libraries.  In addition, we are working hard on an updated REST API (v3) that is coming out later this Fall with support for AutoRest generated client SDKs across PHP, Java, Python, and more which will support AAD authentication. We will be following up with more blog posts on migrating to the new v3 API and client SDKs when they are ready for preview.
Resources and Additional Documentation
For more details, sample code and specific scenario documentation, please refer to the following articles:

Use AAD auth to access the API
Use the Azure Portal to Manage AAD Auth
Access the Media Services API with .NET and AAD
Use the Azure CLI 2.0 to create and configure an AAD app
Use PowerShell to create and configure an AAD app 
Contact Us with Questions
As always, if you have any questions, comments, or feedback, please post a message or question to our MSDN Forum or use Stack Overflow. For direct support, please submit a support request through the Azure Portal. We are also available on Twitter via @MSFTAzureMedia.
Quelle: Azure

Building an Azure Analysis Services Model for Azure Blobs — Part 3

The third and final part of the article series “Building an Azure Analysis Services Model on Top of Azure Blob Storage” has been published on the Analysis Services team blog. It shows that an Azure Analysis Services S9 server can successfully ingest more than 1,000 source files in a 1-terabyte TPC-DS data set stored in Azure Blob Storage in less than 12 hours. Among other things, Part 3 demonstrates how to analyze the memory consumption of the model and how to optimize model size and processing times. This part also discusses the advantages of more sophisticated data sources, such as Azure SQL Data Warehouse, over Azure Blob Storage for reducing the data volume that must be transferred to Azure Analysis Services for processing. Thanks to the modern Get Data experience, you can build a flexible data import pipeline directly in a Tabular model and import even very large data sets with reasonable processing times.

 

Read the full article “Building an Azure Analysis Services Model on Top of Azure Blob Storage”.
Quelle: Azure

ASOS: How they migrated from local monolith to microservices in Azure

Today we are kicking of a new series on Microsoft Mechanics called "How we built it" to share the real-world architectural back stories and best practices as told by the technology architects in our customer organizations.

We kick off the series with lead architect, Dave Green, from British online fashion retailer ASOS, to take a closer look at their design goals and approach for moving from a locally-operated monolith to a fully architected built-for-Cloud online retail system.

Breaking down the monolith

Founded almost two decades ago, ASOS was ahead of its time establishing a fully-online retail presence. The company has a growing customer base of 14 million customers, spanning 230 countries worldwide. Their exponential growth resulted in the need to move beyond a single currency and support multiple languages.

This meant building in more agility, scale and efficiency into their tech architecture. Rather than perform a lift and shift migration, the team decided to adopt a cloud-native, microservices architecture from scratch for faster iteration and release of new features.

Then they set to work breaking down their monolithic retail app which comprised stateful, intertwined services into core microservices, swapping out functions from the monolith piece-by-piece.

As Dave Green explains, core to their approach was the decoupling of their presentation layer from their compute layer. This gave them more freedom to easily add or change features and scale their developer teams to work on multiple features at once. Further, as each service maintains its own state and data, this made it easier to scale the data layer and compute layer independently.

Multi-geo footprint and performance

Breaking down the monolith and moving core services to the Cloud also enabled them to architect for greater performance and resiliency for their global customer base. This included geo-redundancy across North and West EU regions, where most of their customers reside with a handful of core services sensitive to latency, running in Asian and North American regions.

The combined approach with Cloud microservices, paid off on some of their busiest shopping days. This includes Black Friday, where they saw an increase in peak order handling from 9 orders per second using their original monolith retail system to 33 orders per seconds with their Cloud microservices architected system.

To learn more, watch Dave Green's account on "How we built it." You can also join Dave and the ASOS team, on June 29th at 9am PDT for their AMA session on the Microsoft Tech Community at http://aka.ms/how-we-built-it-ASOS.
Quelle: Azure

Announcing Microsoft Azure Government services in the Cloud Solution Provider program

We are excited to announce that Azure Government services are now available through the Cloud Solution Provider program (CSP). The CSP program enables partners to make Microsoft Cloud services part of their customer offerings, expanding U.S. government customer options in meeting their mission goals. With the CSP program, partners can now create high-value service offerings that combine use of Azure Government with solution management, customer support, and billing to U.S. government customers. The security, privacy, compliance and transparency of the Azure Government platform give U.S. government partners the right foundation for meeting regulatory requirements while delivering innovation to the U.S. public sector.

The CSP program is a great fit for the U.S. public sector where partners already build, deploy and manage solutions on behalf of federal, state and local government entities. Over the last 60 days, we piloted Azure Government in CSP with a variety of partners and their customers running production workloads. The feedback reinforces the expected demand from U.S. Government customers for secure, compliant cloud-based solutions through a broad ecosystem of partners. Reduced cost, cloud speed, and increased efficiency are all potential benefits of the managed services that U.S. government partners can enable through CSP. 

“Smartronix and Microsoft share a common commitment to deliver innovative and valuable services to our government customers. AzureGov on CSP makes it easier for us to deliver these solutions.”   – Robert Groat, Executive Vice President Smartronix 

"As compliance with regulation and public demand rapidly increase reliance on technology in the public sector, a consistent, reliable and accessible cloud platform is the backbone supporting that change, with Microsoft Azure, through the SYNNEX CLOUDSolv marketplace, we enable our partners to drive deployment and provide comprehensive solutions that help them compete and grow their business in the government vertical.” – Darren Harbaugh, Vice President, Cloud & Emerging Services, SYNNEX Corporation

“Our partnership with Microsoft enabled us to seize a tremendous opportunity as we were closing the year. Not only did the fast turnaround have us transacting on the Azure Cloud in less than a week, but now we can provide our services on Microsoft’s infrastructure which opens a wave of new possibilities for new government customers.” – Eric Van Heel, VP of Cloud Solutions Support Avtex Solutions

“In partnering with Microsoft through CSP and leveraging technologies such as the Azure Pricing Calculator, which is bleeding edge to provide a comprehensive and easy to use web-based user interface to select and dynamically price out all of Azure Cloud Services, Microsoft enabled us to turn around quotes and spin up resources based on our clients’ requirements within minutes not days.” – Sonoka Ho, VP of Business Operations @ TechTrend.us

The array of services available in Azure Government is rapidly increasing, and nearly all are available today through CSP.  From infrastructure services like virtual machines, storage, and networking to platform services like data, analytics, web and mobile services. Transacting Azure in the Government Cloud through CSP can be done in three steps:

Learn about CSP: the requirements for participation and the option to own the customer relationship end-to-end.
Decide on the model: direct or indirect.
Get started: sign up, get ready to sell, transact, and support.

Once you decide which model is right for you, follow the enrollment path in CSP for Azure Government.

Become an indirect reseller

As an indirect reseller in the CSP program, you’ll work with an indirect provider (also known as a distributor). Indirect providers can provide your customers with product support, provide you with technical assistance and marketing, and help you establish financing and credit terms.

Minimum requirements include having a Microsoft Partner Network (MPN) ID and the ability to sign legal agreements on behalf of your organization.

If you don’t have the infrastructure to provide customer support and billing, you can connect with an indirect provider. This gives you more time to spend with your customers building specialized service offers. Review the authorized indirect providers in your area, to get help with value added services, support and billing.  Here’s where you can learn more about the indirect reseller model and find a provider.

Become a direct partner

As a direct partner, you provide your customers with cloud services, cloud products, customer support, and you bill your customers directly. If you don’t have the infrastructure for doing this type of business, join as an indirect reseller.   Minimum requirements for direct partners include: A service business model, a customer support infrastructure, customer billing and invoicing capabilities, and the ability to scale.  There is a deeper commitment required to be a direct partner, and details on support requirement, billing and invoicing, managing customers, incentives and licensing can be found here.

Enroll in the CSP Program

After you understand the requirements and commitment, apply now and we’ll review your application.  Please note it can take several days to review and verify your information.

Have questions, please email azgovcsp@microsoft.com and join us on Yammer.
Quelle: Azure

Deep learning for predictive maintenance with Long Short Term Memory Networks

Deep learning has proven to show superior performance in certain areas such as object recognition and image classification. It has also gained popularity in other domains such as finance where time-series data plays an important role. Similarly, in predictive maintenance, the data is collected over time to monitor the state of an asset with the goal of finding patterns to predict failures which can benefit from certain deep learning algorithms. Among the deep learning networks, Long Short Term Memory (LSTM) networks are especially appealing to the predictive maintenance domain since they are very good at learning from sequences. This fact lends itself to their applications using time series data by making it possible to look back for longer periods of time to detect failure patterns.

The traditional predictive maintenance machine learning models are based on feature engineering which is manual construction of right features using domain expertise and similar methods. This usually makes these models hard to reuse since feature engineering is specific to the problem scenario and the available data which varies from one business to the other. Perhaps the most attractive part of applying deep learning in the predictive maintenance domain is the fact that these networks can automatically extract the right features from the data, eliminating the need for manual feature engineering. However, determining the topology of deep learning networks such as deciding on the optimal number of layers, number of nodes and hyperparameters is also considered very labor intensive with no clear guidance.

In the notebook Deep Learning Basics for Predictive Maintenance, we build an LSTM network for the data set and scenario described at Predictive Maintenance Template to predict remaining useful life of aircraft engines using the Turbofan Engine Degradation Simulation Data Set. This notebook serves as a tutorial for beginners looking to apply deep learning in predictive maintenance domain and uses simulated aircraft sensor values to predict when an aircraft engine will fail in the future so that maintenance can be planned in advance.
Quelle: Azure

Predictive maintenance using PySpark

Predictive maintenance is one of the most common machine learning use cases and with the latest advancements in information technology, the volume of stored data is growing faster in this domain than ever before which makes it necessary to leverage big data analytic capabilities to efficiently transform large amounts of data into business intelligence. Microsoft has published a series of learning materials including blogs, solution templates, modeling guides and sample tutorials in the domain of predictive maintenance. Recently, we extended those materials by providing a detailed step-by-step tutorial of using Spark Python API PySpark to demonstrate how to approach predictive maintenance for big data scenarios. The tutorial covers typical data science steps such as data ingestion, cleansing, feature engineering and model development.

Business Scenario and Data

The input data is simulated to reflect features that are generic for most of the predictive maintenance scenarios. To enable the tutorial to be completed very quickly, the data was simulated to be around 1.3 GB but the same PySpark framework can be easily applied to a much larger data set. The data is hosted on a publicly accessible Azure Blob Storage container and can be downloaded by clicking this link. In this tutorial, we import the data directly from the blob storage.

The data set has around 2 million records with 172 columns simulated for 1900 machines collected over 4 years. Each machine includes a device which stores data such as warnings, problems and errors generated by the machine over time. Each record has a Device ID and time stamp for each day and aggregated features for that day such as total number of a certain type of warning received in a day. Four categorical columns were also included to demonstrate generic handling of categorical variables. The goal is to predict if a machine will fail in the next 7 days. The last column of the data set indicates if a failure occurred on that day.

Jupyter Notebooks

There are three Jupyter Notebooks on the GitHub repository. To visit the repository, click the green "View Tutorial" button at right of the gallery page.

Notebook_1_DataCleansing_FeatureEngineering
Notebook_2_FeatureEngineering_RollingCompute
Notebook_3_Labeling_FeatureSelection_Modeling

We formatted this tutorial as Jupyter notebooks because it is easy to show the step-by-step process this way. You can also easily compile the executable PySpark script(s) using your favorite IDE.

Specifications & Configurations

The hardware used in this tutorial is a Linux Data Science Virtual Machine with 32 cores and 448 GB memory. For more detailed information of the Data Science Virtue Machine, please visit the link. For the size of the data used in this tutorial (1.3 GB), a machine with less cores and memory would also be adequate. However, in real life scenarios, one should choose the hardware configuration that is appropriate for the specific big data use case. Jupyter Notebooks included in this tutorial can also be downloaded and run on any machine that has PySpark enabled.

The Spark version installed on the Linux Data Science Virtual Machine for this tutorial is 2.0.2 with Python version 2.7.5. Please see the tutorial page for some configurations that needs to be performed before running this tutorial on a Linux machine.

Prerequisites

The user should already know some basics of PySpark. This is not meant to be a PySpark 101 tutorial.
Have PySpark (Spark 2.0., Python 2.7) already configured. Please note if you are using Python 3 on your machine, a few functions in this tutorial require some very minor tweaks because some Python 2 functions deprecated in Python 3.

References

Blog post: Predictive Maintenance Modelling Guide in the Cortana Intelligence Gallery
Predictive Maintenance Modelling Guide
Predictive Maintenance Modelling Guide R Notebook
Predictive Maintenance Modelling Guide Python Notebook
Predictive Maintenance solution
Predictive Maintenance Template

Acknowledgement

Special thanks to Said Bleik, Yiyu Chen and Ke Huang for learning PySpark together. Thank Fidan Boylu Uz and Danielle Dean for proof reading and modifying the tutorial materials.
Quelle: Azure

Azure Database for PostgreSQL team hosts Ask Me Anything session

The Azure Database for PostgreSQL team will host a special Ask Me Anything session on /r/Azure, Thursday, June 22, 2017 from 9:00 am to 1:00 pm Pacific Time.

What's an AMA session?

We'll have folks from our Azure Database for PostgreSQL engineering team available to answer any questions you have. You can ask us anything about our product or even our team!

Why are you doing an AMA?

We have done a few AMAs for Microsoft Databases, Azure cloud as well as on-premises, over the past several months and have received some great questions and valuable feedback from you! We are now extending our scope to the recently announced Azure Database for PostgreSQL service and want to hear from you. Your questions will provide us insight into how we can continue to make your experience better.

Who will be there?

You, of course! We'll also have PMs and Developers from the Engineering team participating throughout the day.

Have any questions about Azure Database for PostgreSQL? Bring them to the AMA!

Why should I ask questions here instead of StackOverflow, MSDN, or Twitter? Can I really ask anything?

An AMA is a great place to ask us anything. StackOverflow and MSDN have restrictions on which questions can be asked while Twitter only allows 140 characters. With an AMA, you’ll get answers directly from the team and have a conversation with the people who build this product.

Here are some question ideas:

What are the advantages of using Azure Database for PostgreSQL instead of a local PostgreSQL server?
What aspects of the database service are managed by Azure?
Do I need to worry about minor version upgrades of the database engine?

Go ahead, ask us anything about our public products or the team. Please note, we cannot comment on unreleased features and future plans.

Join us! We're looking forward to having a conversation with you!
Quelle: Azure

Managing updates for your Azure VM

In this blog post, I will talk about how to use Update Management solution to manage updates for your Azure VMs. Right from within your Azure VM you can quickly assess the status of available updates, initiate the process of installing required updates, and review deployment results to verify that updates were applied successfully to the VM.

This feature is currently in private preview. If you’re interested in giving it a try, please sign up!

Enabling Update Management

From your VM, you can select “Manage Updates” on the virtual machines blade, under Automation + Control. After selecting it, validation is performed to determine if the Update Management solution is enabled for this VM. If it is not enabled, you will have the option to enable the solution.

The solution enablement process can take up to 15 minutes, and during this time you should not close the browser window. Once the solution is enabled and log data starts to flow to the workspace, it can take more than 30 minutes for data to be available for analysis in the dashboard described in the next section. We expect this timing to significantly improve in the future.

Review update assessment

From the Manage Updates dashboard, you can review the update compliance state of the VM from the Missing updates by severity tile, which displays a count and graphical representation of the number of updates missing on the VM. The table below shows how the tile categorizes the updates missing by update classification.

To create an update deployment and bring the VM into compliance, you configure a deployment that follows your release schedule and service window. This entails what update types to include in the deployment, such as only critical or security updates, or if you want to exclude certain updates.

Create a new Update Deployment for the VM by clicking the “Schedule deployment for this VM” button at the top of the blade and specify the required values. 

After you have completed configuring the schedule, click the “OK” button and you return to the status dashboard. You will notice that the Scheduled table shows the deployment schedule you just created.

View update deployment state

When the scheduled deployment executes, you see the status appear for that deployment under the Completed and in-progress table. Double-clicking the completed update deployment takes you to the detailed deployment status page.

To review all detailed activities performed as part of the update deployment, select “All Logs and Output tiles”. This will show the job stream of the runbook responsible for managing the update deployment on the target VM.

OS support

Windows: Windows 2012 and above
Linux: RedHat Linux 6 & 7, Ubuntu Server 12.04 LTS, 14.04 LTS, 15.10, and 16.04

New to OMS Update Management

If you are new to OMS Update Management, you can view the current capabilities which include Update Insights across Windows and Linux, and the ability to deploy updates, as well as documentation.

In future posts, I’ll talk about how to manage updates for multiple VMs in your subscription and how to orchestrate the update deployments including running pre/post steps, sequencing, and much more!
Quelle: Azure

Enable client side monitoring in Azure with Application Insights

With the Application Insights JavaScript SDK you can collect and investigate the performance and usage of your web page or app. Historically we have offered onboarding through manually adding a script to your application and redeploying. Manually adding the script is still supported, but recently we have added the ability to add client-side monitoring from the Azure portal in a few clicks as well.

Enablement

If you have enabled Application Insights in Azure, you can add page view and user telemetry. You can learn how to switch on server-side monitoring in our documentation.

     1. Select Settings -> Application Settings

     2. Under App Settings, add a new key value pair:

Key: APPINSIGHTS_JAVASCRIPT_ENABLED

Value: true

 

     3. Save the settings and Restart your app in the Overview tab.

 

The Application Insights JavaScript SDK is now injected into each web page.

Feedback

If you have any questions or experiencing any problems with the JavaScript SDK, feel free to open an issue on GitHub.
Quelle: Azure

GDPR Questions? Azure has answers.

Microsoft is here to help

Please have a look at our white paper How Microsoft Azure Can Help Organizations Become Compliant with the EU General Data Protection Regulation to gain an understanding of how your organization can use currently available features in Azure to optimize your preparation for GDPR compliance. We are here to help you with your compliance efforts in the face of the coming EU law.

May 25, 2018: a new era begins for data privacy

On this date in a little less than a year, the new European Union (EU) data protection law will be implemented, replacing the old Data Protection Directive, which has been in effect since 1995. The new law, known as the General Data Protection Regulation (GDPR), gives individuals greater control over their personal data and imposes many new obligations on organizations that collect, handle, or analyze personal data.

This is what we do

Azure has developed a tradition of compliance which gives our customers the tools they need to comply with complex regulations. Our attention to, and preparation for the impact of GDPR continues to show how we equally prioritize the best cloud technology with the best compliance offerings.

Additional information about how Microsoft helps you to fulfill specific GDPR requirements are available at the GDPR section of our Microsoft Trust Center.
Quelle: Azure