Azure Container Registry Private Link support preview for virtual networks

Azure Container Registry announces preview support for Azure Private Link, a means to limit network traffic of resources within the Azure network.

With Private Link, the registry endpoints are assigned private IP addresses, routing traffic within a customer-defined virtual network. Private network support has been one of the top customer asks, allowing customers to benefit from the Azure management of their registry while benefiting from tightly controlled network ingress and egress.
  

Private Links are available across a wide range of Azure resources with more coming soon, allowing a wide range of container workloads with the security of a private virtual network.

Private Endpoints and Public Endpoints

Private Link provides private endpoints to be available through private IPs. In the above case, the contoso.azurecr.io registry has a private IP of 10.0.0.6 which is only available to resources in contoso-aks-eastus-vnet. This allows the resources in this VNet to securely communicate. The other resources may be restricted to resources only within the VNet.

At the same time, the public endpoint for the contoso.azurecr.io registry may still be public for the development team. In a coming release, Azure Container Registry (ACR) Private Link will support disabling the public endpoint, limiting access to only private endpoints, configured under private link.

Cross tenant manual approval support

Customers looking to establish a private link between two Azure tenants, where an Azure container registry is in one tenant and while container hosts are in other tenants can use the Private Link Manual Approval workflow. This workflow enables many Azure services, including Azure Machine Learning, to securely interact with your registry. Development teams working in different subscriptions and tenants may also utilize private link manual approval to grant access.

Service Endpoints and Private Links

ACR Service Endpoint preview support was released in March 2019. Service Endpoints provide access from Azure VNets through IP tagging. All traffic to the service endpoint is limited to the Azure backbone network through routing. The public endpoint still exists; however, firewall rules limit public access. Private Link capabilities take this a step further by providing a private endpoint (IP address). As Private Links are more secure and a superset of capabilities of Service Endpoints, Private link support will replace Azure Container Registry Service Endpoint support. While both Service Endpoints and Private Link are currently in preview, we plan to release Private Link capabilities as generally available shortly. We encourage Service Endpoint customers to evaluate ACR Private Link capabilities.

Preview support and limitations

During the preview period, private link support is limited to registries that are not geo-replicated. The feature will move to general availability as we assess feedback and geo-replication support is complete.

We’ve heard clearly that customers requiring private networks also require production support. As such, all support requests will be honored through standard support channels.

Regional support and pricing

Azure Container Registry Private Link support is available across 28 regions through the premium tier.

Additional links:

Learn more about Azure Container Registry.
Azure Container Registry pricing.
Configure Azure Private Link for an Azure Container Registry.

Quelle: Azure

Azure Government Secret accredited at DoD IL6, ICD 503 with IaaS and PaaS

Accelerate classified missions with unparalleled connectivity, high availability, and resiliency across three regions with more than 35 services

Azure Government Secret recently achieved Provisional Authorization (PA) at Department of Defense Impact Level 6 (IL6) and Intelligence Community Directive (ICD) 503 with facilities at ICD 705. We’re also announcing a third region to enable even higher availability for national security missions to stay ahead of their unique threats.

Built exclusively for the needs of US government and operated by cleared US citizens, Azure Government Secret delivers dedicated regions to maintain the security and integrity of classified Secret workloads while enabling reliable access to critical data. The first cloud natively connected to classified networks; Azure Government Secret enables customers to leverage options for private, resilient, high-bandwidth connectivity.

Protect national security production workloads with geodiversity across three regions

Azure Government Secret is designed for the unique requirements of critical national security workloads that cannot be served out of a single geographic location. To provide the geodiversity required, Azure Government Secret delivers across three dedicated regions for US Federal Civilian, Department of Defense (DoD), Intelligence Community (IC), and US government partners working within Secret enclaves. These dedicated Azure regions are located over 500 miles apart to enable applications to stay running in the face of a disaster without a break in continuity of operations.

In addition, these regions provide greater choice when working across multiple locations and delivering cloud-to-edge scenarios. With comprehensive cloud services Azure Government Secret enables faster innovation for the mission from cloud to tactical edge meeting the critical availability needs of the warfighter.

Enabling classified missions at scale with more than 35 services

Designed and built for Infrastructure as a Service (IaaS), Platform as a Service (PaaS), Software as a Service (SaaS) and Marketplace solutions, Azure Government Secret provides a broad range of commercial innovation for classified workloads.­ Some of the services include: identity, analytics, security, and high performance computing to support advanced artificial intelligence (AI) and machine learning.

Operated by cleared US citizens, these new regions are part of Azure Government, delivering a familiar, consistent experience and alignment with existing resellers and programs. Eligible customers can also leverage cleared Microsoft cloud support for their workloads.

Gain speed by connecting directly or extending on-premises networks

With Azure Government Secret, customers can connect natively to classified networks or leverage options for private, resilient, high-bandwidth connectivity using ExpressRoute and ExpressRoute Direct:

Native connection: Agencies with direct connections through US government classified networks can connect natively to Azure Government Secret.
ExpressRoute: Extend on-premises networks into Azure Government Secret regions over a private connection facilitated by a connectivity provider with ExpressRoute.
ExpressRoute Direct: Get the ability to connect directly into Azure Government Secret locations using ExpressRoute Direct.

Continued investments in commercial parity across data classifications

In addition to serving mission customers at DoD IL6 and ICD 503, we continue to invest in rapidly delivering new Azure Government capabilities to support mission needs across all data classifications for any US government customer. In the last six months we’ve continued our drive toward commercial parity, adding hundreds of features and launching 40+ new services and 101 total services in FedRAMP High, with more to come across Azure commercial, Azure Government and Azure Government Secret.

These continued investments enable customers across the full spectrum of government, including departments in every state, all the federal cabinet agencies, and each military branch, modernize their IT to better achieve their missions.

To learn more about Azure Government Secret contact us or visit Azure Government for national security.
Quelle: Azure

Azure Dedicated Host: New capabilities and benefits

Late last year, we’ve announced the general availability of Azure Dedicated Hosts. This blog provides an update regarding the new and recently added capabilities since we introduced Azure Dedicated Hosts in preview.

Azure Dedicated Host provides a single-tenant physical server to run your Azure Virtual Machines for Windows Server and Linux. With Azure Dedicated Host, you can address specific compliance requirements while increasing visibility and control over your underlying infrastructure.

What’s new

Save costs with Azure Dedicated Hosts reservations

We recently introduced the ability for you to purchase Azure reservations for Dedicated Hosts. You are now able to reduce costs by buying Azure Dedicated Hosts reservations. The reservation discount is applied automatically to the number of running dedicated hosts that match the reservation scope and attributes. You don't need to assign a reservation to a specific dedicated host to get the discounts. You may also delete and create hosts and have the reservation apply to the hosts already deployed at any given time.

The Azure Dedicated Hosts pricing page contains the complete list of Dedicated Hosts SKUs, their CPU information, and various pricing options including Azure reservations discounts.

Azure Dedicated Host SKUs, unlike Azure Virtual Machines, are defined based on the virtual machine (VM) series and hardware generation. With Azure Dedicated Hosts, your reservation will automatically apply to any host SKUs supporting the same VM series. For example, if you acquired a reservation for Dsv3_Type1 dedicated host, you would be able to use it with Dsv3_Type2 dedicated hosts.

Maintenance control for platform updates supports Azure Dedicated Hosts

The maintenance control feature for Azure Dedicated Hosts gives control over platform maintenance operations to customers with highly sensitive workloads. Using this feature, customers can manage platform updates that don’t require a reboot. Maintenance control batch updates into one update package and gives you the option to delay platform updates and apply them within a 35-day rolling window.

You can take advantage of this new capability, by creating a maintenance configuration object and then apply it to your dedicated hosts. Then, you can check for pending updates and apply them at the host level. All VMs assigned to the host will be impacted at the same time.

Prior to applying the maintenance, you can check the impact type and expected duration of the impact

To learn more, refer to our documentation Control updates with Maintenance Control.

More options with new SKUs

Since the preview was announced, we have added support for additional VM series and host types. We’re currently supporting both Intel and AMD SKUs with a variety of VM series: Dsv3, Esv3, Dasv4, Easv4, Fsv2, Lsv2, and Msv2. This will enable our customers to run a broad range of workloads on Dedicated Hosts including and not limited to general purpose or memory, storage, and compute intensive applications. 

Visit the Azure Dedicated Host pricing page to learn more about these new SKUs and the options available to you.

Resource Health Activity Log Alerts for Dedicated Hosts

Azure Resource Health alerts can notify you in near real-time when your dedicated hosts experience a change with respect to their health status. Creating Resource Health alerts programmatically let users create and customize alerts in bulk. You can create an action group and specify the steps to take once an alert is triggered. Follow the steps to create activity log alerts using Azure Resource Manager Template and remember to modify the template to include resources of type dedicated hosts.

 

Get started

Start by visiting the Azure Dedicated Host page, read more in the documentation page, or watch a video introduction to Azure Dedicated Host.

Deploy Dedicated Host using Azure CLI, the Azure portal, Azure REST API, or Azure PowerShell.
Quelle: Azure

Azure Cost Management + Billing updates – March 2020

Whether you're a new student, thriving startup, or the largest enterprise, you have financial constraints and you need to know what you're spending, where, and how to plan for the future. Nobody wants a surprise when it comes to the bill, and this is where Azure Cost Management + Billing comes in.

We're always looking for ways to learn more about your challenges and how Azure Cost Management + Billing can help you better understand where you're accruing costs in the cloud, identify and prevent bad spending patterns, and optimize costs to empower you to do more with less. Here are a few of the latest improvements and updates based on your feedback:

Pay-As-You-Go (PAYG) invoice improvements.
Tell us about your reporting goals.
What's new in Cost Management Labs.
New ways to save money with Azure.
Upcoming changes to Azure usage data.
New videos and learning opportunities.
Documentation updates.

Let's dig into the details.

 

Pay-As-You-Go (PAYG) invoice improvements

Managing and staying up to date on your Azure invoices just got a whole lot better with a few key improvements for Pay-As-You-Go (PAYG) subscriptions:

You can now view and download your support plan invoices in the Azure portal, giving you a one-stop destination for all invoices.
You can opt in to email notifications for your support plan invoices to have a PDF copy sent directly to your inbox. This is available for all PAYG invoices in the Azure portal.
You can verify payment status for all your invoices at a glance with the new Payment status column.
And, if you pay with a credit card, you can also correlate the charge on your credit card statement back to the portal with the new Invoice ID column. You can also use this to break down your charges in cost analysis.

These are all based on your feedback, so please keep it coming. Our goal is to make it easier than ever to manage and pay your invoices. What would you like to see next?

 

Tell us about your reporting goals

As you know, we're always looking for ways to learn more about your needs and expectations. This month, we'd like to learn about the most important reporting tasks and goals you have when managing and optimizing costs. We'll use your inputs from this survey to help prioritize reporting improvements within Cost Management + Billing experiences over the coming months. The 12-question survey should take about 10 minutes.

Take the survey.

 

What's new in Cost Management Labs

With Cost Management Labs, you get a sneak peek at what's coming in Azure Cost Management and can engage directly with us to share feedback and help us better understand how you use the service, so we can deliver more tuned and optimized experiences. Here are a few features you can see in Cost Management Labs:

More details in the cost by resource view—Now available in the public portal
Drill in to the cost of your resources to break them down by meter. Simply expand the row to see more details or click the link to open and take action on your resources.
Explain what "not applicable" means—Now available in the public portal
Break down "not applicable" to explain why specific properties don't have values within cost analysis.

Of course, that's not all. Every change in Azure Cost Management is available in Cost Management Labs a week before it's in the full Azure portal. We're eager to hear your thoughts and understand what you'd like to see next. What are you waiting for? Try Cost Management Labs today.

 

New ways to save money with Azure

Lots of cost optimization improvements over the past month. Here are a few you might be interested in:

Save more when you prepay for Azure Cache for Redis.
Save up to 33 percent on Cosmos DB multi-master accounts.
Develop, test, and run small workloads with the Cosmos DB free tier.
Take advantage of EventGrid Premium tier for free while in preview.

 

Upcoming changes to Azure usage data

Many organizations use the full Azure usage and charges to understand what's being used, identify what charges should be internally billed to which teams, or to look for opportunities to optimize costs with Azure reservations and Azure Hybrid Benefit, just to name a few. If you're doing any analysis or have setup integration based on product details in the usage data, please update your logic for the following services.

The following change will start effective April 1:

VM NVv4 meter names changing.
IP prefix meter ID changing for Azure Government.

Also, remember the key-based Enterprise Agreement (EA) billing APIs have been replaced by new Azure Resource Manager APIs. The key-based APIs will still work through the end of your enrollment, but will no longer be available when you renew and transition into Microsoft Customer Agreement. Please plan your migration to the latest version of the UsageDetails API to ease your transition to Microsoft Customer Agreement at your next renewal.

 

New videos and learning opportunities

For those visual learners out there, we have a wealth of new videos this month:

Brief overview of Azure Cost Management and Cloudyn (6 minutes). 
Optimizing cloud investments in Azure Cost Management (6 minutes).
Sharing and saving views in Azure Cost Management (4 minutes).
Achieve accountability through budgeting in ACM (7 minutes).
Tools and tips to optimize cost and performance with Azure Cosmos DB (9 minutes). 
Choosing the right partition key for cost and performance with Azure Cosmos DB (14 minutes).
Azure Cosmos DB Free Tier and Autopilot (14 minutes).
Is Azure Free Account really free? (5 minutes). 

Follow the Azure Cost Management + Billing YouTube channel to stay in the loop with new videos as they're released and let us know what you'd like to see next.

Want a more guided experience? Start with Predict costs and optimize spending for Azure.

 

Documentation updates

Here are a few documentation updates you might be interested in:

Documented how rounding is handled in Cost Management + Billing.
Added details about Azure Lighthouse to the Link partner ID article.
Updated the list of available reservation types.
Noted Cloudyn deprecation at the end of 2020.

Want to keep an eye on all of the documentation updates? Check out the Cost Management + Billing doc change history in the azure-docs repository on GitHub. If you see something missing, select Edit at the top of the document and submit a quick pull request.

 

What's next?

These are just a few of the big updates from last month. We're always listening and making constant improvements based on your feedback, so please keep the feedback coming.

Follow @AzureCostMgmt on Twitter and subscribe to the YouTube channel for updates, tips, and tricks. And, as always, share your ideas and vote up others in the Cost Management feedback forum.
Quelle: Azure

Keeping your cloud deployments secure during challenging times

As the world comes together to combat COVID-19, and remote work becomes a critical capability for many companies, customers have asked us how to best maintain the security posture of their cloud assets while enabling more remote workers to access them.

Misconfiguration of cloud security controls has been at the root of several recent data breaches, so it’s extremely important to continue monitoring your security posture as usage of cloud assets increases.

To help you prioritize the actions that you need to take, we are listing three common scenarios for remote workers and how to leverage Azure Security Center security controls to prioritize relevant recommendations for these scenarios:

1. As more users need to access resources remotely, you need to ensure that Multi-Factor Authentication (MFA) is enabled to enhance their identity protection.

Azure Security Center has a security control called Enable MFA, ideally you should remediate all recommendations that are part of this security control, as shown below:

2. Some users might need remote access via RDP or SSH to servers that are in your Azure infrastructure.

Instead of allowing full 24 x 7 access to those servers, ensure that you are using Just-In-Time (JIT) VM access to those servers. Make sure to review the Secure management ports control in Azure Security Center and remediate the recommendations that are relevant for this scenario.

3. Some of the workloads (servers, containers, databases) that will be accessed remotely by users might be missing critical security updates.

Review the Remediate vulnerabilities control in Azure Security Center to prioritize the updates that must be installed. Make sure to review the result of all recommendations in built-in vulnerability assessment and remediate those items.

Security posture management is an ongoing process. Review your secure score to understand your progress towards a fully compliant environment.

Users of Azure are likely just a portion of your user base. Below is additional guidance on enabling and securing remote work for the rest of your organization:

The top 9 ways Microsoft IT is enabling remote work for its employees
Staying productive while working remotely with Microsoft Teams
Working remotely during challenging times
Work remotely, stay secure—guidance for CISOs

Quelle: Azure

Microsoft powers transformation at NVIDIA’s GTC Digital Conference

The world of supercomputing is evolving. Work once limited to high-performance computing (HPC) on-premises clusters and traditional HPC scenarios, is now being performed at the edge, on-premises, in the cloud, and everywhere in between. Whether it’s a manufacturer running advanced simulations, an energy company optimizing drilling through real-time well monitoring, an architecture firm providing professional virtual graphics workstations to employees who need to work remotely, or a financial services company using AI to navigate market risk, Microsoft’s collaboration with NVIDIA makes access to NVIDIA graphics processing units (GPU) platforms easier than ever.

These modern needs require advanced solutions that were traditionally limited to a few organizations because they were hard to scale and took a long time to deliver. Today, Microsoft Azure delivers HPC capabilities, a comprehensive AI platform, and the Azure Stack family of hybrid and edge offerings that directly address these challenges.

This year during GTC Digital, we’re spotlighting some of the most transformational applications powered by NVIDIA GPU acceleration that highlight our commitment to edge, on-prem, and cloud computing. Registration is free, so sign up to learn how Microsoft is powering transformation.

Visualization and GPU workstations

Azure enables a wide range of visualization workloads, which are critical for desktop virtualization as well as professional graphics such as computer-aided design, content creation, and interactive rendering. Visualization workloads on Azure are powered by NVIDIA’s world-class GPUs and Quadro technology, the world’s preeminent visual computing platform. With access to graphics workstations on Azure cloud, artists, designers, and technical professionals can work remotely, from anywhere, and from any connected device. See our NV-Series virtual machines (VMs) for Windows and Linux.

Artificial intelligence

We’re sharing the release of the updated execution provider in ONNX Runtime with integration for NVIDIA TensorRT 7. With this update, ONNX Runtime can execute open Open Neural Network Exchange (ONNX) models on NVIDIA GPUs on Azure cloud and at the edge using the Azure Stack Edge, taking advantage of the new features in TensorRT 7 like dynamic shape, mixed precision optimizations, and INT8 execution.

Dynamic shape support enables users to run variable batch size, which is used by ONNX Runtime to process recurrent neural network (RNN) and bit error test rate (BERT) models. Mixed precision and INT8 execution are used to speed up execution on the GPU, which enables ONNX Runtime to better balance the performance across CPU and GPU. Originally released in March 2019, TensorRT with ONNX Runtime delivers better inferencing performance on the same hardware when compared to generic GPU acceleration.

Additionally, the Azure Machine Learning service now supports RAPIDS, a high-performance GPU execution accelerator for data science framework using the NVIDIA CUDA platform. Azure developers can use RAPIDS in the same way they currently use other machine learning frameworks, and in conjunction with Pandas, Scikit-learn, PyTorch, and TensorFlow. These two developments represent major milestones towards a truly open and interoperable ecosystem for AI. We’re working to ensure these platform additions will simplify and enrich those developer experiences.

Edge

Microsoft provides various solutions in the Intelligent Edge portfolio to empower customers to make sure that machine learning not only happens in the cloud but also at the edge. The solutions include Azure Stack Hub, Azure Stack Edge, and IoT Edge.

Whether you are capturing sensor data and inferencing at the Edge or performing end-to-end processing with model training in Azure and leveraging the trained models at the edge for enhanced inferencing operations Microsoft can support your needs however and wherever you need to.

Supercomputing scale

Time-to-decision is incredibly important with a global economy that is constantly on the move. With the accelerated pace of change, companies are looking for new ways to gather vast amounts of data, train models, and perform real-time inferencing in the cloud and at the edge. The Azure HPC portfolio consists of purpose-built computing, networking, storage, and application services to help you seamlessly connect your data and processing needs with infrastructure options optimized for various workload characteristics.

Azure Stack Hub announced preview

Microsoft, in collaboration with NVIDIA, is announcing that Azure Stack Hub with Azure NC-Series Virtual Machine (VM) support is now in preview. Azure NC-Series VMs are GPU-enabled Azure Virtual Machines available on the edge. GPU support in Azure Stack Hub unlocks a variety of new solution opportunities. With our Azure Stack Hub hardware partners, customers can choose the appropriate GPU for their workloads to enable Artificial Intelligence, training, inference, and visualization scenarios.

Azure Stack Hub brings together the full capabilities of the cloud to effectively deploy and manage workloads that otherwise are not possible to bring into a single solution. We are offering two NVIDIA enabled GPU models during the preview period. They are available in both NVIDIA V100 Tensor Core and NVIDIA T4 Tensor Core GPUs. These physical GPUs align with the following Azure N-Series VM types as follows:

NCv3 (NVIDIA V100 Tensor Core GPU): These enable learning, inference and visualization scenarios. See Standard_NC6s_v3 for a similar configuration.
TBD (NVIDIA T4 Tensor Core GPU): This new VM size (available only on Azure Stack Hub) enables light learning, inference, and visualization scenarios.

Hewlett Packard Enterprise is supporting the Microsoft GPU preview program as part of its HPE ProLiant for Microsoft Azure Stack Hub solution.“The HPE ProLiant for Microsoft Azure Stack Hub solution with the HPE ProLiant DL380 server nodes are GPU-enabled to support the maximum CPU, RAM, and all-flash storage configurations for GPU workloads,” said Mark Evans, WW product manager, HPE ProLiant for Microsoft Azure Stack Hub, at HPE. “We look forward to this collaboration that will help customers explore new workload options enabled by GPU capabilities.” 

As the leading cloud infrastructure provider1, Dell Technologies helps organizations remove cloud complexity and extend a consistent operating model across clouds. Working closely with Microsoft, the Dell EMC Integrated System for Azure Stack Hub will support additional GPU configurations, which include NVIDIA V100 Tensor Core GPUs, in a 2U form factor. This will provide customers increased performance density and workload flexibility for the growing predictive analytics and AI/ML markets. These new configurations also come with automated lifecycle management capabilities and exceptional support.

To participate in the Azure Stack Hub GPU preview, please send us an email today. 

Azure Stack Edge preview

We also announced the expansion of our Microsoft Azure Stack Edge preview with the NVIDIA T4 Tensor Core GPU. Azure Stack Edge is a cloud managed appliance that provides processing for fast local analysis and insights to the data. With the addition of an NVIDIA GPU, you’re able to build in the cloud then run at the edge. For more information about this exciting release please see the detailed blog.

GTC Digital

Microsoft session recordings will be available on the GTC Digital site starting March 26. You can find a list of the Microsoft digital sessions along with corresponding links in the Microsoft Tech Community blog here.

1 IDC WW Quarterly Cloud IT Infrastructure Tracker, Q3 2019, January 2020, Vendor Revenue
Quelle: Azure

Microsoft is expanding the Azure Stack Edge with NVIDIA GPU preview

We’re expanding the Microsoft Azure Stack Edge with NVIDIA T4 Tensor Core GPU preview during the GPU Technology Conference (GTC Digital). Azure Stack Edge is a cloud-managed appliance that brings Azure’s compute, storage, and machine learning capabilities to the edge for fast local analysis and insights. With the included NVIDIA GPU, you can bring hardware acceleration to a diverse set of machine learning (ML) workloads.

What’s new with Azure Stack Edge

At Mobile World Congress in November 2019, we announced a preview of the NVIDIA GPU version of Azure Stack Edge and we’ve seen incredible interest in the months that followed. Customers in industries including retail, manufacturing, and public safety are using Azure Stack Edge to bring Azure capabilities into the physical world and unlock scenarios such as the real-time processing of video powered by Azure Machine Learning.

These past few months, we’ve taken our customers' feedback to make key improvements and are excited to make our preview available to even more customers today.

If you’re not already familiar with Azure Stack Edge, here are a few of the benefits:

Azure Machine Learning: Build and train your model in the cloud, then deploy it to the edge for FPGA or GPU-accelerated inferencing.
Edge Compute: Run IoT, AI, and business applications in containers at your location. Use these to interact with your local systems, or to pre-process your data before it transfers to Azure.
Cloud Storage Gateway: Automatically transfer data between the local appliance and your Azure Storage account.  Azure Stack Edge caches the hottest data locally and speaks file and object protocols to your on-prem applications.
Azure-managed appliance: Easily order and manage Azure Stack Edge from the Azure Portal.  No initial capex fees; pay as you go, just like any other Azure service.

Enabling our partners to bring you world-class business applications

Equally important to bringing you a great device is enabling our partners to bring you innovative applications to meet your business needs.  We’d love to share some of the continued investment we’re making with partners to bring their exciting developments to you.

As self-checkouts grow in prevalence, Malong Technologies is innovating in AI applications for loss prevention.

“For our customers in the retail industry, artificial intelligence innovation is happening at the edge,” said Matt Scott, co-founder and chief executive officer, Malong Technologies. “Along with our state-of-the-art solutions, our customers need hardware that is powerful, reliable, and custom-tailored for the cloud. Microsoft’s Azure Stack Edge fits the bill perfectly. We’re proud to be a Microsoft Gold Certified Partner, working with Microsoft to help our retail customers succeed.”

Increasing your manufacturing organization’s quality inspection accuracy is key to Mariner’s Spyglass Visual Inspection application.

“Mariner has standardized on Microsoft’s Azure Stack Edge for our Spyglass Visual Inspection and Spyglass Connected Factory products. These solutions are mission critical to our manufacturing customers. Azure Stack Edge provides the performance, stability and availability they require.” – Phil Morris, CEO, Mariner

Building computer vision solutions to improve performance and safety in manufacturing and other industries is a key area of innovation for XXII.

“XXII is thrilled to be a Microsoft partner and we are working together to provide our clients with real time video analysis software on edge with the Azure Stack Edge box. With this solution, Azure allow us to harvest the full potential of NVIDIA GPUs directly on edge and be able to provide our clients in retail, industry and smart city with smart video analysis that are easily deployable, scalable and easily manageable with Azure stack Edge.” – Souheil Hanoune, Chief Scientific Officer, XXII

More to come with Azure Stack Edge

There are even more exciting developments with Azure Stack Edge coming. We’re putting the final touches on much-awaited new compute and AI capabilities including virtual machines, Kubernetes clusters, and multi-node support. Along with these new features announced at Ignite 2019, Data Box Edge was renamed Azure Stack Edge to align with the Azure Stack portfolio.

Our Rugged series for sites with harsh or remote environments is also coming this year, including the battery-powered form-factor that can be carried in a backpack. The versatility of these Azure Stack Edge form-factors and cloud-managed capabilities brings cloud intelligence and compute to retail stores, factory floors, hospitals, field operations, disaster zones, and rescue operations.

Get started with the Azure Stack Edge with NVIDIA GPU preview

Thank you for continuing to partner with us as we bring new capabilities to Azure Stack Edge. We’re looking forward to hearing from you.

To get started with the preview, please email us and we’ll follow up to learn more about your scenarios.
Learn more about Azure Stack Edge.

Learn more about Azure’s Hybrid Strategy

Read about more updates from Azure during NVIDIA’s GTC.
Quelle: Azure

How Azure Machine Learning service powers suggested replies in Outlook

Microsoft 365 applications are so commonplace that it’s easy to overlook some of the amazing capabilities that are enabled with breakthrough technologies, including artificial intelligence (AI). Microsoft Outlook is an email client that helps you work efficiently with email, calendar, contacts, tasks, and more in a single place.

To help users be more productive and deliberate in their actions while emailing, the web version of Outlook and the Outlook for iOS and Android app have introduced suggested replies, a new feature powered by Azure Machine Learning service. Now when you receive an email message that can be answered with a quick response, Outlook on the web and the Outlook mobile suggest three response options that you can use to reply with only a couple of clicks or taps, helping people communicate in both their workplace and personal life, by reducing the time and effort involved in replying to an email.

The developer team behind suggested replies is comprised of data scientists, designers, and machine learning engineers with diverse backgrounds who are working to improve the lives of Microsoft Outlook users by expediting and simplifying communications. They are at the forefront of applying cutting-edge natural language processing (NLP) and machine learning (ML) technologies and leverage these technologies to understand how users communicate through email and improve those interactions from a productivity standpoint to create a better experience for users.

A peek under the hood

To process the massive amount of raw data that these interactions provide, the team uses Azure Machine Learning pipelines to build their training models. Azure Machine Learning pipelines allow the team to divide training steps into discrete steps such as data cleanup, transforms, feature extraction, training, and evaluation. The output of the Azure Machine Learning pipeline converts raw data into a model. This Machine Learning pipeline allows the data scientists to build a training pipeline in a compliant manner that enforces privacy and compliance checks.

In order to train this model, the team needed a way to build and prepare a large data set comprised of over 100 million messages. To do this, the team leveraged a distributed processing framework to sample and retrieve data from a broad user base.

Azure Data Lake Storage is used to store the training data used for training the suggested replies models. We then clean and curate the data into message reply pairs (including potential responses to an email) that are stored in Azure Data Lake Storage (ADLS). The training pipelines also consume the reply pairs stored in ADLS in order to train models. To conduct the Machine Learning training itself, the team uses GPU pools available in Azure. The training pipelines leverage these curated Message Reply pairs to learn how to suggest appropriate replies based on a given message. Once the model is created, data scientists can compare the model performance with previous models and evaluate which approaches perform better at recommending relevant suggested replies.

The Outlook team helps protect your data by using the Azure platform to prepare large-scale data sets that are required to build a feature like suggested replies in accordance with Office 365 compliance standards. The data scientists use Azure compute and workflow solutions that enforce privacy policies to create experiments and train multiple models on GPUs. This helps with the overall developer experience and provides agility in the inner development loop cycle.

This is just one of many examples of how Microsoft products are powered by the breakthrough capabilities of Azure AI to create better user experiences. The team is learning from feedback every day and improving the feature for users while also expanding the types of suggested replies offered. Keep following the Azure blog to stay up-to-date with the team and be among the first to know when this feature is released.

Learn more

Learn more about the Azure Machine Learning service.

Get started with a free trial of Azure Machine Learning service.
Quelle: Azure

Our commitment to customers and Microsoft cloud services continuity

Over the past several weeks, all of us have come together to battle the global health pandemic. During this time, organizations around the world are adjusting the way they manage their daily work and how their workforce continues in the face of extraordinary changes to their professional and personal lives.

With this blog we wanted to share a bit about what we have learned over the last few weeks, resources to help organizations manage through these times, support for critical first responders and emergency organizations, and the criteria we have put in place to manage cloud services capacity to support critical operations. 

We will continue to communicate regularly and openly, so you can have insight into what we are seeing, learning and doing.

As companies operationalize to address new and unique challenges, we have mobilized our global response plan to help customers stay up and running during this critical time. We are actively monitoring performance and usage trends 24/7 to ensure we are optimizing our services for customers worldwide, while accommodating new demand. We are working closely with first responder organizations and critical government agencies to ensure we are prioritizing their unique needs and providing them our fullest support. We are also partnering with governments around the globe to ensure our local datacenters have on-site staffing and all functions are running properly.

In response to health authorities emphasizing the importance of social distancing, we are supporting many large-scale corporations, schools, and governments in the mobilization of remote workforces. Microsoft Teams is helping millions of people adapt to remote work. Organizations have been using Dynamics 365 Customer Service to help contact center employees provide consistent, personalized support while working remotely. Ensuring government and organizational functions can continue while keeping safe distances is critical to our society today.

As demand continues to grow, if we are faced with any capacity constraints in any region during this time, we have established clear criteria for the priority of new cloud capacity. Top priority will be going to first responders, health and emergency management services, critical government infrastructure organizational use, and ensuring remote workers stay up and running with the core functionality of Teams. We will also consider adjusting free offers, as necessary, to ensure support of existing customers. 

We will continue to communicate with customers proactively and transparently about our cloud policies through the Microsoft Trust Center and we are committed to supporting every customer through this difficult period. 

These are certainly unprecedented and challenging times. It is not business as usual. But, together, we can and will get through this. We will be back in touch soon. In the meantime, if you have any immediate questions or needs, please refer to the following resources.

Azure Service Health – for tracking and understanding your Azure service health
Microsoft 365 Service health and continuity – tools and resources for understanding your Microsoft 365 service health

Quelle: Azure

Filesystem SDKs for Azure Data Lake Storage Gen2 now generally available

Since the general availability of Azure Data Lake Storage (ADLS) Gen2 in Feb 2019, customers have been getting insights for their big data analytics workloads at cloud scale. Integration to analytics engines is critical for their analytics workloads, and equally important is the ability to programmatically ingest, manage, and analyze data. This ability is critical for key areas of enterprise data lakes such as data ingestion, event-driven big data platforms, machine learning (ML), and advanced analytics. Programmatic access is possible today using ADLS Gen2 REST APIs, Blob REST APIs, or capabilities via Multi-Protocol Access. As part of our developer ecosystem journey, our goal is to make customer application development for programmatic access easier than ever before.

Towards this goal, we're announcing the general availability of Python, .NET, Java, and JS filesystem SDKs for Azure Data Lake Storage (ADLS) Gen2 in all Azure regions. This includes support for CRUD operations for filesystem, directories, files, and permissions with filesystem semantics for ADLS Gen2. Customers can now use this familiar filesystem programming model to simplify application development for ADLS Gen2. These filesystem SDKs streamline our customers’ ability to ingest, manage, and analyze data for ADLS Gen2 and help them gain insights at cloud scale faster than ever before.

Preview feedback

Many of our customers have tried out the ADLS Gen2 SDK preview builds for their scenarios successfully. Here are some common themes based on preview feedback:

The SDK is working seamlessly with new filesystem semantics and has successfully moved key data domains to ADLS Gen2. The SDK expedited the transfer of 450 GB data from ADLS Gen1 to ADLS Gen2 within a few hours. The permissions set up at the root-level directory is working well with hierarchical namespace enabled and all the permissions are propagating perfectly to the child items through the folder hierarchy.
The SDK is critical to the way customers orchestrate their deployments.
The SDK has helped ingest large amounts of IoT data to be used by data scientists for their analytics workloads. This has been instrumental in providing self-service environments for the researchers with access to their own set of directories.
Data ingestion pipelines have used the SDK to integrate drone image data, satellite image data, ground sensor data, and weather data into ADLS Gen2. This helps build custom ML models which generate additional business insights for customers. Customers can use these ML models or aggregate raw data based on their needs and store processed results back into ADLS Gen2.
Customers appreciate that the SDK preview feedback has been addressed as part of the preview builds and are eagerly awaiting general availability.
Customers have successfully executed various tests including creating and appending files using the ADLS Gen2 SDK and testing reads using the Blob REST API. 

Based on your preview feedback, we have also introduced new APIs for bulk upload that simplifies the experience for larger data writes/appends for ADLS Gen2. Detailed documentation is available in the links below:

.NET SDK
Python SDK
Java SDK
JS SDK

PowerShell and CLI will continue to be available for preview globally in all Azure regions.  We will announce General Availability for PowerShell and CLI as soon as we have addressed preview feedback.

PowerShell
Azure CLI

Next steps 

We welcome your feedback to continue to enrich the ADLS Gen2 developer experience and thank everyone for their collaboration towards achieving this high value release. We look forward to these strong partnerships in future investments as well for our developer ecosystem journey.
Quelle: Azure