Why Azure for Government is your best choice – the most comprehensive CJIS compliance

Government clouds are not all the same so we’re starting a dialogue on our strategy for building the Microsoft Government Cloud, exploring several areas we believe are highly valuable differentiators. Details matter when government, defense and intelligence agencies decide which cloud platform will deliver their mission critical services to citizens and constituents.

Today I’m going to talk about how we support compliance with Criminal Justice Information Services, or CJIS. It’s part of our industry leading compliance portfolio and represents our commitment to doing the hard work to make compliance simpler for our customers. CJIS is the most critical compliance requirement for state and local governments adopting the cloud as it ensures police and public safety personnel use information technology securely and with the right privacy controls.

CJIS has been built into Azure from the beginning

Nearly four years ago, we made a very deliberate decision to comply with the applicable controls in the CJIS Security Policy, starting work on an approach that would ultimately satisfy the US Justice Department and CJIS Systems Agencies in all 50 states. We quickly learned there was no silver bullet, no simple step to achieving compliance and no single federal agreement covering all states.  Our approach is to attest to each state CJIS Systems Agency that Microsoft meets the applicable requirements of the Policy. The Policy has 13 security areas of which four are particularly critical to agency requirements and Microsoft’s approach:

State approval of applicable Microsoft employees thru fingerprint and background screening. Yes – we are really giving the fingerprints of our operational staff to each state!
CJIS Security Awareness Training completed by each employee with potential access to data with 30 days of assignment. This exceeds by five months the CJIS Security Policy requirement of six months.
Signed addendum by Microsoft as a corporation and each applicable employee. Every employee is individually committing to meet CJIS standards as a condition of their employment.
State review and acceptance of Microsoft attestation to CJIS controls thru review of security reports and physical data center inspection.

No other provider provides the transparency and insight that Microsoft provides with regards to CJIS.

With 23 states (and growing) – Microsoft is the CJIS cloud leader

In December 2012 Microsoft was the first hyperscale cloud provider to contractually attest to the applicable CJIS controls with a signed CJIS management agreement and CJIS Security Addendum. It was three years before any other major cloud provider. Since 2012, we have worked one-by-one to sign agreements with 23 states and our goal of reaching all 50 remains very much in sight.

Why CJIS matters – smarter, faster and better mission focus

Police body worn video cameras are one of the biggest emerging technology disruptions today in justice and public safety and its applicability under CJIS continues to evolve in every state. We’re prepared to support each state’s decision with our comprehensive CJIS strategy.

A compelling example of how law enforcement and the cloud come together is innovation by the Miami-Dade Police Department. Miami-Dade police chose to work partner with Microsoft, VIEVU to build and deploy secure solutions based on Microsoft Azure Government. Already, the solutions have helped the Miami-Dade Police Department transform its operations, better engage the public, and increase transparency with citizens. As said by Juan J. Perez, director of the Miami-Dade Police Department:

“We wanted to be at the forefront of technology to be able to capture evidence that couldn’t be gleaned another way.  But we also knew we needed to approach the implementation of body-worn cameras thoughtfully to ensure the most positive impact for everyone involved.”

Read more about Miami-Dade’s story on our Transform Blog.

Miami-Dade and VIEVU are not alone – hundreds of states and partners such as The State of California and partners like Taser have made the same choice.

What this means for you

We are committed to building a Government Cloud you can trust and that means meeting or exceeding the standards required to transform mission critical government and defense workloads to the cloud. Public safety services provided by US state and local governments is one of these workloads and we are proud to offer the deepest and most comprehensive compliance to CJIS in the market today.

Unlike other providers in the market, we don’t have partial coverage, don’t make unsubstantiated claims, and we don’t require our Government customers to use third party software. We believe compliance should be simplified and transparent and allow government agencies to focus on meeting their missions.

I invite you to sign up for a Free Azure Government Trial, to experience the security and trust of the Microsoft Government Cloud.

–Tom
Quelle: Azure

Service Fabric on Linux support available this month

Over the past few years, it’s become increasingly clear that businesses are relying on cloud applications to fuel innovation and gain competitive advantage. Much of this pressure is falling on the shoulders of developers, who need to be able to rapidly create new, game-changing applications that have the potential to disrupt and transform entire industries. These same developers need to be able to release and update their applications more quickly so they can respond to customer feedback and have faster time to market than their business’s competitors.

A number of trends are emerging to make these possibilities a reality for developers and businesses alike – and one of these is microservice architectures that enable developers to create applications using multiple single-purpose, independently versioned services to provide a scalable way to build cloud-native applications and enable rapid innovation. Service Fabric is Microsoft’s microservices application platform that was released last year to help developers build and manage cloud-scale applications. Battle-hardened internally at Microsoft for almost a decade, Service Fabric has been powering highly scalable services like Cortana, Intune, Azure SQL Database, Azure DocumentDB, and Azure’s infrastructure.  We’ve seen tremendous response from our customers and great momentum since our recent GA at Build 2016, including BMW, CareOtter, Ilyriad, Bentley Systems and Assurant.

Given its beginnings, Service Fabric supports Windows servers and .NET applications, but many enterprises today run heterogeneous workloads, including Windows and Linux servers, .Net and Java applications, and SQL and NoSQL databases. That’s why I am excited to announce today that the preview of Service Fabric for Linux will be publicly available at our Ignite conference on September 26.  With today’s announcement customers can now provision Service Fabric clusters in Azure using Linux as the host operating system and deploy Java applications to Service Fabric clusters. Service Fabric on Linux will initially be available for Ubuntu, with support for RHEL coming soon.

In addition, with CLI (Command-Line Interface), Eclipse and Jenkins support, developers can use the tools they know to build and deploy on Service Fabric on Linux. Just as on Windows, developers can build and test their Service Fabric applications on Linux on a one-box setup, meaning you don’t need a cluster in Azure to build and test your Service Fabric app. Our vision is to enable developers to build Service Fabric applications on the OS of their choice and run them wherever they want. In the near future, we will release a Linux standalone installer to enable Service Fabric to be used outside of Azure for on-premises, hybrid and multi-cloud deployments. We also plan on open sourcing parts of the platform, beginning with Service Fabric’s programming models. This will allow developers to enhance the standard programming models and use them as starting points to create their own programming models and to support other languages.
 
We’re excited that with our ongoing enhancements of Service Fabric’s capabilities and reach, more businesses will be able to take advantage of our innovations to power their own applications. To learn more about how to get started with Service Fabric on Linux, check out our episode on Channel 9.
Quelle: Azure

Announcing the General Availability of Storage Service Encryption for Data at Rest

Storage Service Encryption for Azure Blob Storage helps you address organizational security and compliance requirements by encrypting your Blob storage (Block Blobs, Page Blobs and Append Blobs).

Today, we are excited to announce the General Availability of Storage Service Encryption for Azure Blob Storage. You can enable this feature on any Azure Resource Manager storage account using the Azure Portal, Azure Powershell, Azure CLI or the Microsoft Azure Storage Resource Provider API.

Microsoft Azure Storage handles all the encryption, decryption and key management in a totally transparent fashion. All data is encrypted using 256-bit AES encryption, also known as AES-256, one of the strongest block ciphers available. Customers can enable this feature on all available redundancy types of Azure Storage – LRS, GRS, ZRS, RA-GRS and Premium-LRS for all Azure Resource Manager Storage accounts and Blob Storage accounts. There is no additional charge for enabling this feature. 

Note that SSE encrypts when blobs are written or updated. This means that when you enable SSE for an existing storage account, only new writes are encrypted; it does not go back and encrypt the data already present.

Find out more about Storage Service Encryption with Service Managed Keys.
Quelle: Azure

StorSimple Virtual Array: Critical update, software version 10.0.10288.0 (Update 0.3) now available

A new critical update for StorSimple Virtual Array, version 10.0.10288.0 (Update 0.3) is now available for download.

Release notes and issues addressed in this release: StorSimple Virtual Array Update 0.3 release notes.
Update install guide: Install Updates on your StorSimple Virtual Array.

Update 0.3 contains the following bug fixes and improvements .

Backups: Backups would fail to complete under certain conditions. The root cause of this issue is fixed in this release. The fix does not apply retroactively to shares that are already seeing this issue. Customers who are seeing this issue should first apply Update 0.3, then contact Microsoft Support to perform a full system backup to fix the issue
iSCSI: An issue observed in earlier releases where iSCSI session was disconnected when copying large amount of data to a volume on the StorSimple Virtual Array.  This issue is fixed in this release
New virtual disk images: New VHD, VHDX and VMDK are now available via the Azure classic portal. You can download these images to provision new Update 0.3 devices

We strongly recommend that you apply this update. The update is a disruptive update (requires reboot of the SVA) and should be applied during a planned downtime. Should you encounter any issues, please contact Microsoft Support.
Quelle: Azure

Announcing Face Redaction for Azure Media Analytics

Azure Media Redactor is a part of Azure Media Analytics, and offers scalable redaction in the cloud. This Media Processor (MP) will perform anonymization by blurring the faces of selected individuals, and is ideal for use in public safety and news media scenarios. The use of body worn cameras in policing and public spaces is becoming increasing commonplace, which places a larger burden on these departments when videos are requested for disclosure through Freedom of Information or Public Records acts. Responding to these requests take time and money as the faces of minors or bystanders must be blurred out. A video with multiple faces can take hours to manually redact just a few minutes of footage. This service can reduce the labor intensive task of manual redaction to just a few simple touchups. Azure Media Analytics Azure Media Analytics is a collection of speech and vision services offered at enterprise scale, compliance, security, and global reach. For other Media Analytics processors offered by Azure, see Milan Gadas blog post Introducing Azure Media Analytics. You can access these features in our new Azure portal, through our APIs with the presets below, or using the free Azure Media Services Explorer tool. Redaction will be a free public preview for a limited time, and will be in all public datacenters starting around mid September. China and US Gov datacenters will be included in the GA release. Face Redaction Facial redaction works by detecting faces in every frame of video and tracking the face object both forwards and backwards in time, so that the same individual can be blurred from other angles as well. Redaction is still a difficult for computers to solve and accuracy is not at the level of a real person. It is expected to find false positives and false negatives especially with difficult video such as low light or high movement scenes. Since automated redaction may not be 100%, we provide a couple of ways to modify the final output. In addition to a fully automatic mode, there is a two pass workflow which allows the selection/de-selection of found faces via a list of IDs, and to make arbitrary per frame adjustments using a metadata file in JSON format. This workflow is split into ‘Analyze’ and ‘Redact’ modes, as well as a single pass ‘Combined’ mode that runs both in one job. Combined mode This will produce a redacted mp4 automatically without any manual input. Media Processor Name: “Azure Media Redactor” Stage File Name Notes Input asset foo.bar Video in WMV, MOV, or MP4 format Input config Job configuration preset {‘version':’1.0′, ‘options': {‘Mode':’combined’}} Output asset foo_redacted.mp4 Video with blurring applied Input example: Output example:   Analyze mode The “analyze” pass of the two pass workflow will take a video input and produce a JSON file of face locations, and jpg images of each detected face. Stage File Name Notes Input asset foo.bar Video in WMV, MPV, or MP4 format Input config Job configuration preset {‘version':’1.0′, ‘options': {‘Mode':’analyze’}} Output asset foo_annotations.json Annotation data of face locations in JSON format. This can be edited by the user to modify the blurring bounding boxes. See sample below. Output asset foo_thumb%06d.jpg [foo_thumb000001.jpg, foo_thumb000002.jpg] A cropped jpg of each detected face, where the number indicates the labelId of the face Output Example: Download full {  “version”: 1,  “timescale”: 50,  “offset”: 0,  “framerate”: 25.0,  “width”: 1280,  “height”: 720,  “fragments”: [    {      “start”: 0,      “duration”: 2,      “interval”: 2,      “events”: [        [            {            “id”: 1,            “x”: 0.306415737,            “y”: 0.03199235,            “width”: 0.15357475,            “height”: 0.322126418          },          {            “id”: 2,            “x”: 0.5625317,            “y”: 0.0868245438,            “width”: 0.149155334,            “height”: 0.355517566          }        ]      ]    }, … truncated Redact Mode The second pass of the workflow takes a larger number of inputs that must be combined into a single asset. This includes a list of IDs to blur, the original video, and the annotations JSON. This mode uses the annotations to apply blurring on the input video. Stage File Name Notes Input asset foo.bar Video in WMV, MPV, or MP4 format. Same video as in step 1. Input asset foo_annotations.json annotations metadata file from phase one, with optional modifications. Input asset foo_IDList.txt (Optional) Optional new line separated list of face IDs to redact. If left blank, this will blur all faces. Input config Job configuration preset {‘version':’1.0′, ‘options': {‘Mode':’redact’}} Output asset foo_redacted.mp4 Video with blurring applied based on annotations Example Output This is the output from an IDList with one ID selected.   Understanding the annotations The Redaction MP provides high precision face location detection and tracking that can detect up to 64 human faces in a video frame. Frontal faces provide the best results, while side faces and small faces (less than or equal to 24×24 pixels) are challenging. The detected and tracked faces are returned with coordinates  indicating the location of faces, as well as a face ID number indicating the tracking of that individual. Face ID numbers are prone to reset under circumstances when the frontal face is lost or overlapped in the frame, resulting in some individuals getting assigned multiple IDs. For detailed explanations for each attribute, visit the Face Detector blog. Getting started To use this service, simply create a Media Services account within your azure subscription and use our REST API/SDKs or with the  Azure Media Services Explorer (v3.44.0.0 or higher). For sample code, check out the sample code on our documentation page and replace the presets with the ones above and the Media Processor name “Azure Media Redactor”. Contact us Keep up with the Azure Media Services blog to hear more updates on the Face Detection Media Processor and the Media Analytics initiative! Send your feedback and feature requests to our UserVoice page. If you have any questions about any of the Media Analytics products, send an email to amsanalytics@microsoft.com.
Quelle: Azure

Microsoft Azure at IBC 2016

Media Services is excited to be back at IBC and to be a part of the IBC Hackfest.

It has been a busy year for the Azure Media Services team since the last IBC. We launched several products including Azure Media Analytics, General Availability of Azure CDN from Akamai Standard and Microsoft Stream.

We also just wrapped up streaming the Rio Olympics to record audiences worldwide. One of the key highlight was that viewers live streamed 2.71 billion minutes of Olympic coverage – a new record for live event coverage. In addition to this, we had zero downtime. The streaming of Olympics demonstrates the strength of our platform to handle large media workflows reliably.

“Rio 2016 set numerous consumption records, including over 3 billion minutes streamed. More than one-third of those who streamed the Games did so from connected TV devices. Our partnership with Microsoft Azure helped us extend our reach to more people and more devices via cloud streaming than ever before.” – Eric Black, Digital CTO, NBC Sports Group

In addition to the building trusted partnerships, we often hear from our customers that the reason they pick Azure is twofold.  First, the scalable end-to-end capabilities we offer to build media workflows in the cloud and second, our commitment to continuous innovation which brings cutting edge video workflows to the Azure cloud.

In keeping with this commitment, at IBC 2016, we are thrilled about the following key announcements:

Multi-DRM: With the recent announcement of FairPlay Streaming DRM service hitting general availability, Azure Media Services now provides all major DRMs (Microsoft PlayReady, Google Widevine and Apple FairPlay) license delivery. This will help media customers to deliver DRM protected premium content to a really wide set of audience on various devices, easily and quickly. We also offer dynamic encryption capability on our streaming server to encrypt media stream on the fly for both video-on-demand (VOD) and live streaming. With this leading content protection services, we provide a single point of control for you to quickly build a multi-DRM solution with high scalability and reliability.
Media Analytics: We launched Azure Media Analytics, a collection of speech and vision services including Indexer, Hyperlapse, Motion detection, Face and emotions detection, Video summarization, Video Optical Character Recognition and Content moderation. We are adding Azure Media Redactor, which will perform anonymization by blurring the faces of selected individuals, and is ideal for use in public safety and news media scenarios. Some of the applications areas include law enforcement and public safety than can now efficiently process large volumes of video footages form video sources including surveillance cameras and body cams.
HEVC: We are conducting trials of HEVC encoding and delivery in Azure Media Services with our customers. Built on the same platform as the Premium Encoder, the new encoder provides enhancements including the following HEVC encoding at UHD/4K resolutions, including 4096×2160 pixels at 60 frames/second and support for high dynamic range (HDR) content.
GA of Azure Media Services in the new Azure Portal: We are pleased to announce the general availability of Azure Media Services in new Azure portal.  Besides all features in the old portal, we have developed a few additional features into the new portal: new Media Analytics processors, asset details, closed caption configuration, FairPlay DRM support, enhanced live streaming workflow and etc. You still can access the old portal and manage your existing account by directly pinging manage.windowsazure.com, however, you won’t be able to create new Azure Media Services account in the old portal. We will sunset Azure Media Services old portal access in the near future too. Therefore, this is definitely a good time for you to switch over to our brand new experience!
Azure Cool Storage: We now support Azure Cool Storage in the media workflows. Customers can take advantage of Azure Cool Storage by using it to upload mezzanines and thereby reduce cost. They could then transcode, and prepared assets stored on Hot Storage. 
IBC Hackfest: We are also very excited to be part about the IBC Hackfest. Microsoft Azure is a trusted platform for innovation for developers. We look forward to the creative ideas that the teams come up with that leverage the Azure Platform and Azure Media Analytics in particular.

We’re continually moving the needle when it comes to scalable, secure, and cost-effective media workflows in the cloud, and invite you to take advantage by building on Microsoft Azure.

We are located in Hall 15 MS.1. Drop by to see a demonstration of several of our services including our cutting edge Media Analytics capabilities.
Quelle: Azure

Encoding and delivery of UHD content

As recent studies have demonstrated, high efficiency codecs like HEVC/H.265 now have mature software implementations that deliver on the promise of 50% bitrate savings over AVC/H.264, at the same quality. Such efficiency improvements are critical to enable delivery of video at ultra high definition (UHD) and beyond. Today I am pleased to announce that we are conducting trials of HEVC encoding and delivery in Azure Media Services with our customers. Built on the same platform as the Premium Encoder, the new encoder provides enhancements including the following:

HEVC encoding at UHD/4K resolutions, including 4096×2160 pixels at 60 frames/second
Support for high dynamic range (HDR) content via HDR10 Media Profile with bit depth of 10-bits per color sample, BT.2020 color primaries, encoded at HEVC Main 10 Profile

Your encoding process would be very similar to that using the Premium Encoder. Your mezzanine files can be encoded to HEVC video/Dolby Digital Plus or AAC audio, and stored as ISO MP4 files. These can subsequently be dynamically packaged into MPEG DASH, streamed to devices, such as 4K TVs. TSUTAYA, a video rental and online VoD service company based in Japan, is currently testing such a workflow, validating that the compression tools meet their requirements. You can check the output of the encoder yourself – here is a link to a test video that you can view in the Edge browser, on a Windows 10 device with hardware HEVC decode support.

Are you interested in participating in these trials? Please contact us at ams-uhdsup@microsoft.com
Quelle: Azure

Improving the D/L Method using Machine Learning

Technology is continuing to play an integral part in sports. In cricket too, there are many areas where technology can be used. Machine learning will play an important role in Sports Analytics.

We believe that we can use Machine Learning to analyze historical cricket games, and use this to continuously improve the Duckworth Lewis (D/L) Method of computing target scores in rain-shortened matches.

The Current D/L method is a statistical method invented by statisticians Frank Duckworth and Tony Lewis.  It is designed to calculate the target score (or the PAR score) that the second batting team (in a rain-interrupted match) needs to achieve. Today, there are two D/L models/editions that are available to the cricket community: Standard Edition and Professional Edition. The Standard Edition is a chart-based model used for non-ICC match and local matches. The Professional Edition is a software-based black box model, and is used by the ICC for all official matches. 

After Duckworth and Lewis retired, Professor Steven Stern (from the Queensland University of Technology) became the custodian for the method, and the method was renamed as the Duckworth-Lewis-Stern method (or D/L/S method). In many pieces of existing literature, it continues to be referred to as the D/L method.

Improving the D/L Method

The D/L table is static and does not take into consideration the latest game statistics (e.g., which teams are playing better this season, ranking of players, etc.).
We believe we can use historical Twenty20 data to derive an always up-to-date D/L table that takes into these latest statistics. This can be operationalized using Azure Machine Learning and run on a frequent basis to always produce an updated D/L table.

To achieve this, we analyze the T20 data from http://cricsheet.org/, which provides ball-by-ball data for international and IPL cricket matches. The T20 historical data captures the ball-by-ball for about 620 matches, and 153K rows of ball-by-ball data.

Using a Jupyter notebook, we show the data exploration on how we can derive a better D/L table by applying quadratic curve-fitting with constraints techniques using the T/20 data. This Jupyter notebook is now available to the data science and cricket communities so they can take this foundational information and work together to improve the state of cricket analytics.
Quelle: Azure

Enterprise-level policy management for Azure VM backup in Recovery Services vault

Azure Backup announced general availability of Recovery Services vault, based on the Azure Resource Manager model, in May 2016. With Recovery Services vault, customers can backup Classic as well as Resource Manager virtual machines using custom defined backup policies. Today, we are adding the capability for our customers to manage backup policies and model them to meet their changing requirements from a single window, making Azure Backup an enterprise-level backup solution for Azure virtual machines.

Features

With this release, Azure Backup provides:

Ability to view all backup policies in a Recovery Services vault from a single window
Ability to add a new policy from policy list view
Ability to edit a backup policy to match modified backup schedule and retention requirements – once a backup policy is updated, changes are pushed automatically to all virtual machines configured with the policy
Ability to add items to a backup policy – add more virtual machines to an existing backup policy in a single click
Get a view of virtual machines protected with a policy
Delete a backup policy which is no longer in use

Getting Started

To get started with policy management,

1. Open a Recovery Services vault. Create a Recovery Services vault if one doesn’t exist.

2. From Settings Menu, click on Backup Policies to bring up policy management blade.

3. To edit policy or add more virtual machines to the policy, select a policy to bring up the detailed policy blade.

Related links and additional content

Learn more about Azure Backup.
Want more details? Check out Azure Backup documentation.
Need help? Reach out to Azure Backup forum for support.
Tell us how we can improve Azure Backup by contributing new ideas and voting up existing ones.
Follow us on Twitter @AzureBackup for the latest news and updates.

Quelle: Azure

Microsoft Ignite: All things Azure Stack

In less than 30 days, Atlanta, Georgia will be the center of gravity for Microsoft’s enterprise solutions and technologies when our Ignite conference kicks off on September 26. The agenda is fully stacked with sessions, demos, hands-on experiences, panels and more and Azure Stack will be fully represented in that, um, stack.

As we look forward to Day 1, the Azure Stack team will engage with you over the next weeks and days via a series of blogs that will provide both an overview and a navigation guide into all things Azure Stack at Ignite. More importantly, we anticipate and welcome your feedback on anything and everything Azure Stack as you plan for Ignite – your expectations from the sessions we have planned, which sessions you are most excited about, your technical preview experiences thus far, questions or concerns you have, your favorite ice cream flavor to pair with Azure Stack – anything, everything Azure Stack.

My name is Wale Martins and I work as a program manager on the Azure Stack team – specifically in the infrastructure area and ensuring every piece of software component that stands up Azure Stack in your datacenter is justified and makes optimal use of resources to leave more resource capacity for your workloads. My focus in this initial blog though is to tell you what we have planned for you at Ignite – give you the lay of the land (see figure below) and from which subsequent blogs by session owners will take the baton and provide more details to whet your appetites. Let’s get started.

Referencing the figure below, the first thing I want to call out is that none of the sessions we’ve planned are overlapping so you can attend each one without scheduling conflicts.

Azure Stack overview sessions

We have planned four overview sessions that cover different aspects of Azure Stack presented at a 200 and above level and targeted to equip attendees with a 360-degree view on what Azure Stack is, where we are with Azure Stack today and what to expect on the rest of the journey to the General Availability release next year. Whatever your role – IT Pro, developer, business or technical decision maker – you will find these sessions useful and informative. I have listed them below with links to their sign up pages.

Natalia Mackevicius and Vijay Tewari will “provide you with an overview of everything you need to know about Azure Stack, from the value proposition to the business model to the technical capabilities” at Explore Azure Stack “State of the Union” – Foundation 1.
Join Charles Joy and Bradley Bartz to Learn about Azure Stack Agile Service Delivery that is exactly the same delivery model as Azure’s.
At the Learn about Azure Stack Infrastructure Operations and Management session, Vijay Tewari will comprehensively cover the infrastructure components and the scenarios they enable, including hardware that Azure Stack will run on.
And Spencer Shepler will round out our overview sessions with an in-depth Dive into Microsoft Azure Stack Architecture. This is a 400 level deeply technical session targeted at IT influencers and implementers.

Azure Stack Technical Sessions

Our next set of sessions will be technical sessions that do a 300 level+ double click into most of the topics and areas you would have learned about in the overviews.

Mallikarjun Chadalapaka and Scott Napolitan will dive into the foundational resource providers for core IaaS in their Dive deep in the Microsoft Azure Stack IaaS session.
Charlie Satterfield and Thomas Roettinger show you the Azure Stack admin experience we are developing and how you can start your journey to Becoming a Microsoft Azure Stack infrastructure rockstar.
In Learn about the community of templates for Azure Stack, Ricardo Mendes and Marc van Eijk cover the authoring, versatility and universality of Azure Stack templates including across Azure.
In a very complicated act requiring deft onstage collaboration and communication, our trio of Matthew McGlynn, Anjay Ajodha and Shriram Natarajan will Discuss Microsoft DevOps on Azure Stack.
And Igor Sedukhin rounds out this set with a session for you to Learn about hybrid applications with Azure and Azure Stack and how to build these applications.

What’s up with CPS?

So what about CPS – what’s the latest with Microsoft’s Cloud Platform System? If you have invested, are investing or will invest in CPS, we have a session to show you how we carry this first member of our on-premises cloud solutions into the Azure Stack era. Join Cheng Wei and John Haskin to Explore Microsoft Cloud Platform System – delivering Azure experiences in an integrated system. This session will also discuss the WAP/CPS Connector for Azure Stack we are developing.

Those are our planned sessions at a summary level. I have added another figure of the sessions pivoted by date and you will observe again that we have ensured there are no overlapping sessions so it’s an all-you-can-eat offering. Expect more details in subsequent blogs. Also, there are other activities besides sessions and we will talk about those too. Stay plugged. Did I mention that we really want to hear from you? Give us your feedback, tell us your expectations and ask us questions. You may do so in one or more ways below:

In the comments section below
Collaborate and chat with the Azure Stack engineering team to shape Azure Stack. Sign up to participate.
Reach out on Twitter using the hashtag

Whichever you choose, the Azure Stack team is listening.

Quelle: Azure