Build your data science skills–and your network–in Atlanta

Over the last decade, data science has gone from a murmur to a deafening roar.

The demand for skilled data scientists is showing up everywhere—in every industry. And whatever their titles, the community of people who can do analysis, machine learning, big data, or visualization have never been more universally valued.

It’s a good time to be a data scientist.

As a senior recruiter in AI and machine learning at Microsoft, I’ve experience this rapid evolution from the front row. We now have approximately 5,000 members that belong to the company’s internal Machine Learning and Data Science community, sharing insights and technical expertise across nearly every team and discipline. They’re woven into our very DNA.

A lot of these folks will be in Atlanta this September 26–27 at the Microsoft Data Science Summit—together with leading thinkers, researchers, and experts from across data science and machine learning. They’ll be sharing strategies, tools, tips, and breakthroughs with the data science community.

Build your skills, see what your peers are up to, and get hands on with the latest tech.

Data scientists of all stripes are integral for driving business forward, understanding customers, and helping organizations innovate. And right now, new tools are emerging quickly to help you move even faster. In Atlanta, you’ll get an insider’s look at the new ways that Microsoft and other businesses are applying these technologies, and get hands-on training in using them yourself. Questions about Cortana Intelligence Suite, SQL Server, or Microsoft R? This is the place to deepen your expertise. Set up a 1:1 meeting with the people who build and run these technologies to get answers that fit your particular scenario.

Data science is exploding in so many directions that there’s a breathtaking array of demos and real-world examples awaiting you as well. Come see what others are doing. Share ideas, puzzles, and successes with your peers and Microsoft presenters. And discover new ways to help your organization and expand your career.

But register soon! The Microsoft Data Science Summit is coming up fast.

Register now
Quelle: Azure

Networking for a hybrid application environment

Supporting a network in transition: Q&A blog post series with David Lef

In a series of blog posts, this being the fourth and last, David Lef, Principal Network Architect at Microsoft IT, chats with us about supporting a network as it transitions from a traditional infrastructure to a fully wireless platform. Microsoft IT is responsible for supporting 900 locations and 220,000 users around the world. David is helping to define the evolution of the network topology to a cloud-based model in Azure that supports changing customer demands and modern application designs.

David Lef identifies the considerations and challenges of supporting a network environment as line of business applications move from an on-premises environment to the cloud using Azure.

Q: Can you explain your role and the environment you support?

A: My role at Microsoft is principal network architect with Microsoft IT. My team supports almost 900 sites around the world and the networking components that connect those sites, which are used by a combination of over 220,000 Microsoft employees and vendors that work on our behalf. Our network supports over 2,500 individual applications and business processes. We’re responsible for providing wired, wireless, and remote network access for the organization, implementing network security across our network (including our network edges), and connectivity to Microsoft Azure in the cloud. We support a large Azure tenancy using a single Azure Active Directory tenancy that syncs with our internal Windows Server Active Directory forests. We have several connections from our on-premises datacenters to Azure using ExpressRoute. Our Azure tenancy supports a huge breadth of Azure resources, some of which are public-facing and some that are hosted as apps and services internal to Microsoft, but hosted on the Azure platform. We currently host forty percent of our LOB apps using Azure, and that number is continually increasing.

Q: Can you give me a quick history of how this migration has happened and how the network teams have supported it?

A: It started with Azure platform as a service (PaaS), which we were using for internal and external app and service solutions. PaaS was the first primary component that was offered on Azure, so we naturally started there when developing solutions. Most of our early applications were hosted completely in Azure. The hybrid scenario wasn’t fully developed or supported, so we didn’t implement that in our solutions.

However, as Azure networking and infrastructure as a service (IaaS) components have been introduced and matured, we’ve had much more flexibility in how we implement Azure-based solutions and how those solutions interact with each other and our on-premises infrastructure.

We adopted a strategy for Azure migration that addressed the most logical migration scenarios first: basic web apps, any new solutions, and any solutions that were targeted for redesign. Next, we looked at some more challenging apps, such as those with large bandwidth or resource usage requirements, or those that had regulatory implications or significantly affected business-critical operations. Finally, the most difficult and costly apps are left, such as customized legacy solutions with code that is difficult to update.

The main enablers for hybrid line of business (LOB) apps were the addition of ExpressRoute, which gives any Azure tenant the ability to obtain a private, dedicated connection to Azure from their datacenter, and the maturation of Azure IaaS, which has enabled us to take on-premises infrastructure and migrate it directly to Azure-hosted virtual machines without having to modify the components of the infrastructure. We have also widely adopted software as a solution (SaaS) solutions, such as Office 365.

Our primary support channel has been to facilitate the connectivity required by hybrid solutions. A lot of the connectivity is on the back end through ExpressRoute and the configuration and management required to connect our datacenters to Azure, but we also manage networking within Azure. There are some important security and compliance considerations when cloud and on-premises mix, and we’ve been diligent in ensuring that our data and infrastructure in the cloud is as secure as it is in our datacenters. Most of our Azure apps and services are available from some Internet touch-point, so it’s important for us to delineate between front end and back end within our solutions. We don’t want our infrastructure living in IaaS exposed to the public Internet.

Q: Have the challenges changed through this journey?

A: They’re always changing! Change is the nature of the cloud, and that’s really the first challenge we faced: understanding that solutions, processes, and methods are fluid and changing in Azure. There is a continuous stream of features being offered and changed; some of them can be inconsequential to a solution, while others can be game changers.

We understand that there are many ways to connect to Azure, from the datacenter perspective and from the user perspective. As much as possible, we try to provide the freedom to allow our application owners in Azure the choice in how they connect their apps to the datacenter and to their customer.

As an organization, we’ve had to learn how to modify our strategies to focus on the cloud first. Hosting LOB apps in Azure is a fundamental change in how the apps are used and supported. We’ve been diligent in keeping support and communication channels open with our application owners and our customers, and it’s critical that they understand the nature of the cloud and what that means to them and their business. Our cloud-first, mobile-first strategy means that everything is directed to Azure first, and it’s a cultural change that has been spearheaded by our CEO, Satya Nadella, and cascaded down through the rest of the organization. There is an excellent article that highlights the continuing evolution of our cloud strategy.

From a technical perspective, we have a lot of tools and processes in place to help us manage our Azure resources in a way that matches the fluidity of the environment. The integration of Azure Resource Manager (ARM) has been critical to centralizing and standardizing solution management and configuration within Azure. Resource groups and ARM templates provide the functionality we need to configure things properly and ensure they stay configured properly through changes to app requirements or Azure functionality. It’s a constant challenge to maintain both the datacenter and Azure concurrently, both technically and logistically. Migrations to Azure are a constant stream, so the configuration of our datacenters and Azure environments are in constant flux. We have and will have a long legacy tail that will be carried around for a while, so we have to ensure that we’re providing and managing communications between Azure and our on-premises environments as proactively as possible, and educating our tenants and users on how to use both effectively.

Learn more

Other blog posts in this series:

Supporting network architecture that enables modern work styles
Engineering the move to cloud-based services
Cloud-first, mobile-first: Microsoft moves to a fully wireless network

Learn more about how Microsoft IT is evolving its network architecture here.
Quelle: Azure

JSON support is generally available in Azure SQL Database

We are happy to announce that you can now query and store both relational and textual data formatted in JavaScript Object Notation (JSON) using Azure SQL Database. Azure SQL Database provides simple built-in functions that read data from JSON text, transform JSON text into table, and format data from SQL tables as JSON.

You can use JSON functions that enable you to extract value from JSON text (JSON_VALUE), extract object from JSON (JSON_QUERY), update some value in JSON text (JSON_MODIFY), and verify that JSON text is properly formatted (ISJSON). OPENJSON function enables you to convert JSON text into a table structure. Finally, JSON functionalities enable you to easily format results of any SQL query as JSON text using the FOR JSON clause.

What can you do with JSON?

JSON in Azure SQL Database enables you to build and exchange data with modern web, mobile, and HTM5/JavaScript single-page applications, NoSql stores such as Azure DocumentDB that contain data formatted as JSON, and to analyze logs and messages collected from different systems and services. Now you can easily integrate your Azure SQL Database with any service that uses JSON.

Easily expose your data to modern frameworks and services

Do you use services that exchange data in JSON format, such as REST services or Azure App Services? Do you have components or frameworks that use JSON, such as Angular JS, ReactJS, D3, or JQuery? With new JSON functionalities, you can easily format data stored in Azure SQL Database as JSON and expose it to any modern service or application.

Easy ingestion of JSON data

Are you working with mobile devices or sensors, services that produce JSON such as Azure Stream Analytics or Application Insight, or systems that store data in JSON format such as Azure DocumentDB or MongoDB? Do you need to query and analyze JSON data using well-known SQL language or tools that work with Azure SQL Database? Now, you can easily ingest JSON data and store it into Azure SQL Database, and use any language or tool that works with Azure SQL Database to query and analyze loaded information.

Simplify your data models

Do you need to store and query both relational and semi-structured data in your database? Do you need to simplify your data models like in NoSQL data platforms? Now you can combine structured relational data with schema-less data stored as JSON text in the same table. In Azure SQL Database you can use the best approaches both from relational and NoSQL worlds to tune your data model. Azure SQL Database enables you to query both relational and JSON data with the standard Transact-SQL language. Applications and tools would not see any difference between values taken from table columns and the values extracted from JSON text.

Next steps

To learn how to integrate JSON in your application, check out our Getting Started page or Channel 9 video. To learn about various scenarios that show how to integrate JSON in your application, see demos in this Channel 9 video or find some scenario that might be interesting for your use case in these JSON Blog posts.

Stay tuned because we will constantly add new JSON features and make JSON support even better.
Quelle: Azure

Retrieve platform notification system error details with Azure Notification Hubs

We enabled Platform Notification System Feedback a while back to improve monitoring and debugging, where all channel error feedback from Platform Notification Systems associated with your hub is put in a storage blob for you to peruse. If you haven’t yet, I recommend checking out the feature and the simple sample we prepared. Many customers found this very useful, but wished for a way to see these feedback per message request to Notification Hubs.

We thought about it and it was a wonderful idea!

With our latest updates, we’ve added a new field PnsErrorDetailsUri into Per Message Telemetry – if you haven’t worked with Per Message Telemetry, you can read about it here. This means that, as part of Per Message Telemetry, we process per message feedback from Platform Notification Systems as we push notifications out, extract the errors, and put them in a blob whose uri is then presented. This makes these feedback much more targeted and useful, helping you detect any errors in your pushes.

Here is an overview of the differences between Platform Notification System Feedback and the PNS Error Details we added to Per Message Telemetry:

 
Platform Notification System Feedback
Per Message Telemetry’s PNS Error Details

Scope
Notification hub
Notification ID

Content
Expired channel and bad channel errors from PNS
Any errors from PNS

Both PNS Feedback and PNS Error Details are available for Standard Tier namespaces.

If you are using REST, the return of Per Message Telemetry will have an additional PnsErrorDetailsUri when you work with Api-Version 2016-07 or above. The errors can be any of the following:

Invalid PNS credentials
PNS unreachable
Bad channel
Expired channel
Wrong channel
PNS throttled
Invalid token
Wrong token
Dropped

Note that the error details are only fully available after the associated notification send operation is complete, and that you will get NULL for PnsErrorDetailsUri if there is no error.

If you are using our NuGet, simply add a few lines of code to extract the PnsErrorDetailUri from notification outcome details and its blob content.

// Get Notification ID from any send request
var outcome = await client.SendWindowsNativeNotificationAsync(winPayload.ToString(), tag);

// Get pns error detail uri once notification processing is complete
var feedbackUri = string.Empty;
var retryCount = 0;
while (retryCount++ < 6)
{   var result = client.GetNotificationOutcomeDetailsAsync(outcome.NotificationId).Result;   if (result.State != NotificationOutcomeState.Completed)   {   await Task.Delay(TimeSpan.FromSeconds(10));   }   else   {   feedbackUri = result.PnsErrorDetailsUri;   break;   }
}
if (!string.IsNullOrEmpty(feedbackUri))
{   Console.WriteLine("feedbackBlobUri: {0}", feedbackUri);   var feedbackFromBlob = ReadFeedbackFromBlob(new Uri(feedbackUri));   Console.WriteLine("Feedback from blob: {0}", feedbackFromBlob);
}

You can easily read the blob with the following call with the Azure Storage NuGet.

private static string ReadFeedbackFromBlob(Uri uri)
{
var currentBlock = new CloudAppendBlob(uri);
var stringbuilder = new StringBuilder();
using (var streamReader = new StreamReader(currentBlock.OpenRead()))
{
while (!streamReader.EndOfStream)
{
string currentFeedbackString = streamReader.ReadLine();
if (currentFeedbackString != null)
{
stringbuilder.AppendLine(currentFeedbackString);
}
}
}
return stringbuilder.ToString();
​}

We will be updating the Node.js SDK soon to enable this feature as well. Meanwhile, give it a try with our NuGet or REST APIs and let us know what you think!
Quelle: Azure

Why enterprises trust Azure with their apps and data

It takes a lot to earn the trust of enterprise IT, and rightly so: software runs the operations of almost every business around the world. For Microsoft, earning your trust has been a multi-decade investment, not something we started after we got into the cloud business. Everything we’ve done to earn your trust over the years we have applied to Azure.

At Microsoft, security, privacy, and compliance considerations have been baked into the development process for a very long time – it’s core to our culture. The Secure Development Lifecycle, an open methodology which developers can use to help them build more secure software, was invented at Microsoft over a decade ago and has been adopted broadly, across industries: safer software helps everybody.

Of the nearly $12 billion Microsoft spent on research and development last year, $1 billion was focused on our cybersecurity efforts. Because so many individuals and businesses rely on Microsoft, we feel a great responsibility here; and we have a distinct vantage point. As Ann Johnson, VP of our enterprise cybersecurity team, writes, “Microsoft has a unique position in cybersecurity. Because of the massive scale of information that Microsoft processes, for example, billions of device updates and hundreds of billions of emails and authentications, we’re able to synthesize threat data far faster than your organization could ever do it alone.”

Microsoft’s Digital Crimes Unit (pictured) works with attorneys and law enforcement around the globe to catch digital criminals. They use sophisticated analytics and visualization tools, running in Azure, of course.

Thinking about regulatory compliance like HIPAA, PCI, FedRamp, and hundreds of other standards in other countries, Microsoft has more certifications than any other cloud provider, and is continually adding more. Check out my colleague Alice Rison’s frequent updates on the Azure blog.

We’re continually researching new technologies to further advance the state of the art in digital security and privacy. For example, with homomorphic encryption, it’s possible to perform operations on data while never decrypting it, and you can download open-source code from Microsoft Research to try it today. And Microsoft’s work in post-quantum cryptography helps ensure that security can be maintained even when, in the future, quantum computers are able to break the RSA cryptosystem, the standard today.

The point? Microsoft has your back.

Is it any wonder then that enterprises increasingly are turning to Azure? Two important studies, one from infrastructure security firm Hytrust and another from Cowen & Company, both show the majority of you are thinking about Azure. In particular, Cowen’s showed that seventy-three percent of you are expecting to adopt Azure in the next year to 18 months.

We appreciate your confidence! We’ll have a lot more to say about security, and a wide array of other topics, at Ignite on September 24-26 in Atlanta. If you can’t make it, be sure to save the date and watch online.
Quelle: Azure

Implementing Microsoft's hybrid cloud solution

Azure StorSimple is Microsoft’s hybrid cloud storage solution that leverages the cloud to solve one of the most pressing challenges of the enterprise customer: managing rapid data growth. In the last 12 months, rapid innovations like StorSimple Virtual Array (a software version of StorSimple solution) and Local Volumes (which allows storage administrators to decide which data needs to remain in the local tiers) have expanded the value of Azure StorSimple for customers and partners.

Between Aug 2016 – Oct 2016, Microsoft is offering a series of educational sessions for IT Pros to get intimately familiarized with some of these innovations, as well as get a practitioner’s view around planning, implementing, and managing the deployment scenarios. All sessions are delivered (over Skype) by seasoned professionals with several years of consulting & implementation experience. The sessions consist of 40% theory and 60% demo and practical tips. As a follow-up to the sessions, participants can request personalized 1-1 consulting sessions on actual customer scenarios that they are working on. In addition, the session will introduce the audience to several resources for continued learning beyond these sessions. In the last 6 months, these sessions have been widely acclaimed for efficacy and usefulness to the IT Pro community – training over 900 professions across 650 unique organizations, globally. While it’s best to have a background in storage, the sessions will also benefit infrastructure professionals who want to grow their expertise in Azure hybrid cloud storage.

A synopsis of the topics and the registration links can be accessed here: https://infopedia.eventbuilder.com/Hybrid-Cloud-Storage-with-StorSimple

Topics Summary: Azure StorSimple Implementation and Scenarios: Local Volumes, Virtual Array, Back Up, DR, Azure Migration, Enterprise File Collaboration
Quelle: Azure

Announcing Azure App Service MySQL in-app (preview)

Today, we’re announcing a cool new feature (in preview) for Web developers using Azure App Service to create Web applications that use MySQL. MySQL in-app enables developers to run the MySQL server side-by-side with their Web application within the same environment, which makes it easier to develop and test PHP applications that use MySQL.

We’re also making it very easy to get started with this feature via the Azure portal. During the creation of your Web App, you’ll be able to select a “MySQL in-app (preview)” provider for your database, which will help provision the database.

We think this feature will be very welcome by Web developers who are looking to accelerate their testing because:

It supports many PHP applications that use MySQL, such as WordPress, Joomla, and Drupal.
It’s cost-effective since there’s no additional cost to use this feature and you only pay for the App Service plan (since resources are shared).
The MySQL and Web processes are co-located in the same environment (hence the term in-app) which means storage is shared.
Includes support for slow query logging and general logging, which you’ll need to turn on as needed (this feature impacts performance, so you shouldn’t use it all the time).

Since this feature is in preview, and shares its resources with the Web application in the same App Service plan, MySQL in-app is not recommended for production applications. Please also keep in mind the following tips and limitations when using this feature:

Check your storage limits and upgrade the web app pricing plan as needed to accommodate data for both MySQL and your web app. For storage and memory limits for your pricing tier, review the quota limitations for all App Service plans pricing tiers.
Note you only get one MySQL database per web application. In a scenario where you have a deployment slot web app and a production web app, you will get one MySQL database for the deployment slot and one MySQL database for the production web app, if you decide to turn on this feature for each app. The database contents will not be synchronized, which makes it easier for you to try different schema versions and content.
The auto scale feature is not supported since MySQL currently runs on a single instance. Similarly, enabling local cache is not supported.
The MySQL database cannot be accessed remotely using the MySQL CLI or other tools that access external endpoints. You can only access your database content using PHPMyAdmin (which is bootstrapped upon provisioning) or using the KUDU debug console.

The team continues working with Web developers in improving their experience in Azure App Service, particularly when it comes to data solutions. Over the last few months, we’ve come a long way in our data solution portfolio for Web developers, including revamping our PHP client drivers for Azure SQL, a new version of the JDBC drivers, expanded support for Linux on our ODBC drivers, MongoDB protocol support in DocumentDB and, earlier this week, an early technical preview of the new PHP on Linux SQL Server drivers. We will continue working on more data solutions that make it easier for Web developers to bring great applications to market in Microsoft Azure, whatever the language, stack, and platform.

If you’re using MySQL in-app for development and testing and you are interested in migrating this application to production, Azure offers many solutions, including:

ClearDB Database
ClearDB Clusters
Marketplace solutions for MySQL, MariaDB, and other MySQL-compatible solutions from partners like Bitnami and MariaDB
Community-contributed Azure Resource Manager (ARM) templates deploying on VMs
MySQL on virtual machine on Linux or Windows OS

We hope you get started with MySQL in-app in Azure App Service today, and share with us your feedback. Don&;t have a subscription? Sign up for a free trial! And if you’re interested in getting more details about this feature, make sure you check out the detailed blog post.
Quelle: Azure

Live from LinuxCon – Sharing the latest news and learnings on Microsoft’s open journey

Greetings from LinuxCon North America in Toronto, where I am representing Microsoft as a keynote speaker for the first time! I&;m excited to share exciting new open source developments from Microsoft and things we&039;ve learned from our journey with Linux and open source. Of course, I also look forward to catching up with old friends and meeting up with some customers and partners.

Over the past few months I’ve been asked more times than I can count, “Wim, why did you join Microsoft?” As a Linux guy who has watched the company from afar, I am the first to admit that Microsoft hasn’t always been the most open company. After talking to some of the executives at the company, I found that the days of a closed Microsoft are over.

The reality is customers use more than one tool and more than one platform to operate their businesses. They need tools that support Linux and Windows, and they need a cloud that allows them to run any application. One of the things I shared with linux.com recently was how blown away I was to see how large Microsoft&039;s investment in Linux already is. We brought .NET Core, PowerShell, and SQL Server to Linux. We also open sourced Visual Studio Code and just recently PowerShell. And, we are contributing to and participating in numerous community projects. It’s incredible to be a part of it.

Our latest open source and Linux advancements

One of the areas we are focused on is delivering open management solutions. In today’s multi-cloud, multi-OS world, customers need simple, unified tools to reduce complexity. That’s why just last week, we announced that we’re open sourcing PowerShell and making it available on Linux. Now PowerShell users across Windows and Linux can use our popular command-line shell and scripting language to manage almost everything from almost anywhere. My colleague Jeffrey Snover wrote a fantastic story about the journey to open source PowerShell and how customer-centricity brought us here – go check it out!

We’re also investing in making Microsoft Operations Management Suite (OMS), which gives you visibility and control of your applications and workloads across Azure and other clouds, a first-class tool for managing Linux environments. Last week, we announced that the OMS Monitoring Agent for Linux is generally available, delivering rich insights and real-time visibility into customers’ Linux workloads to quickly remediate issues. A lot of the tools we use and integrate with are open source-based, such as fluentd and integration with auditd and the like.

Today, I’m also excited to share that OMS Docker Container monitoring is available in preview. By nature, containers are lightweight and easily provisioned, so without a centralized approach to monitoring, customers may find it difficult to manage and respond to critical issues quickly. With OMS Docker Container monitoring, you get visibility into your container inventory, performance, and logs from one place, get a simplified view of containers’ usage, and can diagnose issues whether your containers are running in the cloud or on-premises. You may have seen Mark Russinovich demo this live at DockerCon in June, and we’re thrilled you can try it for yourself.

What we’ve learned on our journey and what’s next

These are all important milestones for Microsoft that reflect our journey of learning and the thoroughness of our open source approach across infrastructure investments; new governance processes that work with and for the community; new ways to incorporate customer and partner feedback; and the deepening of partnerships to make great experiences possible for organizations of all types. In my keynote tomorrow, I will talk about how we are applying our learnings into the Linux ecosystem, what our approach to open source is, what it means for Linux users, and how me and my team are working to take this to the next level.

Our experiences with Linux in Azure, where nearly 1 in VMs today are Linux, have brought us closer to our customers and what they need to succeed in a rapidly advancing world. We have made significant investments in making Microsoft&039;s platform a great place to run open source software, and I will be working with my team to accelerate this effort over the coming months.

Choice and flexibility are important tenets of our platform. Also critical are our efforts to contribute to open source projects, integrate open source technologies in our platform, and forge commercial and community partnerships with the ecosystem. It’s not just about what we’re open sourcing or making available on Linux. Microsoft is committed to contributing and participating in open source projects, like our investments in OMI and fluentd, our focus on Chakra and TypeScript, and many other projects including the fantastic work from our Microsoft Research organization. To take it a step further, one of the things my team and I have learned is how to partner with the community to make our contributions viable and sustainable, in ways that work for the community. I will be sharing many of those examples in my keynote.

It’s now been a few months since I joined Microsoft. It’s an exciting time to be at this company. I have to say that Linux and open source have become a normal part of our day-to-day business at Microsoft – from our people, our products, our vision, and our investments. I’m excited at what the future will bring with more first- and third-party projects, technologies, and partnerships that will bring great experiences to our customers using Linux and open source technologies.

If you’re at LinuxCon, please join me and the open source team in booth 3 at LinuxCon this week, and follow us on Twitter for more details about my keynote. If you’re not attending, make sure you visit the Azure.com website on Linux to learn more about our work with Linux and open source technologies.
Quelle: Azure

Azure Government Blog: Introduction & highlights

If case you haven’t noticed, we’ve been super busy here at Azure Government, and have been posting all our exciting updates and items of interest on our Azure Government Blog!

In this blog, we announce new functionality, how to’s, security and compliance updates, and invites to our Meetups and other related events. We’re super excited about everything that’s going on with Azure Government, and we’re committed to keeping you informed through this blog. We also welcome comments and feedback on the site, and encourage you to engage with us there.

Recent major new capabilities we have made available include: expanding our compliance coverage to include FedRAMP High, DISA L4, and ITAR support; expanding our CJIS support to five additional states (Tennessee, Rhode Island, Montana, Alaska, and Virginia); announcing the Microsoft and Red Hat partnership expanded to Azure Government; releasing our new portal in preview for Azure Government; and adding a bunch of new VM images. It’s been busy!

If there are any topics in particular that you would like to see, please comment below. To stay up to date on all things Azure Government, be sure to subscribe to our RSS feed and to receive emails by clicking “Subscribe by Email!” on the Azure Government Blog.

To experience the power of Azure Government for your organization, sign up for an Azure Government Trial. For all things related to security, privacy, transparency, and compliance, check out the Microsoft Trust Center.

Happy reading everyone!
Quelle: Azure

PowerShell is open sourced and is available on Linux

Today’s customers live in a multi-platform, multi-cloud, multi-OS world – that’s just reality. This world brings new challenges and customers need tools to make everything work together. Microsoft is working company-wide to deliver management tools that empower customers to manage any platform, from anywhere, on any device, using Linux or Windows. This shift to a more open, customer-obsessed approach to deliver innovation is one of the things that makes me most excited to come to work every day.

You’ve heard Satya Nadella say “Microsoft loves Linux” and that’s never been more true than now. Nearly one in three VMs on Azure are Linux. Nearly 60 percent of third-party IaaS offers in the Azure Marketplace are open source software (OSS). We have forged strong industry partners to extend choice to our customers. We’ve announced SQL Server on Linux, as well as open sourced .NET. We added Bash to Windows 10 to make it a great platform for developing OSS. And, we’re active contributors and participants to numerous open source projects (e.g. OpenSSH, FreeBS, Mesos, Docker, Linux and many more) across the industry.

Today, we are taking the next step in our journey. I am extremely excited to share that PowerShell is open sourced and available on Linux. (For those of you who need a refresher, PowerShell is a task-based command-line shell and scripting language built on the .NET Framework to help IT professionals control and automate the administration of the Windows, and now Linux, operating systems and the applications that run on them.) I’m going to share a bit more about our journey getting here, and will tell you how Microsoft Operations Management Suite can enhance the PowerShell experience.

The journey to opening up PowerShell

Let’s start the journey to PowerShell on Linux, and an open sourced version with a story…

My customer was upset.

Early in the days of Monad (before it was PowerShell), I gave a demo to an executive of a large customer. He loved it but now he was angry with me.

He had asked me in what language it was implemented and was unhappy with my answer: C#. Confused, I asked why that was a problem. “Because” he told me, “Monad is exactly the right approach and I want to standardize my entire company on it, but I can’t because .NET is not available on Linux!”

In the past, Microsoft’s business focus meant that .NET, and thus PowerShell, were only available on Windows. But this is a new Microsoft. Satya’s leadership and Azure have shifted us to a more open, customer-centric, mindset captured perfectly in this photo:

Microsoft wants to earn customers’ preference as the platform for running all their workloads – Linux as well as Windows. This new thinking empowered the .NET team to port .NET Core to Linux and that in turn, enabled PowerShell to port to Linux as well. PowerShell on Linux is now designed to enable customers to use the same tools, and the same people, to manage everything from anywhere. It is initially available on Ubuntu, Centos, as well as Red Hat, and also runs on Mac OS X. More platforms will be added in the future. You can download Alpha builds and check out the source code from GitHub.

Now, users across Windows and Linux, current and new PowerShell users, even application developers can experience a rich interactive scripting language as well as a heterogeneous automation and configuration management that works well with your existing tools. Your PowerShell skills are now even more marketable, and your Windows and Linux teams, who may have had to work separately, can now work together more easily.

So, where are we in this journey? We are in the beginning stages and in learning mode. We started by open sourcing small portions of PowerShell and talking to a number of our partners who were experienced with open source to understand what it took to succeed. What we learned, is that it is critical that individual users can use Git to checkout code, make changes, compile everything on their machines and run all the tests to verify that their change didn’t break anything. This required a large investment in our engineering/build/test systems. We also worked to define a governance model so we had clear roles, responsibilities, and processes so that community contributions could be smoothly incorporated into the product.

The PowerShell team has always prided itself on being a very community focused team but this announcement takes it to the next level by making the source code available and by adopting an open source development model where we can enjoy a deeper connection with the community in RFCs, issues and accept contributions directly. We also needed to extend our community since open source, like so many things, takes a village and that village is key to a great experience! We are partnering with third party companies – Chef, Amazon Web Services, VMware, and Google to name a few – to create a rich, seamless experience across the platforms you know and use.

As we port PowerShell to Linux, we are making sure that we are a first class citizen on that platform. We fit in well with the architecture, idioms and existing tools. This was pretty easy as most of the original PowerShell team had deep Unix backgrounds and that shows in our design. There were a number of little changes that we made and two big things:

We created a PowerShell Editor Service. This allow users to choose from a range of editors (VS Code and Sublime with others to follow) and get a great PowerShell authoring experience with Intellisense, debugging, etc.
We will be extending the PowerShell Remoting Protocol (MS-PSRP) to use OpenSSH as a native transport. Users will have the option to use SSH or WINRM as a transport.

The initial release is an “alpha” and is community supported. In the future, we will deliver an official Microsoft released version of PowerShell based on open source to anyone running a supported version of Windows or *nix. The timing of the official Microsoft version will be based upon community input and business needs. We hope all of you will help us get it right!

Extending the PowerShell experience with Microsoft Operations Management Suite

I also want to tell you more about how today’s PowerShell news extends what you can do with our cloud management solution, Operations Management Suite (OMS). OMS gives you visibility and control of your applications and workloads across Azure and other clouds. Integral to this, it enables customers to transform their cloud experience when using PowerShell on both Linux and Windows Server. OMS Automation elevates PowerShell and Desired State Configuration (DSC) with a highly available and scalable management service from Azure. You can graphically author and manage all PowerShell resources including runbooks, DSC configurations and DSC node configurations from one place.

With OMS hybrid runbook worker, you can extend your OMS Automation capability and apply, monitor and update configurations anywhere, including on-premises. Today we also made the OMS monitoring agent for Linux generally available. Together, customers can gain rich insights and real-time visibility into their Linux workloads, and the power to quickly remediate any issues that may arise.

We hope that all of you will take the time to test drive PowerShell on Linux and let us know what you think! You can also learn more about OMS Automation here. And, be sure to check us out at LinuxCon next week. Wim Coekaerts will be giving a keynote address, we’ll have a booth where we’ll be showing PowerShell, and I’ll be doing a session that will be packed with demos.

If you are new to PowerShell, a great way to start learning is with our Learning PowerShell repository on GitHub. We also offer a free Microsoft Virtual Academy online course: Getting Started with PowerShell 3.0 Jump Start. You’ll want to join/participate in the PowerShell Community at powershell.org/ and follow the PowerShell Team blog. We’ll be updating these to meet the needs of the Linux community (e.g. examples) in the near future.

This has been a long time coming and it is going to be a lot of fun so please join us so that together we can produce a tool that knocks our socks off every time we use it.

Quelle: Azure