SQL Server 2016 innovations power Azure SQL Data Warehouse to deliver faster insights

Azure SQL Data Warehouse (SQL DW) is a SQL-based petabyte-scale, massively parallel, cloud solution for data warehousing. It is fully managed and highly elastic, enabling you to provision and scale capacity in minutes. You can scale compute and storage independently, allowing you to range from burst to archival scenarios, and pay based off what you&;re using instead of being locked into a cluster configuration.

The engine underneath Azure SQL Data Warehouse that runs the queries on each individual node is the industry leading SQL Server Database from Microsoft. With general availability in 2016, Azure SQL DW received an upgrade to SQL Server 2016 that transparently provided 40% performance increase to user workloads comprising of analytic queries.

The two performance pillars of SQL DW are its column store and the batch mode execution engine, also known as vectorized query execution. In this blog, we highlight the improvements in SQL Server 16 that took SQL Data Warehouse performance to a new level. These are all in addition to existing features such as columnar compression and segment elimination. We already had batch mode execution that can process multiple rows at a time, instead of one value at a time, and take advantage of SIMD hardware innovations. SQL Server 16 further extended batch mode execution to more operators and scenarios.

The following are the key SQL Server 16 performance innovations for columnstore and batch mode. Each links to a detailed blog providing examples and observed performance gain.

Aggregate Pushdown

Aggregates are very common in analytic queries. With columnstore tables, SQL Server processes aggregates in batch mode delivering an order of magnitude better performance. SQL Server 16 further dials up aggregate computation performance by pushing the aggregation to the SCAN node. This allows the aggregate to be computed on the compressed data during the scan itself.

String Predicate Pushdown

Columnstore in SQL Server 16 allows string predicates to be pushed down the SCAN node, resulting in a significant improvement in query performance. String predicate pushdown leverages dictionaries to minimize the number of string comparisons.

Multiple Aggregates

SQL Server 16 now processes multiple aggregates on a table scan more efficiently in a single batch mode aggregation operator. Previously multiple aggregation paths and operators would be instantiated resulting in slower performance.

Batch Mode Window Aggregates

SQL Server 16 introduces batch mode execution for window aggregates. Batch mode has the potential to speed up certain queries by even 300 times as measured in some of our internal tests.

Batch Mode in Serial Execution

High concurrent activity and/or low number of cores can force queries to run in serial. Previously serial queries would get forced to run in row mode, resulting in a double beating from lack of parallelism and lack of batch mode. SQL Server 16 can run batch mode even when degree of parallelism (DOP) for a query is 1 (DOP 1 means the query is run serial). SQL Data Warehouse at lower SLOs (less than DWU1000) runs each distribution query in serial as there is less than one core per distribution. With this improvement, these queries now run in batch mode.

The above is quite an extensive list of performance boosts that SQL Data Warehouse now benefits from. Best of all, no change is required to SQL Data Warehouse user queries to get the above performance benefits – it is all automatic under the hood!

Next steps

In this blog we described how SQL Server 2016 innovations in columnstore and batch mode technologies give a huge performance boost to Azure SQL Data Warehouse queries. We encourage you to try it out by moving your on-premise data warehouse into the cloud.

Learn more

Check out the many resources to learn more about SQL Data Warehouse.

What is Azure SQL Data Warehouse?

SQL Data Warehouse best practices

Video library

MSDN forum

Stack Overflow forum
Quelle: Azure

Now available: Azure invoices emailed direct to your inbox

Instead of downloading your invoice every month, you can now opt-in to receive your invoice statement attached to your monthly billing email. In addition, you can configure additional recipients for this email. Save time and send the invoice directly to your accounts receivable department.

How to opt in

Select your subscription from the subscriptions blade. You have to opt-in for each subscription you own. Click Send my invoice, you may not see this if you are not the account admin. Then opt in.
Once you&;ve accepted the agreement you can configure additional recipients:
You can also access this blade from the billing history blade, or a deep link in your monthly statement notification email:

If you can&039;t access the email settings blade:

• You must be the account administrator to configure this setting, not sure what this means? Learn more here.

• If you have a monthly invoice but aren&039;t receiving an email, make sure you have your communication email properly set.

• This feature is only available in the direct channel and may not be available for certain subscriptions such as support offers or Azure in Open.
Quelle: Azure

Microsoft and Tierion collaborate on attestations & Blockchain proofs

Daniel Buchner from the Identity team is working with a wide array of organizations on the decentralized identity initiative here at Microsoft.  One such organization, Tierion, has just committed to a collboration on attestations, which is described in this post.

The goal of Microsoft’s decentralized identity initiative is to give people and organizations control of their identity and related data. We’re building technology that lets users sign data, claims, or agreements with their identities. These bits of identity-signed data are called attestations. Microsoft and Tierion are collaborating on a service that generates, manages, and validates attestations. Together we’re exploring how this technology serves the needs of developers and organizations.

In the future, you might take an online course and receive an attestation proving you completed the required work. This attestation is digitally signed by the educational organization’s decentralized identifier and a timestamp proof that is rooted in a secure public blockchain. Anyone can verify the identities and validate this data without trusting the signers or their service providers. The blockchain serves as the root of trust. Attestations will be kept in secure datastores that are fully controlled by users. The industry sometimes calls this self-sovereign identity.

How does Tierion fit into this picture? Non-repudiation is important to regulated industries such as financial services, healthcare, and insurance. These organizations need to prove there hasn’t been collusion to backdate or modify data. Tierion links data to the blockchain and generates a timestamp proof of the data’s integrity and existence. Anyone with this proof can independently verify the data without relying on a trusted authority.

Public blockchains such as Bitcoin are exceptionally secure, but slow. The current throughput of the Bitcoin network is about four transactions per second. Tierion solves this scalability problem by cryptographically linking millions of data points to a single transaction. We’re working with Tierion on a service that leverages the open source Chainpoint protocol their team developed for using the blockchain as a trust anchor.

The collaboration between Microsoft and Tierion is another important step in bringing the best blockchain-based tools and services to developers. As we move forward with our decentralized identity initiative, look for more content, collaborations, and announcements in the coming months.

 
Quelle: Azure

SQL Database Query Editor available in Azure Portal

We are excited to announce the availability of an in-browser query tool that provides you an efficient way to execute queries on your Azure SQL Databases and SQL Data Warehouses without leaving the Azure Portal. This SQL Database Query Editor is now in public preview in the Azure Portal.

With this editor, you can access and query your database without needing to connect from a client tool or configure firewall rules.

The various features in this new editor create a seamless experience for querying your database.

Query Editor capabilities

Connect to your database

Before executing queries against your database, you must login with either your SQL server or Azure Active Directory (AAD) credentials. If you are the AAD admin for this SQL server, you will be automatically logged in when you first open the Query Editor using AAD single sign-on.

Learn more about how to configure your AAD server admin. If you are not currently taking advantage of Azure Active Directory, you can learn more.

Write and execute T-SQL scripts

If you are already familiar with writing queries in SSMS, you will feel right at home in the in-browser Query Editor.

Many common queries can be run in this editor, such as create new table, display table data, edit table data, create a stored procedure, or drop table. You have the flexibility to execute partial queries or batch queries in this editor. And by utilizing syntax highlighting and error indicating, this editor makes writing scripts a breeze.

Additionally, you can easily load an existing query file into the Query Editor or save your current script in this editor to your local machine. This ability provides you the convenience to save and port the queries between editors.

Manage query results

Another similarity between this Query Editor and SSMS is the ability to resize the Results pane to get the desired ratio between the Editor and Results sections. You can also filter results by keyword rather than having to scroll through all the output.

How to find Query Editor

SQL Database

You can find this experience by navigating to your SQL database and clicking the Tools command and then clicking Query Editor (preview), as shown in the screenshots below. While this feature is in public preview, you will need to accept the preview terms before using the editor.

SQL Data Warehouse

You can find this experience by navigating to your SQL data warehouse and clicking on Query Editor (preview), shown in the screenshot below. While this feature is in public preview, you will need to accept the preview terms before using the editor.

Run sample query

You can quickly test out the editor by running a simple query, such as in the screenshot below.

Send us feedback!

Please reach out to us with feedback at sqlqueryfeedback@microsoft.com.
Quelle: Azure

What’s brewing in Visual Studio Team Services: January 2017 Digest

This post series provides the latest updates and news for Visual Studio Team Services and is a great way for Azure users to keep up-to-date with new features being released every three weeks. Visual Studio Team Services offers the best DevOps tooling to create an efficient continuous integration and release pipeline to Azure. With the rapidly expanding list of features in Team Services, teams can start to leverage it more efficiently for all areas of their Azure workflow, for apps written in any language and deployed to any OS.

Release Management is generally available

Release Management is now generally available. Release Management enables you to create a continuous deployment pipeline for your applications with fully automated deployments, seamless integration with Azure, and end to end traceability.

Azure App Services Continuous Delivery

Additionally, Release Management is now available in the Azure Portal. You can start using this feature today by navigating to your app’s menu blade in the Azure portal and clicking on APP DEPLOYMENT > Continuous Delivery (Preview).

Package Management is generally available

Package Management is available as an extension to Team Services and Team Foundation Server 2017 for hosting your packages and making them available to your team, your builds, and your releases. In addition to support for NuGet packages, Package Management now support npm packages. If you’re a developer working with node.js, JavaScript, or any of its variants, you can now use Team Services to host private npm packages right alongside your NuGet packages.

Work Item Search is now in public preview

While Code Search is the most popular extension for Team Services and has been available for a while now, Work Item Search is now available in public preview. You can install the free Work Item Search extension from the Team Services Marketplace. With Work Item Search you can quickly and easily find relevant work items by searching across all work item fields over all projects in an account. You can perform full text searches across all fields to efficiently locate relevant work items. Use in-line search filters, on any work item field, to quickly narrow down to a list of work items.

Import TFS servers directly into Team Services

We are very excited to announce the preview of the TFS Database Import Service for Visual Studio Team Services. In the past, we have had various options that offered a low-fidelity method for migrating your data.  The difference today is that the TFS Database Import Service is a high-fidelity migration that brings over your source code history, work items, builds, etc. and keeps the same ID numbers, traceability, settings, permissions personalizations, and much more.  Our goal for your final production import is that your team will be working in TFS on a Friday and then be continuing their work in Visual Studio Team Services when they come back to work on Monday.

Public preview of Linux in the hosted build pool

That’s right – we have added Linux containers to our host build pool. These are running on Ubuntu Linux inside the vsts-agent-docker container. This container includes all the standard Java, Node, Docker, and .NET Core tooling. You can create or spawn other Docker containers as part of your build or release process using either a script or the Docker extension in the Visual Studio Marketplace. To use Linux, just choose Hosted Linux Preview for the default agent queue in the General section of your build definition.

Improvements to the pull request experience

We continue to enhance the pull request experience, and we’ve now added the ability to see the changes in a PR since you last looked at it, add attachments in comments, and to see and resolve merge conflicts.

JBoss and WildFly extension

The JBoss and WildFly extension provides a task to deploy your Java applications to an instance of JBoss Enterprise Application Platform (EAP) 7 or WildFly Application Server 8 and above over the HTTP management interface.  It also includes a utility to run CLI commands as part of your build/release process.  Check out this video for a demo. This extension is open sourced on GitHub so reach out to us with any suggestions or issues.  We welcome contributions.

There are many more updates, so I recommend taking a look at the full list of new features in the release notes for November 23rd and January 5th.

Happy coding!
Quelle: Azure

Microsoft Azure Germany expands services and boosts compliance

Since launching in September 2016, Microsoft Azure Germany has continued to expand service and assurance capabilities in our first-of-its-kind cloud for Europe. Today we are excited to announce availability of several new services and the achievement of  ISO/IEC 27001 certification and ISO/IEC 27018 compliance attestation.

Expanding innovation for customers

In addition to the comprehensive set of services offered already, we are announcing 4 major functionality releases – HDInsight on Linux, Power BI, Cool Blob Storage and Mobile Apps for Azure App Service.

HDInsight on Linux is Azure’s fully-managed cloud Hadoop offering – providing optimized open source analytic clusters for Spark, Hive, MapReduce, HBase, Storm, Kafka and R Server.  These big data technologies and ISV applications are easily deployable as managed clusters with enterprise-level security and monitoring – enabling customers to extract critical business insights from their valuable data sources.
With the combination of Azure services and Power BI in Azure Germany, customers can turn their data processing efforts into analytics and reports that provide real-time insights into their businesses. Power BI has several Azure connections available, and customers can connect as few as one Azure data source, or several, then shape and refine their data to build customized reports.
As customer and application data continues to grow at an exponential pace in the cloud, Microsoft Azure Germany now offers options for managing large volumes of data in a more cost-effective way. Azure Cool Blob Storage provides a low-cost storage option for cool object data such as backups, media content, or archive data. 
With the Mobile Apps capabilities of the Azure App Service in Germany, customers can leverage a highly scalable, mobile application development platform that enables them to build native and cross platform mobile experiences with a rich set of enterprise connected capabilities.

Bringing a global standard to our first-of-its-kind cloud for Europe

The ISO/IEC 27001 and ISO/IEC 27018 achievements underscore Microsoft’s commitment to our assurance promises for Microsoft Azure Germany customers.

By following the ISO/IEC 27001 standard, Microsoft Azure Germany demonstrates that its security policies and procedures are robust and can successfully protect organizational information assets in line with internationally recognized standards.
The ISO/IEC 27018 code of practice is based on EU data-protection laws, and provides specific guidance to cloud service providers (CSPs) acting as processors of personally identifiable information (PII) on assessing risks and implementing state-of-the-art controls for protecting PII data.

Moreover, ISO/IEC 27001 and ISO/IEC 27018 emphasize a well-defined, layered approach to security which allows Microsoft to proactively monitor and evaluate its security controls against global security standards, while continuously adapting to local security laws, regulations and best practices in information security.

Microsoft has established a continuous monitoring program that involves a detailed review and validation of information security and privacy-related controls on an ongoing basis. This allows us to identify threats and mitigate vulnerabilities with speed. Additionally, a third-party accredited auditor must validate in their statement of applicability that Microsoft in-scope cloud and commercial technical support services have incorporated controls for the protection of all data including PII on Microsoft Azure Germany. To remain compliant, Microsoft cloud services undergo third-party reviews annually.

Support for customers leveraging IT-Grundschutz framework

In addition to these recent ISO achievements, we remind customers who wish to implement the IT-Grundschutz framework within the scope of their existing or planned ISO/IEC 27001 certification, an “IT Grundschutz Compliance Workbook” is available free of charge – both in English and in German. Released in late 2016, this workbook is based on the standards outlined by the German Federal Office for Information Security (BSI), and provides guidance to customers seeking certification for solutions and workloads deployed on Microsoft Azure Germany.

Empowering app-building capability assured by the confidence of compliance depth

While each of these capabilities further empowers customers to achieve more with Azure, it is the combination of the innovative platform capabilities with our commitment to a comprehensive assurance program that makes Microsoft Azure Germany the ideal platform for our partners and customers.  You can read more about customers choosing Microsoft Azure Germany at the Microsoft News Centre Europe, and learn more about the products we offer at the Azure Germany product page. Additional information about Microsoft Azure compliance depth and achievements can be found at the Trust Center.
Quelle: Azure

Our journey on building the Go SDK for Azure

Over the last few months, we&;ve been busy adding new functionality to the Azure Go SDK and we&039;ll keep doing so as we march towards public preview next year.

If you followed the recent changes on our GitHub repo, you probably noticed few general improvements we made to the SDK

Model Flattening

In the last release we added model flattening to many of our APIs (i.e. you can type Resource.Sku.Family instead of resource.Properties.Sku.Family), which makes for more readable code.

Better error messages during parameter validation

During parameter validation, we enabled the SDK to return an error with the info needed to fix the JSON before sending the request out – making it easier to identify/correct potential coding mistakes.

For example, let us take a scenario where a user wants to create a resource group and location is required in that operation. User forgets to include it in the request.

In previous SDK versions, the operation would fail inside Azure and user would get the following error

resources.GroupsClient#: Failure responding to request: StatusCode=400 — Original Error: autorest/azure: Service returned an error. Status=400 Code="LocationRequired" Message="The location property is required for this definition."

In the latest SDK version, user would get

resources.GroupsClientCreateOrUpdate: Invalid input: autorest/validation: validation failed: parameter=parameters.Location constraint=Null value=(*string)(nil) details: value can not be null; required parameter

 

We also improved the coverage and functionality of the data plane of the SDK by adding support for file and directory manipulation, getting/setting ACLs on containers, working with the Storage Emulator and other various storage blob and queues operations.

Some of the fixes and improvements added to the SDK have been provided by enthusiastic developers outside of our Microsoft team and we would like to extend our sincere gratitude and appreciation to everyone who sent us feedback and/or pull requests. We took note of your requests for better API coverage in the data plane, better documentation, release notes and samples, and we are making progress in incorporating them into our future releases.

Breaking changes

Speaking of future releases: while many API changes are expected to be additive in nature, some of the changes we are introducing will break existing clients. A recent example was issue 1559, which arose when we added parameter validation; in the near future, some methods and parameters may be added/deleted, parameters change order, and structs can change while we are considering model flattening on more APIs. This is part of the reason why we keep the &039;beta&039; label on the Go SDK, and we are carefully examining every proposed change for alternatives that will not break the existing functionality.

We’d like to thank in advance all of you who continue to use our Go SDK and send us feedback; we are committed to building the best experience for developers on our platform, and we&039;d like to make sure the changes have minimal impact on your development cycle as the SDK goes towards more mature stages of public preview and GA (general availability)

We will use this blog to keep you updated on the progress and potential breaking changes, and we’ll give you a heads-up as we are approaching new milestones.
Have any suggestions for how to make the SDK better? We’d love to hear from you! Send us a pr or file an issue, and let’s talk!
Quelle: Azure

Loading data into Azure SQL Data Warehouse just got easier

Azure SQL Data Warehouse is a SQL-based fully managed, petabyte-scale cloud solution for data warehousing. SQL Data Warehouse is highly elastic, enabling you to provision in minutes and scale capacity in seconds. You can scale compute and storage independently, allowing you to burst compute for complex analytical workloads or scale down your warehouse for archival scenarios, and pay based off what you&;re using instead of being locked into predefined cluster configurations.

Since announcing general availability in July 2016, we have continued to work on helping customers get data faster into their Data Warehouse to generate insights faster and grow their businesses further. Azure SQL Data Warehouse solves the data loading scenario via PolyBase, which is a feature built into the SQL Engine. It effectively leverages the entire Massively Parallel Processing (MPP) architecture of Azure SQL Data Warehouse to provide the fastest loading mechanism from Azure Blob Storage into the Data Warehouse. We recently shared how you can use Azure Data Factory Copy Wizard to load 1TB data in under 15 mins into Azure SQL Data Warehouse, at over 1.2 GB per second throughput.

To understand just how this works, let’s take a high-level look at the SQL Data Warehouse architecture. A SQL Data Warehouse is composed of a Control Node, which is where users connect and submit queries, and compute nodes, where processing occurs. Traditional loading tools load individual rows through the control node. The rows are then routed to the appropriate compute node depending on how the data is to be distributed. This can cause slower performance because the control node must read each record as they are received. PolyBase uses the compute nodes to load the data in parallel allowing for faster performance, resulting in quicker insights from your data.

UTF-16 support for delimited text files

To make it easier to load data into Azure SQL Data Warehouse using PolyBase, we have expanded our delimited text file format to support UTF-16 encoded files.

Support for UTF-16 encoded files is important because this is the default file encoding for BCP.exe. We’ve often seen that customers export their data from their on-premises data Warehouse to Azure Blob Storage in UTF-16 format. In the past, it was necessary to then have a script to reencode the data into UTF-8 format, resulting in time consuming processing and a duplication of data. Now with UTF-16 supported, files can go directly from Azure Blob storage into SQL Data Warehouse without encoding conversion.

How to import a UTF 16 text file format

To import UTF-16 files into SQL DW with PolyBase, all you have to do is create a new file format with the encoding option set to ‘UTF16’. All of the additional format options like, field terminator, date format, and rejection values are supported with both UTF-16 and UTF-8 encoding.

Below is an example of a pipe delimited text file format that would read UTF16 files.

Next steps

In this blog post we discussed a bit about PolyBase and why it is the optimal data loading tool for SQL Data Warehouse and our expanded support for UTF-16 encoded file formats. This is now available in all SQL Data Warehouse Azure regions worldwide. We encourage you to try it out if you are interested in moving your on-prem Data Warehouse into the cloud.

Learn more

What is Azure SQL Data Warehouse?

SQL Data Warehouse best practices

Load Data into SQL Data Warehouse

MSDN forum

Stack Overflow forum

Feature Requests

If you have any feature requests for Azure SQL Data Warehouse, I like to suggest connecting with the team via User Voice.
Quelle: Azure

Azure Analysis Services now available in North Central US and Brazil South

Last October we released the preview of Azure Analysis Services, which is built on the proven analytics engine in Microsoft SQL Server Analysis Services. With Azure Analysis Services you can host semantic data models in the cloud. Users in your organization can then connect to your data models using tools like Excel, Power BI, and many others to create reports and perform ad-hoc data analysis.

We are excited to share with you that the preview of Azure Analysis Services is now available in 2 additional regions: North Central US and Brazil South.  This means that Azure Analysis Services is available in the following regions: Brazil South, Southeast Asia, North Europe, West Europe, West US, South Central US, North Central UA, East US 2 and West Central US.

New to Azure Analysis Services? Find out how you can try Azure Analysis Services or learn how to create your first data model.
Quelle: Azure

Manage App Service, SQL Database, and more – Azure Management Libraries for .NET

One C# statement to create a Web App. One statement to create a SQL Server and another statement to create a SQL Database. One statement to create an Application Gateway, etc.

Beta 4 of the Azure Management Libraries for .NET is now available. Beta 4 adds support for the following Azure services and features:

✓ App Service (Web Apps)

✓ SQL Database

✓  Application Gateway

✓ Traffic Manager

✓ DNS

✓ CDN

✓ Redis Cache

 
 

https://github.com/Azure/azure-sdk-for-net/tree/Fluent

You can download Beta 4 from:

 

Last year, we announced a preview of the new, simplified Azure management libraries for .NET. Our goal is to improve the developer experience by providing a higher-level, object-oriented API, optimized for readability and writability. These libraries are built on the lower-level, request-response style auto generated clients and can run side-by-side with auto generated clients. Thank you for trying the libraries and providing us with plenty of useful feedback.

Create a Web App

You can create a Web app instance by using a define() … create() method chain.

var webApp = azure.WebApps()
.Define(appName)
.WithNewResourceGroup(rgName)
.WithNewAppServicePlan(planName)
.WithRegion(Region.US_WEST)
.WithPricingTier(AppServicePricingTier.STANDARD_S1)
.Create();

Create a SQL Database

You can create a SQL server instance by using another define() … create() method chain.

var sqlServer = azure.SqlServers.Define(sqlServerName)
.WithRegion(Region.US_EAST)
.WithNewResourceGroup(rgName)
.WithAdministratorLogin(administratorLogin)
.WithAdministratorPassword(administratorPassword)
.WithNewFirewallRule(firewallRuleIpAddress)
.WithNewFirewallRule(firewallRuleStartIpAddress, firewallRuleEndIpAddress)
.Create();

Then, you can create a SQL database instance by using another define() … create() method chain.

var database = sqlServer.Databases.Define(databaseName)
.Create();

Create an Application Gateway

You can create an application gateway instance by using another define() … create() method chain.

var applicationGateway = azure.ApplicationGateways().Define("myFirstAppGateway")
.WithRegion(Region.US_EAST)
.WithExistingResourceGroup(resourceGroup)
// Request routing rule for HTTP from public 80 to public 8080
.DefineRequestRoutingRule("HTTP-80-to-8080")
.FromPublicFrontend()
.FromFrontendHttpPort(80)
.ToBackendHttpPort(8080)
.ToBackendIpAddress("11.1.1.1")
.ToBackendIpAddress("11.1.1.2")
.ToBackendIpAddress("11.1.1.3")
.ToBackendIpAddress("11.1.1.4")
.Attach()
.WithExistingPublicIpAddress(publicIpAddress)
.Create();

Sample code

You can find plenty of sample code that illustrates management scenarios in Azure Virtual Machines, Virtual Machine Scale Sets, Storage, Networking, Resource Manager, SQL Database, App Service (Web Apps), Key Vault, Redis, CDN and Batch.

Service
Management Scenario

Virtual Machines

Manage virtual machine
Manage availability set
List virtual machine images
Manage virtual machines using VM extensions
Create virtual machines from generalized image or specialized VHD
List virtual machine extension images

Virtual Machines – parallel execution

Create multiple virtual machines in parallel
Create multiple virtual machines with network in parallel
Create multiple virtual machines across regions in parallel

Virtual Machine Scale Sets

Manage virtual machine scale sets (behind an Internet facing load balancer)

Storage

Manage storage accounts

Networking

Manage virtual network
Manage network interface
Manage network security group
Manage IP address
Manage Internet facing load balancers
Manage internal load balancers

Networking – DNS

Host and manage domains

Traffic Manager

Manage traffic manager profiles

Application Gateway

Manage application gateways
Manage application gateways with backend pools

SQL Database

Manage SQL databases
Manage SQL databases in elastic pools
Manage firewalls for SQL databases
Manage SQL databases across regions

Redis Cache

Manage Redis Cache

App Service – Web Apps

Manage Web apps
Manage Web apps with custom domains
Configure deployment sources for Web apps
Manage staging and production slots for Web apps
Scale Web apps
Manage storage connections for Web apps
Manage data connections (such as SQL database and Redis cache) for Web apps

Resource Groups

Manage resource groups
Manage resources
Deploy resources with ARM templates
Deploy resources with ARM templates (with progress)

Key Vault

Manage key vaults

CDN

Manage CDNs

Batch

Manage batch accounts

Give it a try

You can run the samples above or go straight to our GitHub repo. Give it a try and let us know what do you think (via e-mail or comments below), particularly –

Usability and effectiveness of the new management libraries for .NET.
What Azure services you would like to see supported soon?
What additional scenarios should be illustrated as sample code?

Over the next few weeks, we will be adding support for more Azure services and applying finishing touches to the API.
Quelle: Azure