New Azure Storage JavaScript client library for browsers – Preview

Today we are announcing our newest library: Azure Storage Client Library for JavaScript. The demand for the Azure Storage Client Library for Node.js, as well as your feedback, has encouraged us to work on a browser-compatible JavaScript library to enable web development scenarios with Azure Storage. With that, we are now releasing the preview of Azure Storage JavaScript Client Library for Browsers.

Enables web development scenarios

The JavaScript Client Library for Azure Storage enables many web development scenarios using storage services like Blob, Table, Queue, and File, and is compatible with modern browsers. Be it a web-based gaming experience where you store state information in the Table service, uploading photos to a Blob account from a Mobile app, or an entire website backed with dynamic data stored in Azure Storage.

As part of this release, we have also reduced the footprint by packaging each of the service APIs in a separate JavaScript file. For instance, a developer who needs access to Blob storage only needs to require the following scripts:

<script type=”javascript/text” src=”azure-storage.common.js”/>
<script type=”javascript/text” src=”azure-storage.blob.js”/>

Full service coverage

The new JavaScript Client Library for Browsers supports all the storage features available in the latest REST API version 2016-05-31 since it is built with Browserify using the Azure Storage Client Library for Node.js. All the service features you would find in our Node.js library are supported. You can also use the existing API surface, and the Node.js Reference API documents to build your app!

Built with Browserify

Browsers today don’t support the require method, which is essential in every Node.js application. Hence, including a JavaScript written for Node.js won’t work in browsers. One of the popular solutions to this problem is Browserify. The Browserify tool bundles your required dependencies in a single JS file for you to use in web applications. It is as simple as installing Browserify and running browserify node.js -o browser.js and you are set. However, we have already done this for you. Simply download the JavaScript Client Library.

Recommended development practices

We highly recommend use of SAS tokens to authenticate with Azure Storage since the JavaScript Client Library will expose the authentication token to the user in the browser. A SAS token with limited scope and time is highly recommended. In an ideal web application it is expected that the backend application will authenticate users when they log on, and will then provide a SAS token to the client for authorizing access to the Storage account. This removes the need to authenticate using an account key. Check out the Azure Function sample in our Github repository that generates a SAS token upon an HTTP POST request.

Use of the stream APIs are highly recommended due to the browser sandbox that blocks users from accessing the local filesystem. This makes the stream APIs like getBlobToLocalFile, createBlockBlobFromLocalFile unusable in browsers. See the samples in the link below that use createBlockBlobFromStream API instead.

Sample usage

Once you have a web app that can generate a limited scope SAS Token, the rest is easy! Download the JavaScript files from the repository on Github and include in your code.

Here is a simple sample that can upload a blob from a given text:

1. Insert the following script tags in your HTML code. Make sure the JavaScript files located in the same folder.

<script src="azure-storage.common.js"></script/>
<script src="azure-storage.blob.js"></script/>

2. Let’s now add a few items to the page to initiate the transfer. Add the following tags inside the BODY tag. Notice that the button calls uploadBlobFromText method when clicked. We will define this method in the next step.

<input type="text" id="text" name="text" value="Hello World!" />
<button id="upload-button" onclick="uploadBlobFromText()">Upload</button>

3. So far, we have included the client library and added the HTML code to show the user a text input and a button to initiate the transfer. When the user clicks on the upload button, uploadBlobFromText will be called. Let’s define that now:

<script>
function uploadBlobFromText() {
     // your account and SAS information
     var sasKey ="….";
     var blobUri = "http://<accountname>.blob.core.windows.net";
     var blobService = AzureStorage.createBlobServiceWithSas(blobUri, sasKey).withFilter(new AzureStorage.ExponentialRetryPolicyFilter());
     var text = document.getElementById(&;text&039;);
     var btn = document.getElementById("upload-button");
     blobService.createBlockBlobFromText(&039;mycontainer&039;, &039;myblob&039;, text.value,  function(error, result, response){
         if (error) {
             alert(&039;Upload filed, open browser console for more detailed info.&039;);
             console.log(error);
         } else {
             alert(&039;Upload successfully!&039;);
         }
     });
}
</script>

Of course, it is not that common to upload blobs from text. See the following samples for uploading from stream as well as a sample for progress tracking.

•    JavaScript Sample for Blob
•    JavaScript Sample for Queue
•    JavaScript Sample for Table
•    JavaScript Sample for File 

Share

Finally, join our Slack channel to share with us your scenarios, issues, or anything, really. We’ll be there to help!
Quelle: Azure

Launching online training and certification for Azure SQL Data Warehouse

Azure SQL Data Warehouse (SQL DW) is a SQL-based fully managed, petabyte-scale cloud solution for data warehousing. SQL Data Warehouse is highly elastic, enabling you to provision in minutes and scale capacity in seconds. You can scale compute and storage independently, allowing you to burst compute for complex analytical workloads or scale down your warehouse for archival scenarios, and pay based off what you&;re using instead of being locked into predefined cluster configurations.

We are pleased to announce that Azure SQL Data Warehouse training is now available online via the edX training portal. In this computer science course, you will learn how to deploy, design, and load data using Microsoft&039;s Azure SQL Data Warehouse, or SQL DW. You&039;ll learn about data distribution, compressed in-memory indexes, PolyBase for Big Data, and elastic scale.

Course Syllabus

Module 1: Key Concepts of MPP (Massively Parallel Processing) Technology and SQL Data Warehouse
This module makes a case for deploying a data warehouse in the cloud, introduces massively parallel processing and explores the components of Azure SQL Data Warehouse.

Module 2: Provisioning a SQL Data Warehouse
This module introduces the tasks needed to provision Azure SQL Data Warehouse, the tools used to connect to and manage the data warehouse and key querying options.

Module 3: Designing Tables and Loading Data
This module covers data distribution in an MPP data warehouse, creating tables and loading data.

Module 4: Integrating SQL DW in a Big Data Solution
This module introduces Polybase to access big data, managing, protecting, and securing your Azure SQL Data Warehouse, and integrating your Azure SQL Data Warehouse into a big data solution.

Final Exam
The final exam accounts for 30% of your grade and will be combined with the weekly quizzes to determine your overall score. You must achieve an overall score of 70% or higher to pass this course and earn a certificate.

Note: To complete the hands-on elements in this course, you will require an Azure subscription. You can sign up for a free Azure trial subscription (a valid credit card is required for verification, but you will not be charged for Azure services).  Note that the free trial is not available in all regions. It is possible to complete the course and earn a certificate without completing the hands-on practices.

Exclusive free trial

We’re giving all our customers free access to Azure SQL Data Warehouse for a whole month!  More information on the SQL DW Free Trial.  All you need to do is sign up with your Azure Subscription details before 30th June 2017.

Azure Subscription

If you don’t have an Azure subscription you can sign up for free.  Provision for yourself the industry leading elastic-scale data warehouse literally in minutes and experience how easy it is to go from ‘just data’ to ‘business insights’.  Load your data or try out pre-loaded sample data set and run queries with compute power of up to 1000 DWU (Data Warehouse Units) and 12TB of storage to experience this fully managed cloud-based service for an entire month for free.

Learn more

What is Azure SQL Data Warehouse?

What is Azure Data Lake Store?

SQL Data Warehouse best practices

Load Data into SQL Data Warehouse

MSDN forum

Stack Overflow forum
Quelle: Azure

Announcing Microsoft Azure Storage Explorer 0.8.9

We just released Microsoft Azure Storage Explorer 0.8.9 last week. You can download it from http://storageexplorer.com/​.

Recent new features in the past two releases:

Automatically download the latest version when it is available
Create, manage, and promote blob snapshots
Sign-in to Sovereigh Clouds like Azure China, Azure Germany and Azure US Government
Zoom In, Zoom Out, and Reset Zoom from View menu

Try out and send us feedback from the links on the bottom left corner of the app.
Quelle: Azure

Comparing SELECT..INTO and CTAS use cases in Azure SQL Data Warehouse

The team recently introduced SELECT..INTO to the SQL language of Azure SQL Data Warehouse. SELECT..INTO enables you to create and populate a new table based on the result-set of a SELECT statement. So now users have two options for creating and populating a table using a single statement. This post summarises the usage scenarios for both CTAS and SELECT..INTO and summarizes the differences between the two approaches:

Look at the example of SELECT..INTO below:

SELECT *

INTO [dbo].[FactInternetSales_new]

FROM [dbo].[FactInternetSales]

;

The result of this query is also a new round robin distributed clustered columnstore table called dbo.FactInternetSales_new. All done and dusted in three lines of code. Great!

Let’s now contrast this with the corresponding CTAS statement below:

CREATE TABLE [dbo].[FactInternetSales_new]

WITH

( DISTRIBUTION = HASH(Product_key)

, HEAP

)

AS

SELECT *

FROM [dbo].[FactInternetSales]

;

The result of this query is a new hash distributed heap table called dbo.FactInternetSales_new. Note that with CTAS you have full control of the distribution key and the organisation of the table. However, the code is more verbose as a result. With SELECT..INTO that code is significantly reduced and also might be more familiar.

With that said there are some important differences to be mindful of when using SELECT..INTO. There are no options to control the table organization or the distribution method. SELECT..INTO also always creates a round robin distributed clustered columnstore table. It is also worth noting that there is a small difference in behavior when compared with SQL Server and SQL Database. In SQL Server and SQL Database the SELECT..INTO command creates a heap table (the default table creation structure). However, in SQL Data Warehouse, the default table type is a clustered columnstore and so we follow the pattern of creating the default table type.

Below is a summary table of the differences between CTAS and SELECT..INTO:

 
CTAS
SELECT INTO

Distribution Key
Any (full control)
ROUND_ROBIN

Table type
Any (full control)
CLUSTERED COLUMNSTORE INDEX

Verbosity
Higher (WITH section required)
Lower (defaults fixed so no additional coding)

Familiarity
Lower (newer syntax to Microsoft customers)
Higher (very familiar syntax to Microsoft customers)

 

Despite these slight differences and variations there still several reasons for including SELECT..INTO in your code.

In my mind there are three primary reasons:

Large code migration projects
Target object is a round robin clustered columnstore index
Simple cloning of a table.

When customers migrate to SQL Data Warehouse they are often times migrating existing solutions to the platform. In these cases the first order of business is to get the existing solution up and running on SQL Data Warehouse. In this case SELECT..INTO may well be good enough. The second scenario is the compact code scenario. Here a round_robin clustered columnstore table may be the desired option. In which case SELECT..INTO is much more compact syntactically. SELECT..INTO can also be used to create simple sandbox tables that mirror the definition of the source table. Even empty tables can created when paired with a WHERE 1=2 is used to ensure no rows are moved. This is a useful technique for creating empty tables when implementing partition switching patterns.

Finally, customers may not even realize they require SELECT..INTO support. Many customers use off the shelf ISV solutions that require support for SELECT..INTO. A good example might be a rollup Business Intelligence tool that generates its own summary tables using SELECT..INTO on the fly. In this case customers may be issuing SELECT..INTO queries without even realizing it.

For more information please refer to the product documentation for CTAS where the main differences are captured.
Quelle: Azure

SharePoint Server 2016 in Azure infrastructure services

To take advantage of SharePoint’s collaboration features, Microsoft recommends SharePoint Online in Office 365. If that is not the best option for you right now, you should use SharePoint Server 2016. However, building a SharePoint Server 2016 farm in Microsoft Azure infrastructure services requires additional planning considerations and deployment steps.

For a defined path from evaluation to successful deployment, see SharePoint Server 2016 in Microsoft Azure. This new content set reduces the time it takes for you to design and deploy dev/test, staging, production, or disaster recovery SharePoint Server 2016 farms in Azure.

There are step-by-step instructions for two prescriptive dev/test environments:

1. A single-server farm running in Azure for demonstration, evaluation, or application testing.

2. An intranet farm running in Azure to experiment with client access and administration in a simulated Azure IaaS hybrid configuration.

When you are ready to begin planning the Azure environment for your SharePoint Server 2016 farm, see Designing a SharePoint Server 2016 farm in Azure. A table-based, step-by-step approach assures that you are collecting the right set of interrelated settings for the networking, storage, and compute elements of Azure infrastructure services.

When you are ready to deploy, see Deploying SharePoint Server 2016 with SQL Server AlwaysOn Availability Groups in Azure to build out this high availability configuration:

A table-based, phased approach assures that you are creating the Azure infrastructure with the correct settings, which you can adapt or expand for your business needs.

To assist you in creating the Azure infrastructure and configuring the servers of the high availability SharePoint Server 2016 farm, use the SharePoint Server 2016 High Availability Farm in Azure Deployment Kit, a ZIP file in the TechNet Gallery that contains:

Microsoft Visio and Microsoft PowerPoint files with the figures for the two dev/test environments and the high-availability deployment

All the PowerShell command blocks to create and configure the high availability SharePoint Server 2016 farm in Azure

A Microsoft Excel configuration workbook that generates the PowerShell commands to create the SharePoint Server 2016 high availability farm in Azure, based on your custom settings

Quelle: Azure

Improved troubleshooting in Azure Stream Analytics with diagnostic logs

We are announcing the much-awaited public preview of diagnostic logs for Azure Stream Analytics through integrations with Azure Monitoring. You can now examine late or malformed data that causes unexpected behaviors. This helps remediate errors caused by data that does not conform to the expectations of the query.

Diagnostic logs provide rich insights into all operations associated with a streaming job. They are turned off by default and can be enabled in the “Diagnostic logs” blade under “Monitoring”. These are different from Activity logs that are always enabled and provide details on management operations performed.

Examples of data handling errors that diagnostic logs can help with include:

Data conversion and serialization errors in cases of schema mismatch.
Incompatible types including constraints such as allow null and duplicates.
Truncation of strings and issues with precision during conversion.
Expression evaluation errors such as divide by zero, overflow etc.

An example of non-conforming data being written to Azure storage is illustrated below:

{
Diagnostic:"Encountered error trying to write 3 events: …",
Timestamp:"7/25/2015 12:27:44Z",
Source:"Output1",
Output:"Output1",
Error:
{
Type:"System.InvalidOperationException",
Description:"The given value “hello world” of type string from the data source cannot be converted to type decimal of the specified target column [Amount].",
},
EventData:
{
SomeValue:”hello world”,
Count:1
}
}

Errors are sampled by error type and source as shown above.

Immediate access to the actual data that causes errors enables you to either quickly remediate problems or ignore the non-conforming data to make progress.

Persisting event data and operational metadata (such as occurrence time and occurrence count) in an Azure Storage artifacts enables easier diagnosis and faster troubleshooting of issues. This data can also be analyzed offline using Azure Log Analytics. Routing this data to EventHub makes it possible to set up a Stream Analytics job to monitor another Stream Analytics job!

It should be noted that the usage of services such as Azure Storage, EventHub, and Log Analytics for analyzing non-conforming data will be charged based on the pricing model for those services.

We are excited for you to try it out our diagnostic logs. Detailed steps on using this capability can be found in the documentation page.
Quelle: Azure

Azure Brings big data, analytics, and visualization capabilities to U.S. Government

To further our commitment to providing the latest in cloud innovation for government customers we’re excited to announce the general availability of HDInsight and Power BI Pro in Microsoft Cloud for Government.  HDInsight and Power BI bring exciting new capabilities to Azure Government that enable organizations to manage, analyze, and visualize large quantities of data. HDInsight unlocks the ability to build data and machine learning applications that run-on Apache Spark and Hadoop.  Power BI allows for the aggregation of data and visualization with easy to operate dashboard functionality.

We are also announcing a preview of Cognitive Services in Azure Government. We have enabled scenarios such as audio and text translation into other languages as well as facial (gender and age) and emotion recognition with Computer Vision and Emotion. If you’re interested in participating in the Azure Government Cognitive Services preview, please contact azgovfeedback@microsoft.com for more information.

With these capabilities working today, we can take data and derive insight in minutes.  Here’s a video example of these capabilities working together. In this demo, we leveraged HDInsight, Power BI and Machine Learning along with our Cognitive Services (available in preview for Azure Government) to show how you can easily build a solution to translate and analyze text and visualize the results.

Some partners are leveraging these capabilities to provide real time dashboards for their solutions, such as Prabal Acharyya, WW Director IoT Analytics for OSIsoft Technologies. OSISoft provides business solutions that connect sensor-based data, operations, and people to enable real-time intelligence for their customers. Prabal expanded on Power BI’s value, saying:

“Data scientists in U.S. Government spend inordinate amounts of time each day manually scrubbing terabytes of operational data for advanced analytics and business intelligence”, says Acharyya, “OSIsoft is pleased to partner with Microsoft to deliver PI Integrator for Microsoft Azure on Microsoft U.S. Government Cloud with free and fluid access to streaming Power BI-ready data, context & insights to build Innovative Gov solutions”

Azure HDInsight

HDInsight is the only fully-managed cloud Hadoop offering that provides optimized open source analytic clusters for Spark, Hive, MapReduce, HBase, Storm, Kafka, and R Server backed by a 99.9% SLA. Each of these big data technologies and ISV applications are easily deployable as managed clusters with enterprise-level security and monitoring. 

HDInsight brings Big Data to Azure Government and broadens the landscape for building powerful data analysis solutions. Examples include:

Deploy a Big Data analysis cluster in minutes. No upfront costs, get started immediately.
Enable streaming and processing of large data sets in real time using Kafka, Storm, and Spark for HDInsight.
Build Machine Learning capabilities with Spark and R Server
Build intelligent applications that leverage big data to deliver personalized experiences

If you’re looking to get started creating powerful solutions with HDInsight for Azure Government log into the Azure Portal or signup for a trial.

Power BI Pro for U.S. Government

Power BI brings your Big Data solutions to life with live dashboards, interactive reports, and compelling visualizations. Power BI connects to a broad range of data wherever it lives and enables anyone to visualize and analyze data with greater speed, efficiency, and understanding.

Power BI Pro for Microsoft Cloud for Government includes:

Power BI service is a cloud-based business analytics service that gives you a single view of your most critical data.
Power BI Desktop puts visual analytics at your fingertips with intuitive report authoring; drag-and-drop to place content exactly where you want it on the flexible and fluid canvas, and quickly discover patterns as you explore a single unified view of linked, interactive visualizations.
Power BI Mobile helps you stay connected to your data from anywhere, anytime; and get a 360° view of your organization data on the go – at the touch of your fingertips.

Want to get started? Signup for Power BI Pro for Government
Quelle: Azure

Azure Blueprint supports the UK Government’s Cloud Security Principles

Azure Government Engineering is pleased to announce the release of Azure Blueprint for the UK Government’s Cloud Security Principles. Blueprint empowers Azure customers to build the most secure cloud solutions on the most secure cloud platform.
 
Azure Blueprint for the UK Government enables UK public sector organizations to understand how solutions built on Azure implement the 14 individual Cloud Security Principles published by the National Cyber Security Centre, supporting workloads with information designated as UK OFFICIAL. The Azure Blueprint UK Government Customer Responsibilities Matrix outlines how Azure implements security controls designed to satisfy each security principle and assists customers in understanding how they may implement safeguards within their Azure solution to fulfill the requirements of each principle where they hold a responsibility.

In conjunction with this documentation release, a Blueprint compliance architecture ARM (Azure Resource Manager) template has been released on GitHub. This ARM template deploys a three-tiered network architecture which provides a baseline from which customers can build a secure environment that supports the UK Cloud Security Principles.
 
To access the Azure Blueprint UK Government Cloud documents please e-mail AzureBlueprint@microsoft.com. Additional information and Blueprint resources are available on the Azure Government Documentation site.
Quelle: Azure

Announcing real-time Geospatial Analytics in Azure Stream Analytics

We recently announced the general availability of Geospatial Functions in Azure Stream Analytics to enable real-time analytics on streaming geospatial data. This will make it possible to realize scenarios such as fleet monitoring, asset tracking, geofencing, phone tracking across cell sites, connected manufacturing, ridesharing solutions, etc. with production grade quality with a few lines of code.

The connected car landscape and the turning of the automobile into a real-time data exhaust opens new avenues of business for automation, and post-sale monetization opportunities in industries such as insurance and content providers. NASCAR has been a pioneer in using geospatial capabilities in Azure Stream Analytics.

“We use real-time geospatial analytics with Azure Stream Analytics for analyzing race telemetry during and after the race,” said NASCAR’s Managing Director of Technology Development, Betsy Grider.

The new capabilities provide native functions that can be used in Azure Stream Analytics to compute geospatial operations such as the identification of geospatial data as points, lines, and polygons, computation of overlap between polygons, intersections between paths, etc. The ability to join multiple streams with geospatial data can be used to answer complex questions on streaming data.
We’ve adopted the GeoJSON standard for dealing with geospatial data. The new functions include:

CreatePoint – Identifies a GeoJSON point.
CreateLineString – Identifies a GeoJSON line string.
CreatePolygon – Identifies a GeoJSON polygon.
ST_DISTANCE – Determines the distance between two points in meters.
ST_OVERLAPS – Determines if one polygon overlaps with another.
ST_INTERSECTS – Determines if two line strings intersect.
ST_WITHIN – Determines if one polygon is contained inside another.

The ability to reason about geospatial data in motion using a declarative SQL like language Simplified queries for geospatial scenarios would look as follows:

Generate an event when a gas station is less than 10 km from the car:

SELECT Cars.Location, Station.Location
FROM Cars c
JOIN Station s ON ST_DISTANCE(c.Location, s.Location) < 10 * 1000

Generate an event when:

Fuel level in the car is lower than 10%
Gas stations have a promotion
The car is pointing to gas station

SELECT Cars.gas, Cars.Location, Cars.Course, Station.Location, Station.Promotion
FROM Cars c
JOIN Station s ON Cars.gas < 0.1 AND Station.Promotion AND ST_OVERLAPS(c.Location, c.course)

Generate an event when building is within a possible flooding zone:

SELECT Building.Polygon, Building.Polygon
FROM Building b
JOIN Flooding f ON ST_OVERLAPS(b.Polygon, b.Polygon)

Generate an event when a storm is heading towards a car:

SELECT Cars.Location, Storm.Course
FROM Cars c, Storm s
JOIN Storm s ON ST_OVERLAPS(c.Location, s.Course)

The integration with Power BI enables live visualizations of geospatial data on maps in real-time dashboards. It is also possible to use Geospatial functions for actualizing scenarios such as identifying and auctioning on hotspots and groupings and visualize data using heat maps on a Bing Maps canvas.

Live heat maps using machine learning and geospatial analytics can help unlock better business outcomes for ride-sharing and fleet management scenarios.

 

This video shows a fleet monitoring example built using the functionality detailed above.

Fleet monitoring with Geospatial functions in Azure Stream Analytics

The GeoSpatial Functions documentation page covers detailed documentation and usage examples. We are excited for you try out geospatial functions using Azure Stream Analytics.
Quelle: Azure

What’s brewing in Visual Studio Team Services: March 2017 Digest

This post series provides the latest updates and news for Visual Studio Team Services and is a great way for Azure users to keep up-to-date with new features being released every three weeks. Visual Studio Team Services offers the best DevOps tooling to create an efficient continuous integration and release pipeline to Azure. With the rapidly expanding list of features in Team Services, teams can start to leverage it more efficiently for all areas of their Azure workflow, for apps written in any language and deployed to any OS.

Delivery Plans

We are excited to announce the preview of Delivery Plans! Delivery Plans help you drive alignment across teams by overlaying several backlogs onto your delivery schedule (iterations). Tailor plans to include the backlogs, teams, and work items you want to view. 100% interactive plans allow you to make adjustments as you go. Head over to the marketplace to install the new Delivery Plans extension. For more information, see our blog post.

Mobile Work Item Form Preview

We’re releasing a preview of our mobile-friendly work item form for Visual Studio Team Services! This mobile work item form brings an optimized look and feel that’s both modern and useful. See our blog post for more information.

Updated Package Management experience

We’ve updated the Package Management user experience to make it faster, address common user-reported issues, and make room for upcoming package lifecycle features. Learn more about the update, or turn it on using the toggle in the Packages hub.

Release Views in Package Management

We’ve added a new feature to Package Management called release views. Release views represent a subset of package-versions in your feed that you’ve promoted into that release view. Creating a release view and sharing it with your package’s consumers enables you to control which versions they take a dependency on. This is particularly useful in continuous integration scenarios where you’re frequently publishing updated package versions, but may not want to announce or support each published version.

By default, every feed has two release views: Prerelease and Release.

To promote a package-version into the release view:

Select the package
Click the Promote button
Select the view to promote to and select Promote

Check out the docs to get started.

Build editor preview

We’re offering a preview of a new design aimed at making it easier for you to create and edit build definitions. Click the switch to give it a try.

If you change your mind, you can toggle it off. However, eventually after we feel it’s ready for prime time, the preview editor will replace the current editor. So please give it a try and give us feedback.

The new editor has all the capabilities of the old editor along with several new capabilities and enhancements to existing features:

Search for a template

Search for the template you want and then apply it, or start with an empty process.

Quickly find and add a task right where you want it

Search for the task you want to use, and then after you’ve found it, you can add it after the currently selected task on the left side, or drag and drop it where you want it to go.

You can also drag and drop a task to move it, or drag and drop while holding the Ctrl key to copy the task.

Use process parameters to pass key arguments to your tasks

You can now use process parameters to make it easier for users of your build definition or template to specify the most important bits of data without having to go deep into your tasks.

For more details, see the post about the preview of our new build editor.

Pull Request: Improved support for Team Notifications

Working with pull requests that are assigned to teams is getting a lot easier. When a PR is created or updated, email alerts will now be sent to all members of all teams that are assigned to the PR.

This feature is in preview and requires an account admin to enable it from the Preview features panel (available under the profile menu).

After selecting for this account, switch on the Team expansion for notifications feature.

In a future release, we’ll be adding support for PRs assigned to Azure Active Directory (AAD) groups and teams containing AAD groups.

Pull Request: Actionable comments

In a PR with more than a few comments, it can be hard to keep track of all of the conversations. To help users better manage comments, we’ve simplified the process of resolving items that have been addressed with a number of enhancements:

In the header for every PR, you’ll now see a count of the comments that have been resolved.

When a comment has been addressed, you can resolve it with a single click.

If you have comments to add while you’re resolving, you can reply and resolve in a single gesture.

Automatic Github Pull Request Builds

For a while we’ve provided CI builds from your GitHub repo. Now we’re adding a new trigger so you can build your GitHub pull requests automatically. After the build is done, we report back with a comment in your GitHub pull request.

For security, we only build pull requests when both the source and target are within the same repo. We don’t build pull requests from a forked repo.

Extension of the Month: Azure Build and Release Tasks

This extension has really been trending over the last month and it’s not hard to see why. If you’re building and publishing your applications with Microsoft Azure you’ll definitely want to give this 4.5 star rated extension a look. It is a small gold mine of tasks to use in your Build and Release definitions.

Azure Web App Slots Swap: Swap two deployment slots of an Azure Web App
Azure Web App Start: Start an Azure Web App, or one of its slot
Azure Web App Stop: Stop an Azure Web App, or one of its slot
Azure SQL Execute Query: Execute a SQL query on an Azure SQL Database
Azure SQL Database Restore: Restore an Azure SQL Database to another Azure SQL Database on the same server using the latest point-in-time backup
Azure SQL Database Incremental Deployment: Deploy an Azure SQL Database using multiple DACPAC and performing incremental deployments based on current Data-Tier Application version
AzCopy: Copy blobs across Azure Storage accounts using AzCopy

Go to the Visual Studio Team Services Marketplace and install the extension.

There are many more updates, so I recommend taking a look at the full list of new features in the release notes for January 25th and February 15th.

Happy coding!
Quelle: Azure