Azure Data Factory V2: visual monitoring added to public preview

We are excited to announce the addition of visual monitoring capabilities for Azure Data Factory V2 (ADF v2) to our customers. With this release, you can easily monitor the data factory v2 pipelines without writing a single line of code. In the first release, we are enabling you to visually monitor pipeline and activity runs. You can use a simple and intuitive list based interface to monitor your runs and perform various operations on these lists, including filtering and sorting, among others. With this release, we are also providing guided tours on how to use the enabled visual monitoring features. We are also giving customers the ability to give us valuable feedback.

This is the first of many visual tools that we plan to enable for our customers in the coming months to visually author and monitor data factory v2 pipelines. Our ultimate goal with visual tools for ADF v2 is to increase productivity and efficiency for both new and advanced users with intuitive experiences.

You can get started by clicking the ‘Monitor & Manage’ tile in your provisioned v2 data factory blade.

 

Check out some of the exciting features enabled with visual monitoring in ADF v2

List view monitoring

Monitor pipeline and activity runs with a simple list view interface. All of the runs are displayed in the local browser time zone. You can change the time zone and all the date time fields will snap to the selected time zone.

Monitor pipeline runs

Below you can see the list view showcasing each pipeline run for your data factory v2 pipelines.

Monitor activity runs

Below you can see the list view showcasing activity runs corresponding to each pipeline run. Click ‘Activity Runs’ icon under the ‘Actions’ column to view activity runs for each pipeline run.

Note: You need to click the ‘Refresh’ icon on top to refresh the list of pipeline and activity runs. Auto-refresh is currently not supported.

Rich ordering and filtering

Order pipeline runs in desc/asc by run start, and filter pipeline runs pipeline name, run start, and run status.

Add/remove columns to list view

Right click the list view header and choose the columns that you want to appear in the list view.

Reorder columns widths in list view

Increase and decrease the column widths in list view by simply hovering over the column header.

Guided tour

Click on the ‘Information Icon’ in the lower left. You can then click ‘Guided tour’ to get step by step instructions on how to visually monitor your pipeline and activity runs.

Feedback

Click on the ‘Feedback’ icon to give us feedback on various features or any issues that you may be facing.

Select data factory

Hover on the ‘Data Factory’ icon on the top left. Click on the ‘Arrow’ icon to see a list of Azure subscriptions and data factories that you can monitor.

 

This is the first public release of ADF v2 visual monitoring features. We are continuously working to refresh the released bits with new features based on customer feedback. Get more information and detailed steps for using the ADF v2 visual monitoring features.

Get started building pipelines easily and quickly using Azure Data Factory. If you have any feature requests or want to provide feedback, please visit the Azure Data Factory forum.
Quelle: Azure

Windows Server, version 1709 is now available on Azure

What a great day! Back in June, we announced Windows Server was joining the Semi-Annual Channel release cadence to deliver innovation at a faster pace. Two weeks ago at Ignite, we announced Windows Server, version 1709, the first release in this new model, and today you can start using it!

Azure customers can deploy Windows Server, version 1709 based on the image in the Azure Marketplace. Software Assurance customers can download Windows Server, version 1709 from the Volume Licensing Service Center (VLSC) portal. If you run virtual machines in a hosted environment you can also check the images that your service provider made available.

Windows Server, version 1709 is only the first step in this new world of faster release cadences. The most important aspect of having new releases twice a year is customer feedback will shape the product. You can try the next set of preview builds of Windows Server in the Semi-Annual Channel and provide feedback by joining the Windows Insiders program. You can also join the conversation in the Microsoft Tech Community where we have tons of professionals and experts sharing their learnings and answering questions.

For more information, follow us @windowsserver or on Facebook.
Quelle: Azure

Azure Service Bus and Azure Event Hubs Geo-disaster recovery preview released

Two long-standing, well-known and appreciated Azure core services, Azure Service Bus and Azure Event Hubs, just released a preview of an upcoming generally available Geo-disaster recovery feature. With the help of this feature, no client needs to manage Geo-disaster recovery scenarios anymore via code, but can instead rely on the services doing the metadata synchronization between two independent namespaces. At this time, data replication is not yet supported and will be added at a later point in time (post-general availability). 

Note that there is a significant difference between a disaster and an outage. A disaster is typically causing a full or partial data center outage, for example, a fire, flood or earthquake. An outage is usually caused by more transient issues and are very short lived. Disasters can take hours and sometimes days to resolve, whereas outages are more in the timeframe of minutes.

Currently, both services require that you have a separate monitoring process to automatically recover from disasters. This means that you would need to write a small application to monitor your namespace, and for example, connects every 1-10 minutes. If the connection fails repeatedly the application can trigger a failover. It's also worth noting, that it is possible to have multiple independent monitoring processes. Please find more information in the articles below.

To set up disaster recovery, select two namespaces in independent regions, for example, US North Central and US South Central, and define a primary and a secondary namespace, then create a pairing between them. In case of a disaster, trigger the failover.

To learn more about the REST API for Service Bus, including code samples, please visit documentation.
To learn more about the REST API for Event Hubs, including code samples, please visit documentation.
Important information about the difference between a disaster and an outage can also be found in documentation.

If you have feedback, please let us know!
Quelle: Azure

Meet the Azure Analysis Services team at PASS Summit 2017

Members from the Azure Analysis Services team will be presenting two sessions at this years PASS Summit 2017 in Seattle, WA. Members will also be available in the SQL Clinic to directly answer your Analysis Services questions in a one-on-one setting. Sessions include the following:

Creating Enterprise Grade BI Models with Azure Analysis Services or SQL Server Analysis Services

Speakers: Bret Grinslade and Christian Wade

Level: 400

Microsoft Azure Analysis Services and SQL Server Analysis Services enable you to build comprehensive, enterprise-scale analytic solutions that deliver actionable insights through familiar data visualization tools such as Microsoft Power BI and Microsoft Excel. Analysis Services enables consistent data across reports and users of Power BI. This session will reveal new features for large, enterprise models in the areas of performance, scalability, model management, and monitoring. Learn how to use these new features to deliver tabular models of unprecedented scale with easy data loading and simplified user consumption.

Deliver Enterprise BI on Big Data

Speakers: Bret Grinslade and Josh Caplan

Level: 300

Learn how to deliver analytics at the speed of thought with Azure Analysis Services on top of a petabyte-scale SQL Data Warehouse, Azure Data Lake, or HDInsight implementation. This session will cover best practices for managing, processing, and query accelerating at scale, implementing change management for data governance, and designing for performance and security. These advanced techniques will be demonstrated thorough an actual implementation including architecture, code, data flows, along with tips and tricks.

Learn more about PASS Summit 2017. We hope to see you there.
Quelle: Azure

Updates on Intel Xeon Scalable Processors for Microsoft Azure Stack

Today we are announcing the validation of Azure Stack systems with the new family of Intel Xeon Scalable Processors, also known as “Purley”. With Azure Stack running on Intel Xeon Scalable Processors, customers can expect a boost in performance and capacity. Please note that results will vary based on configuration. Intel Xeon Scalable Processors offer new IO stack improvements, support up to 28 cores per CPU, and offer a 50% improvement in memory bandwidth, up to 1.5TB.

Availability dates of these new systems will vary by hardware vendor, including Cisco, Dell EMC, Hewlett Packard Enterprise, Huawei, Lenovo, and Wortmann/Terra, starting in November 2017 and continuing through February 2018. Details can be obtained by reaching out to those hardware vendors directly.

Even with these new systems, customers can confidently continue current Azure Stack purchases and deployments. Availability of Azure Stack systems with Intel Xeon E5 v4 family, also known as “Broadwell”, will continue for up to 12 months, depending on the hardware vendor. The scaling architecture for Azure Stack accounts for multi-generational hardware in the same instance by enabling scale units using different hardware. As a consequence, when this capability is added, customers can increment capacity with multi-generational hardware under the same cloud end point.
Quelle: Azure

Azure Event Grid now supports Event Hubs as a destination

Azure Event Grid was introduced in August 2017 as the eventing backplane for Azure and beyond. It underpins many of our goals, and much of our vision for Serverless cloud computing. One of the core concepts of Azure Event Grid is making the cloud broadly reactive and introducing concepts that are familiar with reactive frameworks and models. In doing this, we have made Azure Event Grid a push based system that notifies you when events occur. While this push concept is central to the reactive architecture, there are times when receiving a webhook may present challenges, particularly at scale. For this reason we have introduced Event Hubs as a new subscription endpoint type available for Azure Event Grid.

Today, this feature is available via the CLI by using the –endpoint-type parameter and specifying eventhub instead of the default webhook. When using this option, you specify the endpoint as the Azure Resource Manager path of the Event Hub you want to send the events to. Assuming you already have an Event Hub, you can get this resource path and navigate to your Event Hub in the portal. You can also copy the path out of the URL, or you can list the resource using the CLI commands shown below.

Set the variables below to make your life easier.

eventHub=<your Event Hub name>
eventHubNamespace=<your Event Hubs namespace here>
resourceGroup=<your resource group name>

Get a resource path for Event Hub namespace and store in variable.

eventHubResource=$(az resource list –resource-group $resourceGroup –resource-type Microsoft.EventHub/namespaces | jq -r ".[0] | select(.name=="$eventHubNamespace") | .id")

Now create the event subscription with the endpoint being your Event Hub.

az eventgrid resource event-subscription create –endpoint $eventHubResource/eventHubs/$eventHub –name myehsub –resource-group $resourceGroup–resource-type storageAccounts –provider-namespace Microsoft.Storage –resource-name mystorageaccount –endpoint-type eventhub​

The output will show the JSON for the event subscription resource.

At this point you're ready to rock 'n roll. Anytime you create or delete blobs in this storage account an event will be delivered to your Event Hub. To learn more and see this data, check out the articles Get started receiving messages with the Event Processor Host in .NET Standard and Receive events from Azure Event Hubs using Java.

Also, you can still use the other CLI parameters to create prefix and suffix filters for this event subscription. If you wanted to only receive events for creating blobs you would add the parameter below:

–included-event-types Microsoft.Storage.BlobCreated​

If you also wanted to only receive notifications for .xml files you could add the parameter below:

–subject-ends-with .xml​

You may have noticed that you didn't need to provide the connection string to the Event Hub. Since this is all in the Azure ecosystem and you're logged into Azure CLI, the resource manager will RBAC your ability to get the Event Hub connection string after which we use to do service to service authentication, so there is no need to provide a connection string.

Happy Messaging!
Quelle: Azure

Azure Data Lake Tools for Visual Studio Code (VSCode) October Updates

If you are a data scientist looking for a lightweight code editor for U-SQL, try ADL Tools for VSCode for rapid development. If you are a developer looking for a modern, simple U-SQL development tool, try ADL Tools for VSCode. If you prefer to use Mac or Linux for your development, install ADL Tools for VSCode and get started with U-SQL development.

U-SQL is a programming framework built for scaling out big data queries and running them in a serverless cloud environment. In addition to being able to run SQL-like queries directly, the U-SQL framework makes it easy to plug in your R, Python or .NET algorithms and scale them out in the same easy, declarative style as SQL.

We are excited to share the recent release for VSCode Azure Data Lake Tools, a cross-platform code editor to allow you easily to author and submit U-SQL file to Azure Data Lake Analytics (ADLA). We have greatly improved the getting started experience, enhanced usability of the tools, and improved the integration with Azure Data Lake Storage (ADLS). The ADLS integration allows you to easily preview files, list storage paths, and download or upload files with exceptional performance.

To maintain the VS Code lightweight approach, we removed the .NetCore and Java dependencies in the extension. For Windows users, you can start to use Azure Data Lake Tools for your U-SQL development right after you install the extension. For non-Windows users, you can install Azure Data lake Tools, then follow the reminder to install .NetCore and Mono. 

Key customer benefits

Added ADLS file downloading with status monitoring

Improved file upload of single or multiple files with status monitoring

Removed Java dependency and .NetCore dependency for Windows

Solidified tools performance and reliability through architectural improvements

Improved the getting started experience and made C# extension installation optional

New Features

Download File from ADLS with status monitoring
Upload single or multiple files to ADLS with status monitoring

How do I get started?

Get the latest ADL Tools by going to the VSCode Extension repository or VSCode Marketplace and searching “Azure Data Lake Tools for VSCode”.

For more information about Azure Data Lake Tool for VSCode, use the following options:

Get more information on using Data Lake Tools for VSCode.
Watch the ADL Tools for VSCode User instructions video.
Learn more about how to get started on Data Lake Analytics.
Learn how to Develop U-SQL assemblies for Azure Data Lake Analytics jobs.

If you encounter any issues, please submit them on GitHub.

Want to make this extension even more awesome? Share your feedback.
Quelle: Azure

Microsoft Cosmos DB in Azure Storage Explorer – public preview

We are happy to announce the public preview of support for Cosmos DB in the Azure Storage Explorer (ASE). With this release Cosmos DB databases can be explored and managed with the same consistent user experiences that make ASE a powerful developer tool for managing Azure storage. The extension allows you to manage Cosmos DB entities, manipulate data, create and update stored procedures, triggers, as well as User Defined Functions. Azure Storage Explorer not only offers unified developer experiences for inserting, querying, and managing your Azure Cosmos DB data, but also provides an editor with syntax highlighting and suggestions for authoring your Cosmos DB stored procedures. With this extension you are now able to browse Cosmos DB resources across both Document DB and Mongo DB interfaces along-side existing experiences for Azure Blobs, tables, files, and queues in ASE. Key customer benefits Enables ASE to be a one-stop shop to manage Azure database and storage resources Provides an easy way to understand explorer experience for Cosmos DB data and structure Offers flexible navigation experiences across Cosmos DB accounts and hierarchies Delivers better accessibility for file navigation and data management capability with reliable performance Summary of key features Open Cosmos DB account in the Azure portal Add resources to the Quick Access list Search and refresh Cosmos DB resources Connect directly to Cosmos DB through a connection string Create and delete Databases Create and delete Collections Create, edit, delete, and filter Documents Create, edit, and delete Stored Procedures, Triggers, and User-Defined Functions Get started Install Azure Storage Explorer: [Windows], [Mac], [Linux] For more information about Cosmos DB in ASE, please see: Azure Storage Explorer: Get started with Storage Explorer User Manual: Manage Azure Cosmos DB in Azure Storage Explorer  Demo of Cosmos DB: Use Azure Cosmos DB in Azure Storage Explorer    Learn more about today’s announcements on the Azure blog. Discover more Azure service updates. If you have questions, feedback, comments, or bug reports, please use the comments below or send a note to cosmosdbtooling@microsoft.com.
Quelle: Azure

Announcing new Azure Government capabilities for classified mission-critical workloads

Microsoft is partnering with the U.S. Government in the journey to the cloud, providing both infrastructure (IaaS) and platform (PaaS) offerings to enable digital transformation. Our government customers are responsible for the most sensitive data and the most critical applications in the country. We are committed to delivering the broadest array of services to meet government regulatory requirements and security needs. Azure Government is the mission-critical cloud, providing more than 7,000 Federal, State, and local customers the exclusivity, highest compliance and security, hybrid flexibility, and commercial-grade innovation they need to better meet citizen expectations.

Today at Microsoft Government Cloud Forum in Washington D.C., we’re announcing a number of important advances for Azure Government, the dedicated cloud for our U.S. Government customers and their partners. We’re expanding our support for highly-classified workloads, delivering advanced technologies like Blockchain and support for High Performance Computing, and increasing our available security capabilities with Azure Security Center. In addition, we’re launching Citrix VDI on Azure Government to help our government customers deliver user workloads from the cloud.

Introducing Azure Government Secret

At last year's Government Cloud Forum Azure Government was the first government-only cloud to be awarded Information Impact Level 5 DoD Provisional Authorization by the Defense Information Systems Agency. On our anniversary, we’re proud to be the only provider to deliver a physically isolated cloud that is DoD Impact Level 5-ready for infrastructure, platform, and productivity services serving every branch of the military and the defense agencies the greatest number L5 services in the market.

Taking the next step forward in meeting the mission-critical and data needs of our U.S. Government customers, we are announcing expansion plans to make Azure Government Secret available to support government agencies and partners who have Secret classified data. Azure Government Secret will deliver multi-tenant cloud infrastructure and cloud capabilities to U.S. Federal Civilian, Department of Defense, Intelligence Community, and U.S. Government partners working within Secret enclaves. Customers with Secret requirements can expect to gain access to new technologies at scale, including services such as cognitive capabilities, artificial intelligence, and predictive analytics.

Blockchain for Azure Government

We view blockchain as a major technological advancement with the potential for significant impact in many industries, including the public sector, through its ability to enable verifiable and immutable cross-party computation. At its core, a blockchain is a data structure that’s used to create a digital transaction ledger that, instead of resting with a single provider, is shared among a distributed network of computers. Blockchain technologies deployed on Azure are applicable to many complex problems facing government today, including distribution of funds after natural disasters, registration of property ownership, and other issues involving tracking ownership of funds or assets through multiple transactions. Today we’re launching Blockchain for Azure Government, which will support a wide array of our Azure blockchain and distributed ledger marketplace solutions. These solutions automate the deployment and configuration of blockchain infrastructure across multiple organizations, allowing our customers to focus on government transformation and application development.

Unified security management with Azure Security Center

While cyber threats affect every organization and every individual, governments face unique challenges. The recent Executive Order on Strengthening the Cybersecurity of Federal Networks and Infrastructure, represents a key example of the increasing pressure on government agencies to increase their efforts around protecting highly sensitive data and systems. To help our customers address their security challenges, we’re bringing Azure Security Center to Azure Government. Security Center offers unified security management and advanced threat protection for hybrid cloud workloads, enabling government agencies to take on evolving security threats. Learn more about Security Center.

Expanding High Performance Computing in Azure Government

In today’s data-driven government, High Performance Computing (HPC) is increasingly being mainstreamed to apply to a broader range of problems. To address this demand, we’re extending our existing public sector HPC offerings, including the NC-series and Azure Batch, to include the H-series virtual machines. Azure H-series virtual machines, with InfiniBand and Linux RDMA technology, are designed to deliver cutting-edge performance for complex engineering and scientific workloads such as weather prediction and climate modeling, trajectory modeling, and other memory-intensive projects. By the end of the year, customers will be able to take advantage of this expanded offering in Azure Government.

New Virtual Desktop Infrastructure options in the cloud

The public sector is under a mandate to be more efficient and run fewer datacenters. To reduce on-premises infrastructure, many government customers are considering moving their Virtual Desktop Infrastructure (VDI) to the cloud. Today, we’re announcing new options with Azure Government, giving customers more flexibility in handling VDI requirements. With Citrix VDI on Azure Government, customers can now extend existing Citrix environments and deploy Windows 10 desktops into Azure Government from Citrix Cloud. Learn more about new VDI options.

We’re excited about all the new capabilities coming to Azure for the U.S. Government. To find out more about technology innovation and security for government customers, check out Azure Government. And for ongoing updates, follow the Azure Government blog.
Quelle: Azure

Last week in Azure for the week of October 9, 2017

1. IoT updates

Now you can monitor your Azure IoT solutions with Azure Monitor and Azure Resource Health. Azure Monitor provides highly granular and real-time monitoring data for any Azure resource. Check out this video on Channel 9 to Get Started with Azure Monitor. Azure resouce health is your personalized dashboard of how your resources are doing based on a series of executed checks, such as checking if the IoT hub is up and running.

Azure IoT Hub now supports X.509 CA certificates for device identity. The use of X.509 certificate authority (CA) certificates dramatically simplifies device identity creation and life-cycle management in the supply chain, which enables registration of any number of devices into Azure IoT Hub by registering an X.509 CA certificate once; otherwise, device-unique certificates must be pre-registered for every device before it can connect. For more information, see Device Authentication using X.509 CA Certificates.

2. New previews

Azure Files share snapshots – Azure Files share snapshots enables you to store periodically read-only versions of your file shares. It also enables you to copy an older version of your content from anywhere for further modification and use.
Batch AI – Now you can provision clusters of GPUs or CPUs on demand to train your models in parallel and at scale in Azure. If you want to give it a try, you can run recipes using the Python quickstart or the Azure CLI 2.0 quickstart. Or, you can try out the Azure Batch AI training recipes in GitHub. For more information, see Azure Batch AI.

3. Azure Management Libraries for .NET v1.3

The latest release of the Azure Management Libraries for .NET adds support for availability zones (in preview), as well as support for Network Peering, Virtual Network Gateway and Azure Container Instances. You can find .NET sample code that addresses just about every Azure management scenario you can imagine on GitHub.

4. Cloud Platform Release Announcements

Every few weeks (usually on a Wednesday), we publish a consolidated list of updates for that period from the Cloud Platfrom team at Microsoft, which includes Azure, dev tools, and more. The Cloud Platform Release Announcements for October 11, 2017 post from last week covers everything that was released after Microsoft Ignite 2017, such as public preview of Java support in Azure Functions and general availability of Power BI Embedded.

5. Stuff to watch & Listen

Azure Friday: ILB ASE and Application Gateway – Christina Compy joins Scott Hanselman to talk about exposing your internet-isolated apps with an Application Gateway. This enables you to securely host multi-tier applications on an Internal Load Balancer (ILB) App Service Environment (ASE) and only expose the front-end applications that you want to expose.

Azure Friday: Jenkins Plugins for Kubernetes – Pui Chee "PC" Chan joins Scott Hanselman to discuss native support for Jenkins in Azure. Our plugins make it easy for you to build your project using a container agent and then automate deployment from Jenkins to an Azure Container Service Kubernetes cluster.

Azure Friday: Azure Instance Metadata Service – Hariharan Jayaraman joins Scott Hanselman to talk about the Azure Instance Metadata Service, which provides information about running virtual machine instances that you can use to manage and configure your virtual machines. Use the service to get information such as SKU, network configuration, and upcoming maintenance events.

The Azure Podcast: Episode 199 – Blockchain Update – Cale Teeter gives us the latest scoop on all the things Microsoft is doing in the Blockchain space. Plus, more coverage of the updates mentioned earlier in this post.

Cloud Tech 10 – 16th October 2017 – Batch AI, Durable Functions, File Share Snapshots and more! – In this rapid-fire video series from Mark Whitby, a Cloud Solution Architect at Microsoft UK, you get a weekly dose of how-to information in 10 minutes or less.

Azure Application Service Environments v2: Private PaaS Environments in the Cloud – An App Service Environment v2 is a fully isolated and dedicated environment for securely running Azure App Service apps at high scale, including Web Apps, Mobile Apps, and API Apps. It is essentially a deployment of the Azure App Service into a subnet of your network, so think of it as your private Platform-as-a-Service environment in the cloud.

Quelle: Azure