Azure Networking Fridays with the Azure Black Belt Team

You are invited to join us for Azure Networking Fridays!

This hour long session will occur every other Friday this fall. It is open to all customers and partners to learn more about Azure Networking (including ExpressRoute and Virtual Networking) and how to plan and design their connectivity to the Microsoft Cloud.

There will be an open Q&A session at the end where customers can ask the experts. Content and partner speakers will vary for each session but the general agenda is as follows:

Azure Networking fundamentals (10 minutes)
Deep dive topic of the week (15-20 minutes)
Partner spotlight of the week (15-20 minutes)
Q&A

We’re kicking off the series Friday, September 16, 2016.

Join the Skype Meeting and make sure you don’t miss out on future sessions by adding this the series to your Outlook calendar (download ICS here).
Quelle: Azure

Microsoft Azure now available from UK datacenters

Today, I’m proud that Microsoft is first global provider to deliver the complete cloud from datacenters in the UK. Microsoft Azure, along with Office 365, is now generally available from multiple UK datacenter locations providing data residency to help enable the digital transformation of our customers in industries such as banking, government, public sector and healthcare that require certain data to remain within the UK.

As one of the largest cloud operators in the world, we’ve invested billions in building a highly scalable, reliable, secure, and sustainable cloud infrastructure. With the introduction of new regions in the UK, Microsoft has now announced 34 Azure regions around the world with 28 generally available today – more than any other major cloud provider.

Customers choosing Microsoft benefit from the company’s experience in enterprise computing and hybrid cloud – offering everything from optimization of compute resources to higher-level services for advanced analytics, media services, the Internet of Things (IoT), and big data.

Microsoft Azure is a growing collection of integrated cloud and data services—analytics, computing, databases, mobile, media, networking, storage, and web—for moving faster, achieving more and saving money.

You can learn more about the customers adopting the Microsoft Cloud at Official Microsoft Blog.
Quelle: Azure

Improved Automatic Tuning boosts your Azure SQL Database performance

Azure SQL Database is the world’s first intelligent database service that learns and adapts with your application, enabling you to dynamically maximize performance with very little effort on your part.

Today we released an exciting update to Azure SQL Database Advisor that greatly reduces the time (from a week to a day) required to produce and implement index tuning recommendations. 

This brings us one step closer to our vision where developers no longer have to worry about physical database schema management, as the system will self-optimize to provide predictable and optimal performance for every database application.

About SQL Database Advisor

Database Advisor provides custom performance tuning recommendations for your databases using machine learning intelligence. It saves you time, automatically tuning your database performance, so you can focus your energy on building great applications.

Database Advisor continuously monitors your database usage and provides recommendations to improve performance (create/drop indexes, and more)
You can choose to have the recommendations be automatically applied (via Automatic Tuning option)
Recommendations applied or rolled back manually (via Azure Portal or REST API)

What’s new in this release?

The following table summarizes the improvements in this release:

Area
Before
Now

Time to produce new index recommendations
(for a database with daily usage)
~7 days
~18 hours

Delay before T-SQL statement is executed
(CREATE INDEX or DROP INDEX)
~12 hours delay
Immediately
(starts within minutes)

Time to react to any regressions and revert “bad” tuning actions
~12 hours
<=1 hour

Delay between implementing consecutive index recommendations
~12 hours delay between indexes
Immediately
(starts within minutes)

TOTAL TIME TO IMPLEMENT
(for a DB with 3 active recommendations)
~9 DAYS
~1 DAY

Automated Index Tuning is now even more powerful

All these improvements together make automated index tuning an even more attractive choice for managing the performance of your Azure SQL databases. With the new recommendation models and greatly improved underlying automation, Database Advisor will tirelessly work 24/7 to make your database applications run blazing fast at all times.

If you’re not using automated tuning yet, we strongly encourage you to give it a try – you’ll be pleasantly surprised with the results, as many of our other customers already were.

Summary

You can now run your production DB workload in SQL DB for a day, and Database Advisor will help you improve your database performance by providing custom tuning recommendations. You can also opt-in to automated tuning mode where the tuning recommendations will be auto-applied to your database for a complete hands-off tuning experience.

Now you can dedicate your energy and attention on building great database applications, while the SQL DB service keeps your databases running and performing great for you.

Next steps

If you’re new to Azure SQL Database, sign up now for a free trial and discover how built-in intelligence of Azure SQL DB make it easier and faster than ever to build amazing database applications.

If you’re already using Azure SQL Database, try SQL Database Advisor today and share your feedback with us using the built-in feedback mechanism on the Azure Portal, or in the comments section of this post. We’d love to hear back from you!

For more detailed information, check out the SQL Database Advisor online documentation.
Quelle: Azure

Azure SQL Data Warehouse: February 2016 Updates

Documentation

The service team has migrated 600+ reference topics from our similar on-premise APS/PDW product to MSDN.  We have also taken your feedback and made a number of updates to our online documentation.  We have also added the APPLIES TO marker on the 600+ reference topics that pertain to SQL DW.  As you can see below, the CREATE DATABASE topic pertains only to Azure SQL Data Warehouse, whereas the CREATE COLUMNSTORE INDEX applies to multiple products.​ 

Quelle: Azure

The biggest PCI coverage in the industry just got bigger

We are pleased to announce Microsoft published our second Azure Payment Card Industry (PCI) Attestation of Compliance (AoC) for this year on the Microsoft Trust Center. Azure has the biggest PCI coverage in the industry and we are rapidly creating new services and features that our PCI customers want to leverage in their compliant solutions. To keep up with the pace of Azure’s growth, we are now undergoing two PCI assessments each year.

The Core AoC was released in March and covers the Azure platform, along with a number of Azure Services. The Azure PCI DSS AoC Package contains both the Core and Add-on AoC. This second AoC, the Add-on, covers the following additional Azure services:

IoT Hub
Service Fabric
StorSimple
API Management
Operations Management Suite

Azure Automation
Log Analytics
Azure Backup
Azure Site Recovery

Microsoft Intune
Azure Container Service
Stream Analytics
Power BI

With the addition of the services above, Azure’s list of PCI attested services is now 40! We will continue to increase our coverage of available services in our next round of assessments.
Quelle: Azure

What’s brewing in Visual Studio Team Services: September 2016 Digest

This post series will provide the latest updates and news for Visual Studio Team Services and will be a great way for Azure users to keep up-to-date with new features being released every three weeks. Visual Studio Team Services offers the best DevOps tooling to create an efficient continuous integration and release pipeline to Azure. With the rapidly expanding list of features in Team Services, teams can start to leverage it more efficiently for all areas of their Azure workflow, for apps written in any language and deployed to any OS.

New features released in August 2016

Fall is upon us but the sun is always shining for Team Services users. In August, users got treated with a host of exciting Team Services updates – a major redesign of social coding with Git (new pull requests UI, comments with markdown and emoji, auto complete pull requests), a new clone in Git Tower option, a host of build and CI enhancements and updates in Agile tools and testing.

New navigation – get around Team Services faster

A new Team Services navigation offers a more efficient way of browsing around the various hubs within Team Services, so you can get to where you want in fewer clicks than before

Using Team Services with Jenkins gets easier

With the new improvements in the Jenkins tasks in Team Services, it has become even easier to perform common scenarios such as:

Use Jenkins to validate your Team Services pull requests
Use Jenkins continuous integration for your Team Services Git repository
Use Jenkins to test or deploy your Team Services build
Download Jenkins build artifacts for use in a Team Services test, build, or release

Copying/uploading files to deploy to Linux gets simpler too!

Updates in Team Services now make it easier to copy files over SSH during CI and CD. The following guides are great pointers on enabling deployment to Linux from Team Services/TFS.

Deploy an Azure Red Hat Linux VM running Apache Tomcat.
Deploy an Azure Ubuntu Linux VM running Apache Tomcat.

If you prefer uploading files via FTP, the new FTP upload task will make it easier to do so as part of a build.

Track requirements quality right on the team Services Dashboard

It is now easier to link your automated tests to requirements and use Dashboard widgets to track the quality of each requirements you want to track.

Team Services build pool gets .NET Core agent and more

The hosted build pool has been updated with new Xamarin versions and a .NET Core agent.

New extensions for Team Services

See the extensions monthly roundup for a preview of Definition of Done, Personas and Product Vision extensions.

Inside Team Services – Summer Interns and Package Management

See what three awesome interns are doing to improve the Package management experience in Team Services.

Quelle: Azure

Microsoft enabling customers to "go to" MARS-E

We’re not referring to the planet (maybe one day?), but we are still excited to announce Microsoft Azure is the first hyper-scale platform to enable Affordable Care Act (ACA) Administering Entities (AEs) to address the Minimum Acceptable Risk Standards for Exchanges (MARS-E) 2.0 security and privacy control requirements.

Azure provides controls and capabilities that can be used by customers to help manage MARS-E 2.0 control requirements, reaffirming Microsoft’s continued commitment to enable healthcare industry customers to meet their security, legal, and regulatory needs.

Microsoft Azure enables our customers who process healthcare information to meet their obligations for protecting data in a manner that is compliant with MARS-E security requirements. Microsoft offers a comprehensive portfolio of authorizations and achieving MARS-E compliance complements our existing FedRAMP, HIPAA/HITECH, and HITRUST-certified offerings, strongly positioning us to help our customers comply with a myriad of healthcare industry requirements.

MARS-E was originally published in 2012 and contains the information security guidance, requirements, and templates for AEs including state and federal Health Insurance Exchanges (HIX) or marketplaces who facilitate purchase of health insurance by consumers and small businesses. The exchanges handle Personally Identifiable Information (PII), Protected Health Information (PHI) or Federal Tax Information (FTI) of U.S. citizens. MARS-E provides guidance for state and federal HIXs and their contractors regarding the minimum-level security controls that must be implemented to protect information and information systems that Centers for Medicare and Medicaid Services (CMS) oversees. The new MARS-E 2.0 framework has been effective as of September 2015, and includes significant updates to security and privacy controls.

While we are still grounded here on Earth, healthcare industry companies can now operate and build in an environment when leveraging the Microsoft Azure platform, which has a layer of assurance specifically enforcing the type of data protection critical to the healthcare industry and its regulators. This milestone is another example of our commitment to being the leader in providing trusted cloud solutions. Visit the Microsoft Trust Center for more information.
Quelle: Azure

Azure SQL Data Warehouse general availability expanding to 18 regions worldwide

We are excited to announce the general availability of Azure SQL Data Warehouse in four additional regions—North Europe, Japan East, Brazil South, and Australia Southeast. These additional locations bring the product worldwide availability count to 18 regions – more than any other major cloud provider.

SQL Data Warehouse is your go-to SQL-based view across data, offering a fast, fully managed, petabyte-scale cloud solution. It is highly elastic, enabling you to provision in minutes and scale up to 60 times larger in seconds. You can scale compute and storage independently, allowing you to range from burst to archival scenarios, and pay based off what you’re using instead of being locked into a confined bundle. Plus, SQL Data Warehouse offers the unique option to pause compute, giving you even more freedom to better manage your cloud costs.

With general availability, SQL Data Warehouse offers an availability SLA of 99.9% – the only public cloud data warehouse service that offers an availability SLA to customers. Geo-Backups support has also been added to enable geo-resiliency of your data, allowing SQL Data Warehouse Geo-Backup to be restored to any region in Azure. With this feature enabled, backups are available even in the case of a region-wide failure, keeping your data safe. See this blog post for more info on the capabilities and features of SQL Data Warehouse.

Getting started with SQL Data Warehouse is easy and you can provision a data warehouse within minutes. It easily integrates with business intelligence tools like Power BI and with Azure Machine Learning for predictive analytics. Begin today and experience the speed, scale, elasticity, security and ease of use of a cloud-based data warehouse for yourself.

Azure SQL Data Warehouse regional availability

Azure SQL Data Warehouse is generally available in the following regions: North Europe, Japan East, Brazil South, Australia Southeast, Central US, East US, East US 2, South Central US, West Central US, West US, West US 2, West Europe, East Asia, Southeast Asia, Central India, South India, Canada Central, Canada East.

Learn more about Azure services availability across regions on Azure’s regional information page.

Share your feedback

We would love to hear from you about what features you would like us to add. Please let us know on our feedback site what features you want most. Users who suggest or vote for feedback will receive periodic updates on their request and will be the first to know when the feature is released.

Learn more

Check out the many resources for learning more about SQL Data Warehouse, including:

What is Azure SQL Data Warehouse?
SQL Data Warehouse best practices
Video library
MSDN forum
Stack Overflow forum
Quelle: Azure

Azure Storage PowerShell v.1.7 – Hotfix to v1.4 breaking changes

Breaking changes were introduced in Azure PowerShell v1.4. These breaking changes are present in Azure PowerShell versions 1.4-1.6 and versions 2.0 and later. The following Azure Storage cmdlets were impacted:

Get-AzureRmStorageAccountKey: Accessing keys
New-AzureRmStorageAccountKey: Accessing keys
New-AzureRmStorageAccount: Specifying account type and endpoints
Get-AzureRmStorageAccount: Specifying account type and endpoints
Set-AzureRmStorageAccount: Specifying account type and endpoints

To minimize impact to cmdlets, we are releasing Azure PowerShell v1.7 – a hotfix that addresses all of the breaking changes with the exception of specifying the Endpoint properties for New-AzureRmStorageAccount, Get-AzureRmStorageAccount, and Set-AzureRmStorageAccount. This means no code change will be required by customers where the hotfix is applicable. This hotfix will not be present in Azure PowerShell versions 2.0 and later. Please plan to update the above cmdlets when you update to Azure PowerShell v2.0.

Below, you’ll find examples for how the above cmdlets work for different versions of Azure PowerShell and the action required:

Accessing keys with Get-AzureRmStorageAccountKey and New-AzureRmStorageAccountKey

V1.3.2 and earlier:

$key = (Get-AzureRmStorageAccountKey -ResourceGroupName $groupname -Name $accountname).Key1

$key = (Get-AzureRmStorageAccountKey -ResourceGroupName $groupname -Name $accountname).Key2

$key = (New-AzureRmStorageAccountKey -ResourceGroupName $groupname -Name $accountname -KeyName $keyname).StorageAccountKeys.Key1

$key = (New-AzureRmStorageAccountKey -ResourceGroupName $groupname -Name $accountname -KeyName $keyname).StorageAccountKeys.Key2

V1.4-V1.6 and V2.0 and later:

The cmdlet now returns a list of keys, rather than an object with properties for each key.

# Replaces Key1
$key = (Get-AzureRmStorageAccountKey -ResourceGroupName $groupname -Name $accountname)[0].Value

# Replaces Key2
$key = (Get-AzureRmStorageAccountKey -ResourceGroupName $groupname -Name $accountname)[1].Value

# Replaces Key1
$key = (New-AzureRmStorageAccountKey -ResourceGroupName $groupname -Name $accountname -KeyName $keyname).Keys[0].Value

# Replaces Key2
$key = (New-AzureRmStorageAccountKey -ResourceGroupName $groupname -Name $accountname -KeyName $keyname).Keys[1].Value

V1.7 (Hotfix):

Both methods work.

$key = (Get-AzureRmStorageAccountKey -ResourceGroupName $groupname -Name $accountname).Key1

$key = (Get-AzureRmStorageAccountKey -ResourceGroupName $groupname -Name $accountname)[0].Value

$key = (New-AzureRmStorageAccountKey -ResourceGroupName $groupname -Name $accountname -KeyName $keyname).StorageAccountKeys.Key1

$key = (New-AzureRmStorageAccountKey -ResourceGroupName $groupname -Name $accountname -KeyName $keyname).Keys[0].Value

Specifying account type in New-AzureRmStorageAccount, Get-AzureRmStorageAccount, and Set-AzureRmStorageAccount

V1.3.2 and earlier:

$AccountType = (Get-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).AccountType

$AccountType = (New-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).AccountType

$AccountType = (Set-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).AccountType

V1.4-V1.6 and V2.0 and later:

AccountType field in output of this cmdlet is renamed to Sku.Name.

$AccountType = (Get-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).Sku.Name

$AccountType = (New-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).Sku.Name

$AccountType = (Set-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).Sku.Name

V1.7 (Hotfix):

Both methods work.

$AccountType = (Get-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).AccountType

$AccountType = (New-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).AccountType

$AccountType = (Set-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).AccountType

$AccountType = (Get-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).Sku.Name

$AccountType = (New-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).Sku.Name

$AccountType = (Set-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).Sku.Name

Specifying Endpoints in New-AzureRmStorageAccount, Get-AzureRmStorageAccount, and Set-AzureRmStorageAccount

V1.3.2 and earlier:

$blobEndpoint = (Get-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).PrimaryEndpoints.Blob.AbsolutePath

$blobEndpoint = (New-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).PrimaryEndpoints.Blob.AbsolutePath

$blobEndpoint = (Set-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).PrimaryEndpoints.Blob.AbsolutePath

V1.4-V1.6 and V2.0 and later:

Output type for PrimaryEndpoints/Secondary endpoints blob/table/queue/file changed from Uri to String.

$blobEndpoint = (Get-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).PrimaryEndpoints.Blob

$blobEndpoint = (New-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).PrimaryEndpoints.Blob

$blobEndpoint = (Set-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).PrimaryEndpoints.Blob

Note: The ToString() method for these cmdlets will continue to work. For example:

$blobEndpoint = (Get-AzureRmStorageAccount -ResourceGroupName $groupname -Name $accountname).PrimaryEndpoints.Blob.ToString()

V1.7 (Hotfix):

No hotfix was provided for this breaking change. The return value’s endpoints will have to continue to be string, as changing these back to Uri would introduce an additional break.

Next steps

Download Azure PowerShell v1.7 (hotfix).
View all Azure PowerShell releases.
See migration guide for Azure PowerShell 2.0.

Quelle: Azure

Exploring HockeyApp data in Application Insights: introducing the Bridge App

In a previous blog, we announced that soon, the data from any app in HockeyApp would be accessible through the Analytics, and Continuous Export features in Application Insights. This functionality is now available to you through the new HockeyApp Bridge App! With it, you will now be able to query raw HockeyApp data and gain insights from it, as well as export it to your own data store for warehousing purposes. In this blog post, we’ll see how to take advantage of this new feature as well as answer common questions about how to instrument your applications going forward.

The HockeyApp Bridge App

HockeyApp is a great tool for instrumenting your mobile and desktop applications. It has powerful facilities for tracking distribution, adoption, crash reporting, feedback, and other data. It also has a collection of dashboards through which this data can be explored. Sometimes, however, you need to access, analyze, and visualize your data in ways other than are presently exposed in HockeyApp. This is where the new HockeyApp Bridge application type in Application Insights comes in! Your HockeyApp data will be available to you in its raw form to query and analyze using Analytics and export to your own data store via Continuous Export!

Creating a HockeyApp Bridge App in Application Insights

The HockeyApp Bridge App is the core feature that will enable you to access your HockeyApp data in Application Insights through the Analytics and Continuous Export features. Data collected by HockeyApp after the creation of the HockeyApp Bridge App will be accessible from the aforementioned features. All you need to set up a Bridge App is to create a new Application Insights resource with the “HockeyApp Bridge Application” app type. You will need to provide an API key that you can obtain from your HockeyApp settings, and soon after creating it, the Analytics and Continuous Export features will be accessible to you against your HockeyApp data.

Please see our documentation for a detailed walkthrough of setting up a HockeyApp Bridge App, as well as to learn more about the various ways to access your data.

Using the HockeyApp Bridge App

Let’s look at a simple practical example of using the HockeyApp Bridge App. In this case, we’ll be looking at telemetry generated by an iOS app that was created for the Xamarin Evolve 2016 conference (App Store link). This app was instrumented using the iOS HockeySDKs.

First, let’s create a HockeyApp Bridge App per the instructions above:

Now, let’s press the Analytics button to open a new Application Insights Analytics window. Once loaded, let’s open a new tab, type in the following query, and press “go”:

customEvents
| summarize country_Count = count() by client_CountryOrRegion
| order by country_Count
| render piechart

And just like that, we now have a pie chart of the country of origin of our users!

This is a very simple scenario, but building on it with the extensive querying capabilities in Analytics, you can begin to really investigate and gain insights about your HockeyApp applications. In parallel, you can configure Continuous Export to store this data in your warehouse, to later join with other data sources. All of these possibilities and more are now possible with the new HockeyApp Bridge App in Application Insights!

FAQs

Over the last several blog posts around instrumenting mobile and desktop apps, we’ve received some common questions:

Can you summarize how to instrument my applications going forward?

Please refer to the simple flowchart below to determine how to instrument your applications and access your data going forward (click for a large version):

Should I use Application Insights or HockeyApp to instrument my mobile and desktop applications?

You should use HockeyApp to instrument mobile and desktop applications going forward. HockeyApp provides great capabilities for these types of apps including distribution, adoption, crash reporting, and feedback tracking.

What if I need more information than the dashboards in HockeyApp provide?

Should you need to access or analyze the raw data from HockeyApp beyond the pre-made dashboards available there, you should create a HockeyApp Bridge App as described in this blog post. You can then further interact with your data through the Analytics and Continuous Export features in Application Insights.

What SDKs should I use for my mobile and desktop applications?

While the Application Insights SDKs for mobile and desktop applications will continue working for the foreseeable future, you should use the HockeyApp SDKs going forward. More details about this are available in our previous blog post.

How do I track handled exceptions in my mobile and desktop applications?

Tracking of handled exceptions is not natively available for mobile and desktop apps. It is however easy to implement via the Custom Event mechanism. To do this, a helper method can be created as described in the HockeyApp KB repository.

Next steps

Instrumenting your mobile or desktop application is easy – use HockeyApp and the HockeySDKs, and create a HockeyApp Bridge App in Application Insights if you need to analyze or access your raw data. Continue using Application Insights and the Application Insights SDKs for all other application types.

As always, please share your ideas for new or improved features on the Application Insights UserVoice page, and for any questions visit the Application Insights Forum or HockeyApp Support!
Quelle: Azure