.NET: Manage Azure Container Service, Cosmos DB, Active Directory Graph and more

We released 1.1 of the Azure Management Libraries for .NET. This release adds support for:

Cosmos DB
Azure Container Service and Registry
Active Directory Graph

https://github.com/azure/azure-sdk-for-net/tree/Fluent

Getting started

You can download 1.1 from:

Create a Cosmos DB with DocumentDB API

You can create a Cosmos DB account by using a define() … create() method chain.

var documentDBAccount = azure.DocumentDBAccounts.Define(docDBName)
.WithRegion(Region.USEast)
.WithNewResourceGroup(rgName)
.WithKind(DatabaseAccountKind.GlobalDocumentDB)
.WithSessionConsistency()
.WithWriteReplication(Region.USWest)
.WithReadReplication(Region.USCentral)
.Create();

In addition, you can:

Create Cosmos DB with DocumentDB API and configure for high availability
Create Cosmos DB with DocumentDB API and configure with eventual consistency
Create Cosmos DB with DocumentDB API, configure for high availability and create a firewall to limit access from an approved set of IP addresses
Create Cosmos DB with MongoDB API and get connection string

Create an Azure Container Registry

You can create an Azure Container Registry by using a define() … create() method chain.

var azureRegistry = azure.ContainerRegistries.Define("acrdemo")
.WithRegion(Region.USEast)
.WithNewResourceGroup(rgName)
.WithNewStorageAccount(saName)
.WithRegistryNameAsAdminUser()
.Create();

You can get Azure Container Registry credentials by using ListCredentials().

RegistryListCredentials acrCredentials = azureRegistry.ListCredentials();

Create an Azure Container Service with Kubernetes Orchestration

You can create an Azure Container Service by using a define() … create() method chain.

var azureContainerService = azure.ContainerServices.Define(acsName)
.WithRegion(Region.USEast)
.WithNewResourceGroup(rgName)
.WithKubernetesOrchestration()
.WithServicePrincipal(servicePrincipalClientId, servicePrincipalSecret)
.WithLinux()
.WithRootUsername(rootUserName)
.WithSshKey(sshPublicKey)
.WithMasterNodeCount(ContainerServiceMasterProfileCount.MIN)
.WithMasterLeafDomainLabel("dns-myK8S")
.DefineAgentPool("agentpool")
.WithVMCount(1)
.WithVMSize(ContainerServiceVMSizeTypes.StandardD1V2)
.WithLeafDomainLabel("dns-ap-myK8S")
.Attach()
.Create();

Create Service Principal with Subscription Access

You can create a service principal and assign it to a subscription with contributor role by using a define() … create() method chain.

var servicePrincipal = authenticated.ServicePrincipals.Define("spName")
.WithExistingApplication(activeDirectoryApplication)
// define credentials
.DefinePasswordCredential("ServicePrincipalAzureSample")
.WithPasswordValue("StrongPass!12")
.Attach()
// define certificate credentials
.DefineCertificateCredential("spcert")
.WithAsymmetricX509Certificate()
.WithPublicKey(File.ReadAllBytes(certificate.CerPath))
.WithDuration(TimeSpan.FromDays(7))
// export credentials to a file
.WithAuthFileToExport(new StreamWriter
(new FileStream(authFilePath, FileMode.OpenOrCreate)))
.WithPrivateKeyFile(certificate.PfxPath)
.WithPrivateKeyPassword(certPassword)
.Attach()
.WithNewRoleInSubscription(role, subscriptionId)
.Create();

Similarly, you can:

Manage service principals
Browse graph (users, groups and members) and managing roles
Manage passwords

Try it

You can get more samples from GitHub. Give it a try and let us know what do you think by emailing us or commenting below.
Quelle: Azure

ISVs find their cloud footing on Azure

This post is authored by the ISV team.

According to Gartner, “By 2020, anything other than a cloud-only strategy for new IT initiatives will require justification at more than 30% of large-enterprise organizations.” With innovation shifting to public datacenters, pressure is on ISVs to develop their own cloud roadmap.

Moving to the cloud is a big step, but it might be easier than you think. The Microsoft Azure platform has an array of options that accelerate business transformation. Move to the cloud on your terms, and from there the sky’s the limit.

For example, Baker Hill, a technology solution provider to more than 600 banks and credit unions, needed to move more than 10 terabytes of data from its parent company’s datacenter in just 48 hours without using a transfer agent or touching anything in the originating datacenter. With help from Microsoft, Baker Hill migrated hundreds of databases with time to spare by using Azure ExpressRoute connected to Equinix’s high-speed network. And now that Baker Hill has met its migration deadline, the company is continuing to transform its offerings with Azure.

In another scenario, Brainshark, which provides its clients worldwide with a cloud-based sales readiness and training platform, needed to find a more elastic solution to handle an ever-expanding volume of video content. To eliminate storage and processing constraints, Brainshark moved to Azure. In addition to improving the end-user experience, the transition virtually eliminated maintenance costs. But that was just the first step. Next, the company created Brainshark Labs, an incubator for next-generation sales enablement solutions that include wearable technology, virtual reality, and artificial intelligence. For this next chapter of innovation, Brainshark integrated Azure Cognitive Services with HoloLens mixed-reality simulation technology to transform sales training and customer engagement.

These are just two of many success stories with Microsoft technologies. Are you ready to add yours?

Learn more about partnering with Microsoft.
Quelle: Azure

Announcing the Solution Template for Jenkins on Azure

Have you been looking for the Microsoft Azure Marketplace image for Jenkins on Azure? We removed it because the Jenkins version used is outdated. I am excited to announce the replacement and share some updates from our team.

Solution template for Jenkins in Azure Marketplace

The solution template for Jenkins in Azure Marketplace is designed to configure a Jenkins instance following best practices with minimal Azure knowledge. You can now easily provision a fully configured Jenkins instance in minutes with a single-click through the Azure portal and a handful of user inputs.

The template installs the latest stable Jenkins version on a Linux (Ubuntu 14.04 LTS) Virtual Machine along with the following tools and plugins configured to work with Azure:

Git for source control
Azure Credentials plugin for connecting securely
Azure VM Agents plugin for elastic build, test and continuous integration
Azure Storage plugin for storing artifacts Azure CLI to deploy apps using scripts
Azure CLI to deploy apps using scripts

You can find a 5-min quickstart that provides a step-by-step walkthrough on the new Jenkins Hub. And yes, we now have a central hub where you can get all Jenkins on Azure resources.

Azure Credentials plugin version 1.2

We updated the Azure Credentials plugin so that you can now retrieve an Azure service principal and use it in Azure CLI.

In the below code snippet, substitute 'my service principal' with your credential ID in your Jenkins instance.

withCredentials([azureServicePrincipal('my service principal')]) {
sh 'az login –service-principal -u $AZURE_CLIENT_ID -p $AZURE_CLIENT_SECRET -t $AZURE_TENANT_ID'
}

This article on Jenkins Hub shows you how to create a Jenkins pipeline, checks out the source code in a GitHub repo, runs maven and then uses Azure CLI to deploy to Azure App service.

As always, we would love to get your feedback via comments. You can also email Azure Jenkins Support to let us know what you think.
Quelle: Azure

Azure Compute Reddit AMA – July 2017

The Azure Compute team will host a special Ask Me Anything session on /r/Azure, Friday, July 14, 2017 from 9:00 am to Noon PDT.

What's the AMA about?

We'll have folks from across the Azure Compute Engineering team available to answer any questions you have. You can ask us anything about our products, services, or even our team!

Why are you doing an AMA?

It’s a top priority for us to reach out and learn from our customers and the community. We want to know how you use Azure and Azure Compute, and how your experience has been. Your questions provide insights into how we can make the service better.

Who will be there?

We'll have PMs and Developers from across Azure Compute participating throughout the AMA.

Have any questions about the following topics? Bring them to the AMA.

Linux & Windows VMs
VM Scale Sets
Service Fabric, on Linux or Windows
Azure Container Service, using Kubernetes, DC/OS, or Docker Swarm
Azure Resource Manager
Azure Backup
Service Bus
Azure Batch
Azure Portal
And More!

Why should I ask questions here instead of other channels? Can I really ask anything?

With an AMA, you’ll get answers directly from the team and have a conversation with the people who build these products and services. Go ahead, ask us anything about our public products or the team. Please note, we cannot comment on unreleased features and future plans.

Join us! We're looking forward to having a conversation with you.
Quelle: Azure

Petya ransomware prevention & detection in Azure Security Center

This blog post was authored by Tim Burrell, Principal Engineering Manager, Microsoft Threat Intelligence Center​.

Microsoft Malware Protection Center (MMPC) published a blog post yesterday detailing a new ransomware infection that appears to have begun in Ukraine and spread from there to other places in Europe and beyond. MMPC analysis showed this to be a more sophisticated variant of Ransom:Win32/Petya and all free Microsoft antimalware products were updated with signatures for this threat, including Windows Defender Antivirus.

This post summarizes measures that Azure customers can take to prevent and detect this threat through Azure Security Center. See here for basic information on enabling Azure Security Center.

Prevention

Azure Security Center scans virtual machines across an Azure subscription and makes a recommendation to deploy endpoint protection where an existing solution is not detected. This recommendation can be accessed via the Prevention section as shown below.

Drilling into the Compute pane (or the overview recommendations pane) shows more detail, including the Endpoint Protection installation recommendation being discussed here:

Clicking on this leads to a dialog allowing selection of and installation of an endpoint protection solutions, including Microsoft’s own antimalware solution:

These recommendations and associated mitigation steps are available to Azure Security Center Free tier customers.

Detection

Azure Security Center customers who have opted into Standard-Tier can benefit from a new detection recently added to alert on specific indicators related to Petya ransomware running on an infected host – this is described in further detail below.

These alerts are accessed via the Detection pane highlighted below, and require the Azure Security Center Standard tier.

An alert for Petya ransomware will show up as shown below:

Drilling in provides more detail of the impacted VM and suspicious process or commandline that triggered the alert:

Note that although the detection alert relates to a specific host, because this ransomware attempts to propagate to other nearby machines, it is important to apply remediation steps to all on all hosts on the network, not just the host identified in the alert.

Please follow the remediation steps indicated in the Alert or in the Microsoft Malware Protection Center (MMPC) blog.
Quelle: Azure

New troubleshooting and diagnostics for Azure Files Storage mounting errors on Windows

Azure File Storage offers fully managed file shares in the cloud using the Server Message Block (SMB) protocol, which is the predominantly used file share protocol for on-premises Windows use cases. Azure Files can be mounted from any client OS that implements the SMB versions supported by Azure Files. Today, we are introducing AzFileDiagnostics to help first time Azure Files file share users ensure that the Windows client environment has the correct prerequisites. AzFileDiagnostics automates detection of most of the symptoms mentioned in the troubleshooting Azure Files article and helps set up your environment and receive optimal performance.

In general, mounting a file share can be simply achieved on Windows using a standard “net use” command. When you create a share, Azure Portal automatically generates a “net use” command and makes it available for copy and pasting. One can simply click on the “Connect” button, copy the command for mounting this file share on your client, paste it and you have a drive with mounted file share. What could go wrong? Well, as it turns out, use of different clients, SMB versions, firewall rules, ISPs, or IT policies can affect connectivity to Azure Files. Good news is AzFileDiagnostics isolates and examines each source of possible issues and in turn provides you with advice or workarounds to correct the problem.

As an example, Azure Files supports SMB protocol version 2.1 and 3.0. To ensure secure connectivity, Azure Files requires communication from another region or from on premises to be encrypted. Thus, requiring SMB 3.0 channel encryption for those use-cases. AzFileDiagnostics detects the SMB version on the client and determines whether the client meets the necessary encryption requirement automatically.

How to use AzFileDiagnostics

You can download AzFileDiagnostics from Script center today and simply run:

PowerShell Command:

AzFileDiagnostics.ps1 [-StorageAccountName <storage account name>] [-FileShareName <share name>] [-EnvironmentName <AzureCloud| AzureChinaCloud| AzureGermanCloud| AzureUSGovernment>]

Usage Examples:

AzFileDiagnostics.ps1

AzFileDiagnostics.ps1 -UncPath storageaccountname.file.core.windows.netsharename

AzFileDiagnostics.ps1 -StorageAccountName storageaccountname –FileShareName sharename –Environment AzureCloud

In addition to diagnosing issues, it will present you with an option to mount the file share when the checks have successfully completed.

Learn more about Azure Files

Get started with Azure Files
Use Azure Files with Linux
Troubleshoot Azure Files on Windows
Troubleshoot Azure File on Linux

Feedback

We hope that AzFileDiagnostics will make your getting started experience smoother. We love to hear your feedback. If there are additional troubleshooting topics for Azure Files that you would like to see, please leave a comment below. In addition to this, if you have any feature request, we are always listening to your feedback on our User Voice. Thanks!
Quelle: Azure

Identity now available in SQL Data Warehouse

Azure SQL Data Warehouse (SQL DW) is a SQL-based, fully managed, petabyte-scale cloud solution for data warehousing. SQL DW is highly elastic, you can provision in minutes and scale capacity in seconds. You can scale compute and storage independently, allowing you to burst compute for complex analytical workloads or scale down your warehouse for archival scenarios, and pay based off what you're using instead of being locked into predefined cluster configurations.

IDENTITY has been a long standing customer ask for SQL Data Warehouse. We’re excited to announce that Azure SQL Data Warehouse now supports an IDENTITY column property as well as SET IDENTITY_INSERT syntax and generating IDENTITY on load. In data warehousing, IDENTITY functionality is particularly important as it makes easier the creation of surrogate keys.

Surrogate keys are fundamental to dimensional modelling because they often uniquely identify a row. Since they are typically integer values, they also tend to compress and compare with better performance. While UUIDs can often be used for similar purposes, they are harder to manage, don’t intrinsically contain temporal information, and are non-performant. For large data warehouses, the 4x size of UUIDs compared with a traditional 4-byte IDENTITY value really adds up. The previous method of assigning monotonically increasing surrogate keys involved using left outer joins from a staging table combined with the application of getting the max id on the surrogate key column with a ROW_NUMBER function. This solution was clunky and invoked a costly broadcast data move.

We hope that by adding this feature, we’ve made data management in SQL DW easier and better for our customers.

Keep in mind, this IDENTITY property is not synonymous with uniqueness constraints which are often imposed on IDENTITY columns!

Next steps

Get started today by creating IDENTITY columns in a table today. It’s as simple as:

CREATE TABLE dbo.T1
( C1 INT IDENTITY(1,1) NOT NULL
, C2 INT NULL
)
WITH
( DISTRIBUTION = HASH(C2)
, CLUSTERED COLUMNSTORE INDEX
)
;

Bear in mind that the IDENTITY property cannot be used in the following scenarios:

Where the column data type is not INT or BIGINT
Where the column is also the distribution key
Where the table is an external table

Learn more about adding IDENTITY functionality to your tables today by visiting our documentation or our T-SQL syntax page.

Learn more

What is Azure SQL Data Warehouse?
SQL Data Warehouse best practices
Video library
MSDN forum
Stack Overflow forum

Quelle: Azure

Event Hubs Capture (formally Archive) is now Generally Available

Today we are announcing Azure Event Hubs Capture, released in public preview in September 2016 as Azure Event Hubs Archive, is now Generally Available.

This capability adds an important dimension to Azure Event Hubs, which is a highly scalable data streaming platform and event ingestion service capable of receiving and processing millions of events per second. Event Hubs Capture makes it easy to send this data to persistent storage without using code or configuring other compute services. You can currently use it to push data directly from Event Hubs to Azure Storage as blobs. In the near future, we will also support Azure Data Lake Store. Other benefits include:

Simple setup: You can use either the Azure portal or an Azure Resource Manager template to configure Event Hubs to take advantage of Capture capability.

Reduced total cost of ownership: Event Hubs handles all the management, so there is minimal overhead involved in setting up and tracking your custom job processing mechanisms.

Integrated with your destination: Just choose your Azure Storage account, and soon Azure Data Lake Store, and Event Hubs Capture will automatically push the data into your repositories.

Near-Real time batch analytics: Event data is available within minutes of ingress into Event Hubs. This enables the most common scenarios of near-real time analytics without having to construct separate data pipelines.

Perfect on-ramp for Big Data: When the Capture feature is enabled, Event Hubs allows a single stream to support real-time and batch based pipeline, making it easy to compose your Big Data solutions with Event Hubs.

With the move to General Availability, beginning August 1, 2017 Event Hubs Capture will be charged at an hourly rate of $0.10/hr. For detailed pricing, please refer to Event Hubs pricing. 

Next steps

We have a few additional resources that can jump-start your usage of Events Hubs. After learning more about this new feature and Event Hubs more generally, you can explore how to use templates to enable Capture capability on your Event Hub. Lastly, we hope you’ll let us know what you think about newer sinks and newer serialization formats.

If you have any questions or suggestions, leave us a comment below.
Quelle: Azure

Text Analytics API now supports analyzing sentiment in 16 languages

You can now analyze the sentiment of your text in 12 new languages. With this release, you will now be able to get a more complete view of your customer’s voice with an understanding of how your customers feel about your product or service, an international event, or news topic.

Sentiment analysis is now supported in Danish, Dutch, English, Finnish, French, German, Greek, Italian, Japanese, Norwegian, Polish, Portuguese, Russian, Spanish, Swedish, and Turkish. For details on the languages supported across all the capabilities, see the Text Analytics API documentation.

Text Analytics is easy to get started with – try it yourself through the demo experience. Customers are already using the capabilities around the world, and it’s incredibly easy to do so. One such way is through Microsoft Flow, where you can start analyzing tweets and visualize the analytics in Power BI with a few clicks. See the Flow template to see this in action.

The Text Analytics API is one of Microsoft’s Cognitive Services, which let you build apps with powerful algorithms using just a few lines of code. You can get started for free with a trial account today.
Quelle: Azure

Creating enterprise grade BI models with Azure Analysis Services

In April we announced the general availability of Azure Analysis Services, which evolved from the proven analytics engine in Microsoft SQL Server Analysis Services. The success of any modern data-driven organization requires that information is available at the fingertips of every business user, not just IT professionals and data scientists, to guide their day-to-day decisions. Self-service BI tools have made huge strides in making data accessible to business users. However, most business users don’t have the expertise or desire to do the heavy lifting that is typically required, including finding the right sources of data, importing the raw data, transforming it into the right shape, and adding business logic and metrics, before they can explore the data to derive insights. With Azure Analysis Services, a BI professional can create a semantic model over the raw data and share it with business users so that all they need to do is connect to the model from any BI tool and immediately explore the data and gain insights. Azure Analysis Services uses a highly optimized in-memory engine to provide responses to user queries at the speed of thought.

In this video, Christian Wade demonstrates how you can leverage Azure Analysis Services to build enterprise grade BI models. In this video you will learn how to import Power BI Desktop files using the new web designer (coming soon) and how to use other tools like SQL Server Data tools (SSDT) and BISM Normalizer.

Learn more about Azure Analysis Services.
Quelle: Azure