Announcing the public preview of PowerShell in Azure Cloud Shell

A few months ago, we started the journey to bring the PowerShell experience in to Azure Cloud Shell. Today, that experience enters public preview alongside Bash in Azure Cloud Shell. With the addition of PowerShell in Cloud Shell, now you have the flexibility to choose the shell experience that works best for you.

Features of PowerShell in Cloud Shell

The PowerShell experience builds on to the benefits of Azure Cloud Shell such as:

Authenticated shell access to Azure from virtually anywhere.
Common tools and programming languages in a shell that’s updated and maintained by Microsoft.
Persist your files across sessions in attached Azure File storage.

The PowerShell experience adds:

Azure drive (Azure:) to discover and navigate all Azure resources like file system navigation. Azure drive also provides contextual capabilities such as:

Resource group scoping for Azure PowerShell cmdlets, when within the context of a resource group path in the Azure drive (Azure:).
Context-sensitive command list using Get-AzureRmCommand. It only lists commands that are applicable to items under the path in Azure drive (Azure:).

Rich PowerShell script editing using VIM, which provides built-in syntax highlighting and IntelliSense for PowerShell files. 
An extensible model for adding new commands (via modules and scripts) from the PowerShell Gallery, which automatically persisted across your Cloud Shell sessions.
Enables interactions with VMs using PowerShell remoting to enable management of guest VMs.

Find more details about the features and tools incorporated into the PowerShell experience in Cloud Shell.

Azure PowerShell integration

To provide a streamlined Azure PowerShell experience, Cloud Shell:

Automatically authenticates access to all your account's subscriptions for Azure PowerShell
Maintains the version of the Azure PowerShell modules, providing the latest and greatest experience in every Cloud Shell session

Whether you're an experienced Azure user or new to the platform, Cloud Shell offers low-friction access to learn and use Azure PowerShell commands. Using Cloud Shell, you can easily automate and manage resources at scale from the comfort of the Azure portal.

Azure documentation integration

Azure PowerShell documentation is now fully interactive with the addition of PowerShell Try It button. This addition enables an immersive learning experience for Azure PowerShell scenarios and samples. The integrated PowerShell environment in the Azure documentation uses the same PowerShell in Cloud Shell experience that is available from the Azure portal.

Try this experience today in the Azure PowerShell tutorials.

Azure mobile app integration

PowerShell in Azure Cloud Shell is also available on the Azure mobile app enabling you to take this experience with you, wherever you go. Saving in-progress work across devices is where it starts to get interesting. With the power of the Azure mobile app, you have access to any script in your CloudDrive, from virtually anywhere.

Try it today

Launch Cloud Shell from the top navigation bar of the Azure portal and select PowerShell option from the shell drop-down list. Learn more details about Azure Cloud Shell.

Thank you to our private preview users who helped shape the current experience by providing valuable feedback via issues and feature requests. We encourage you to continue your support by sharing your thoughts, experience, and input through Azure Cloud Shell UserVoice.
Quelle: Azure

Built-in security and operations management for Azure and hybrid environments

The growth of cloud infrastructure usage has been tremendous in the last couple years. In my conversations with customers, many are looking for technologies to help with cloud security and cloud management. More customers are asking for management that is rooted in the cloud and really designed for the new cloud paradigm. At Microsoft we are your trusted partner for enterprise today and in the future, and we are in a unique position where we build both a cloud platform and have a long history of delivering management and security services.

With Azure we are blurring the lines between the traditional categories of platform and management as we deliver an open cloud platform that has built-in security and operations management – and can still meet the needs of our largest enterprise customers. Our customers benefit from this approach with a simpler experience across the full security and operations management lifecycle. We also recognize the importance of building tools that manage and secure not just Azure but also your traditional workloads, and that’s why we are focused on delivering hybrid capabilities.

Today I’m excited to announce several new services and features across these areas:

Azure Cost Management by Cloudyn available for free. Azure Cost Management helps organizations manage and optimize cloud spend across Azure, AWS and Google Cloud Platform. Cost management has been one of the most popular requests from our customers and I’m excited to announce that it is now available for free to Azure customers and partners to manage Azure spend. Learn more about Azure Cost Management by Cloudyn.
Azure Security Center protection for hybrid workloads. Azure Security Center helps you protect workloads running in Azure from cyber threats and can now also be used to secure workloads running on-premises and in other clouds. Today we are releasing new capabilities to better detect and defend against advanced threats, automate and orchestrate security workflows, and streamline investigation of threats. Learn more about Azure Security Center updates.
Integration of management into the virtual machine experience in the Azure portal. This new experience simplifies the process of adding backup, site recovery, monitoring, update management and more to your existing virtual machines.
Update management, configuration management & change tracking included at no cost for Azure customers to help you manage missing updates and track configuration changes efficiently across Windows and Linux virtual machines in Azure, and across your hybrid environments. Python support has been added to the Automation service in addition to the existing PowerShell & Graphical authoring capabilities to make it easier to automate both Windows and Linux environments. Learn more about Azure Automation and configuration updates.
End to end monitoring from the application to the infrastructure. The new Azure monitor user experience centralizes the monitoring services together so that you can get visibility across infrastructure and applications. In addition, we have significantly optimized your experience for Azure Log Analytics, as well as with metrics exploration, application performance monitoring, and failure diagnostics in Application Insights. We have also integrated Azure alerts with IT Service Management tools and released new solutions for Container Monitoring. Learn more about Azure Monitoring updates.
Azure Policy to help you deliver governance and compliance. The new Azure Policy service, now in limited preview, helps you establish standards, guardrails, and continually monitor compliance to deliver enterprise-wide governance. Azure policies can be applied over your Azure resources, from a single subscription to a management group with control across your entire organization. Sign up for the Azure Policy limited preview.
PowerShell support in Azure Cloud Shell complements Bash as another authenticated, browser-based shell tool to streamline your Azure management experience. Learn more about PowerShell in Azure Cloud Shell.

The importance of securing and managing your cloud workloads

In this world where customers expect to do business with you 24×7 and threats are only getting more sophisticated, we recommend that at a minimum you turn on security, backup and monitoring for your virtual machines. The Azure platform is designed to reduce your security and operations management burden for building, maintaining, and securing the datacenters, but as a customer you can partner with us to ensure that your Azure resources are secure and well-managed with the right security and compliance controls in place. 

I hope you will join me at Microsoft Ignite, either in person or virtually, to see these new features and updates in action. I’m excited to hear from you on how you are securing and managing your resources in the cloud and encourage you to continue sending us feedback. You can create a free account to get started exploring Azure security and operations management today.
Quelle: Azure

At Ignite, Microsoft is updating its Cognitive Services collection of intelligent APIs

Microsoft Cognitive Services enables developers to augment the next generation of applications with the ability to see, hear, speak, understand, and interpret needs using natural methods of communication. Think about the possibilities: being able to add vision and speech recognition, emotion and sentiment detection, language understanding, and search, to applications without having any data science expertise.

Today, we are excited to announce several service updates:

Text Analytics API is now generally available. Text Analytics is a cloud-based service that provides advanced natural language processing over raw text. It includes API functions such as sentiment analysis, key phrase extraction and language detection.
Bing Custom Search API will be generally available in October. Bing Custom Search lets you create a highly-customized targeted web search experience to deliver more relevant results from your targeted web space through a commercial grade service.
Bing Search APIs v7 will be will be generally available in October. Allowing you to bring the immense knowledge of the planet to your applications, the v7 update will provide several improvements, such as results coming back fast with improved performance for queries on the Bing Web Search API. New sorting and filtering options make it easier to find relevant results in news trending topics and image searches. Better error messages make it easy to troubleshoot and diagnose problem queries, and updated, modernized documentation make it easy bring the power of the Bing Search APIs to your applications.
We plan to make Language Understanding Intelligent Service and Microsoft Bot Framework, which contains everything you need to build and connect intelligent bots, generally available later this year.
We’re also adding new capabilities to our services:

QnAMaker preview API is now enabling to build, train and publish a simple question and answer bot from product manuals.
We’re expanding Face API, Computer Vision API and Content Moderator in 7 additional regions – South Central US, West US2, East US, Brazil, North Europe, Australia East and East Asia.

Creating a highly targeted search for your users

As mentioned, we’re excited to announce that Bing Custom Search will be generally available in October!
With Bing Custom Search, you can create a highly-customized targeted web search experience, to deliver more relevant results from targeted web space through a commercial grade service.

Featuring a straightforward User Interface, Bing Custom Search enables you to create your own web search engine without a line of code. You can specify the slices of the web you want to draw from – or let cutting-edge AI technology help to identify them. Businesses of any size, hobbyists and entrepreneurs can design and deploy web search applications for any possible scenario.

For example, Amicus has recently released a platform that changes the way charitable aid is funded and delivered, showing donors where every dollar is spent and giving non-profits real-time tools to report and measure performance. This allows donors to fund ‘projects’ instead of blindly giving money to an organization, and non-profits to build project requests based on measurable outcomes. This transition presented a very unique challenge: how to enable donors to research and learn about the projects and activities performed by non-profits? Amicus needed to help donors Learn, Find and Fund projects that were of interest and relevant to them, something complex with traditional search engines.

With Bing Custom Search, part of Microsoft Cognitive Services, Amicus has been able to identify its own set of relevant web pages in advance: when users have a single concept of interest (like ‘water’, ‘education’ or ‘India’), Bing Custom Search is able to deliver highly relevant results in the context of global aid.

“This is exactly what our audience needs in order to learn about a broader range of important work performed by relief organizations, beyond those the donors currently know about. Bing Custom Search, part of Microsoft Cognitive Services, delivers a ‘Learn and Find’ experience in ways never before possible.” – says Beth Katz, Chief Product Officer at Amicus.

Get Started with Bing Custom Search

To easily get started with Bing Custom Search, you can look at the service page, refer to the documentation and start building a great experience with the quick start guide.

Bringing text to life in your application

We are excited to announce that Text Analytics is now generally available in the Azure Portal, and now available in four additional regions: South Central US, East US, West Europe, and Southeast Asia.

Text Analytics API is a cloud-based service that provides advanced natural language processing over raw text. Text Analytics API has three main functions: sentiment analysis, key phrase extraction, and language detection.

Sentiment Analysis – Find out what customers think of your brand or topic by analyzing raw text for clues about positive or negative sentiment. This API returns a sentiment score between 0 and 1 for each document, where 1 is the most positive. Our models are pretrained using an extensive body of text and natural language technologies from Microsoft. 
Key Phrase Extraction – Automatically extract key phrases to quickly identify the main points. For example, for the input text ‘The food was delicious and there were wonderful staff’, the service returns the main talking points: ‘food’ and ‘wonderful staff’. 
Language Detection – For up to 120 languages, detect which language the input text is written in and report a single language code for every document submitted on the request.

Some customers scenarios where Text Analytics is used are customer feedback analytics, as an enricher to search scenarios, and in conjunction with LUIS and the Bot Framework (analyzing sentiment of a conversation over time).

More and more customers are using Text Analytics API: Brainshark is a cloud-based sales training and readiness platform that helps sales people achieve mastery in the presentation of sales materials to clients, slashing the costs and resources needed for training and maximizing the effectiveness of sales engagements.

Brainshark is now creating a training platform that allows sales representatives to perfect their pitch through video and Cognitive Services. Utilizing Face API, Emotion API, and Text Analytics, it’s possible to analyze their pitch, and feed a Machine Learning model to provide feedback on their performance.

“Now, companies are simply pushing sales people into the field and they’re learning through experience— a ridiculously expensive way to train. Every deal lost due to lack of confidence costs the company real money. If we can minimize that and actually get sales people ready to sell, it’ll have a huge impact on productivity,” says Jim Ninivaggi, Senior Vice President, Business Development at Brainshark.

Get Started with Text Analytics API

One of the best way to get started with Text Analytics API is to look at our Quick Start guides. Here is a snippet from the C# Quickstart that show how to consume the API using the C# SDK.  We also have quickstart in Java, Node.js, Python, Ruby and PHP.

Let’s say that I want to be able to explore the most important phrases, sentiment and language from feedback I receive from my customers with Text Analytics API.

// EXTRACTING LANGUAGE
LanguageBatchResult result = client.DetectLanguage(
new BatchInput(
new List<Input>()
{
new Input("1", "This is a document written in English."),
new Input("2", "Este es un document escrito en Español."),
new Input("3", "这是一个用中文写的文件")
}));
// Printing language results.
foreach (var document in result.Documents)
{
Console.WriteLine("Document ID: {0}, Language:{1}",
document.Id, document.DetectedLanguages[0].Name);
}
// GETTING KEY PHRASES
KeyPhraseBatchResult result2 = client.KeyPhrases(
new MultiLanguageBatchInput(
new List<MultiLanguageInput>()
{
new MultiLanguageInput("ja", "1", "猫は幸せ"),
new MultiLanguageInput("de", "2",
"Fahrt nach Stuttgart und dann zum Hotel zu Fu."),
new MultiLanguageInput("en", "3", "My cat is stiff as a rock."),
new MultiLanguageInput("es", "4", "A mi me encanta el fútbol!")
}));
// Printing keyphrases
foreach (var document in result2.Documents)
{
Console.WriteLine("Document ID: {0} ", document.Id);
Console.WriteLine("t Key phrases:");
foreach (string keyphrase in document.KeyPhrases)
{ Console.WriteLine("tt" + keyphrase); }
}
// SENTIMENT ANALYSIS
SentimentBatchResult result3 = client.Sentiment(
new MultiLanguageBatchInput(
new List<MultiLanguageInput>()
{
new MultiLanguageInput("en", "0", "I had the best day of my life."),
new MultiLanguageInput("en", "1",
"This was a waste of my time. The speaker put me to sleep."),
new MultiLanguageInput("es", "2",
"No tengo dinero ni nada que dar…"),
new MultiLanguageInput("it", "3",
"L'hotel veneziano era meraviglioso.
È un bellissimo pezzo di architettura."),
}));

// Printing sentiment results
foreach (var document in result3.Documents)
{
Console.WriteLine("Document ID: {0} , Sentiment Score: {1:0.00}",
document.Id, document.Score);
}

Don’t hesitate to refer to the API definitions for technical documentation for the APIs.

We also developed a very interesting tutorial here that integrates Text Analytics into Power BI to extract the most important phrases and sentiment from customer feedback. You’ll see how we’re using a custom Power Query function and creating a nice Word Cloud from these phrases.

Happy coding!
 
The Microsoft Cognitive Services Team
Quelle: Azure

Introducing Azure Availability Zones for resiliency and high availability

As part of our commitment to providing customers with a platform for their most demanding, mission-critical workloads, I’m excited to announce expanded capabilities for Microsoft’s global cloud infrastructure.

Starting today, customers can begin using Azure Availability Zones in preview to build highly available applications. Availability Zones increase Azure’s resiliency capabilities and broaden options for customers to choose the business continuity solution that is right for their organization. We've also designed Availability Zones to give customers great confidence in delivering services and with an industry-leading, financially-backed 99.99% virtual machines uptime SLA when generally available.

Availability Zones are fault-isolated locations within an Azure region, providing redundant power, cooling, and networking. Availability Zones allow customers to run mission-critical applications with higher availability and fault tolerance to datacenter failures. With 42 announced regions worldwide (more than any other cloud provider) and backed by one of the largest networks on the planet, Azure offers the scale needed to bring applications closer to users and customers around the world. Availability Zones are now in preview in two regions, East US 2 in Virginia and West Europe in the Netherlands, with plans to offer preview to additional regions in the US, Europe, and Asia before the end of the year including our new France Central region in Paris.

With Azure’s geographic expansion, we invest in providing the best cloud experience possible including expanding and upgrading our global network. Today also marks the completion of the MAREA transatlantic subsea cable, the latest addition to our global network. MAREA is a joint project between Microsoft, Facebook and Telxius, and represents our latest infrastructure initiative to support customer demand and service innovation across the globe. MAREA is the highest-capacity cable to ever cross the Atlantic, and the first to connect Virginia and Spain. It will help support the growing demand for high speed, reliable connections to the U.S. and Europe, including our newest Azure regions coming to France, and beyond.

To learn more about Availability Zones and sign up for the Preview, visit: http://aka.ms/az.

Follow these links to find out more about the innovation in Microsoft’s global network and the MAREA transatlantic subsea cable.
Quelle: Azure

On-premises Azure Files access on Linux update and new troubleshooter

We announced the availability of Linux on-premises and cross-region mounting of Azure Files for the first OS distribution – Ubuntu 17.04, in April. Today, we are happy to share that more Linux distributions now include this functionality, which will allow on-premises and cross-region mounting of Azure Files.

For security reasons, connections to Azure Files shares are blocked if the communication channel isn’t encrypted while the connection attempt is not made from the same data center on which Azure File shares reside. Channel encryption is not provided if the user’s client OS doesn’t support SMB encryption, and Linux did not support this yet. We have collaborated with Linux community to implement encryption Linux CIFS. Encryption support was introduced in 4.11 kernel.

This functionality has already been backported to the following Linux distributions:

Ubuntu from 16.04 and above
SUSE Linux Enterprise Server 12 SP3 and above
CoreOS Stable
Debian 9 with jessie-backports kernel

This functionality is coming soon on the following Linux distributions:

CentOS 7.5 (mid-2018)
Next release of RHEL (one after 7.4)

This means that with the above version you can share data access across VMs living in different environments like on-premises, other regions in Azure or even other clouds. You can point it to Azure Files and get an SMB/CIFS share mounted on your Linux VM, or mount the same share on Windows and share data across platforms.

New getting started and diagnostic tool for Linux

We are also making it easy for you to check if your Linux environment is correctly set up, and has all the pre-requisites to use SMB by releasing a brand new troubleshooter tool for Linux, similar to the Windows troubleshooter that we announced in June.

We published a troubleshooting for Linux guide some time ago. Today, we are releasing the Azure toubleshooter tool in order for you to rest assured that your environment is correctly set up.

You can download AzFileDiagnostics from Script Center today and simply run the following command and that’s it. The tool will tell you what needs to be done for mounting your file share. It will then mount the file share for you, if you wish to do so.

chmod +x AzFilesDiagnostics.sh
./AzFilesDiagnostics.sh

Summary and next steps

Azure is a first-class platform for Linux and open source technology. Microsoft has been constantly contributing to the Linux open source community to provide the finest experience for our Azure Linux customers and we look forward to continue doing so.

We are excited to see the tremendous adoption of Azure File Storage. You can try Azure File storage by getting started in under five minutes. Further information and detailed documentation links are provided below.

Use Azure File on Linux
Azure Files Storage: a frictionless cloud SMB file system for Windows and Linux
Inside Azure File Storage

We will continue to enhance the Azure File Storage based on your feedback. If you have any comments, requests, or issues, you can use the following channels to reach out to us:

Stack Overflow
MSDN
User Voice

Quelle: Azure

New offers in Azure Marketplace

Last month 31 great new cloud offerings were published to Azure Marketplace. Check ‘em out!

Cloudbreak for Hortonworks Data Platform: Cloudbreak for Hortonworks Data Platform simplifies the provisioning, management, and monitoring of HDP clusters in the cloud environments.

CloudLanes Cloud Backup Accelerator & VTL Gateway: Superfast backups. Send data to the cloud at a much faster rate. Cloud backups and archives require your users to upload and download files and information to and from the cloud.

Workspot Cloud Desktops: VDI and DaaS on Azure: VDI is an infinitely scalable cloud service. Deploy thousands of Windows 10 desktops in a day.

Forscene Edgeserver: Forscene from Forbidden is professional video editing software for collaborative productions and remote workflows.

CloudLanes Cloud Video Accelerator & NFS Storage: CloudLanes Cloud Video Accelerator allows you to rapidly and securely ingest and move your on premise videos to any cloud, leveraging its superior economics and rich set of features to manage massive video archives.

Hanu Managed Services: Our offerings are categorized as standard and on-demand services to cater to the custom needs of every customer.

Optimiz Centos Linux Web Server: For faster web server they build the ready optimizing centos and php 7 server image – OS CENTOS 7 – MariaDB – PHP 7 – PageSpeed.

RapidMiner Server 7.5: RapidMiner Server makes it easy to share, reuse, and operationalize the models and results created in RM Studio.

RapidMiner Server 7.6: Real data science, fast and simple. RapidMiner Server makes it easy to share, reuse, and operationalize the models and results created in RM Studio.

RSA NetWitness Broker 10.6.4: The RSA NetWitness Broker aggregates data captured by other devices and event sources. Brokers aggregate data from configured concentrators.

RSA NetWitness Concentrator 10.6.4: The RSA NetWitness Concentrator indexes metadata extracted from network or log data, and makes it available for enterprise-wide querying and real-time analytics while also facilitating reporting and alerting.

RSA NetWitness Event Stream Analysis 10.6.4: The RSA NetWitness ESA host provides advanced stream analytics such as correlation and complex event processing at high throughputs and low latency.

RSA NetWitness Log Decoder 10.6.4: The RSA NetWitness Log Decoder collects log events from hundreds of devices and event sources, and offers the same extensive features and functionality as its physical version.

RSA NetWitness Virtual Log Collector 10.6.4: The RSA NetWitness VLC is a host that will collect logs from currently supported event sources and protocols.

RSA NetWitness Archiver 10.6.4: The RSA NetWitness Archiver is a host that enables long-term log archiving by indexing and compressing log data and sending it to archiving storage.

SQL Beacon: SQL Beacon monitors SQL Servers across an entire organization from a single instance, allowing DBAs to totally forget about best practice checks they often don’t have time to carry out.

Delphix Dynamic Data Platform for Azure: The Delphix Dynamic Data Platform allows data to be securely delivered to every stakeholder, across cloud, hybrid, and on-premises environments, at the speed required to enable rapid development and delivery of applications and solutions.

OmniOS Community Edition: OmniOS Community Edition Association (OmniOSce) is a Swiss association, dedicated to the continued support and release of OmniOS for the benefit of all parties involved.

CIS Windows Server 2008 R2 Benchmark v3.0.1.5 – L1: This instance of Microsoft Windows Server 2008 R2 is hardened according to a CIS Benchmark.

CIS Windows Server 2008 R2 Benchmark v3.0.1.5 – L2:  This instance of Microsoft Windows Server 2008 R2 is hardened according to a CIS Benchmark. Launching an image hardened according to the trusted security configuration baselines prescribed by CIS will reduce cost, time, and risk to your organization. 

CIS Windows Server 2012 Benchmark v2.0.1.5 – L1: This instance of Microsoft Windows Server 2008 R2 is hardened according to a CIS Benchmark. Launching an image hardened according to the trusted security configuration baselines prescribed by CIS will reduce cost, time, and risk to your organization.

CIS Windows Server 2012 Benchmark v2.0.1.5 – L2: This instance of Microsoft Windows Server 2008 R2 is hardened according to a CIS Benchmark. Launching an image hardened according to the trusted security configuration baselines prescribed by CIS will reduce cost, time, and risk to your organization.

CIS Windows Server 2016 Benchmark v1.0.0.4 – L1: This instance of Microsoft Windows Server 2008 R2 is hardened according to a CIS Benchmark. Launching an image hardened according to the trusted security configuration baselines prescribed by CIS will reduce cost, time, and risk to your organization.

Pilot Things IoT VPN: Build a worldwide IoT VPN in a few clicks, thus reducing complexity. 

FlashGrid Node for Oracle RAC (based on OL 7): The VM image contains FlashGrid software and can be used for creating Oracle Real Application Clusters nodes.

CERTIFY: Certify smart tools identifies the standards requirement applicable to the Medical Device and automatically creates the corresponding risks and technical requirement.

Solutions templates

Kyligence Analytics Platform 2.3: Kyligence Analytics Platform (KAP) is an enterprise-ready big data product, powered by Apache Kylin and Apache Hadoop.

Parity Ethereum Failover Kovan: This Parity instance is to be used by the Kovan network authorities to ensure high availability.

Versio™ Platform: Versio™ is an advanced, fully IP-enabled, integrated playout in the cloud platform, that is 100% software running on the Azure cloud. 

Cloudbreak for Hortonworks Data Platform: Simplifies the provisioning, management, and monitoring of HDP clusters in cloud environments.

Veritas NetBackup™ 8.0: Provides unified data protection for enterprises with large-scale, complex, and heterogeneous multi-cloud environments that want to take their business further on the Microsoft Azure cloud platform.

– The Azure Marketplace Team
Quelle: Azure

Route IoT device messages to Azure Storage with Azure IoT Hub

Azure Storage containers is the new custom endpoint type available with IoT Hub.  Azure Storage containers joins Service Bus queues, topics, and Event Hubs as supported custom endpoint types for IoT Hub message routing. Storage was the most requested endpoint type for message routing because it makes it super simple to build a cold-path analytics pipeline. The best part, it’s available everywhere.

Just as a reminder, IoT Hub can write messages to multiple endpoints, so with this feature improvement customers can both send the message through a hot-path analytics pipeline, as well as push the message to storage for cold-path analytics or long-term archival. Cold-path analytics are used to process data which requires more complex processing than simple windowing or thresholding, and cold-path analytics often uses data from devices over a longer period of time. With this feature update, you can set up your hot- and cold-path analytics easily in IoT Hub routes.

With an Azure Storage container as a custom endpoint, IoT Hub will write messages to a blob based on the batch frequency and block size specified by the customer. After either the batch size or the batch frequency are hit, whichever happens first, IoT Hub will then write the enqueued messages to the storage container as a blob. You can also specify the naming convention you want to use for your blobs, as shown below.

This feature was brought to you in part by the outpouring of feedback we received requesting the ability to route messages based on message body, and I want to send a huge THANK YOU to everyone who requested this functionality. As always, please continue to submit your suggestions through the Azure IoT User Voice forum or join the Azure IoT Advisors Yammer group.
Quelle: Azure

Azure Analysis Services now available in Azure Government

We are pleased to announce the general availability of Azure Analysis Services in the Microsoft Cloud for Government. Based on the proven analytics engine in SQL Server Analysis Services, Azure Analysis Services is an enterprise-grade OLAP engine and BI modeling platform, offered as a fully managed platform-as-a-service (PaaS). Azure Analysis Services enables developers and BI professionals to create BI Semantic Models that can power highly interactive and rich analytical experiences in BI tools and custom applications.

BI professionals can build and manage enterprise scale data models with SQL Server Data Tools, Visual Studio, and SQL Management Studio. Users can easily connect to Azure Analysis Services with powerful data visualization tools such as Power BI and Excel. In addition, third party BI tools, such as Tableau, are also supported.

BI professionals can use Power Query to import data from a variety of data sources including Azure SQL Database, Azure SQL Data Warehouse, and HDInsight, or on-premises data sources such as Microsoft SQL Server, Oracle, and Teradata. With support for large data models, Azure Analysis Services offers a robust approach to manage the security of the data model including Azure Active Directory identity management and row- and column-level security. Please use the following resources to learn more about Azure Analysis Services, get your questions answered, and give us feedback and suggestions about the product.

Overview
Documentation
Azure regions
MSDN forum
Ideas & suggestions

Join us at Microsoft Ignite from September 24 – 29, 2017, or at the SQL PASS Summit 2017 from October 31 – November 3, 2017 where you can hear directly from our engineers and product managers.
Quelle: Azure

September updates to the Azure Analysis Services web designer

In July, we released the Azure Analysis Services web designer. This new browser-based experience allows developers to start creating and managing Azure Analysis Services (AAS) semantic models quickly and easily. While SQL Server Data Tools (SSDT) and SQL Server Management Studio (SSMS) are still the primary tools for development, this new experience is intended to make modeling fast and easy. It is great for getting started on a new model or to do things such as adding a new measure to an existing model.

Today we are announcing the release of the September update which brings along with it some new features as well as several bug fixes. New features include:

Improved measure editing

We have redesigned the measure editor to allow you make changes to multiple measures and then save them all in one transaction instead of saving them one at a time.

Bulk renaming

Often when when you start creating a model, your table and column names match what the underlying database has and are not always user friendly. Now, you can select all the column and tables that you wish to rename and then select “edit multiple selection” in properties under name.

This will bring up the bulk rename dialog. Here you can rename all the columns and save in one transaction.

Auto arrange tables

Clicking “Arrange All” at the bottom of the table list will arrange all the tables in the diagram rather then adding them one at a time. The layout can then be saved for future use.

You can try the Azure Analysis web designer today by linking to it from a server in the Azure portal.

Submit your own ideas for features on our feedback forum. Learn more about Azure Analysis Services and the Azure Analysis Services web designer.
Quelle: Azure

Azure’s IoT Hub global expansion: Newly available in 8 regions

As part of Microsoft’s mission to enable more customers and organizations worldwide to achieve more, Azure IoT Hub is expanding within 4 countries across 3 continents, with availability now in Azure UK South, UK West, Canada Central, Canada East, India Central, India South, East US2 and Central US. These new regions give you more options for implementing IoT solutions in geographic locations that work best for your mission, passions, creative aspirations, and business!
 
Azure IoT Hub is a fully-managed service that enables reliable and secure bidirectional communications between millions of IoT devices and a solution back end.

Azure IoT Hub provides you with:

Secure communications by using per-device security credentials and access control
Multiple device-to-cloud and cloud-to-device hyper-scale communication options
Queryable storage of per-device state information and meta-data
Easy device connectivity with device libraries for the most popular languages and platforms

IoT Hub is the bridge between your devices and their solutions in the cloud, allowing them to store, analyze and act on that data in real-time. 

To learn more, visit IoT Hub documentation.
Quelle: Azure