The Intelligent Data Lake

Advanced Analytics and Cognitive Intelligence on Petabyte sized files and trillions of objects with Azure Data Lake

Today we are announcing the general availability of Azure Data Lake, ushering in a new era of productivity for your big data developers and scientists. Fundamentally different from today’s cluster-based solutions, the Azure Data Lake services enable you to securely store all your data centrally in a “no limits” data lake, and run on-demand analytics that instantly scales to your needs. Our state-of-the-art development environment and rich and extensible U-SQL language enable you to write, debug, and optimize massively parallel analytics programs in a fraction of the time of existing solutions.

Before Azure Data Lake

Traditional approaches for big data analytics constrain the productivity of your data developers and scientists due to time spent on infrastructure planning, and writing, debugging, & optimizing code with primitive tooling. They also lack rich built-in cognitive capabilities like keyphrase extraction, sentiment analysis, image tagging, OCR, face detection, and emotion analysis. The underlying storage systems also impose challenges with artificial limits on file and account sizes requiring you to build workarounds. Additionally, your developer’s valuable time is spent either optimizing the system or you end up overpaying for unused cluster capacity. The friction in these existing systems is so high, it effectively prevents companies from realizing the business transformation that Big Data promises.

With Azure Data Lake

With thousands of customers, Azure Data Lake has become one of the fastest growing Azure services. You can get started on this new era of big data productivity and scalability with the general availability of Azure Data Lake Analytics and Azure Data Lake Store.

Azure Data Lake Store – the first cloud Data Lake for enterprises that is secure, massively scalable and built to the open HDFS standard.  With no limits to the size of data and the ability to run massively parallel analytics, you can now unlock value from all your unstructured, semi-structured and structured data.
Azure Data Lake Analytics – the first cloud analytics job service where you can easily develop and run massively parallel data transformation and processing programs in U-SQL, R, Python and .NET over petabytes of data. It has rich built-in cognitive capabilities such as image tagging, emotion detection, face detection, deriving meaning from text, and sentiment analysis with the ability to extend to any type of analytics. With Azure Data Lake Analytics, there is no infrastructure to manage, and you can process data on demand, scale instantly, and only pay per job.
Azure HDInsight – the only fully managed cloud Hadoop offering that provides optimized open source analytic clusters for Spark, Hive, Map Reduce, HBase, Storm, Kafka and R-Server backed by a 99.9% SLA. Today, we are announcing the general availability of R Server for HDInsight to do advanced analytics and predictive modelling with R+Spark. Further, we are introducing the public preview of Kafka for HDInsight, now the first managed cluster solution in the cloud for real-time ingestion with Kafka.

Data Lake Store – A No Limits Data Lake that powers Big Data Analytics

Petabyte sized files and trillions of objects

With Azure Data Lake Store your organization can now securely capture and analyze all data in a central location with no artificial constraints to limit the growth of your business. It can manage trillions of files where a single file can be greater than a petabyte in size – this is 200x larger file size than other cloud object stores. Without the limits that constrain other cloud offerings, Data Lake Store is ideal for managing any type of data; including massive datasets like high-resolution video, genomic and seismic datasets, medical data, and data from a wide variety of industries. Data Lake Store is an enterprise data lake that can power your analytics today and in the future.

“DS-IQ provides Dynamic Shopper Intelligence by curating data from large amounts of non-relational sources like weather, health, traffic, and economic trends so that we can give our customers actionable insights to drive the most effective marketing and service communications. Azure Data Lake was perfect for us because it could scale elastically, on-demand, to petabytes of data within minutes. This scalability and performance has impressed us, giving us confidence that it can handle the amounts of data we need to process today and, in the future, enable us to provide even more valuable, dynamic, context-aware experiences for our clients.” -William Wu, Chief Technology Officer at DS-IQ

Scalability for massively parallel analytics

Data Lake Store provides massive throughput to run analytic jobs with thousands of concurrent executors that read and write hundreds of terabytes of data efficiently. You are no longer forced to redesign your application or repartition your data because Data Lake Store scales throughput to support any size of workload. Multiple services like Data Lake Analytics, HDInsight or HDFS compliant applications can efficiently analyze the same data simultaneously.

“Ecolab has an ambitious mission to find solutions to some of the world’s biggest challenges – clean water, safe food, abundant energy and healthy environments.  “Azure Data Lake has been deployed to our water division where we are collecting real-time data from IoT devices so we can help our customers understand how they can reduce, reuse, and recycle water and at the same address one of the world’s most pressing sustainability issues. We’ve been impressed with Azure Data Lake because it allows us to store any amount of data we require and also lets us use our existing skills to analyze the data. Today, we have both groups who use open source technologies such as Spark in HDInsight to do analytics and other groups that use U-SQL, leveraging the extensibility of C# with the simplicity of SQL.” -Kevin Doyle, VP of IT, Global Industrial Solutions at Ecolab

Data Lake Analytics – A On-Demand Analytics Job Service to power intelligent action

Start in seconds, scale instantly, pay per job

Our on-demand service will have your data developer and scientist processing big data jobs to power intelligent action in seconds. There is no infrastructure to worry about because there are no servers, VMs, or clusters to wait for, manage or tune. Instantly apply or adjust the analytic units (processing power) from one to hundreds or even thousands for each job. Only pay for the processing used per job, freeing valuable developer time from doing capacity planning and optimizations required in cluster-based systems that can take weeks to months.

“Azure Data Lake is instrumental because it helps Insightcentr ingest IoT-scale telemetry from PCs in real-time and gives detailed analytics to our customers without us spending millions of dollars building out big data clusters by scratch. We saw Azure Data Lake as the fastest and most scalable way we can get bring our customers these valuable insights to their business.” -Anthony Stevens, CEO Australian start-up Devicedesk

Develop massively parallel analytic programs with simplicity

U-SQL is an easy-to-use, highly expressive, and extensible language that allows you to write code once and automatically have it be parallelized for the scale you need. Instead of writing low-level code dealing with clusters, nodes, mappers, and reducers, etc., a developer writes a simple logical description of how data should be transformed for their business using both declarative and imperative techniques as desired. The U-SQL data processing system automatically parallelizes the code – enabling developers to control the amount of resources devoted to parallel computation with the simplicity of a slider. The U-SQL language is highly extensible and can reuse existing libraries written in a variety of languages like .NET languages, R, or Python. You can massively parallelize the code to process petabytes of data for diverse workload categories such as ETL, machine learning, feature engineering, image tagging, emotion detection, face detection, deriving meaning from text, and sentiment analysis.

“Azure Data Lake allows us to develop quickly on large sets of data with our current developer expertise. We have been able to leverage Azure Data Lake Analytics to capture and process large marketing audiences for our dynamic marketing platforms." -McPherson White, Director of Development at PureCars

Run Big Cognition at Petabyte Scale

Furthermore, we’ve incorporated the technology that sits behind the Cognitive Services API inside U-SQL directly. Now you can process any amount of unstructured data, e.g., text, images, and extract emotions, age, and all sorts of other cognitive features using Azure Data Lake and perform query by content. You can join emotions from image content with any other type of data you have and do incredibly powerful analytics and intelligence over it. This is what we call ‘Big Cognition’. It’s not just extracting one piece of cognitive information at a time, not just about understanding an emotion or whether there’s an object in an image, but rather it’s about joining all the extracted cognitive data with other types of data, so you can do some really powerful analytics with it. We have demonstrated this capability at Microsoft Ignite and PASS Summit, by showing a Big Cognition demo in which we used U-SQL inside Azure Data Lake Analytics to process a million images and understand what’s inside those images. You can watch this demo here and try it yourself using a sample project on GitHub.

Debug and optimize your Big Data programs with ease

With today’s tools, developers face serious challenges debugging distributed programs. Azure Data Lake makes debugging failures in cloud distributed programs as easy as debugging a program in your personal environment using the powerful tools within Visual Studio. Developers no longer need to inspect thousands of logs on each machine searching for failures. When a U-SQL job fails, logs are automatically located, parsed, and filtered to the exact components involved in the failure and available as a visualization. Developers can even debug the specific parts of the U-SQL job that failed to their own local workstation without wasting time and money resubmitting jobs to the cloud. Our service can detect and analyze common performance problems that big data developers encounter such as imbalanced data partitioning and offers suggestions to fix your programs using the intelligence we’ve gathered in the analysis of over a billion jobs in Microsoft’s data lake.

Developers do a lot of heavy lifting when optimizing big data systems and frequently overpay for unused cluster capacity. Developers must manually optimize their data transformations, requiring them to carefully investigate how their data is transformed step-by-step, often manually ordering steps to gain improvements. Understanding performance and scale bottlenecks is challenging and requires distributed computing and infrastructure experts. For example, to improve performance, developers must carefully account for the time & cost of data movement across a cluster and rewrite their queries or repartition their data. Data Lake’s execution environment actively analyzes your programs as they run and offers recommendations to improve performance and reduce cost. For example, if you requested 1000 AUs for your program and only 50 AUs were needed, the system would recommend that you only use 50 AUs resulting in a 20x cost savings.

“We ingest a massive amount of live data from mobile, web, IoT and retail transactions. Data Lake gives us the ability to easily and cost effectively store everything and analyse what we need to, when we need to. The simplicity of ramping up parallel processing on the U-SQL queries removes the technical complexities of fighting with the data and lets the teams focus on the business outcomes. We are now taking this a step further and exposing the powerful Data Lake tools directly to our clients in our software allowing them to more easily explore their data using these tools.” -David Inggs, CTO at Plexure

Today, we are also announcing the availability of this big data productivity environment in Visual Studio Code allowing users to have this type of productivity in a free cross-platform code editor that is available on Windows, Mac OS X, and Linux.

Azure HDInsight introduces fully managed Kafka for real-time analytics and R Server for advanced analytics

At Strata + Hadoop World New York, we announced new security, performance and ISV solutions that build on Azure HDInsight’s leadership for enterprise-ready cloud Hadoop. Today, we are announcing the public preview of Kafka for HDInsight. This service lets you ingest massive amounts of real-time data and analyze that data with integration to Storm, Spark, for HDInsight and Azure IoT Hub to build end-to-end IoT, fraud detection, click-stream analysis, financial alerts, or social analytics solutions.

We are also announcing the general availability of R Server for HDInsight. Running Microsoft R Server as a service on top of Apache Spark, developers can achieve unprecedented scale and performance with code that combines the familiarity of the open source R language and Spark. Multi-threaded math libraries and transparent parallelization in R Server enables handling up to 1000x more data and up to 50x faster speeds than open source R—helping you train more accurate models for better predictions than previously possible. Newly available in GA is the inclusion of R Studio Server Community Edition out-of-the-box making it easy for data scientists to get started quickly.

“Milliman is among the world’s largest providers of actuarial and related products and services, with offices in major cities around the globe. R Server for HDInsight, offers the ability for our clients to be able to forecast risk over much larger datasets than ever before, improving the accuracy of predictions, in a cost-efficient way. The familiarity of the R Programming language to our users, as well as the ability to spin up Hadoop and Spark clusters within minutes, running at unprecedented scale and performance, is what really gets me excited about R Server for HDInsight.” -Paul Maher, Chief Technology Officer of the Life Technology Solutions practice at Milliman

Enterprise-grade Security, Auditing and Support

Enterprise-grade big data solutions must meet uptime guarantees, stringent security, governance & compliance requirements, and integrate with your existing IT investments. Data Lake services (Store, Analytics, and HDInsight) guarantee an industry-leading 99.9% uptime SLA and 24/7 support for all Data Lake services. They are built with the highest levels of security for authentication, authorization, auditing, and encryption to give you peace-of-mind when storing and analyzing sensitive corporate data and intellectual property. Data is always encrypted; in motion using SSL, and at rest using service or user managed HSM-backed keys in Azure Key Vault. Capabilities such as single sign-on (SSO), multi-factor authentication and seamless management of your on-premises identity & access management is built-in through Azure Active Directory. You can authorize users and groups with fine-grained POSIX-based ACLs for all data in the Store or with Apache Ranger in HDInsight enabling role-based access controls. Every access or configuration change is automatically audited for security and regulatory compliance requirements.

Supporting open source and open standards

Microsoft continues to collaborate with the open source community reflected by our contributions to Apache Hadoop, Spark, Apache REEF and our work with Jupyter notebooks. This is also the case with Azure Data Lake.

Azure Data Lake Analytics uses Apache YARN, the central part of Apache Hadoop to govern resource management and deliver consistent operations. To lead innovations to YARN, Microsoft has been a primary contributor to improve performance, scale, and made security innovations.

“Hortonworks and Microsoft have partnered closely for the past 5 years to further the Hadoop platform for big data analytics, including contributions to YARN, Hive, and other Apache projects.  Azure Data Lake services, including Azure HDInsight and Azure Data Lake Store, demonstrate our shared commitment to make it easier for everyone to work with big data in an open and collaborative way.” -Shaun Connolly, Chief Strategy Officer at Hortonworks

Data Lake Store supports the open Apache Hadoop Distributed File System (HDFS) standard. Microsoft has also contributed improvements to HDFS such as OAuth 2.0 protocol support.

Leadership

Both industry analysts and customers recognize Microsoft’s capabilities in big data. Forrester recently recognized Microsoft Azure as a leader in their Big Data Hadoop Cloud Solutions. Forrester notes that leaders have the most comprehensive, scalable, and integrated platforms. Microsoft specifically was called out for having a cloud-first strategy that is paying off.

Getting started with Data Lake

Data Lake Analytics and Store is generally available today. R Server for HDInsight is also generally available today. Kafka for HDInsight is in public preview. Try it today individually or as part of Cortana Intelligence Suite to transform your data into intelligent action.

Read the overview, pricing and getting started pages of Data Lake Analytics or attend the free course
Read the overview, pricing and getting started pages of Data Lake Store
Read the R Server, Kafka overview and pricing pages of HDInsight

@josephsirosh
Quelle: Azure

Free local development using the DocumentDB Emulator plus .NET Core support

Azure DocumentDB is a fully managed, globally distributed NoSQL database service backed by an enterprise grade SLA that guarantees 99.99% availability. DocumentDB is a cloud born database perfect for the massive scale and low latency needs of modern applications, with guarantees of <10ms read latency and <15ms write latency at the 99th percentile. A single DocumentDB collection can elastically scale throughput to 10s-100s of millions of request/sec and storage can be replicated across multiple regions for limitless scale, with the click of a button. Along with the flexible data model and rich query capabilities, DocumentDB provides both tenant-controlled and automatic regional failover, transparent multi-homing APIs and four well-defined consistency models for developers to choose from.

Due to its flexible schema, rich query capabilities, and availability multiple SDK platforms, DocumentDB makes it easy to develop, evolve and scale modern applications. At the Connect() conference this week, we announced the availability of new developer tools to make it even easier to build applications on DocumentDB.

We&;re excited to introduce a public preview of the DocumentDB Emulator, which provides a local development experience for the Azure DocumentDB service. Using the DocumentDB Emulator, you can develop and test your application locally without an internet connection, without creating an Azure subscription, and without incurring any costs. This has long been the most requested feature on the user voice site, so we are thrilled to roll this out everyone that&039;s voted for it.
We are also pleased to announce the availability of the DocumentDB .NET Core SDK, which lets you build fast, cross-platform .NET web applications and services.

About the DocumentDB Emulator

The DocumentDB Emulator provides a high-fidelity emulation of the DocumentDB service. It supports identical functionality as Azure DocumentDB, including support for creating and querying JSON documents, provisioning and scaling collections, and executing stored procedures and triggers. You can develop and test applications using the DocumentDB Emulator, and deploy them to Azure at global scale by just making a single configuration change.

You can use any supported DocumentDB SDK or the DocumentDB REST API to interact with the emulator, as well as existing tools such as the DocumentDB data migration tool and DocumentDB studio.  You can even migrate data between the DocumentDB emulator and the Azure DocumentDB service.

While we created a high-fidelity local emulation of the actual DocumentDB service, the implementation of the DocumentDB emulator is different than that of the service. For example, the DocumentDB Emulator uses standard OS components such as the local file system for persistence, and HTTPS protocol stack for connectivity. This means that some functionality that relies on Azure infrastructure like global replication, single-digit millisecond latency for reads/writes, and tunable consistency levels are not available via the DocumentDB Emulator.

Get started now by downloading the DocumentDB Emulator to your Windows desktop.

About the DocumentDB .NET Core SDK

You can build fast, cross-platform web-apps and services that run on Windows, Mac, and Linux using the new DocumentDB .NET Core SDK. You can download the latest version of the .NET Core SDK via Nuget. You can find release notes and additional information in our DocumentDB .NET Core SDK documentation page.

Next Steps

In this blog post, we looked at some of the new developer tooling introduced in DocumentDB, including the DocumentDB Emulator and DocumentDB .NET Core SDK.

Get started coding now by downloading the DocumentDB Emulator to your desktop!
Get started building fast, cross-platform web apps with the new DocumentDB .NET Core SDK
Create a new DocumentDB account from the Azure Portal
Stay up-to-date on the latest DocumentDB news and features by following us on Twitter @DocumentDB or reach out to us on the developer forums on Stack Overflow

Quelle: Azure

Dive into Red Hat OpenShift Container Platform on Microsoft Azure

Join Microsoft in a joint webinar with Red Hat to explore how OpenShift can help you go to market faster.

Red Hat CCSP and Cloud Evangelist Nicholas Gerasimatos and Microsoft Azure Principal PM Boris Baryshnikov will demo how to deploy OpenShift in Azure. They’ll break down capabilities like source to image and running and deploying containerized applications so that you’re ready to get started right away.

This is a great way to learn about building, deploying, and managing containerized services and applications with RedHat OpenShift Container Platform on Microsoft Azure. You’ll get an overview of how OpenShift can help provide a secure, flexible, and easy-to-manage application infrastructure.

Plus, if you attend the webinar live on November 17, you can participate in a live Q&A with Nicholas and Boris to get answers to your specific questions. Register today!
Quelle: Azure

Azure Backup security capabilities for protecting cloud backups

More and more customers are hit with security issues. These security issues result in data loss and the cost of security breach has been ever increasing. Despite having security measures in place, organizations face cyber threats because of vulnerabilities exposed by multiple IT systems. All these and many such data points pose very strong questions – Are your organization’s IT applications and data safe? What is the cost of recovering from the huge business impact in case of cyber attacks? If you have a backup strategy in place, are your cloud backups secure?

“Currently, there are over 120 separate ransomware families, and we’ve seen a 3500% increase in cybercriminal internet infrastructure for launching attacks since the beginning of the year” points out a recent CRN Quarterly Ransomware Report. To mitigate the threat of such attacks, FBI recommends users to regularly backup data and to secure backups in the cloud. This blog talks about Security Features in Azure Backup that help secure hybrid backups.

Value proposition

Malware attacks that happen today, target production servers to either re-encrypt the data or remove it permanently. Also, if production data is affected, the network share as well as backups are also affected, which can lead to data loss or data corruption. Hence, there is a strong need to protect production as well as backup data against sophisticated attacks and have a strong security strategy in place to ensure data recoverability.

Azure Backup now provides security capabilities to protect cloud backups. These security features ensure that customers are able to secure their backups and recover data using cloud backups if production and backup servers are compromised.  These features are built on three principles – Prevention, Alerting and Recovery – to enable organizations increase preparedness against attacks and equip them with a robust backup solution.

Features

Prevention: New authentication layer added for critical operations like Delete Backup Data, Change Passphrase. These operations now require Security PIN available only to users with valid Azure credentials. 
Alerting: Email notifications are sent for any critical operations that impact availability of backup data. These notifications enable users to detect attacks as soon as they occur.
Recovery: Azure backup retains deleted backup data for 14 days ensuring recovery using any old or recent recovery points. Also, minimum number of recovery points are always maintained such that there are always sufficient number of points to recover from.

Getting started with security features

To start leveraging these features, navigate to recovery services vault in the Azure portal and enable them. The video below explains how to get started by enabling these features and how to leverage them in Azure Backup.

Related links and additional content

Learn more about Azure Backup Security Features
Getting started with Recovery Services vault
Need help? Reach out to Azure Backup forum for support
Tell us how we can improve Azure Backup by contributing new ideas and voting up existing ones
Follow us on Twitter @AzureBackup for latest news and updates

Quelle: Azure

Announcing integration of Azure Backup into VM management blade

Today, we are excited to announce the ability to seamlessly backup virtual machines in Azure from the VM management blade using Azure Backup. Azure Backup already supports backup of classic and Resource Manager VMs (Windows or Linux) using Recovery Services vault, running on standard storage or on Premium Storage. We also announced backup of VMs encrypted using ADE (Azure Disk Encryption), a couple of weeks back. With this announcement, we are brining the backup experience closer to VM management experience, giving ability to backup VMs directly from VM management blade. This announcement makes Azure the public cloud providing a backup experience natively integrated into VM management.

Azure Virtual Machines provide a great value proposition for different kind of workloads that want to harness the power of cloud. It provides a range of VMs offering basic capabilities to running powerful GPUs to meet customer demands. Backing VMs against accidental deletions and corruptions resulting from human errors is a critical capability for enterprise customers as well as small and medium scale customers who are deploying their production workloads in the cloud. This integration makes meeting that requirement seamless with a simple two-step backup configuration.

Value proposition

Azure Backup’s cloud-first approach to backup puts following cloud promises into action:

Freedom from infrastructure: No need to deploy any infrastructure to backup VMs
Cloud Economics: Customers can leverage highly available, scalable and resilient backup service at a cost-effective price
Infinite scale: Customers can protect multiple VMs in one go or one at a time using a Recovery Services vault
Pay as you go: Simple Backup pricing makes it easy to protect VMs and pay for what you use

Features

With the integration of Azure Backup into VM management blade, customers will be able to perform following operations directly from VM management blade:

Configure Backup using simple two-step configuration.
Trigger an on-demand backup for backup configured VMs
Restore a complete VM, all disks or a file-folders inside the VM( In preview for Windows VMs) from backup data
View recovery points corresponding to configured backup schedule

Get started

To get started,select a virtual machine from the Virtual machines list view. Select Backup in the Settings menu.

Create or select a Recovery Services vault: A recovery Services vault stores backups separate from customer storage account to guard from accidental deletions.
Create or Select a Backup Policy: A backup policy specifies the schedule at which backups will be running and how long to store backup data.

By default a vault and a policy is selected to make this experience even smoother. Customers have the flexibility to customize this as per needs.

Related links and additional content

Learn more about Azure Backup 
Want more details? Check out Azure Backup documentation
Sign up for a free Azure trial subscription
Need help? Reach out to Azure Backup forum for support
Tell us how we can improve Azure Backup by contributing new ideas and voting up existing ones.
Follow us on Twitter @AzureBackup for the latest news and updates

Quelle: Azure

Azure Security Center now available in UK

We’re pleased to announce Azure Security Center is now available in the UK. Azure Security Center helps protect your Azure resources by providing visibility into security across all your subscriptions, helping you find and fix vulnerabilities, and alerting you if threats are detected.

Learn more about Azure Security Center and our approach to data security, or review the quick start.
Quelle: Azure

Azure N-Series: General availability on December 1

We’re excited today to announce general availability of the Azure N-Series of Azure Virtual Machines as of December 1, 2016. Azure N-Series virtual machines—powered by NVIDIA GPUs—provide customers and developers around the globe access to industry-leading accelerated computing and visualization experiences via the Azure public cloud. We’re additionally excited to announce availability in East US, West Europe, and South East Asia at GA joining the current South Central US region that will be available at GA. We’ve had thousands of customers participate in the N-Series preview since we launched it back in August. We’ve received fantastic feedback, especially around the enhanced performance, and the work Microsoft and NVIDIA have done together to make this a turnkey experience for cloud adopters. During the preview, customers exercised these unique capabilities on a wide range of potential breakthrough scenarios including artificial intelligence (AI), 3D visualization and interactivity, highly computational medical research, and beyond. Azure NC virtual machines—GPU compute Azure NC-based instances are powered by NVIDIA Tesla K80 GPUs and provide the compute power required to accelerate the most demanding high-performance computing (HPC) and AI workloads. Customers can now use these instances to run deep learning training jobs, HPC simulations, rendering, real-time data analytics, DNA sequencing, and many more CUDA (Compute Unified Device Architecture)–accelerated tasks. Additionally, customers have the option to utilize RDMA (Remote Direct Memory Access) over InfiniBand for scaling jobs across multiple instances. Using InfiniBand between instances provides close to bare-metal performance when scaling out to tens, hundreds, or even thousands of GPUs across hundreds of nodes—allowing customers to submit tightly coupled jobs like Microsoft Cognitive Toolkit (CNTK)–based training for natural language processing, image recognition, and object detection. City of Hope is an independent research and treatment center for cancer, diabetes, and other life-threatening diseases and is currently ranked as one of “America’s Best Hospitals.” City of Hope is using HPC to bring together physical and computer sciences, and mathematics, to develop new and ground-breaking methods to model biological processes. Further, these methods are used to predict and analyze large-scale data from their high-throughput instrumentation. Using Azure NC24 instances, Dr. Vaidehi and her team are able to rapidly scale their GPU cluster footprint to study the dynamics of proteins in a few days—as opposed to a month using traditional CPU-based machines. The addition of K80 GPUs greatly accelerates the team’s research output, thereby making drug design much more efficient. Algorithmia is an open marketplace for algorithms and algorithm development, making state-of-the-art algorithms accessible and discoverable by anyone. Algorithmia is the largest marketplace for algorithms in the world, with more than 30,000 developers leveraging more than 2,500 custom algorithms. The Algorithmia Marketplace includes research contributions from MIT, University of Washington, Carnegie Mellon University, University of California-Berkeley, Caltech, the University of Texas at Austin, University of Tokyo, and University of Toronto, among others. “With Azure’s new on-demand GPU instances, we’re able to provide teams and organizations with access to GPU-accelerated deep learning models and algorithms,” Diego Oppenheimer, Algorithmia CEO, says. “The flexibility of the Azure infrastructure allows us to scale to meet our users needs.” Azure NV virtual machines—GPU visualization Azure NV-based instances are powered by NVIDIA Tesla M60 GPUs and provide NVIDIA GRID capabilities on demand. Scientists, designers, and engineers can now utilize these new instances for running hardware-accelerated workstation applications, designing the next concept car, or creating the next blockbuster movie. These instances support applications utilizing both DirectX and OpenGL. Frame is an enterprise cloud platform that allows any Windows software to be run in the cloud and delivered to any browser. Enterprises and educational institutions use Frame to deliver graphics-intensive 2D and 3D applications as software as a service (SaaS) to any device, anywhere. Azure N-Series virtual machines enable Frame to deliver a high-end workstation class experience to users on any connected device. For example, a designer can create a 3D CAD model from a laptop using a single NV6 instance and instantly switch to an NV24;quad-GPU instance, dramatically speeding up the time it takes to run a complex simulation. Catering to some of the most demanding computer users on the planet, Frame knows that every extra ounce of performance translates to increased productivity and higher quality. The new Azure N-Series brings fast cutting edge NVIDIA GPUs, the latest CPUs, and more memory per instance than anything previously available in the cloud. And with on-demand access, users don’t have to wait for the next hardware refresh of their local workstation. They can even access this performance from their laptop or mobile device for a seamless and performant on-the-road experience. “Azure N-Series powered by NVIDIA’s cutting edge GPUs couldn’t come at a better time,” said Nikola Bozinovic, CEO, Frame. “We’re seeing an incredible interest in virtualization of workstation-class workloads. The combination of Frame and the new N-Series instances on Azure provides our customers with today’s most advanced graphics platform in the cloud.” Pricing for general availability of Azure N-Series takes effect on December 1, 2016, in supported regions in North America, Europe, and Asia. Here is the current pricing.   NC6 NC12 NC24 NC24r Cores 6 12 24 24 GPU 1 x K80 GPU 2 x K80 GPUs 4 x K80 GPUs 4 x K80 GPUs Memory 56 GB 112 GB 224 GB 224 GB Disk 380 GB SSD 680 GB SSD 1.44 TB SSD 1.44TB SSD Network Azure Network Azure Network Azure Network InfiniBand     NV6 NV12 NV24 Cores 6 12 24 GPU 1 x M60 GPU 2 x M60 GPUs 4 x M60 GPUs Memory 56 GB 112 GB 224 GB Disk 380 GB SSD 680 GB SSD 1.44 TB SSD Network Azure Network Azure Network Azure Network NVIDIA GRID Yes Yes Yes With today’s announcements, we’re taking a major step forward in our mission to make every organization and individual more productive via access to high performance and accelerated computing in the cloud. -The Azure Big Compute Team Learn more: Microsoft and OpenAI partnership announcement Azure Virtual Machines pricing page VM documentation
Quelle: Azure

Choose Azure, like OpenAI did, to power your compute intensive workloads and add intelligent interactions to all your apps

Today, Microsoft announced a new partnership with OpenAI, a nonprofit AI research organization co-founded by Elon Musk, Sam Altman, Greg Brockman and Ilya Sutskever. Together we hope to make significant contributions to advance the field of AI, and make it more accessible to every developer and every organization. Read more about the partnership and why OpenAI chose Microsoft Azure as the primary cloud platform from Harry Shum, Microsoft Executive Vice President of AI and Research and Sam Altman, co-founder, OpenAI.

Extending our commitment to AI, and to help developers and organizations run high performance workloads and build intelligent applications using the power of the cloud, here’s a drill down into a few of the Azure innovations we announced today:

Azure N-Series Virtual Machines will be generally available starting in December. Organizations like OpenAI, Esri, City of Hope and Frame are already using the industry-leading accelerated computing and visualization experiences offered by Azure N-Series VMs. These virtual machines powered by NVIDIA® GPUs are designed for the most intensive compute workloads, including deep learning, simulations, rendering and the training of neural networks. They also enable high-end visualization capabilities to allow for workstation and streaming scenarios by utilizing the NVIDIA GRID in Azure.
Azure Bot Service, the first public cloud bot service, is now available in preview. Azure Bot Service is powered by the Microsoft Bot Framework and serverless compute in Azure. Starting today you can build, connect, deploy and manage intelligent bots that interact naturally wherever your users are talking – from your app or website to text/sms, Slack, Facebook Messenger, Skype, Teams, Kik, Office 365 mail and other popular services. Bots run on Azure Functions, a serverless environment, so that they scale based on demand and you pay only for the resources your bots consume.
Azure Functions offers serverless compute on Azure and is generally available today. Azure Functions allows developers to implement code triggered by events occurring in Azure or third party services as well as on-premises systems. Developers can use Azure Functions to build HTTP-based API endpoints accessible by a wide range of applications, mobile and IoT devices. Functions can scale on-demand so you pay only for the resources you consume. Azure Functions uniquely offers integrated Visual Studio tooling support, out of the box Azure and third-party service bindings and continuous deployment to improve developer productivity.

These investments and innovations on Azure are designed to bring you the power of the most intelligent cloud so you can address the evolving needs of your customers.  We look forward to hearing your feedback on these latest releases.

To learn more about how to build intelligent apps, tune into Connect (); on Wednesday.
Quelle: Azure

App Service on Linux now supports Containers and ASP.NET Core

In September, 2016 we announced the preview of Azure App Service on Linux making it easier for PHP and Node.js developers to run their web applications natively on Linux and thus making it easier to work directly with .htaccess files or avoid using modified extensions or code.

App Service provides a fully managed experience for web and mobile developers so they can quickly create applications and services for their business without having to focus resources on the day to day management of the web server and operating system. This includes streamlined deployment abilities with deployment slots, custom domains, SSL configuration, continuous deployment and horizontal and vertical scaling.

Today, we are building on the management ease and agility of App Service on Linux by enabling developers to bring their own Docker formatted container images and extending support to ASP.NET Core.

Bring your own Docker formatted container

App Service provides default Linux containers for versions of Node.js, PHP and ASP.NET Core that make it easy to quickly get up and running on the service.  With our new container support, developers can now create customized containers based on the defaults.  For example, developers could create a container with specific builds of Node.js and pm2 that differ from the default versions provided by the service.  This enables developers to use new or experimental framework versions that are not available in the default containers.  Developers can upload their containers to the Azure Container Registry, Docker Hub, or a private container registry.

ASP.NET Core Support

ASP.NET Core is a lean and composable framework for building web and cloud applications. ASP.NET Core is fully open source and available on GitHub. ASP.NET Core is available on Windows, Mac, and Linux. App Service today introduces a new base container image to support ASP.NET core. This now gives developers the flexibility to run ASP.NET core on App Service on Windows and Linux. The main advantage of running it on the Linux option is that you can deploy to App Service using containers.

deploy them in containers to Azure App Service.

Getting started

To get started, sign in or start a free trial and create an App Service instance. More information available in the App Service documentation. We would love to hear your feedback on this preview so please visit https://feedback.azure.com/ to get it in the hands of our team.
Quelle: Azure

Microsoft Azure Announces Industry’s First Cloud Bot-as-a-Service

We are thrilled by the surprisingly large number of bots, tools, channels and the overall vibrancy and excitement within the developer community since our launch of Bot Framework on Github in March. The democratization of AI and the advancement of conversational canvasses are well underway.

For software developers, creating a conversational experience requires a shift in the way we design and build software. It turns out, it’s pretty difficult to do well. Conversation is inherently fluid and tangents are the norm. This isn’t the case for a traditional app, where the design center has (mostly) been about task completion accomplished through a graphical user interface (GUI).

In order to make it easier to get started creating a bot, today we are excited to announce a new service which is built using the Bot Framework and Azure Functions: Azure Bot Service.

Azure Bot Service is the first public cloud bot-service powered by the Microsoft Bot Framework and serverless compute in Microsoft Azure. With this cloud service, you can build, connect, deploy and manage intelligent bots that interact naturally wherever your users are talking – from your app or website to text/sms, Slack, Facebook Messenger, Skype, Teams, Kik, Office 365 mail and other popular services. Bots run on Azure Functions, a serverless environment, so they scale based on demand and you pay only for the resources your bots consume.

Accelerate development cycles

With the Azure Bot Service, developers can accelerate building intelligent bots using Microsoft Cognitive Services by working in an integrated developer experience designed for bot development. You can get started quickly with out-of-the-box templates such as the basic bot, Language Understanding Intelligent Service bot, form bot, and proactive bot. You can build bots in C# or Node.js directly in the browser and try it out with the companion Web Chat control. Or you can use the IDE and code editor of your choice under the covers;  the Azure Bot Service uses an Azure Resource Manager (ARM) template to create an Azure Function App for easy deployment and automatically registers your bot in the Microsoft Bot Framework, which provides a public bot directory to increase the exposure of your bot.

Enrich your bots

The Azure Bot Service includes built-in configurable channels to improve your customer interactions and increase your reach to more customers. You can easily build bots that work from your apps or websites and across popular channels such as Slack, Facebook Messenger, Skype, Teams, Web chat, Email, GroupMe, Kik, Telegram and Twilio. Through the Direct Line support you can interact with Microsoft Bot Framework features through a REST API, so you can bring the conversation experience to your app or website, provide control over your branded experience and reach  the many channels Bot Framework supports. 

Plug in Microsoft Cognitive Services to enable your bots to see, hear, interpret and interact in more human ways. You  can enhance the capabilities of your bots with a just few lines of code using many other Azure services, too. For example, Azure Search makes it easy to add powerful and sophisticated search capabilities to your bots. Check out this GitHub sample to see it in action.

Boost operational efficiency

Azure Bot Service uses Azure Functions to give you the operational agility to run and handle scale as your bots grow in popularity and you only pay for what you use. Moreover, you also don’t need to worry about provisioning or managing servers that run your bots. Patching and infrastructure maintenance is handled for you – you focus on writing code.

Integrated continuous deployment support means that you can use your preferred DevOps tool chain and commit code to source control systems such as Visual Studio Team System, GitHub, and BitBucket and then automatically publish the code into Azure to continuously improve your bot.

Get started

Get started today by signing up for an Azure Trial and creating your first bot! Please give your feedback at UserVoice and visit our documentation.
Quelle: Azure