November 2016 Leaderboard of Database Systems contributors on MSDN

Thank you for your positive feedback on our first leaderboard published last month. Congratulations to the Top-10 contributors on the November 2016 leaderboard!

Hilary Cotter continues his Rank-1 position from last month. Five of this month’s Overall Top-10 featured in last month’s Overall Top-10 as well. Five others are new entrants.

The following continues to be the points hierarchy (in decreasing order of points):

For questions related to this leaderboard, please write to leaderboard-sql@microsoft.com

Happy Holidays everyone!
Quelle: Azure

Microsoft Azure Achieves HITRUST CSF Certification

We’re excited and proud to announce that Microsoft Azure is one of the first hyper-scale platforms to become HITRUST CSF Certified.  The HITRUST Certification is the most widely recognized security accreditation in the healthcare industry.  It incorporates healthcare specific security, privacy and regulatory requirements from existing regulations such as HIPAA/HITECH, PCI, ISO 27001 and MARS-E as well as industry best practices.  This certification provides a single framework that is tailored to health organizations to evaluate the Azure environment.  

“HITRUST has been working with the industry to ensure the appropriate information protection requirements are met when sensitive health information is accessed or stored in a cloud environment. By taking the steps necessary to obtain HITRUST CSF Certified status, Microsoft Azure is distinguished as an organization that people can count on to keep their information safe,” said Ken Vander Wal, Chief Compliance Officer, HITRUST.

To empower every person and every organization on the planet to achieve more, we must ensure that large industries, like healthcare, are unblocked from regulatory, privacy, and security obstacles which could prevent them from being able to move their IT workloads and operations to the cloud.  Achieving HITRUST CSF Certification is an awesome example of Azure removing yet another hurdle so a large and important aspect of our global society, i.e., healthcare industries and professionals, can confidently and securely leverage the services, efficiencies, and power of Microsoft Azure.

The following is a complete list of Azure Services included in this HITRUST certification:

Identity and Access Management

Azure Active Directory
Rights Management
Multi-Factor Authentication

Compute

Virtual Machines
Cloud Services
Batch

Networking

Application Gateway
VPN Gateway
Express Route
Virtual Network
Load Balancer
Traffic Manager

Web & Mobile

Web Apps
Mobile Apps
Notification Hubs

Analytics

HDInsight

Management & Security

Azure Key Vault
Scheduler
Azure Management Portal
Azure Classic Portal

Media & CDN

Media Services

Data & Storage

Redis Cache – including Premium
SQL Database
SQL Data Warehouse
SQL Virtual Machines
Storage- Blob, Table, Queue, Files, and Disks – including Premium

Hybrid Integration

Service Bus
Workflow

Quelle: Azure

End-to-end cloud experiences for developers at Node.js Interactive North America

Over 700 developers, DevOps engineers and other Node.js enthusiasts met in Austin last month for Node.js Interactive North America. Microsoft is proud to have been a platinum sponsor and I’m particularly thrilled to have keynoted the event, kicking off a great week of community engagement: from meetups to workshops, sessions covering from debugging to robots and content that highlights our work in technologies like TypeScript or Node-Chakracore.

For those of us working with Node.js at Microsoft, this event is an important milestone in our own Node.js journey. Today we support Node.js broadly in Microsoft Azure, providing developers with architectural choices to build applications on the infrastructure, with VM Scale Sets and Linux-based Node.js stacks, on container infrastructure, with Azure Container Service, on our PaaS, with support for Node.js for Web, IoT, mobile or serverless applications, and through 3rd party Node.js IaaS and PaaS solutions in Azure.

Whatever choice you make, we add value to those investments by providing Node.js SDKs for multiple Azure services and client drivers for many of our data solutions, including MongoDB protocol support in DocumentDB. And what’s more important – we value developer productivity in whatever platform you choose and that’s why we continue investing in great DevOps tooling and a redefined coding experience with Visual Studio Code, providing IntelliSense & debugging, powerful extensibility and out-of-the-box Azure support that works everywhere.

On my keynote at Node.js Interactive North America, I gave attendees some perspective on what our vision for Node.js is and where we are going, including walking participants through how those experiences look and feel like for Node.js developers in the cloud using solutions like Docker support in Azure App Service, and covering debugging use cases (even if you couldn’t attend the event, you can follow along on GitHub)

Developers around the globe are adopting this powerful combination of open source, Node.js and the cloud at a rapid pace. Enterprise adoption continues growing, and is already popular in polyglot scenarios, including amongst Microsoft customers.

In fact, during my keynote I shared with the community our perspective on how and why organizations are adopting Node.js in the cloud, based on our experience as an open and flexible global cloud platform where over 60% of Marketplace solutions integrate open source, nearly 1 in 3 VMs run Linux, the number of customers running containers is quadrupling and over a dozen Node.js solutions coexist in Marketplace, signaling we can expect the growth of Node.js in Azure to continue.

Our Node.js vision focuses on developer productivity, flexible and powerful cloud deployments and production operations & diagnostics that support the enterprise business needs. Some of our products like TypeScript, Visual Studio or Microsoft Azure are helping customers bring this vision to reality today, and we will continue investing in this portfolio as well as in the community and ecosystem to ensure that we can maintain a learning loop that empowers developers to do more with Node.js in the cloud.

For example, Microsoft is part of the Node Foundation (which, by the way, is looking for your input) and an active participant of the CTC, TSC & a number of WGs contributing on areas like TypeScript, Chakra and more, and we learn a lot through our developer support team focused on open source experiences, including Node.js.

As we close a great year of Node.js momentum in the cloud, make sure you check out all the content from Node.js Interactive North America on YouTube, explore the end-to-end Node.js demo and get started with Node.js in Azure.
Quelle: Azure

Exercise your greatest power as a developer

To paraphrase Daniel Webster (American Statesman, 1782-1852), “If all my developer skills were taken from me with one exception, I would choose to keep the power of learning like a developer, for by it I would soon regain all the rest.”

To be a good developer is to be a perpetual learner; it is essential for survival. The problems you solve are always changing, but the programming languages, platforms, hardware, tools and technologies you use to solve them are all moving targets. Even the foundation of the agile development processes most developers follow has the notion that you must continue to learn to be more effective. After all, if you’re not learning something new, you’re either falling behind or getting left behind.

Once again, it’s that time of year when many office buildings get a bit quieter, parking and traffic get a little easier, and many production systems go into “hands-off” lockdown for fear of a breaking change ruining the holidays. This slow period provides an opportunity to step back from the stuff you work on every day and learn something new that perhaps you haven’t had a chance to try yet.

Fortunately, there are a lot of great resources available for you to learn new skills in Azure. Below are ten areas to explore that go beyond the familiar cloud workhorses (such as virtual machines and storage) and focus on capabilities related to IoT, containers, microservices, serverless computing, bots, artificial intelligence, and more. Each has a list of resources to give you a quick intro, and additional content to help you dive deeper.

If you’re new to Microsoft Azure, you may want to start with the Get Started Guide for Azure Developers.

1. Internet of Things (IoT)

Anything can be a connected device these days. Azure IoT Suite and the Azure IoT services make it easy for you to connect devices to the cloud, not only to collect the telemetry data they generate but also to do things  in your apps based on that data. You can also get Azure IoT-certified starter kits for some DIY time building your own devices.

Quick:

Watch: Introduction to Azure IoT Suite and IoT Hub for developers
Read: Developer&;s introduction to Azure IoT

More time:

Watch: Developer’s Guide to Connecting Devices to Azure IoT
Read: Azure IoT Developer Center
Do: Azure IoT Starter Kits
Learn: Getting Started with the Internet of Things (IoT)

2. Functions

Looking for a way to build microservices or get tasks done easily in your apps, such as processing data, integrating systems, or providing simple APIs? Azure Functions offers serverless compute for composing event driven solutions. You only need to write the code that solves a specific need and then not worry about building out an entire application or the infrastructure required to run it.

Quick:

Watch: Azure Functions and the Evolution of Web Jobs
Read: Azure Functions Overview

More time:

Watch: Developing Azure Functions
Read: What is Serverless Computing? Exploring Azure Functions
Do: Azure Functions Challenge
Learn: Using Azure Functions to Build Nanoservices

3. Cognitive Services

Artificial intelligence is no longer science fiction and can be used in your applications today. Cognitive Services is a growing collection of machine learning APIs, SDKs, and services you can use to make your applications more intelligent, engaging, and discoverable. Add smart features to your applications and bots, such as emotion and video detection; facial, speech, and vision recognition; and speech and language understanding.

Quick:

Watch: Get started with Microsoft Cognitive Services
Read: How Uber is using driver selfies to enhance security, powered by Microsoft Cognitive Services

More time:

Watch: Microsoft Cognitive Services: Give Your Apps a Human Side
Read: Face and Emotion Recognition in Xamarin.Forms with Microsoft Cognitive Services
Do: Getting started with the Text Analytics APIs to detect sentiment, key phrases, topics and language

4. Bot Service

Looking to improve customer engagement in a new or existing application? Azure Bot Service enables rapid, intelligent bot development, bringing together the power of the Microsoft Bot Framework and Azure Functions. Build, connect, deploy and manage bots that interact naturally wherever your users are talking. Allow your bots to scale based on demand, and you pay only for the resources you consume.

Quick:

Watch: Introducing the Azure Bot Service
Read: Azure Bot Service Overview

More time:

Watch: Introducing the Azure Bot Service
Read: Bot Framework – Solving Business Problems with the Microsoft Bot Framework
Do: Create Your First Bot
Learn: Getting Started with Bots

5. Container Service

If you have been building container based applications and now need to get them into production, check out Azure Container Service. This open sourced service supports popular container orchestration engines such as Kubernetes, Docker Swarm, and DC/OS. Azure Container Service removes a lot of complexity to help you manage clusters of virtual machines to run your containerized applications.

Quick:

Watch: Azure Container Service
Read: Azure Container Service 101

More time:

Watch: Building Applications Using the Azure Container Service
Read: Azure Container Service Introduction
Do: Deploy an Azure Container Service cluster
Learn: Deploying Dockerized Apps to the Azure Container Service

6. Logic Apps

Azure Logic Apps help you automate workflows and integrate applications and services. Nearly a hundred out of the box connectors for all your favorite services  make it easy to set up workflows and accomplish tasks between connected services. Using a visual designer in the Azure portal or Visual Studio, you can compose the logic (and it works great with Azure Functions) that act based on events.

Quick:

Watch: Enterprise Integration with Azure Logic Apps
Read: What are Logic Apps?

More time:

Watch: Getting started with Azure Logic Apps
Read: Microsoft Azure – Enterprise Application Integration Using Azure Logic Apps
Do: Build a Logic App in a Free Sandbox Experience
Learn: Mastering Azure App Service, Part 1: Building Logic Apps

7. API Apps

Azure API Apps make it easy for you to build and consume cloud-hosted REST APIs. Azure provides a marketplace of APIs where you can publish your API, or find existing APIs to use in your applications. You can also generate cross-platform client SDKs for the hosted API using Swagger.

Quick:

Watch: Azure API Apps 101 with Guang Yang
Read: API Apps Overview

More time:

Watch: RESTful Web Services: ASP.NET and Azure API Apps
Do: Get started with API Apps, ASP.NET, and Swagger in Azure App Service tutorial
Learn: Mastering Azure App Service, Part 2: Building Azure API Apps

8. DocumentDB

Sometimes a traditional relational database is not the best choice for your data. DocumentDB is a fully managed and scalable NoSQL database service that features SQL queries over object data. You can also access DocumentDB by using existing MongoDB drivers, which enables you to use DocumentDB with apps written for use with MongoDB.

Quick:

Watch: DocumentDB: overview and offline development experience
Read: What is DocumentDB?

More time:

Watch: Delivering Applications at Scale with DocumentDB, Azure&039;s NoSQL Document Database
Read: An Overview of Microsoft Azure DocumentDB
Do: NoSQL tutorial: Build a DocumentDB C# console application
Learn: Azure DocumentDB: Planet-Scale NoSQL

9. Mobile Center

If you’re already working on a mobile app, you should learn more about mobile DevOps with Visual Studio Mobile Center, which brings together our mobile developer services, including HockeyApp and Xamarin Test Cloud. Currently in Preview, Visual Studio Mobile Center provides cloud-powered lifecycle services for mobile apps, including continuous integration, test automation, distribution, crash reporting, and application analytics. The Mobile Center SDK currently supports Android, iOS, Xamarin, and React Native apps with a roadmap to support more over the coming months.

Quick:

Read: Introducing Visual Studio Mobile Center (Preview)

More time:

Watch: Visual Studio Mobile Center with Thomas Dohmke
Read: Mobile DevOps – Exploring Visual Studio Mobile Center

10. Application Insights

Rich application metrics help you deliver and continuously improve applications for your customers. Application Insights is an extensible application performance management service that’ll help you detect, triage, and diagnose issues in web apps and services. You can integrate it into your DevOps pipeline and use it to monitor the usage and experience of your apps.

Quick:

Watch: Monitor Web Apps using Azure Application Insights
Read: General availability of Azure Application Insights

More time:

Watch: Advanced Analytics with Application Insights
Read: What is Application Insights?
Do: Interactive data analytics demo
Learn: Web and Data Application Development with Visual Studio 2017 and Azure (Module 1)

I hope you found an area that caught your interest and you learn something new from the content provided.

Happy holidays!
Quelle: Azure

General Availability: Larger Block Blobs in Azure Storage

Azure Blob Storage is a massively scalable object storage solution capable of storing and serving tens to hundreds of petabytes of data per customer across a diverse set of data types including media, documents, log files, scientific data and much more. Many of our customers use Blobs to store very large data sets, and have requested support for larger files. The introduction of larger Block Blobs increases the maximum file size from 195 GB to 4.77 TB. The increased blob size better supports a diverse range of scenarios, from media companies storing and processing 4K and 8K videos to cancer researchers sequencing DNA.  

Azure Block Blobs have always been mutable, allowing a customer to insert, upload or delete blocks of data without needing to upload the entire blob. With the new larger block blob size, mutability offers even more significant performance and cost savings, especially for workloads where portions of a large object are frequently modified. For a deeper dive into the Block Blobs service including object mutability, please view this video from our last Build Conference. The REST API documentation for Put Block and Put Block List also covers object mutability. 

We have increased the maximum allowable block size from 4 MB to 100 MB, while maintaining support for up to 50,000 blocks committed to a single Blob. Range GETs continue to be supported on larger Block Blobs allowing high speed parallel downloads of the entire Blob, or just portions of the Blob. You can immediately begin taking advantage of this improvement in any existing Blob Storage or General Purpose Storage Account across all Azure regions. 

Larger Block Blobs are supported by the most recent release of the .NET Client Library (version 8.0.0), with support for Java, Node.js and AzCopy rolling out over the next few weeks. You can also directly use the REST API as always. Larger Block Blobs are supported by REST API version 2016-05-31 and later. There is nothing new to learn about the APIs, so you can start uploading larger Block Blobs right away.  

This size increase only applies to Block Blobs, and the maximum size of Append Blobs (195 GB) and Page Blobs (1 TB) remains unchanged. There are no billing changes. To get started using Azure Storage Blobs, please see our getting started documentation, or reference one of our code samples.
Quelle: Azure

Flashback 2016 – Highlights from Azure SQL Data Warehouse

Earlier this year we announced the general availability of Azure SQL Data Warehouse, offering a SQL-based fully managed, petabyte-scale cloud solution for data warehousing. SQL Data Warehouse is highly elastic, enabling you to provision in minutes and scale capacity in seconds. You can scale compute and storage independently, allowing you to burst compute for complex analytical workloads or scale down your warehouse for archival scenarios, and pay based off what you’re using instead of being locked into predefined cluster configurations. Unlike other cloud data warehouse services, SQL Data Warehouse offers the unique option to pause compute, giving you even more freedom to better manage your cloud costs. We are excited to see the customer momentum across industries and we have continued to bring new features to even further enhance the customer experience. Leading digital advertising company MediaBrix uses Azure SQL Data Warehouse to run analysis across billions of rows of data – drilling deep down into 30 TB of data. SQL Data Warehouse not only helps MediaBrix gain fast insights, but it also hooks up to technologies like Azure Machine Learning to create predictive models, change data in real time, and deliver the right ads to the right people. Learn more about how MediaBrix gets answers from data with Microsoft Azure platform. As 2016 ends, here are some of the highlights from a memorable year. General availability now across 23 regions world wide Since announcing general availability in 14 regions in July, we have now extended to 9 additional regions bringing the total to 23 – more than any other major cloud provider. Now customers across following regions can use Azure SQL Data Warehouse: North Europe, North Central US, Central US, East US, East US 2, South Central US, West Central US, West US, West US 2, Canada Central, Canada East, West Europe, Germany Central, Germany Northeast, East Asia, Southeast Asia, Australia Southeast, Central India, South India, China East, China North, Japan East, and Brazil South. Industry Leading Performance for Analytic Queries Azure SQL Data Warehouse is powered by SQL Server underneath and with GA we went live with SQL Server 2016. The industry leading SQL Server 2016 columnstore implementation is at the core of serving analytic queries in SQL Data Warehouse. We significantly improved the data compression and segment elimination to reduce the IO when processing large number of rows. Batch mode execution on top of columnstore speeds up queries by orders of magnitude. With SQL Server 2016, we added batch mode support for common analytic operators, such as order by and windowing aggregates. In addition, we now support aggregate pushdown and string predicate pushdown to the scan node. “We can tell customers who’s actually consuming their advertising. For example, we might say that to target women aged 24 to 35 who have children, they’ll need to do so between 6 and 8 AM on the East Coast, preferably in Pennsylvania or New Jersey. It’s mind-blowing to tell them that, because they’re not getting that level of intelligence from anybody else.” – Christopher Beach, Senior Vice President of Engineering, MediaBrix Learn more about how an advertising company gets answers from data with Microsoft Azure platform. Fast Loading with ADF and PolyBase We recently shared how you can use Azure Data Factory Copy Wizard to load 1TB data in under 15 mins into Azure SQL Data Warehouse, at over 1.2 GB per second throughput. Azure Data Factory is a data movement service in the cloud enabling ingestion of data from multiple on-premises and cloud sources – SQL Data Warehouse has deployed a single-click integration with Data Factory to make data movement even easier. Using the staging blob feature, you can achieve high load speeds from all types of data stores besides Azure Blob storage, which the Polybase supports by default. “Using Azure SQL Data Warehouse, we’re able to do near-real-time compute so they can see data from the last hour. In some cases, we can even bypass our default system and go into a true live system that shows how many people are in the room at that time.” – Tom Sheppard, Chief Executive Officer, Presence Orb Learn more about real-time benefits with Azure SQL Data Warehouse. Enhanced migration, monitoring, and SQL tooling experience Azure SQL Data Warehouse has introduced updates to the Azure portal and SQL Server Management Studio (SSMS) to provide a seamless experience when loading, monitoring, and developing your SQL Data Warehouse. The updates include integrated support for loading from 20+ data stores on premise and in the cloud, a simple process to troubleshoot common issues. The updates also bring highly requested functionality within SSMS further enhancing the experience for SQL users, like enabling the execution of Generate Scripts wizard for database users and user defined functions. SQL Data Warehouse introduced a new top level resource blade that allows you to quickly manage all your databases. You can use the SQL Data Warehouse resource blade to quickly scan through your data warehouse for details like the name, status, server, pricing tier, location, and subscription. Accelerated look up queries SQL Data Warehouse now supports the creation of secondary B-Tree indexes on column store tables. Most analytic queries aggregate large amounts of data and are served well by scanning the column store segments directly. However, there is often a need to look for a “needle in a haystack”, which translates to a query that does a lookup of a single row or a small range of rows. Such look up queries can get orders of magnitude (even 1000x) improvement in response time and potentially run in sub-second if there is a B-Tree index on the filter column. Easy integration with Azure Active Directory authentication and other services within Azure Azure AD provides an alternative to SQL Authentication enabling centralized identity and group management. It enables a single sign-on experience using SQL Data Warehouse for federated domains. Azure AD can be used to authenticate against a growing number of Azure and other Microsoft services and helps customers prevent the proliferation of users and passwords. “Switching from Amazon Redshift was not just about a direct comparison to Azure SQL Data Warehouse. The overall Azure offering provided a lot of motivation.” – Bill Sabo, Managing Director of Information Technology, Integral Analytics Learn more about Integral Analytics switch to Azure from AWS. HIIPA Certification To enable greater adoption within the health industry, Azure SQL Data Warehouse is HIIPA certified. The Health Insurance Portability and Accountability Act (HIPAA) is a US healthcare law that establishes requirements for the use, disclosure, and safeguarding of individually identifiable health information. New products from Partners for easy experience We have had great partners join us to help customers on their journey to experience and adopt the service – including building custom product to enhance data migration and management experience. For example, Redgate, a long-time partner that delivers SQL Server tools, has created Data Platform Studio (DPS) which provides a simple and reliable way to migrate on-premises SQL Server databases to Azure SQL Data Warehouse. Exclusive free trial At the annual PASS Summit in October, we announced a SQL Data Warehouse exclusive free trial – enabling customers to experience this cloud-based service for free for a month. It’s been a great year and thanks to everyone who has joined us in the journey. We are excited for the next year and look forward to helping you solve your most important data warehousing challenges and bringing you even more compelling features and service enhancements. Till then, wish you the best start to 2017. Learn more Check out the many resources for learning more about SQL Data Warehouse, including: What is Azure SQL Data Warehouse? SQL Data Warehouse best practices Video library MSDN forum Stack Overflow forum
Quelle: Azure

Project Bletchley – Corda Distributed Ledger available on Azure

We are very excited to announce the availability of R3’s Corda on Microsoft Azure, only a few weeks after its release to the open source community.  With the addition of Corda to Project Bletchley, we continue to expand our distributed ledger platform support on Azure to enable the next generation of distributed business applications.

Today, R3 published a virtual machine image in the Azure Marketplace to easily and quickly deploy a multi-member Corda demo network on a single VM.  This offering demonstrates the capabilities of Corda through real world scenarios, including an interest rate swap deal and Standard Initial Margin Model (SIMM) valuation. This is the first click-stop of many towards R3’s financial-grade distributed ledger platform offering on Azure.

As Richard G Brown, Chief Technology Officer at R3, describes “Corda is a distributed ledger platform designed from the ground up to record, manage and synchronize financial agreements between regulated financial institutions. It is heavily inspired by and captures the benefits of blockchain systems, without the design choices that make blockchains inappropriate for many banking scenarios.  By making simple Corda demos available on the Azure Marketplace, R3 and Microsoft are making it easy for newcomers to experience Corda for themselves before joining the community.”

Some of Corda’s key design choices include:

Designed for Business: Recording and management of financial transactions, within existing legal and regulatory frameworks
Privacy First: Data sharing restricted to entitled participants of a transaction
Modular Consensus: Support for multiple consensus algorithms

For additional information, you can download the technical white paper.

Try out the Corda distributed ledger demo on Azure to gain insight for your business process development and let us know what you think!  Do not hesitate to leave a comment with questions, feedback, or additional requests as you begin.
Quelle: Azure

New survey shows hybrid is leading approach, security waning as blocker to cloud adoption

This post is authored by Julia White, Corporate Vice President, Azure + Security Marketing.

Earlier this fall we once again invited our community of IT professionals, developers, and technology decision makers to participate in the 2016 Future of Cloud Survey led by North Bridge Growth Equity Venture Partners and research analyst firm Wikibon, which analyzes trends in adoption, use, and challenges every year.

The results are in, and while many of the findings reiterate trends we’ve been seeing, there are also a few new trends emerging. As we turn the corner into 2017, I want to share some of my thoughts on the key findings. 

As we’ve noted before, the use of cloud technology by organizations of all sizes has hit mainstream levels of adoption. This research shows that 42 percent of organizations have a cloud-first or cloud-only strategy, and another 49 percent are using cloud in key aspects of their technology systems. This means over 90 percent of companies surveyed report they are using cloud in a meaningful way. It’s no longer a matter of “if we move to cloud,” but when and how. From this robust set of survey information, here’s what I found most interesting:

Hybrid is the logical path forward.

Hybrid cloud, meaning using a combination of public cloud and on premises systems, remains the most common approach for organizations. The Future of Cloud survey found survey found that a hybrid model is still the predominant strategy at 47 percent followed by purely public cloud use at 30 percent. In our own survey of 2,500 IT Professionals, we found that hybrid isn’t just a short term strategy – 9 in 10 (91 percent) of IT workers believe hybrid cloud will remain the approach for their organizations five years from now. Every organization has a unique set of existing systems and business policies, so taking the approach of using a mix of public cloud and on premises technology simply makes sense. This means organizations must ensure hybrid cloud is efficient. Hybrid systems cannot be two separate infrastructures connected, but running in parallel; there must be consistency in management, security, and development experience to make this feasible. 

Blockers to cloud adoption are changing – vendor lock-in and privacy rising in concern, while companies are becoming more comfortable with cloud security.

While security continues as the top concern with using cloud, we’re pleased to see overall concern has dropped significantly since 2015. Interestingly in this survey, 50 percent of respondents cite security as a benefit of using cloud, while 50 percent say security is a barrier. We should expect this continue to tip in favor of cloud as a benefit moving forward. While security concerns are diminishing, privacy concerns are rising. In 2011, privacy didn’t even make the top list and has now risen to number 3 spot of cloud concerns. With new regulations such as GDPR, it makes sense that privacy concerns are on the rise and this will likely continue to increase. This also means that the privacy policies and privacy track-record of the global cloud providers should expect to come under greater scrutiny – appropriately so.

For the first time, concerns over vendor lock-in rose to the second highest spot. In this way, the cloud is no different than on-premises technology, and customers need to know they can change course if and when needed. In a recent post on top cloud myths, I talked about why enterprises need multiple public cloud vendors, as well as the rise of multi-cloud management solutions for enterprises to manage these systems. Belief that one cloud vendor can meet all needs is simply out of touch with reality and smacks of vendor hubris. The balance for organizations is which mixture of cloud technology to tap into for their different needs. SaaS based business systems are the most efficient, but are equally non-portable as their on premises equivalents. Platform services provide greater development efficiency than infrastructure as a service (IaaS), but can be less portable. Fundamentally, the cloud doesn’t change vendor lock-in concerns or dynamics, but the reality in technology choices continues. 

SaaS starts the cloud journey, then IaaS and PaaS.

Often, getting started with SaaS business applications is the first step into using cloud for a company’s digital transformation. We see many of our enterprise customers start with Office 365 and then adopt Azure to run other existing business apps on IaaS, and then tap into the development efficiencies to create new solutions using Azure PaaS offerings. With this, it’s not too surprising that 7 in 10 companies said they are using SaaS in their organization, followed by Infrastructure-as-a-service (IaaS) with 58 percent of respondents deploying IaaS for compute. Platform-as-a-service (PaaS) is deployed by 45 percent of respondents, but is expected to show the highest increase over the next two years growing by 19 percent. Despite plenty of market focus on IaaS technology, I consistently hear from customers the efficiencies of using SaaS and PaaS technologies. This has been a fundamental reason why, across the Microsoft cloud, we’ve got significant focus on these areas – across Office 365, Dynamics 365, PowerApps, Flow, Azure IoT, and Cortana Intelligence services among others. 

New innovations continue to drive cloud adoption, DevOps model rises.

New technology innovations – from IoT and advanced analytics to containers and virtual reality – are creating new possibilities for company’s digital transformation. Many of these technologies are only feasible using a cloud model – taking advantage of the scale, agility, and cost models the cloud provides. To this end, organizations report that their cloud investments are in these areas with analytics as a top priority (58 percent), over half (52 percent) say containers are a priority, and 48 percent are investing in IoT. Virtual reality is an emerging area of innovation with 16 percent listing it is a priority. Reinforcing the focus on rapid innovation both mobile and open source are now twice as likely as last year to be cited as a driver for cloud computing.

With this focus on innovation, we’re also seeing a rise in DevOps. Over half (51 percent) have begun DevOps in small teams – up 37 percent YoY – and 30 percent of companies have begun DevOps in large teams company wide, up 2x YoY. We’re also seeing this trend in our customer base, with many traditional IT organizations classifying themselves as DevOps in surveys and event registrations.

These are just a few of the top trends that popped for me in this research. I’d love to hear your thoughts on what 2017 will bring in cloud adoption and trends. Join us in the conversation on
Quelle: Azure

Microsoft Azure Storage Import/Export Announcements

Azure Import/Export Service enables transfer of data in and out of Azure Blob Storage by shipping data on harddrives. You can use this service when you have large amount of data which takes long to upload/download over network. We are happy to share some improvements we have made to this service with you today.

Azure Import/Export in Azure portal

Today we are excited to announce the general availability of Azure Import/Export Service in the new Azure portal.

With Azure portal support, you will have added ability to:

Manage import and export jobs across all storage accounts from the same pane.
View status of overall job and of each drives individually with percentage complete during data transfer.
Obtain a link to error log in case of a warnings/failure encountered during import or export job.
See reminder warning signs if you happen to miss updating tracking information to avoid delays in the job-processing.

Storage Accounts – Classic, Azure Resource Manager (cool and hot access tier)

In addition to the Classic Storage accounts, you can now create Import and Export jobs targeting Azure Resource Manager account with Hot and Cool access tiers. Any new jobs for both Classic and Resource Manager storage accounts can be created from Azure portal. For any job already created in classic portal, you can continue to manage your jobs via the classic Azure portal.

Import/Export tool – multi-drive and multi-source support

We are also happy to announce WAImportExport client tool improvements. Download latest version of  WAImportExport client tool from Download Center. Learn how to use WAImportExport client tool. You can use this tool to copy data to the hard drives you are going to ship to an Azure data center.

Enhancements to this tool are made in order to significantly reduce the preparation work needed before sending the disk to Azure datacenter for processing your large amount of data. You will no longer need to shard the data and figure out optimal placement of data across multiple disk. You can now copy data from multiple source directories in a single command line.

This version of the tool provides the following enhancements while preparing disks for an import job:

Ability to copy data from multiple source directories in a single command.
Ability to restructure the destination data from various source directories to different destination virtual directories.
Ability to prepare multiple disks using a single command line.
Automatic spreading of data across multiple disks.
Central control of how each disk and data are handled using comma-separated (CSV) file.

Import/Export tool v1 – platform agnostic copy

We are pleased to introduce a new flag /skipwrite in WAImportExport V1 release of the tool.

With this new flag, the disk preparation can be split into two stages: data-copy using any OS/tool and post-processing using the Import/Export tool v1. Post-processing includes generating journal file, drive manifest and optionally encrypt the disk when data is already pre-written to the disk.

That said, copying of data can happen on any OS that support NTFS file system e.g Linux. Only post-processing needs to happen on windows machine.

Also, you can reuse your backup hard-disk directly and eliminate the need to procure additional hardware for Azure Import/Export job.

You can download WAImportExport V1 version of the tool and refer to WAImportExport V1 usage guide for further information.

Regional availability expansion

We expanded Azure Import/Export service to Canada, US Gov and China now. With these additions, you will have ability to choose from one of the 20 different regions close to you for shipping your drives. For more information on all the regions, please see Import/Export under Storage on  Azure Products available by region page.

Hard drives types

In addition to 3.5” internal SATA II/III disks, Azure Import/Export can now process your import or export jobs using 2.5” Internal SATA drives and internal SSD drives.

Questions?

For more information, go to Azure Import/Export Overview. Your feedback is important to us, so send all your feedback or any feature requests using the Import/Export User Voice. And don’t worry – if you need any assistance, Microsoft Support is there to help you along the way!
Quelle: Azure

Azure Analysis Services is now available in North Europe and West US

Azure Analysis Services is a new preview service in Microsoft Azure where you can host semantic data models. Users in your organization can then connect to your data models using tools like Excel, Power BI and many others to create reports and perform ad-hoc data analysis.

We are excited to announce that Azure Analysis Services is now available in North Europe and West US.

Azure Analysis Services is now available in the following regions: North Europe, West Europe, West US, South Central US and West Central US.

Learn more about Azure Analysis Services or try creating your first data model.
Quelle: Azure