Backup Managed Disk VMs using Azure Backup

Last week we announced the general availability of Managed Disks. Managed Disks are Azure Resource Manager (ARM) resources, that can be deployed via templates to create thousands of Managed Disks without worrying about creating storage accounts or specifying disk details. Backup of Managed disk VMs against accidental deletions and corruptions resulting from human errors is a critical capability for customers of all sizes. With Azure Backup service, you get key enterprise features like backup, restore, policy based management, backup alerts, job monitoring, instant data recovery without deploying any infrastructure in your tenant environment. You get the ability to backup Managed Disk VMs directly from VM management blade and the user experience is consistent with backup of VMs attached to Standard or Premium Unmanaged Disks. 

Value Proposition

Azure Backup’s cloud-first approach provides:

Freedom from infrastructure: No need to deploy any infrastructure to backup VMs
Eliminate backup storage management with bottomless Recovery Services vault.
Pay as you go model with no egress costs for restores.
Self-service backup and restore

Key features

Application Consistent backups for Windows Azure VMs and File-system consistent backup for Linux Azure VMs without the need to shutdown VM.
Policy Based Management:  Azure Backup allows you to specify the backup schedule as well as retention policy of backups.  The service handles periodic backups as well as pruning of recovery points beyond the configured retention period. 
Long Term Retention of backup data for years even beyond the lifecycle of the VM.
Full VM and Disk restore:  In case your VM is corrupted and needs replacement or want to simply make a copy of the VM you can do so with full VM or disk restore. 
Instant Data Recovery:  With Instant Data Recovery, you can restore individual files and folders within the VM instantly without provisioning any additional infrastructure, and at no additional cost. Instant Restore provides a writeable snapshot of a recovery point that you can quickly mount, browse, recover files/folders by simply copying them to a destination of your choice. These snapshots even allow you to open application files such as SQL, MySQL directly from cloud recovery point snapshots as if they are present locally and attach them to live application instances, without having to copy them.
Role Based Access:  You can limit the access to backup data in the Recovery Services vault using Role Based Access controls. Azure Backup supports Backup Contributor, Backup Operator and Backup Reader roles at a vault level.
Monitoring and Alerting: You can monitor your backup and restore jobs from the Recovery Services Vault dashboard.   In addition, they can also configure email alerts for job failures.

Customers can backup data to Recovery Services Vault in all public Azure regions, including Canada, UK, and West US2.

Getting started

To get started, enable backup with a few steps:

Select a virtual machine from the Virtual machines list view. Select Backup in the Settings menu.
Create or select a Recovery Services Vault:  The vault maintains backups in a separate storage account with its own lifecycle management. 
Create or select a Backup Policy

Watch the video below to instantly recover files from an Azure VM (Windows) backup.

Watch the video below to instantly recover files from an Azure VM (Linux) backup.

The instant restore capability will be available soon for users who are protecting their Linux VMs using Azure VM backup. If you are interested in being an early adopter and want to provide valuable feedback, please let us know at linuxazurebackupteam@service.microsoft.com. Watch the video below to know more.

Related links and additional content

Want more details? Check out Azure Backup documentation and Managed Disk Blog
Learn more about Azure Backup
Need help? Reach out to Azure Backup forum for support
Sign up for a free Azure trial subscription
Tell us how we can improve Azure Backup by contributing new ideas and voting up existing ones.
Follow us on Twitter @AzureBackup for the latest news and updates

 

 
Quelle: Azure

Microsoft Enterprise Services Tips to using SQL Data Warehouse effectively

Azure SQL Data Warehouse is a SQL-based fully managed, petabyte-scale cloud solution for data warehousing. With tight integration between other Azure products, SQL Data Warehouse represents the backbone of any major cloud data warehousing solution in Azure. With decoupled compute and storage, elastic scale-out functionality, and the ability to pause and resume to meet demand, SQL Data Warehouse gives customers the flexibility, cost-savings, and performance they need, right when they need it.

Microsoft Enterprise Services (ES) is one of the largest consulting and support businesses in the world. Microsoft ES operates across more than 100+ subsidiaries around the globe operating in spaces such as cloud productivity, mobility solutions, adoption services, and more. With the speed and scale at which clients’ operations transform and grow, it is paramount that ES stays ahead of the curve to meet future demand.

Right data at the right time allows ES to make the best decision about where to fund resources and how to best serve their customers. Traditional data warehouse reporting stacks took far too long to deliver reports and were far too inflexible to change models. Adding to the costs of complexity, maintenance, and scaling to match growing data, their traditional on-premise data warehouse was producing less value than the cost of upkeep and development, distracting from the core business value of delivering insights.

The move to a modern data warehouse solution was becoming readily apparent. ES analytics split their workload into data processing and distribution. Core requirements for ES included easy scalability, high IO, multi-terabyte storage, row level data security, and support of more than 200 concurrent users. Working through the host of Azure service offerings, ES landed on a solution with Azure SQL Data Warehouse as their data processing and ad-hoc analysis workload with IaaS SQL Server 2016 Always On instances as their data distribution layer.

Implementation

The implementation for ES at a high level takes audit and historical data from a variety of sources on-premise which first land into Azure Storage Blobs. Polybase is then used to load data in parallel quickly into the Azure SQL Data Warehouse where it is then processed and transformed into dimension and fact tables. Afterwards, these dimension and fact tables are moved into Analysis Services and SQL Server IaaS instances to support quick and highly concurrent access a variety of business users. Across this solution, Azure Data Factory acts as the orchestrating ELT framework, allowing for a single interface to control the data flow for the majority of the pipeline.

8 Tips for SQL Data Warehouse

Putting together a solution like this, while performant, is not a trivial task. Listed below is some guidance straight from the ES team on designing solutions with Azure SQL Data Warehouse:

1. Stage the data in Azure SQL DW:

One of the guiding principles of our warehouse has been to stage the data in its native form, i.e. the way it is available in source. There are various reasons as to why we persist a copy of the source like performance, data quality, data persistence for validation, etc… because of staging the data, we are able to distribute data as per our needs and ensure we have minimal data skew. Rarely, if at all, have we used round robin mechanism to store data.

2. Arriving at a common distribution key:

The data staged into the Azure SQL Data Warehouse instance was ingested from a 3NF data source. This helped us to slightly change the schema from source and include the base table’s foreign key as part of all tables, which are distributed on the same key. During our fact load we join the tables on these set of keys thereby minimizing all DMS operations and in some cases, no DMS operations occur. Hence this gives us an edge in terms of performance. As an example, the data in source systems have one to many relationships between tables. However, in our SQL DW we have inserted a common distribution key across all tables, based on business logic and because this key gets loaded when ELT runs.

However, we would recommend checking the data skew before going ahead with this approach as the distribution key must be chosen completely based on the skew that one may see. When data skew is high or the joins are not compatible we create an interim table, which is distributed on the same key as the other join table. We us CTAS to accomplish this which incurs one DMS operation to get all keys but improves performance when there are complex joins.

3. Vertical Partitioning of Wide Tables:

We had a row size limitation of 32KB in Azure SQL DW. Since we had several wide tables with 150+ columns and many with varchar(4000) we came up with an approach to vertically partition the table on the same distribution key. This helped us to overcome the challenge of 32KB and at the same time provide the required performance while joining the two tables as the distribution key was the same.

Note: SQL Data Warehouse now supports 1MB wide rows

4. Use the right resource class:

In several cases we had complex facts that would need more resources (memory and CPU) to speed-up the process of fact loads. Not just facts, we also had dimensions which had complex business rules and type 2 kind of implementations. We designed our ETL in such a way that lesser complex facts and dimensions would run on smallrc resource class providing for more parallelism, whereas the more complex facts which would need more resources would run using largerc resource class.

5. Use the primary key as distribution column for master tables:

In the source from where we ingest data into SQL DW, we have many master tables that we use in SQL DW to look up these tables for building our facts. In such a case, we have made these tables with reasonable amount of data (>1 million rows) being distributed on the primary key, which is a unique integer. This has given us the advantage having even data distribution (minimal to no data skew), thereby making our look up queries really fast.

6. Using Dynamic Scale up and Scale down for saving costs:

Our ELTs using ADF are designed in such a way that prior to the scheduled ELT kick off, we scale up our instance from 100DWU to 600DWU. This has led to huge cost savings. Our ELT runs for nearly 4-5 hours during this time and the DWU usage is capped at 600 DWU. During month end when there is a faster need for processing and businesses need data faster, we have the option of scaling to 1000 DWU. All this is done as part of our ELT’s and no manual intervention is needed.

7. Regular Maintenance:

In our case, we included updating statistics and index rebuilds as part of the ELT process. No sooner the dimension and fact load is completed, we check for all the CCI’s where the fragmentation is > 5% and rebuild the index. Similarly, for the key tables, we are updating statistics to ensure best performance.

8. Leverage SQL Server 2016 for data marts:

Azure SQL DW is the primary data processing engine, whereas we chose to have SQL Server 2016 running on DS 14 IAAS VM’s are our primary source of data distribution. This has enabled us to leverage high concurrency provided by SQL Server and use the power of Azure SQL DW for processing. We have close to 1000+ users who would be using the data provisioned from Azure SQL DW.

Using Azure SQL Data Warehouse as part of their solution, Microsoft Enterprise Services was able to reduce run times by up to 40% and provide insights to business users with reduced latency.

If you have not already explored this fully managed, petabyte scale cloud data warehouse service, learn more at the links below.

Learn more

What is Azure SQL Data Warehouse?
SQL Data Warehouse best practices
Video library
MSDN forum
Stack Overflow forum
Quelle: Azure

Register today for SMB Live

This year’s SMB Live events, coming to 22 cities across the United States, include a day of new in-depth and hands-on marketing content, and a day of 200 & 300-level technical training to modernize your organization’s technical skills and marketing capabilities. You’ll leave SMB Live with the tools you need for cloud business growth and to be prepared to help modernize your SMB customers to Azure Hybrid IT, Enterprise Mobility + Security, Microsoft Dynamics 365, and more!

Space at these SMB Live events is limited. Register today to secure your spot at the event in the city nearest you.

Register Today!

Day 1

Are you taking full advantage of the hybrid cloud opportunity with your customers? Is Azure coming up more and more in your customer conversations? Are your customers looking for a secure and efficient cloud solution? Join our experts at this free, one-day SMB Live session and see how you too can be that market leader.

What you will take away from this training:

Simple ways you can create a differentiated Cloud offering, including developing repeatable solutions to grow your profitability by learning how successful partners are:

building a Cloud Infrastructure or Azure Hybrid IT practice to protect customers, while increasing recurring revenues. 
leading with security solutions leveraging Enterprise Mobility + Security (EMS)

How to align your sales process to the new Cloud customer buying behaviors and proven ways to maximize your revenue from your existing customer base. 
See how new Microsoft business productivity tools can help you drive new and recurring profits. Hear what’s new with Windows 10 Enterprise Subscription E3 for CSP, Microsoft Dynamics 365 for Financials, SQL & Windows Server 2016 and Office 365 E5 (Skype Telephony).

What Organizations should attend:

Experienced Cloud partners who are interested in developing a new or current cloud practice beyond Office 365 with Microsoft Azure Hybrid IT solutions.

Who from the organization should attend Day 1?

Business leads, sales, marketing, and/or technical business decision-makers.

Register Today! 

Day 2

Learn how to differentiate yourself from your competitors using Azure Hybrid cloud solutions and scalable workloads. This one-day technical hands on training will provide you with the tools and skills necessary to develop and manage your cloud practice; easily spec and estimate monthly Azure costs and learn how powerful Azure Migration offerings from other partners can help move your customers to the cloud.

What you will take away from this training:

Technical Readiness – How to develop and deploy core Azure workloads with hands-on training configuring Business Continuity solutions and Business Operations
Accurate Azure Price Forecasting – Learn how to present an AZURE solution with the use of detailed pricing structures and scenarios.
Understanding the Azure Solution Profitability – Review and understand the profitability potential with a view to near and long term customer relationships.
New and Existing Customer Assessment and Migration Resources – Resources to help you analyze your current customer base and begin migrations immediately to Azure.

What Organizations should attend:

Experienced Cloud partners who are interested in developing a new or current cloud practice beyond Office 365 with Microsoft Azure Hybrid IT solutions

Who from the organization should attend day 2?

Business / Technical Decision Makers, Pre / Post sales Technical Roles. 

Register Today!
Quelle: Azure

Connect Excel to an Azure Analysis Services server

You can connect to Azure Analysis Services from Power BI, Excel and many other third party client tools. This blog will focus on how to connect to your Azure Analysis Services server from Microsoft Excel.

Before getting started, you’ll need:

A data model deployed at an Azure Analysis Services server – Creating your first data model in Azure Analysis Services.
Microsoft Excel – If you have the latest version of Excel 2016 from Office 365, you do not need to install any additional updates. For non Office 365 or older versions of Excel, the MSOLAP.7 provider is required.

In Excel 2016, on the Data ribbon, click Get External Data > From Other Sources > From Analysis Services.

In the Data Connection Wizard, in Server name, enter the name of your Azure Analysis Services server. Then, in Logon credentials, select Use the following User Name and Password, and then type the organizational user name, for example nancy@adventureworks.com, and password.

In Select Database and Table, select the database and model or perspective, and then click Finish.

Select OK to create a PivotTable report.

    

A pivot table will be created and you will now see your field list on the side. You can drag and drop different fields to build out your pivot table.

Learn more about Azure Analysis Services and how to have Faster PivotTables in Excel 2016
Quelle: Azure

Dear #MongoDB users, we welcome you in #Azure #DocumentDB

First and foremost, security is our priority 

Microsoft makes security a priority at every step, from code development to incident response. Azure code development adheres to Microsoft’s Security Development Lifecycle (SDL) – a software development process that helps developers build more secure software and address security compliance requirements while reducing development cost. Azure Security Center makes Azure the only public cloud platform to offer continuous security-health monitoring. Azure is ubiquitous, with a global footprint approaching nearly 40 geographical regions and continuously expanding. With its worldwide presence, one of the differentiated capabilities Azure offers is the ability to easily build, deploy, and manage globally distributed data-driven applications that are secure.

Azure DocumentDB is Microsoft&;s multi-tenant, globally distributed database system designed to enable developers to build planet scale applications. DocumentDB allows you to elastically scale both throughput and storage across any number of geographical regions. The service offers guaranteed low latency at P99 – 99.99% high availability, predictable throughput, and multiple well-defined consistency models – all backed by comprehensive enterprise-level SLAs. By virtue of its schema-agnostic and write optimized database engine, by default DocumentDB is capable of automatically indexing all the data it ingests and serve SQL, MongoDB, and JavaScript language-integrated queries in a scale-independent manner.

DocumentDB has a number of powerful security features built-in. To secure data stored in an Azure DocumentDB database account, DocumentDB provides support for a secret-based authorization model that utilizes a strong hash-based message authentication code (HMAC). In addition to the secret based authorization model, DocumentDB also supports policy driven IP-based access controls for inbound firewall support. This model is very similar to the firewall rules of a traditional database system and provides an additional level of security to the DocumentDB database account. With this model, you can now configure a DocumentDB database account to be accessible only from an approved set of machines and/or cloud services. Once this configuration is applied, all requests originating from machines outside this allowed list will be blocked by the server. Access to DocumentDB resources from these approved sets of machines and services still require the caller to present a valid authorization token. All communication inside the cluster in DocumentDB (e.g., replication traffic) is using SSL. All communication from Mongo (or any other clients) to DocumentDB service is always using SSL.To learn more about securing access to your data in DocumentDB, see Securing Access to DocumentDB Data.

The table below maps current DocumentDB features to the security checklist that MongoDB recommends.

Checklist Item

Status

Enable Access Control and Enforce Authentication

Enabled by default

Only discovery/authentication commands like IsMaster/GetLastError/WhatsMyUri are supported before authentication

Configure Role-Based Access Control

Each DatabaseAccount has its own key.

Support for ReadOnly keys to limit access.

No default user/account present.

Encrypt Communication

We do not allow non-SSL communication – all communication to service is always over SSL.

DocumentDB requires TLS1.2 which is more secure than TLS1, SSL3

Encrypt and Protect Data

Encryption at rest

Limit Network Exposure

IP Filtering

Audit System Activity

We audit all APIs and all system activities, and plan to expose it to customers using Portal shortly (today we already expose it to customers when they ask for it).

Run MongoDB with a Dedicated User

DocumentDB is a multi-tenant service so no account has direct access to the core operating system resources.

Run MongoDB with Secure Configuration Options

DocumentDB only support MongoDB wire protocol and does not enable HTTP/JSONP endpoints

The capabilities offered by DocumentDB span beyond that of traditional geographical disaster recovery (Geo-DR) offered by "single-site" databases. Single site databases offering Geo-DR capability are a strict subset of globally distributed databases. With DocumentDB&039;s turnkey global distribution, developers do not have to build their own replication scaffolding by employing either the Lambda pattern (for example, AWS DynamoDB replication) over the database log or by doing "double writes" across multiple regions. We do not recommend these approaches since it is impossible to ensure correctness of such approaches and provide sound SLAs.

DocumentDB enables you to have policy-based geo-fencing capabilities. Geo-fencing is an important capability that ensures data governance and compliance restrictions and may prevent associating a specific region with your account. Examples of geo-fencing include (but are not restricted to), scoping global distribution to the regions within a sovereign cloud (for example, China and Germany), or within a government taxation boundary (for example, Australia). The policies are controlled using the metadata of your Azure subscription.

For failover, you can specify an exact sequence of regional failovers if there is a multi-regional outage and you can associate the priority to various regions associated with the database account. DocumentDB will ensure that the automatic failover sequence occurs in the priority order you specified.

We are also working on encryption-at-rest and in-motion. Customers will be able to encrypt data in DocumentDB to align with best practices for protecting confidentiality and data integrity. Stay tuned for that.

Second, you don’t have to rewrite your Apps

Moving to DocumentDB doesn’t require you to rewrite your apps or throw away your existing tools. DocumentDB supports protocol for MongoDB, which means DocumentDB databases can now be used as the data store for apps written for MongoDB. This also means that by using existing drivers for MongoDB databases, your applications written for MongoDB can now communicate with DocumentDB and use DocumentDB databases instead of MongoDB databases. In many cases, you can switch from using MongoDB to DocumentDB by simply changing a connection string. Using this functionality, you can easily build and run MongoDB database applications in the Azure cloud – leveraging DocumentDB&039;s fully managed and scalable NoSQL databases, while continuing to use familiar skills and tools for MongoDB. Furthermore, we only support SSL for Mongo (not http) for the benefit of all users. Other benefits that you can get right away (that you can’t get anywhere else) include:

No Server Management – DocumentDB is a fully managed service, which means you do not have to manage any infrastructure or Virtual Machines yourself. And DocumentDB is available in all Azure Regions, so your data will be available globally instantly.
Limitless Scale – You can scale throughput and storage independently and elastically. You can add capacity to serve millions of requests per second with ease.
Enterprise grade – DocumentDB supports multiple local replicas to deliver 99.99% availability and data protection in the face of both local and regional failures. You automatically get enterprise grade compliance certifications and security features.
MongoDB Compatibility – DocumentDB protocol support for MongoDB is designed for compability with MongoDB. You can use your existing code, applications, drivers, and tools to work with DocumentDB.

Third, we do it with love…

Modern developers rely on dozens of different technologies to build apps, and that number is constantly expanding. These apps are often mission-critical and demand the best tools and technologies, regardless of vendor. That’s why we work so hard to find elegant, creative and simple ways to enable our customers build any app, using any model, with any language (e.g., Node.js, Java, Python, JavaScript, .NET, .NET core, SQL) against DocumentDB. And that’s why there are thousands of apps built on top of DocumentDB for everything from IoT, advertising, marketing, e-commerce, customer support, games, to power grid surveillance. We are deeply committed to making your experience on DocumentDB simply stellar! We offer a platform that brings everything together into one to simplify the process of building distributed apps at planet scale . We agonize over the best way to give developers the best experience, making sure our service works together seamlessly with all other services in Azure like Azure Search, Azure Stream Analytics, Power BI, Azure HDInsight and more. We strive for nearly instantaneous, yet thoughtful, human responses to each inquiry about DocumentDB that you post online.  For us, this is not going above and beyond, it’s how we do it. This is who we are.

Welcome to real planet-scale NoSQL revolution!

We’re thrilled you’re going to be helping us define our NoSQL product (which capabilities to add, which APIs to support, and how to integrate with other products and services) to make our service even better. DocumentDB powers the businesses of banking and capital markets, professional services and discrete manufacturers, startups and health solutions. It is used everywhere in the world, and we’re just getting started. We’ve created something that both customers and developers really love and something we are really proud of! The revolution that is leading thousands of developers to flock to Azure DocumentDB has just started, and it is driven by something much deeper than just our product features. Building a product that allows for significant improvements in how developers build modern applications requires a degree of thoughtfulness, craftsmanship and empathy towards developers and what they are going through. We understand that, because we ourselves are developers.

We want to enable developers to truly transform the world we are living in through the apps they are building, which is even more important than the individual features we are putting into DocumentDB. Developing applications is hard, developing distributed applications at planet scale that are fast, scalable, elastic, always available and yet simple – is even harder. Yet it is a fundamental pre-requisite in reaching people globally in our modern world. We spend limitless hours talking to customers every day and adapting DocumentDB to make the experience truly stellar and fluid. The agility, performance and cost-effectiveness of apps built on top of DocumentDB is not an accident. Even tiny details make big differences.

So what are the next steps you should take? Here are a few that come to mind:

First, go to the Create a DocumentDB account with protocol support for MongoDB tutorial to create a DocumentDB account.
Then, follow the Connect to a DocumentDB account with protocol support for MongoDB tutorial to learn how to get your account connection string information.
Afterwards, take a look at the Use MongoChef with a DocumentDB account with protocol support for MongoDB tutorial to learn how to create a connection between your DocumentDB database and MongoDB app in MongoChef.
When you feel inspired (and you will be!), explore DocumentDB with protocol support for MongoDB samples.

Sincerely,
@rimmanehme + your friends @DocumentDB
Quelle: Azure

Key updates to Azure Backup Server

Microsoft Azure Backup Server (MABS) is a cloud-first backup solution to protect data and workloads across heterogeneous IT environments of enterprises. It is available as a free download with Azure Backup without the requirement of System Center License or SQL license for server DB. The latest update released for Azure Backup Server ensures that customers are able to centrally monitor all their backup entities, perform agentless backups, secure data against cyber threats like ransomware, machine compromise, and recover from them. Azure Backup Server now goes a step further and provides security based mechanisms to safeguard all the operations that impact availability of cloud data.

If you are new to Azure Backup Server:

You can download Microsoft Azure Backup Server and start protecting your infrastructure today. It is available as a free download with Azure Backup without the requirement of System Center or SQL license for the server DB.

Learn more about Azure Backup Server using these short videos to get started.

Key features

Azure Backup Server recently added the following enterprise grade features to strengthen security, provide a centralized view of backup entities, and support key workloads:

Central monitoring –  Customers can now monitor their on-premises assets backed up in Azure Backup Server from portal. Recovery Services vault now provides a centralized view of backup management servers, protected servers, backup items, and their associations. This gives a simple experience to search for backup items, identify Azure Backup Server they are associated to, view disk utilization, and other details related to these entities.
Security features – Azure Backup recently announced Security Features, available as part of latest update. These features are built on three principles – Prevention, Alerting, and Recovery – to enable organizations increase preparedness against attacks and equip them with a robust backup solution.
VMware support – Azure Backup Server also contains support for VMware backup. This capability provides agentless backups, seamless discovery, and auto-protection features.
Availability in new regions – Azure Backup is already available in multiple regions. Customers can now backup data to new regions as well including Canada, UK, and West US2.

Getting started

To start leveraging these features, navigate to the links, and videos below.

Central monitoring

To start using central monitoring for Azure Backup Server, you must create a Recovery Services vault, download latest Azure Recovery Services Agent, and register Azure Backup Server to this vault. If your Azure Backup Server is already registered to Recovery Services vault, you can start leveraging these features by upgrading Azure Backup Server to update 1 and installing latest Azure Recovery Services Agent (minimum version 2.0.9062).

Security features

The video below explains how to get started by enabling Security features and how to leverage them in Azure Backup Server.

VMware backups

Please go through 4 simple steps to protect VMware VMs using Azure Backup Server. The first video in the series is linked below.

Related links and additional content

Download Azure Backup Server update 1
Learn more about Azure Backup Security Features
Getting started with Recovery Services vault
Need help? Reach out to Azure Backup forum for support
Tell us how we can improve Azure Backup by contributing new ideas and voting up existing ones
Follow us on Twitter @AzureBackup for latest news and updates

Quelle: Azure

How to make a movie the secure way

At the RSA 2017 Conference this week, we’ll be presenting “Securing the Making of the Next Hollywood Blockbuster” (San Francisco | February 15, 2017 | 1:30 PM – 2:15 PM) with Academy Award®-winning studio New Regency. It’s a spellbinding adventure about their transition to secure movie production in the cloud, made possible by Microsoft Azure and a cadre of ISV partners who’ve ported their solutions to the platform.

New Regency, responsible for critically acclaimed feature films such as The Revenant and Birdman, and blockbuster titles including the recent Assassin’s Creed, knows what it’s like to be on the bleeding-edge of technology while at the same time telling vivid stories in full cinematic glory. They use technology to both help convey elements of the script to deliver on the producer’s vision, and to optimize workflows for efficiency and cost savings. In the middle of it all: securing the content that is the lifeblood of the business.

That content comes in many forms—memos, scripts, pictures, audio, email, contracts, videos, and more—all with an ever-growing association of metadata. It’s also stored in many places—on servers, workstations, mobile storage, archives, etc.—which presents a massive security challenge. Data flies far and wide, challenging all efforts to lock it down. On top of that, personal information such as health records, contract details, and paystubs have entered the mix, adding to an already strenuous data governance situation.

In this session, we’ll look at the end-to-end workflow designed by New Regency that leverages Azure and the combined wizardry of Avid, 5th Kind, Contractlogix, Docusign, MarkLogic, and SyncOnSet. Lulu Zezza, Physical Production Executive at New Regency and the driving force behind the project, noted that, “Moving to the cloud is the best way to implement security controls across so many different physical and logical environments, locations, and data types. We have people working all over the world, for different companies, using different systems, all contributing to the same production. In the past, it’s been like a free-for-all, with contractors getting access to things they shouldn’t, information being duplicated and stored in the wrong places, and sensitive content left out in the open.”

The new digital workflow enables a secure “script-to-screen” experience for the management of both production data and the crew’s personal HR information (to which new global privacy standards apply). Metadata captured from contracts, script, and camera is associated with filming days, scenes and takes recorded and later to the final edit of the film, reducing the need for document sharing and film screenings. Plus communications are kept protected and confidential. It’s a whole new way to make movies.

Join us at our session where you’ll hear about:

Architectural considerations for multi-domain cloud environments
Secure access and device management for BYOD users
Content protection and privacy in connected and disconnected networks
And glimpses behind-the-scenes of the making of The Revenant, Assassins Creed, A Cure for Wellness and Unfinished Business!

Quelle: Azure

Safeguarding your cloud resources with Azure security services

While cloud security continues to be a top concern, we recently shared insights from a survey that show overall concern has dropped significantly since 2015. We’re now at a stage where half of organizations contend the cloud is more secure than their on-premises infrastructure. In conversations I have with our customers and partners, I hear increasingly about how using the cloud improves an organizations’ security posture. As many organizations push forward on their digital transformation through increased use of cloud services, understanding the current state of cloud security is essential.

Maintaining a strong security posture for your cloud-based innovation is a shared responsibility between you and your cloud provider. With Microsoft Azure, securing cloud resources is a partnership between Microsoft and our customers, so it’s essential that you understand the comprehensive set of security controls and capabilities available to you on Azure. 

Microsoft Azure is built on a foundation of trust and security. With significant investments in security, compliance, privacy, and transparency, Azure provides a secure foundation to host your infrastructure, applications, and data in the cloud. Microsoft also provides built-in security controls and capabilities to further help you protect your data and applications on Azure. These can be classified broadly into four categories:

Manage and control user identity and access: Comprehensive identity management is the linchpin of any secure system. You must ensure that only authorized users can access your environments, data, and applications. Azure Active Directory serves as a central system for managing access across all your cloud services, including Azure, Office 365, and hundreds of popular SaaS and PaaS cloud services. Its federation capability means that you can use your on-premises identities and credentials to access those services, and Azure Multi-Factor Authentication provides for the most secure sign-on experience.

Increase network and infrastructure security: Azure provides you the security-hardened infrastructure to interconnect Azure VMs as well as make connections to on-premises datacenters. Additionally, you can extend your on-premises network to the cloud using secure site-to-site VPN or a dedicated Azure ExpressRoute connection. You can strengthen network security by configuring Network Security Groups, user-defined routing, IP forwarding, forced tunneling, endpoint ACLs, and Web Application Firewall as appropriate.

Encrypt communications and operation processes: Azure uses industry-standard protocols to encrypt data in transit as it travels between devices and Microsoft datacenters, and when it is stored in Azure Storage. You can also encrypt your virtual machine disks using Azure Disk Encryption. Azure Key Vault enables you to safeguard and control cryptographic keys and other secrets used by cloud apps and services. Azure Information Protection will help you classify, label, and protect your sensitive data.

Defend against threats: Microsoft enables actionable intelligence against increasingly sophisticated attacks using our network of global threat monitoring and insights. This threat intelligence is developed by analyzing a wide variety of signal sources and a massive scale of signals. (For example, customers authenticate with our services over 450 billion times every month, and we scan 200 billion emails for malware and phishing each month.) Our approach to protect the Azure platform includes intrusion detection, distributed denial-of-service (DDoS) attack prevention, penetration testing, behavioral analytics, anomaly detection, and machine learning. You can leverage additional services to develop a strong threat prevention, detection, and mitigation strategy.

Azure Active Directory Identity Protection helps you protect and mitigate against the risks from compromised identities. It offers a cloud powered, adaptive machine learning based identity protection system that can detect cyber-attacks, mitigate them in real time, and automatically suggest updates to your Azure AD configuration and conditional access policies. Services like Antimalware for Azure and Azure Security Center use advanced analytics to not only help in detecting threats but also prevent them. Azure Security Center helps you get a central view of the security state of all your Azure resources in real time, including recommendations for improving your security posture. You can use Operations Management Suite to extend the threat prevention, detection and quick response across Azure and other environments (on-premises, AWS). Log Analytics service will give you real-time insights to readily analyze millions of records across all of your workloads regardless of their physical location.

These are just a few examples of the broad set of security controls and services available to you with Azure. Over the past year, we have expanded the portfolio with many new security services and ongoing enhancements.

Microsoft is committed to continued innovation in helping you protect your data, applications, and identities in the cloud. Innovations we have delivered most recently include:

New capabilities and enhancements in Azure Security Center available for preview this month include Just In Time network access to VMs, automatic discovery and recommendations for application whitelisting, and expanded Security Baselines with more than 100 recommended configurations defined by Microsoft and industry partners. Our research team continues to monitor the threat landscape and innovate on detection algorithms. Some new threat detections available to customers include Brute Force detections, outbound DDoS and Botnet detections, as well as new behavioral analytics for Windows and Linux VMs.
Preview of Storage Service Encryption for File Storage. IT organizations can lift and shift their on-premises file shares to the cloud using Azure Files by simply pointing the applications to the Azure file share path. Azure Files now offer enhanced protection with the ability to encrypt data at rest.
Azure SQL Database Threat Detection is already available in preview. Last week the team announced that it will be generally available in April 2017. Azure SQL Database Threat Detection provides an additional layer of security intelligence built into the Azure SQL Database service that uses machine learning to continuously monitor, profile, and detect suspicious database activity to help customers detect and respond to potential threats.

With these tools, organizations are able to securely transition to the cloud while also complying with regulatory requirements. Read how Ricoh USA Inc. discovered that Azure exceeds the level of security it could previously offer its customers.

Azure has a vibrant partner ecosystem, so it’s also easy to bring your trusted cloud security vendor with you, enabling you to leverage your existing security solutions. Find partner security solutions in Azure Marketplace.

Microsoft Azure at RSA 2017

For those of you attending RSA Conference this week in San Francisco, we hope to connect with you at the show. You can:

See the keynote by Brad Smith, President and Chief Legal Officer at 8:35AM PST. You can stream it live if you’re not at RSA.
Attend our sessions:

A Vision for Shared, Central Intelligence to Ebb a Growing Flood of Alerts: SP03-T09
How to Go from Responding to Hunting with Sysinternals Sysmon: HTA-T09
Critical Hygiene for Preventing Major Breaches: CXO-F02
Advances in Cloud-Scale Machine Learning for Cyber-Defense: EXP-T11
Learnings from the Cloud: What to Watch When Watching for Breach: STR-W11

Visit Booth 3501 in the North Expo Hall and learn how Microsoft solutions work together to improve your organization’s security posture. See the complete Microsoft schedule for RSA 2017. Hope to see you in San Francisco!

Quelle: Azure

Azure Application Insights JavaScript SDK: reliability and performance improvements

Recently, we have improved the robustness of web page monitoring in Application Insights, and introduced the ability not to use cookies. Transmission is now more reliable in the face of throttling and network issues, and when a page is about to unload.

With Azure Application Insights you can monitor performance and usage of your apps. With a little snippet of JavaScript you can get timings of page loads and AJAX calls, counts and details of browser exceptions and AJAX failures, as well as users and session counts. To learn more how to get started you can visit our documentation.

New JavaScript SDK features

In addition to internal improvements we have fixed some bugs, cleaned up the SDK, and added a couple more features:

Add snippet.js to NPM package:

Snippet.js is a file used for those who want to use application insights from a separate file instead of using inline JavaScript using a Gulp pipeline. By RehanSaeed. Thanks Muhammed Rehan!

Option to disable cookies

This was an ask from some of our users. Now you can disable cookies, but some features will be lost. Without cookies every page view will count as a new user and session. For more configuration options please see our Application Insights SDK JavaScript API

Enable dependency correlation headers

These are now turned off by default, but you can enable them manually. To correlate dependencies with server request set disableCorrelationHeaders in your config file to false. If you opt in to this feature you will then be able to see the server request that correlate with your client side AJAX calls.

JavaScript SDK Improvements

The below improvements have been made that elevate our JavaScript SDK without any additional work when onboarding

Security

The JS SDK has fully switched to https. We now use https to send all telemetry and to download the library. Also, for all secure sites the SDK will create cookies with a secure flag – more on set-cookie documentation. 

Transmission

One of the biggest goals for Application Insights JavaScript SDK is to provide as much functionality as possible, while not degrading the instrumented page in any way. This includes performance and user experience. Because transmission reliability is a crucial part of the SDK, we’ve recently made a few improvements in this area.

Telemetry is now sent more reliably when:

1. The Application Insights portal is imposing throttling, or is temporarily unable to accept data.

Retry and error handling features are helping to improve transmission reliability. In the case of network issues, the SDK will retry sending the telemetry data with an exponential backoff.

2. The user navigates to a new page within the same site, shortly after your code sends a telemetry event. Previously, events not yet sent were lost when the page was unloaded. Events are now kept in a session buffer that is preserved across pages in the same site.

Beacon API

JavaScript SDK also provides experimental support for a Beacon API. The Beacon API is designed to transmit telemetry when the browser is not busy with time-critical operations. All data is delivered even if the user navigates away or closes a browser tab.

The Beacon API only allows to queue data for transmission, and the actual transmission is handled by the browser outside of our control. Thus, the JavaScript SDK cannot receive a confirmation from the Application Insights backend that all telemetry was received and processed correctly.

If a default JavaScript SDK configuration doesn’t fully work for you and you suspect that not all events are tracked or that transmission is impacting the performance of your page, you can try using the Beacon API.  The feature is currently disabled, but you can enable it by setting ‘isBeaconApiDisabled’ to false – see config. If you decide to send your data using the Beacon API, the SDK will automatically turn off Session Storage Buffer and Retry features.

Performance.now in Session Class

After working with the PowerBI and Edge team we have moved to using performance.now in session calls. This improves the performance of our library’s date/time handling.

Feedback

If you have any questions or experiencing any problems with the JavaScript SDK, feel free to open an issue on GitHub.

Special Thanks

Special thanks to Kamil Szostak for help preparing this post.
Quelle: Azure

What’s new in Azure Active Directory B2C

Over the past few weeks, we have introduced new features in Azure AD B2C, a cloud identity service for app developers. Azure AD B2C handles all your app’s identity management needs, including sign-up, sign-in, profile management and password reset. In this post, you’ll read about these features:

Single-page app (SPA) support
Usage reporting APIs
Friction-free consumer sign-up

Single-page app (SPA) support

A single-page app (SPA) is a web app that loads a single HTML page and dynamically updates the page as the consumer interacts with the app. It is written primarily in JavaScript, typically using a framework like AngularJS or Ember.js. Gmail and Outlook are two popular consumer-facing SPAs.

Since JavaScript code runs in a consumer’s browser, a SPA has different requirements for securing the frontend and calls to backend web APIs, compared to a traditional web app. To support this scenario, Azure AD B2C added the OAuth 2.0 implicit grant flow. Read more about using the OAuth 2.0 implicit grant flow or try out our samples:

A SPA, implemented with an ASP.NET Web API backend
A SPA, implemented with a Node.js Web API backend

Both samples use an open-source JavaScript SDK (hello.js). Note that the OAuth 2.0 implicit grant flow support is still in preview.

Usage reporting APIs

A frequent ask from developers is to get access to rich consumer activity reports on their Azure AD B2C tenants. We’ve now made those available to you, programmatically, via REST-based Azure AD reporting APIs. You can easily pipe the data from these reports into business intelligence and analytics tools, such as Microsoft’s Power BI, for detailed analyses. With the current release, 4 activity reports are available:

tenantUserCount: Total number of consumers in your Azure AD B2C tenant (per day for the last 30 days). You can also get a breakdown by the number of local accounts (password-based accounts) and social accounts (Facebook, Google, etc.).
b2cAuthenticationCount: Total number of successful authentications (sign-up, sign-in, etc.) within a specified period.
b2cAuthenticationCountSummary: Daily count on successful authentications for the last 30 days.
b2cMfaRequestCountSummary: Daily count of multi-factor authentications for the last 30 days.

Get started using the steps outlined in this article.

Friction-free consumer sign-up

By default, Azure AD B2C verifies email addresses provided by consumers during the sign-up process. This is to ensure that valid, and not fake, accounts are in use on your app. However, some developers prefer to skip the upfront email verification step and doing it themselves later. This friction-free sign-up experience makes sense for certain app types. We’ve added a way for you to do this on your “Sign-up policies” or “Sign-up or sign-in policies”. Learn more about disabling email verification during consumer sign-up.

Feedback

Keep your great feedback coming on UserVoice or Twitter (@azuread, @swaroop_kmurthy). If you have questions, get help on Stack Overflow (use the ‘azure-active-directory’ tag).
Quelle: Azure