Client provided keys with Azure Storage server-side encryption

Microsoft Azure Storage offers several options to encrypt data at rest. With client-side encryption you can encrypt data prior to uploading it to Azure Storage. You can also choose to have Azure Storage manage encryption operations with server-side encryption using Microsoft managed keys or using customer managed keys in Microsoft Azure Key vault. Today, we present enhancement to server-side encryption to support granular encryption settings on storage account with keys hosted in any key store. Client provided key (CPK) enables you to store and manage keys in on-premises or key stores other than Azure Key Vault to meet corporate, contractual and regulatory compliance requirements for data security.

Client provided keys allows you to pass an encryption key as part of read or write operation to storage service using blob APIs. When you create a blob with a client provided key, the storage service persists the SHA-256 hash of the encryption key with the blob to validate future requests. When you retrieve an object, you must provide the same encryption key as part of the request. For example, if a blob is created with Put Blob, all subsequent write operations must provide the same encryption key. If a different key is provided, or if no key is provided in the request, the operation will fail with 400 Bad Request. As the encryption key itself is provided in the request, a secure connection must be established to transfer the key. Here’s the process:

Figure 1: Client provided keys

Getting started

Client provided keys may be used with supported blob operations by adding the x-ms-encryption-* headers to the request.

Request Header

Description

x-ms-encryption-key

Required. A Base64-encoded AES-256 encryption key value.

x-ms-encryption-key-sha256

Required. The Base64-encoded SHA256 of the encryption key.

x-ms-encryption-algorithm

Required. Specifies the algorithm to use when encrypting data using the given key. Must be AES256.

Request

PUT mycontainer/myblob.txt
x-ms-version: 2019-02-02
x-ms-encryption-key: MDEyMzQ1NjcwMTIzNDU2NzAxMjM0NTY3MDEyMzQ1Njc=
x-ms-encryption-key-sha256: 3QFFFpRA5+XANHqwwbT4yXDmrT/2JaLt/FKHjzhOdoE=
x-ms-encryption-algorithm: AES256
Content-Length: <length>

Key Management

Azure Storage does not store or manage client provided encryption keys. Keys are securely discarded as soon as possible after they’ve been used to encrypt or decrypt the blob data. If client provided keys are used on blobs with snapshots enabled, each snapshot can be provisioned with different encryption key. You must keep track of snapshot and associated encryption key to pass the correct key with blob operations. If you need to rotate the key associated with an object, you can use copy blob operation to pass old and new keys as headers as shown below.

Request

PUT mycontainer/myblob.txt
x-ms-copy-source: https://myaccount.blob.core.windows.net/mycontainer/myblob.txt
x-ms-source-encryption-key: MDEyMzQ1NjcwMTIzNDU2NzAxMjM0NTY3MDEyMzQ1Njc=
x-ms-source-encryption-key-sha256: 3QFFFpRA5+XANHqwwbT4yXDmrT/2JaLt/FKHjzhOdoE=
x-ms-source-encryption-algorithm: AES256
x-ms-encryption-key: NzY1NDMyMTA3NjU0MzIxMDc2NTQzMjEwNzY1NDMyMTA=
x-ms-encryption-key-sha256: uYo4dwqNEIFWjJ5tWAlTJWSrfdY2QIH5UF9IHYNRqyo=
x-ms-encryption-algorithm: AES256

Next Steps

This feature is available now on your storage account with recent release of Storage Services REST API (version 2019-02-02). You may also use .NET client library and Java client library.

For more information on client provided keys please visit our documentation page. For any further questions, or to discuss your specific scenario, send us an email at azurestoragefeedback@microsoft.com or post your ideas and suggestions about Azure Storage on our feedback forum.
Quelle: Azure

Customer Provided Keys with Azure Storage Service Encryption

Azure storage offers several options to encrypt data at rest. With client-side encryption you can encrypt data prior to uploading it to Azure Storage. You can also choose to have Azure storage manage encryption operations with storage service encryption using Microsoft managed keys or using customer managed keys in Azure Key Vault. Today, we present enhancement to storage service encryption to support granular encryption settings on storage account with keys hosted in any key store. Customer provided keys (CPK) enables you to store and manage keys in on-premises or key stores other than Azure Key Vault to meet corporate, contractual, and regulatory compliance requirements for data security.

Customer provided keys allows you to pass an encryption key as part of read or write operation to storage service using blob APIs. Since the encryption key is defined at the object level, you can have multiple encryption keys within a storage account. When you create a blob with customer provided key, storage service persists the SHA-256 hash of the encryption key with the blob to validate future requests. When you retrieve an object, you must provide the same encryption key as part of the request. For example, if a blob is created with Put Blob using CPK, all subsequent write operations must provide the same encryption key. If a different key is provided, or if no key is provided in the request, the operation will fail with 400 Bad Request. As the encryption key itself is provided in the request, a secure connection must be established to transfer the key. Here’s the process:
 

Figure 1 Customer Provided Keys

Getting started

Customer Provided Keys may be used with supported blob operations by adding the x-ms-encryption-* headers to the request.

Request Header
Description

x-ms-encryption-key
Required. A Base64-encoded AES-256 encryption key value.

x-ms-encryption-key-sha256
Required. The Base64-encoded SHA256 of the encryption key.

x-ms-encryption-algorithm
Required. Specifies the algorithm to use when encrypting data using the given key. Must be AES256.

Request

PUT mycontainer/myblob.txt
x-ms-version: 2019-02-02
x-ms-encryption-key: MDEyMzQ1NjcwMTIzNDU2NzAxMjM0NTY3MDEyMzQ1Njc=
x-ms-encryption-key-sha256: 3QFFFpRA5+XANHqwwbT4yXDmrT/2JaLt/FKHjzhOdoE=
x-ms-encryption-algorithm: AES256
Content-Length: <length>

Key management

Azure Storage does not store or manage customer provided encryption keys. Keys are securely discarded as soon as possible after they’ve been used to encrypt or decrypt the blob data. If customer provided keys are used on blobs with snapshots enabled, each snapshot can be provisioned with different encryption key. You must keep track of snapshot and associated encryption key to pass the correct key with blob operations. If you need to rotate the key associated with an object, you can download the object and upload with new encryption key.

Next steps

This feature is available now on your storage account with recent release of Storage services REST API (version 2019-02-02). You may also use .NET Client library and Java Client library. There are no additional charges for customer provided keys.

For more information on customer provided keys please visit our documentation page. For any further questions, or to discuss your specific scenario, send us an email at azurestoragefeedback@microsoft.com or post your ideas and suggestions about Azure Storage on our feedback forum.
Quelle: Azure

Measuring your return on investment of Azure as a compliance platform

Today we’re pleased to introduce the release of Microsoft Azure is Helping Organizations Manage Regulatory Challenges More Effectively, a new International Data Corporation (IDC) white paper based on original research by IDC and sponsored by Microsoft. IDC studied Azure customers who are using Azure as a platform to meet regulatory compliance needs, with a special focus on government, healthcare, and financial customers. Azure Policy was cited by customers as having an important impact on meeting compliance obligations.

IDC found that these customers are realizing significant benefits by leveraging Azure capabilities to make their regulatory and compliance efforts more effective. Significant findings of research include:

•    Five-year return on investment (ROI) of 465 percent, worth an average of $4.29 Million.
•    Six-month payback on investment.
•    47 percent reduction in unplanned downtime.
•    35 percent reduction in compliance-related penalties.
•    A 24 percent increase in productivity for regulatory compliance teams.

Research summary findings

“Study participants reported use of Azure as a compliance platform helped them carry out their day–to-day compliance responsibilities more effectively. Azure helped them better manage spikes in the workload, enabled faster access to (and analysis of) data during audits, and reduced exposure to risk based on the strong internal controls of Azure.”

Specific benefits outlined by study participants in the research included:

Better workload management and reduced risk: "We are able to stay on top of what we are doing, and we can now handle growth or spikes in the workload. Azure has lessened our exposure to risk because of its strong internal controls."
Increased audit efficiency: "Azure has absolutely helped with audits. For example, it allows us to have much better access to our data, and faster analysis of that data for our audits. Compliance teams save time as a result."
State-of-the-art security: "Azure has lessened compliance risk exposure because its … security systems are state of the art. There is less chance of any kind of data being compromised through intrusion."

About half of the organizations surveyed were using Azure Blueprints, which enable tenants to deploy a repeatable set of Azure resources that implements and adheres to common compliance standards, including ISO 27001, PCI DSS, and NIST SP 800-53. Benefits cited by customers from using Azure Blueprints included better visibility and remediation of threats and vulnerabilities, guidance documentation, and automation scripts for hosting web applications.

One customer said of Azure Blueprints in the research, "The architecture is already set up and is very sophisticated (for example, there are different app services, load balancers, and the database are all set up). We don't have to spend a lot of time on architecture. Other benefits are the resource manager, security management, logging and auditing, activity logs, and diagnostic logs. It’s a great resource for support of our ongoing compliance requirements."

Learn more about how to deploy Azure Blueprints today.  

Read more about the IDC findings by visiting the article.
Quelle: Azure

Azure Data Factory Mapping Data Flows are now generally available

In today’s data-driven world, big data processing is a critical task for every organization. To unlock transformational insights and embrace a data-driven culture, companies need tools to help them easily integrate and transform data at scale, without requiring specialized skills.

Today we’re announcing the general availability of the Mapping Data Flows feature of Azure Data Factory (ADF), our productive and trusted hybrid integration service. Data Factory now empowers users with a code-free, serverless environment that simplifies ETL in the cloud and scales to any data size, no infrastructure management required.

Built to handle all the complexities and scale challenges of big data integration, Mapping Data Flows allow users to quickly transform data at scale. Build resilient data pipelines in an accessible visual environment with our browser-based designer and let ADF handle the complexities of Spark execution.Mapping Data Flows simplifies data processing, with built-in capabilities to handle unpredictable data schemas and to maintain resilience to changing input data. With Mapping Data Flows, customers like Nielsen are empowering their employees to turn data into insights, regardless of data complexity or the coding skills of their teams.”Mapping Data Flows have been instrumental in enabling Nielsen’s analytics teams to perform data cleansing and preparation in a user-friendly and code-free environment, and allow us to deliver insights to our clients in a faster and more automated way.” – David Hudzinski, Director, Product, Nielsen

Accelerate time to insights by focusing on building your business logic without worrying about managing and maintaining server clusters or writing code to build pipelines. Easily perform ETL tasks like loading fact tables, maintaining slowly changing dimensions, aggregating semi-structured big data, matching data using fuzzy matching, and preparing data for modeling. With our intuitive visual interface, design your data transformation logic as easy-to-read graphs, and build libraries of transformation routines to easily turn raw data into business insights.

Work the way you want – code-first, or entirely code-free with Mapping Data Flows. Use built-in transformations to perform common actions like joining, aggregating, pivoting, and sorting. Customize these transformations with the expression builder, which includes auto-complete and comprehensive online help.

As you build your logical graphs, validate in real-time using ADF’s live data preview capability. Features like null counts, value distributions, and standard deviation provide immediate insights into your data.  
Finally, build pipelines and debug your new ETL process end-to-end using the drag and drop pipeline builder with interactive debugging.   
Build schedules for your pipelines and monitor your data flow executions from the ADF monitoring portal. Easily manage data availability SLAs with ADF’s rich availability monitoring and alerts, and leverage built-in CI/CD capabilities to save and manage your flows in a managed DataOps environment. And establish alerts and view execution plans to validate that your logic is performing as planned as you tune your data flows.

Mapping Data Flows is a game-changer for any organization looking to make data integration and transformation faster, easier, and accessible to everyone.

Learn more and get started today using ADF with Mapping Data Flows.
Quelle: Azure

Introducing the preview of direct-upload to Azure managed disks

We are excited to announce the preview of direct-upload to Azure managed disks. Today, there are two ways you can bring your on-premises VHD files to Azure as managed disks:

Stage the VHD into a storage account before converting them into managed disks
Attach an empty managed disk to a VM and do copy.

Both these ways have disadvantage. The first option requires extra storage account to manage while the second option has extra cost of running virtual machine. Direct-upload addresses both these issues and provides a simplified workflow by allowing copy of your on-premises VHD into an empty managed disk. You can use it to upload to Standard HDD, Standard SSD, and Premium SSD managed disks of all the supported sizes.

If you are an independent software vendor (ISV) providing backup solution for IaaS virtual machines in Azure, we recommend you leverage direct-upload to restore your customers’ backups to managed disks. It will help simplify the restore process by getting away from storage account management. Our Azure Backup support for large managed disks is powered by direct-upload. It uses direct-upload to restore large managed disks.

For increased productivity, Azure Storage Explorer also added support for managed disks. It exposes direct-upload via an easy-to-use graphical user interface (GUI), enabling you to migrate your local VHD files to managed disks in few clicks. Moreover, it also leverages direct-upload to enable you to copy and migrate your managed disks seamlessly to another Azure region. This cross-region copy is powered by AzCopy v10 which is designed to support large-scale data movement in Azure.

If you choose to use Azure Compute Rest API or SDKs, you must first create an empty managed disk by setting the createOption property to Upload and the uploadSizeBytes property to match the exact size of the VHD being uploaded.

Rest API

{
"location": "WestUS2",
"properties": {
"creationData": {
"createOption": "Upload",
"uploadSizeBytes": 10737418752
}
}
}

Azure CLI

az disk create
-n mydiskname
-g resourcegroupname
-l westus2
–for-upload
–upload-size-bytes 10737418752
–sku standard_lrs

You must generate a writeable SAS for the disk, so you can reference it as the destination for your upload.

az disk grant-access
-n mydiskname
-g resourcegroupname
–access-level Write
–duration-in-seconds 86400

Use AzCopy v10 to upload your local VHD file to the empty managed disk by specifying the SAS URI you generated.

AzCopy copy "c:somewheremydisk.vhd" "SAS-URI" –blob-type PageBlob

After the upload is complete, and you no longer need to write any more data to the disk, revoke the SAS. Revoking the SAS will change the state of the managed disk and allow you to attach the disk to a virtual machine.

az disk revoke-access -n mydiskname -g resourcegroupname

Supported regions

All regions are supported via Azure Compute Rest API version 2019-03-01, latest version of Azure CLI, Azure PowerShell SDK, Azure .Net SDK, AzCopy v10 and Storage explorer.

Getting started

Upload a vhd to Azure using Azure PowerShell and AzCopy v10
Upload a vhd to Azure using Azure CLI and AzCopy v10
Upload, download, cross-region copy managed disks using Azure Storage Explorer

Quelle: Azure

The key to a data-driven culture: Timely insights

A data-driven culture is critical for businesses to thrive in today’s environment. In fact, a brand-new Harvard Business Review Analytic Services survey found that companies who embrace a data-driven culture experience a 4x improvement in revenue performance and better customer satisfaction.

Foundational to this culture is the ability to deliver timely insights to everyone in your organization across all your data. At our core, that is exactly what we aim to deliver with Azure Analytics and Power BI, and our work is paying off in value for our customers. According to a recent commissioned Forrester Consulting Total Economic Impact™ study, Azure Analytics and Power BI deliver incredible value to customers with a 271 percent ROI, while increasing satisfaction by 60 percent.

Our position in the leaders quadrant in Gartner’s 2019 Magic Quadrant for Analytics & Power BI, coupled with our undisputed performance in analytics provides you with the foundation you need to implement a data-driven culture.

But what are three key attributes needed to establish a data-driven culture?

First, it is vital to get the best performance from your analytics solution across all your data, at the best possible price.

Second, it is critical that your data is accurate and trusted, with all the security and privacy rigor needed for today’s business environment.

Finally, a data-driven culture necessitates self-service tools that empower everyone in your organization to gain insights from your data.

Let’s take a deeper look into each one of these critical attributes.

Performance

When it comes to performance, Azure has you covered. An independent study by GigaOm found that Azure SQL Data Warehouse is up to 14x faster and costs 94% less than other cloud providers. This unmatched performance is why leading companies like Azure Anheuser-Busch Inbev adopt Azure.

“We leveraged the elasticity of SQL Data Warehouse to scale the instance up or down, so that we only pay for the resources when they’re in use, significantly lowering our costs. This architecture performs significantly better than the legacy on-premises solutions it replaced, and it also provides a single source of truth for all of the company’s data.” – Chetan Kundavaram, Global Director, Anheuser-Busch Inbev

Security

Azure is the most secure cloud for analytics. This is according to Donald Farmer, a well-respected thought leader in the data industry, who recently stated, “Azure SQL Data Warehouse platform offers by far the most comprehensive set of compliance and security capabilities of any cloud data warehouse provider”. Since then, we announced Dynamic Data Masking and Data Discovery and Classification to automatically help protect and obfuscate sensitive data on-the-fly to further enhance your data security and privacy.

Insights for all

Only when everyone in your organization has access to timely insights can you achieve a truly data-driven culture. Companies drive results when they break down data silos and establish a shared context of their business based on trusted data. Customers that use Azure Analytics and Power BI do exactly that. According to the same Forrester study, customers stated.

“Azure Analytics has helped with a culture change at our company. We are expanding into other areas so that everyone can make informed business decisions.”  — Study interviewee

“Power BI was a huge success. We’ve added 25,000 users organically in three years.”  — Study interviewee

Only Azure Analytics and Power BI together can unlock the performance, security and insights for your entire organization. We are uniquely positioned to empower you to develop a data-driven culture needed to thrive. We are excited to see customers like Reckitt Benckiser, choose Azure for their analytics needs.

"Data is most powerful when it's accessible and understandable. With this Azure solution, our employees can query the data however they want versus being confined to the few rigid queries our previous system required. It’s very easy for them to use Power BI Pro to integrate new data sets to deliver enormous value. When you put BI solutions in the hands of your boots on the ground—your sales force, marketing managers, product managers—it delivers a huge impact to the business."  — Wilmer Peres, Information Services Director, Reckitt Benckise

When you add it all up, Azure Analytics and Power BI are simply unmatched.

Get started today

To learn more about Azure’s insights for all advantage, get started today!

Gartner, Magic Quadrant for Analytics and Business Intelligence Platforms, 11 February 2019, Cindi Howson, James Richardson, Rita Sallam, Austin Kronz

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, express or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
Quelle: Azure

Over 100 Azure services support PROTECTED Australian government data

Today Microsoft published an independent security assessment of 113 Microsoft Azure services for their suitability to handle official and PROTECTED Australian government information. This assessment, carried out under the Information Security Registered Assessor Program (IRAP), is now available for customers and partners to review and use as they plan for increasing the use of cloud in government.

This milestone significantly expands the ability of the Australian government to leverage Microsoft Azure to drive digital transformation. The expanded scope of this IRAP assessment includes cognitive services, machine learning, IoT, advanced cybersecurity, open source database management, and serverless and application development technologies. This enables the full range of innovation within Azure Australia to be utilized for government applications, further reinforcing our commitment to achieving the broadest range of accreditations and assurances to meet the needs of government customers.

This assurance is critical for customers such as the Victorian Government, using ICT shared services provider Cenitex in partnership with Canberra-based OOBE to deploy VicCloud Protect, a ground-breaking and highly secure service that enables its government customers to safely manage applications and data rated up to PROTECTED level.

“VicCloud Protect is a first for the Victorian Government and our customers can now confidently store their classified data in the cloud with peace of mind that the platform meets both the Australian Cyber Security Centre guidelines and the Victorian Protection Data Security Framework to handle Protected level information.” – Nigel Cadywould, Cenitex Service Delivery Director

This is just one of many examples of Australian governments and partners building on the secure foundations of Azure to build transformative solutions for government. Microsoft is one of the only global cloud providers to operate cloud regions in Canberra specifically designed and secured to meet the strict security compliance requirements of Australian government and national critical infrastructure, including:

Data center facilities within CDC, a datacenter provider based in Canberra that specializes in government and national critical infrastructure and meets the stringent sovereignty and transparent ownership controls required by the Australian government’s hosting policy.
Leading physical and personnel security within the Canberra facilities designed for the even higher requirements of handling secret government data.
Direct connection within the data center to the federal government’s intragovernment communications network (ICON) for enhanced security and performance.
Unmatched flexibility for colocation of critical systems in the same facilities as Microsoft Azure in Canberra and access to the ecosystem of solution providers deployed within CDC.

Microsoft delivers the Azure Australia Central regions in Canberra as the first and best home of Australian government data and applications. The assessment released today covers not just the Central regions , but addresses all regions of Microsoft Azure in Australia, including Australia East (Sydney) and Australia Southeast (Melbourne). Also, as Microsoft has introduced further capacity and capabilities into the Australia Central regions, we have streamlined the process for customers to deploy services into our Canberra regions. Customers no longer need to manually request access to deploy services to the Australia Central region and can now directly deploy from the portal.

Because the Australian Government has designed the IRAP program to follow a risk-based approach, each customer decides whether to operate that service at the PROTECTED level or lower. To assist customers with their authorization decision, Microsoft makes the IRAP assessment report and supporting documents available to customers and partners on an Australia-specific page of the Microsoft Service Trust Portal.

For government customers who want to get started building solutions for PROTECTED level data, we’ve published Australia PROTECTED Blueprint guidance with reference architectures for IaaS and PaaS web applications along with threat model and control implementation guidance. This Blueprint enables customers to more easily deploy Azure solutions suitable for processing, storage, and transmission of sensitive and official information classified up to and including PROTECTED.

Learn more about our latest IRAP assessment

Our IRAP assessment report and supporting documents are available on the Australia-specific page of the Microsoft Service Trust Portal
Find additional documents and configuration guidance for operating at PROTECTED on the Azure Australia Microsoft Docs page and the AU-PROTECTED Blueprints on the Service Trust Portal
Learn more about the Australia Central and Australia Central 2 regions and CDC Data Centres

Quelle: Azure

Azure Cost Management updates – September 2019

Whether you're a new student, thriving startup, or the largest enterprise, you have financial constraints and you need to know what you're spending, where, and how to plan for the future. Nobody wants a surprise when it comes to the bill, and this is where Azure Cost Management comes in!

We're always looking for ways to learn more about your challenges and how Cost Management can help you better understand where you're accruing costs in the cloud, identify and prevent bad spending patterns, and optimize costs to empower you to do more with less. Here are a few of the latest improvements and updates based on your feedback:

Reconcile invoiced charges with the new invoice details view
Automate reporting across subscriptions with management group exports
What's new in Cost Management Labs
Download charts to share views outside the Azure portal
New ways to save money with Azure!
Documentation updates

Let's dig into the details.

 

Reconcile invoiced charges with the new invoice details view

Have you ever had to compare your PDF invoice with raw cost and usage details? The process can be a bit daunting. Detailed usage data is critical for analysis and reporting, but can be overkill for invoice reconciliation. You need a summary of your usage with the same granularity as the invoice. This is exactly what you get with the new Invoice details view.

With the new Invoice details view, you can also view and filter by part number for Enterprise Agreement (EA) accounts and use publisher and charge type to identify Marketplace purchases. What would you like to see next?

 

Automate reporting across subscriptions with management group exports

You already know you can dig into your cost and usage data from the Azure portal. You may even know you can get rich reporting from the Cost Management Query API or get the full details, in all its glory, from the UsageDetails API. These are both great for ad-hoc queries, but you may be looking for a simpler solution. This is where Cost Management exports come in!

Cost Management exports automatically publish your cost and usage data to a storage account on a daily, weekly, or monthly basis. Up to this month, you've been able to schedule exports for billing accounts, subscriptions, and resource groups. Now, you can also schedule exports across subscriptions using management groups. If you manage pay-as-you-go (PAYG) subscriptions, this will be even more powerful because, for the first time, you'll be able to export all cost and usage data for your account from a single place.

If you do start using management groups, don't forget they also allow you to analyze and drill into costs and get notified before they go over predefined limits.

Learn more about exports in the Create and manage exported data tutorial.

 

What's new in Cost Management Labs

With Cost Management Labs, you get a sneak peek at what's coming in Azure Cost Management and can engage directly with us to share feedback and help us better understand how you use the service, so we can deliver more tuned and optimized experiences. Here are a few features you can see in Cost Management Labs:

Download charts as an image – This is now available in the public portal.
Open the desired view, then click the Export command at the top, select the PNG option, and click the Download charts button.
Dark theme support in cost analysis – This is now available in the public portal.
Support for the Azure portal dark theme was added to cost analysis in early August. We're making the last few final touches and expect this to be available from the full portal in early September.
New: Get started quicker with the cost analysis Home view
Cost Management offers 5 built-in views to get started with understanding and drilling into your costs. The Home view gives you quicker access to those views so you get to what you need faster!

Of course, that's not all! Every change in Cost Management is available in Cost Management Labs a week before it's in the full Azure portal, like the new Invoice details view and scheduling management group exports. We're eager to hear your thoughts and understand what you'd like to see next. What are you waiting for? Try Cost Management Labs today!

 

Save and share customized views in cost analysis

You built a custom view, saved it, and even shared it with your team. But now you need to share that view outside the portal. Whether you need to present it as part of a larger PowerPoint deck or simply share it over email, you can now download charts in cost analysis as an image to share it with others. You'll see a slightly redesigned Export menu which now offers a PNG option when viewing charts.

 

New ways to save money with Azure

Lots of cost optimization improvements have been introduced over the past month! Here are a few you might be interested in:

Lower your upfront reservation costs with monthly payment options.
SQL Data Warehouse reservations are now available in 18 more regions.
App Service Premium plan costs an average 35 percent less. Consider switching from the Standard plan to get more for less.
Azure Archive Storage costs up to 50 percent less in some regions.
Data transfer to Azure CDN from Microsoft sourced from Azure services like Azure Storage and Media Services is free, starting October 2019.
Azure SQL Database instance pools (new preview) offer a new, cost-effective way to migrate smaller databases to the cloud.

 

Documentation updates

We added a clarification in the budgets tutorial about when to expect email alerts. In general, new cost and usage data is available in Cost Management within 8-12 hours, depending on the service. Budget alerts are processed within the next 4 hours. You can generally expect to receive budget alerts via email or action group within 12-16 hours. Keep in mind this time is based on when services emit usage data. Learn more about Cost Management data in the Understanding Cost Management data documentation.

Want to keep an eye on all documentation updates? Check out the Cost Management doc change history in the azure-docs repository on GitHub. If you see something missing, select Edit at the top of the document and submit a quick pull request.

 

What's next?

These are just a few of the big updates from last month. We're always listening and making constant improvements based on your feedback, so please keep the feedback coming!

Follow @AzureCostMgmt on Twitter and subscribe to the YouTube channel for updates, tips, and tricks! And, as always, share your ideas and vote up others in the Cost Management feedback forum.
Quelle: Azure

Azure Cosmos DB recommendations keep you on the right track

The tech world is fast-paced, and cloud services like Azure Cosmos DB get frequent updates with new features, capabilities, and improvements. It’s important—but also challenging—to keep up with the latest performance and security updates and assess whether they apply to your applications. To make it easier, we’ve introduced automatic and tailored recommendations for all Azure Cosmos DB users. A large spectrum of personalized recommendations now show up in the Azure portal when you browse your Azure Cosmos DB accounts.

Some of the recommendations we’re currently dispatching cover the following topics

SDK upgrades: When we detect the usage of an old version of our SDKs, we recommend upgrading to a newer version to benefit from our latest bug fixes and performance improvements.
Fixed to partitioned collections: To fully leverage Azure Cosmos DB’s massive scalability, we encourage users of legacy, fixed-sized containers that are approaching the limit of their storage quota to migrate these containers to partitioned ones.
Query page size: We recommend using a query page size of -1 for users that define a specific value instead.
Composite indexes: Composite indexes can dramatically improve the performance and RU consumption of some queries, so we suggest their usage whenever our telemetry detects queries that can benefit from them.
Incorrect SDK usage: It’s possible for us to detect when our SDKs are incorrectly used, like when a client instance is created for each request instead of being used as a singleton throughout the application; corresponding recommendations are provided in these cases.
Lazy indexing: The purpose of Azure Cosmos DB’s lazy indexing mode is rather limited and can impact the freshness of query results in some situations. We advise using the (default) consistent indexing mode instead of lazy indexing.
Transient errors: In rare occurrences, some transient errors can happen when a database or collection gets created. SDKs usually retry operations whenever a transient error occurs, but if that’s not the case, we notify our users that they can safely retry the corresponding operation.

Each of our recommendations includes a link that brings you directly to the relevant section of our documentation, so it’s easy for you to take action.

3 ways to find your Azure Cosmos DB recommendations

1.    Click on this message at the top of the Azure Cosmos DB blade:

2.    Head directly to the new “Notifications” section of your Cosmos DB accounts:

3.    Or even find them through Azure Advisor, making it easier to receive our recommendations for users who don’t routinely visit the Azure portal.

Over the coming weeks and months, we’ll expand the coverage of these notifications to include topics like partitioning, indexing, network security, and more. We also plan to surface general best practices to ensure you’re making the most out of Azure Cosmos DB.

Have ideas or suggestions for more recommendations? Email us or leave feedback using the smiley on the top-right corner of the Azure portal!
Quelle: Azure

Built-in Jupyter notebooks in Azure Cosmos DB are now available

Earlier this year, we announced a preview of built-in Jupyter notebooks for Azure Cosmos DB. These notebooks, running inside Azure Cosmos DB, are now available.

Cosmic notebooks are available for all data models and APIs including Cassandra, MongoDB, SQL (Core), Gremlin, and Spark to enhance the developer experience in Azure Cosmos DB. These notebooks are directly integrated into the Azure Portal and your Cosmos accounts, making them convenient and easy to use. Developers, data scientists, engineers and analysts can use the familiar Jupyter notebooks experience to:

Interactively run queries
Explore and analyze data
Visualize data
Build, train, and run machine learning and AI models

In this blog post, we’ll explore how notebooks make it easy for you to work with and visualize your Azure Cosmos DB data.

Easily query your data

With notebooks, we’ve included built-in commands to make it easy to query your data for ad-hoc or exploratory analysis. From the Portal, you can use the %%sql magic command to run a SQL query against any container in your account, no configuration needed. The results are returned immediately in the notebook.

Improved developer productivity

We’ve also bundled in version 4 of our Azure Cosmos DB Python SDK for SQL API, which has our latest performance and usability improvements. The SDK can be used directly from notebooks without having to install any packages. You can perform any SDK operation including creating new databases, containers, importing data, and more.

Visualize your data

Azure Cosmos DB notebooks comes with a built-in set of packages, including Pandas, a popular Python data analysis library, Matplotlib, a Python plotting library, and more. You can customize your environment by installing any package you need.

For example, to build interactive visualizations, we can install bokeh and use it to build an interactive chart of our data.

Users with geospatial data in Azure Cosmos DB can also use the built-in GeoPandas library, along with their visualization library of choice to more easily visualize their data.

Getting started

Follow our documentation to create a new Cosmos account with notebooks enabled or enable notebooks on an existing account.
Start with one of the notebooks included in the sample gallery in Azure Cosmos Explorer or Data Explorer.
Share your favorite notebooks with the community by sending them to the Azure Cosmos DB notebooks GitHub repo.
Tag your notebooks with #CosmosDB, #CosmicNotebooks, #PoweredByCosmos on social media. We will feature the best and most popular Cosmic notebooks globally!

Stay up-to-date on the latest Azure #CosmosDB news and features by following us on Twitter or LinkedIn. We’d love to hear your feedback and see your best notebooks built with Azure Cosmos DB!
Quelle: Azure