GEP uses Azure and SQL Database to expand global reach

GEP delivers software and services that enable procurement leaders around the world to maximize their impact on their businesses’ operations, strategies, and financial performances. One of GEP&;s SaaS solutions for their customers is SMART by GEP​®, a cloud-based, comprehensive procurement-software platform built on Azure from the ground up.

One critical motivation for GEP was the greater scalability, less downtime, and reduced maintenance costs that GEP could experience with Azure SQL Database compared to what GEP could achieve on-premises. GEP also needed a way to overcome regulatory barriers that kept it out of some global markets. For many of GEP’s potential European customers, regulatory compliance would require having data stored in their local geographic regions. But it would not have been practical for GEP to build out multiple datacenters. By moving to Microsoft Azure, GEP has been able to accommodate its rapid growth and its potential to expand into new markets.

To learn more about GEP&039;s journey and how you can take advantage of Azure SQL Database to build SaaS applications, take a look at this newly published case study.
Quelle: Azure

Simpler Azure management libraries for .NET

One C# statement to authenticate. One statement to create a virtual machine. One statement to modify an existing virtual network, etc. No more guessing about what is required vs. optional vs. non-modifiable.

https://github.com/Azure/azure-sdk-for-net/tree/Fluent

We are announcing the first developer preview release of the new, simplified Azure management libraries for .NET. Our goal is to improve the developer experience by providing a higher-level, object-oriented API, optimized for readability and writability. These libraries are built on the lower-level, request-response style auto generated clients and can run side-by-side with auto generated clients.

Azure Authentication

One statement to authenticate and choose a subscription. The Azure class is the simplest entry point for creating and interacting with Azure resources.

Azure azure = Azure.Authenticate(credFile).WithDefaultSubscription();

Create a Virtual Machine

You can create a virtual machine instance by using a Define() … Create() method chain.

Console.WriteLine("Creating a Windows VM");

var windowsVM = azure.VirtualMachines.Define("myWindowsVM")
.WithRegion(Region.US_EAST)
.WithNewResourceGroup(rgName)
.WithNewPrimaryNetwork("10.0.0.0/28")
.WithPrimaryPrivateIpAddressDynamic()
.WithNewPrimaryPublicIpAddress("mywindowsvmdns")
.WithPopularWindowsImage(KnownWindowsVirtualMachineImage.WINDOWS_SERVER_2012_R2_DATACENTER)
.WithAdminUserName("tirekicker")
.WithPassword(password)
.WithSize(VirtualMachineSizeTypes.StandardD3V2)
.Create();

Console.WriteLine("Created a Windows VM: " + windowsVM.Id);

Update a Virtual Machine

You can update a virtual machine instance by using an Update() … Apply() method chain.

windowsVM.Update()
.WithNewDataDisk(10)
.DefineNewDataDisk(dataDiskName)
.WithSizeInGB(20)
.WithCaching(CachingTypes.ReadWrite)
.Attach()
.Apply();

Management libraries unleash the power of IntelliSense in Visual Studio

Fluent interface-inspired method chains in combination with IntelliSense deliver a wizard-like developer experience by presenting required and optional methods in the right sequence. For example, once you choose a Windows virtual machine image, IntelliSense will prompt for an admin password and nothing else.

Then, IntelliSense will prompt for a password and nothing else. This will continue until you reach the minimum required to call create().

As another example, if you were to choose a Linux virtual machine image, IntelliSense would prompt for a root user name and then SSH key.

Samples

You can find plenty of sample code that illustrates key management scenarios in Azure Virtual Machines, Virtual Machine Scale Sets, Storage, Networking, Resource Manager, Key Vault and Batch …

Service

Management Scenario

Virtual Machines

Manage virtual machine

Manage availability set

List virtual machine images

Manage virtual machines using VM extensions

List virtual machine extension images

Virtual Machines – parallel execution

Create multiple virtual machines in parallel
Create multiple virtual machines with network in parallel

Virtual Machine Scale Sets

Manage virtual machine scale sets (behind an Internet facing load balancer)

Storage

Manage storage accounts

Network

Manage virtual network
Manage network interface
Manage network security group
Manage IP address
Manage Internet facing load balancers
Manage internal load balancers

Resource Groups

Manage resource groups
Manage resources
Deploy resources with ARM templates
Deploy resources with ARM templates (with progress)

Key Vault

Manage key vaults

Batch

Manage batch accounts

Give it a try

This is a developer preview that supports major parts of Azure Virtual Machines, Virtual Machine Scale Sets, Storage, Networking, Resource Manager, Key Vault and Batch. You can run the samples above or go straight to our GitHub repo.

Give it a try and let us know what do you think (via e-mail or comments below), particularly –

Usability and effectiveness of the new management libraries for .NET?
What Azure services you would like to see supported soon?
What additional scenarios should be illustrated as sample code?

The next preview version of the Azure Management Libraries for .NET is a work in-progress. We will be adding support for more Azure services and tweaking the API over the next few months.
Quelle: Azure

Temporal Tables are generally available in Azure SQL Database

Temporal Tables allow you to track the full history of data changes directly in Azure SQL Database, without the need for custom coding. With Temporal Tables you can see your data as of any point in time in the past and use declarative cleanup policy to control retention for the historical data.

When to use Temporal Tables?

Quite often you may be in the situation to ask yourself fundamental questions: How did important information look yesterday, a month ago, a year ago, etc. What changes have been made since the beginning of the year? What were the dominant trends during a specific period of time?  Without proper support in the database, however, questions like these have never been easy to answer.
Temporal Tables are designed to improve your productivity when you develop applications that work with ever-changing data and when you want to derive important insights from the changes.
Use Temporal Tables to:

Support data auditing in your applications
Analyze trends or detect anomalies over time
Easily implement slowly changing dimension pattern
Perform fine-grained row repairs in case of accidental data errors made by humans or applications

Manage historical data with easy-to-use retention policy

Keeping history of changes tends to increase database size, especially if historical data is retained for a longer period of time. Hence, retention policy for historical data is an important aspect of planning and managing the lifecycle of every temporal table.  Temporal Tables in Azure SQL Database come with an extremely easy-to-use retention mechanism. Applying retention policy is very simple: it requires users to set single parameter during the table creation or table schema change, like shown in the following example.

ALTER TABLE [WebSiteClicks]
SET
(
SYSTEM_VERSIONING = ON
(
HISTORY_TABLE = dbo. WebSiteClicks_History,
HISTORY_RETENTION_PERIOD = 3 MONTHS
)
);

You can alter retention policy at any moment and your change will be effective immediately.

Why you should consider Temporal Tables?

If you have requirements for tracking data changes, using Temporal Tables will give you multiple benefits over any custom solution. Temporal Tables will simplify every phase in the development lifecycle: object creation, schema evolution, data modification, point-in-time analysis and data aging.

Next steps

To learn how to integrate Temporal Tables in your application, read the following article with the step-by-step instructions. To utilize temporal retention, check out  Manage temporal history with retention policy article on Azure.com.
Visit Channel 9 to hear a real customer story and watch a live presentation with the demo. For more information, check out MSDN documentation.
Quelle: Azure

Azure Stream Analytics query testing now available in the new portal

Azure Stream Analytics is a fully managed service allowing you to gain insights and run analytics in near real-time on your big data streaming workloads. The service was first deployed more than 2 years ago, long before the “new” Azure management portal, http://portal.azure.com, even existed.

For the past few months we’ve been hard at work adding exciting new features to the service as well as transitioning the management user interface from the old https://manage.windowsazure.com to the new portal

Today we want to announce that we’ve just added the ability to test queries in the “new” portal without needing to start or stop a job. Here’s a quick look at how this works.

Setup

You can setup a Stream Analytics by following this simple tutorial – How to create a Stream Analytics job. 

Once you have created a new Stream Analytics job you would typically Create Inputs and then Create Outputs. Or you can just skip ahead to building the query and once your query is working then go back and define the Inputs and Outputs to match those used in the query. Both ways work, giving you the flexibility to decide how you wish to work.

For the purposes of this blog post I have defined a job with 1 data stream input, called StreamInput and 1 output, called Output. You can see these in the query editor blade above.

Open the Query editor blade from the job details screen by clicking on the query in the “Query” lens. Or in our case the < > placeholder because there is no query yet.

You will be presented with the rich editor as before where you create your query. This blade has now been enhanced with a new pane on the left. This new pane shows the Inputs and Outputs used by the Query, and those defined for this job.

There is also 1 additional Input and Output shown which I did not define. These come from the new query template that we start off with. These will change, or even disappear all together, as we edit the query. You can safely ignore them for now.

A key requirement and a common ask from our customers while writing a query is being able to test, and test often, to ensure that the output is what it is expected to be, given some input data. Having to save the query after every edit, start the job, wait for incoming data, check the results, and then stop the job again each time you make a small change to the query would be slow and is sometimes not even possible. A way to test changes to a query quickly was needed.

I am happy to announce that with today’s latest release in the portal you can now test the query without going through this stop/start process. Here&;s how …

Sample data and testing queries

To test with sample input data, right click on any of your Inputs and choose to Upload sample data from file.

Once the upload has completes you can then use the Test button to test this query against the sample data you have just provided.

The output of your query is displayed in the browser, with a link to Download results should you wish to save the test output for later use. You can now easily and iteratively modify your query, and test repeatedly to see how the output changes.

In the diagram above you can see how I have changed the query inline to have a 2nd output, called HighAvgTempOutput where I am only writing a subset of the data being received.
With multiple outputs used in a query you can see the results for both outputs separately and easily toggle between them.
Once you are happy with the results in the browser, then you can save your query, start your job, sit back and watch the magic of Stream Analytics happen for you.

Feature Parity and the road ahead

With the long awaited addition of sample data and query testing in the new portal we are happy to announce that we have reached feature parity between the portals. Everything you could do before, and more, is now in the new portal. Going forward all new development efforts will be concentrated on the new portal. The old portal will continue to work and existing functionality will remain until end of the calendar year when we place to completely retire support for Stream Analytics in the old portal.
If you have not tried Stream Analytics in the new portal we encourage you to head over and give it a try.

Next Steps

We’re really excited to bring local testing to the new portal and take this final step to reaching feature parity across the two portals. We hope this makes your life much easier as you go about developing (and testing) your queries.

We invite you to provide feedback on our User Voice page about what you want added next to the service!

If you are new to either Microsoft Azure or Stream Analytics, try it out by signing up for a free Azure trial account and create your first Stream Analytics job.

If you need help or have questions, please reach out to us through the MSDN or Stackoverflow forums, email the product team directly.
Quelle: Azure

Azure PowerShell 3.0.0–Highlights and breaking changes

Azure PowerShell is a set of PowerShell cmdlets which assist you in managing your assets in Azure using Azure Resource Manager (ARM) and Azure Service Management (RDFE).  Azure PowerShell 3.0.0 adds various improvements and fixes across multiple Azure resources; however, in accordance with semantic versioning, the introduction of a new major revision indicates breaking changes in a small subset of our cmdlets.  You can install the cmdlets via your favorite installation path indicated in the Azure PowerShell 3.0.0 release notes.

Resource improvements

ApiManagement

Enable support of Importing and Exporting SOAP based APIs (Wsdl Format)

Import-AzureRmApiManagementApi
Export-AzureRmApiManagementApi

Deprecated cmdlet Set-AzureRmApiManagementVirtualNetworks. In place, place used cmdlet Update-AzureRmApiManagementDeployment
Enabled support for ARM based VNETs for configuration Vpn via cmdlet Update-AzureRmApiManagementDeployment
Introduced support for VpnType (None, External, Internal) to differentiate ApiManagement workloads for Internet and Intranet
Fixed PowerShell issues


Batch

Added new cmdlet for reactivating tasks

Enable-AzureBatchTask

Added new parameter for application packages on job manager tasks and cloud tasks

New-AzureBatchTask -ApplicationPackageReferences

Added new parameters for job auto termination

New-AzureBatchJob -OnAllTasksComplete -OnTaskFailure
New-AzureBatchJob -ExitConditions

ExpressRoute

Added new parameter service key in return object when provider list all cross connection

Get-AzureCrossConnectionCommand

MachineLearning

Get-AzureRmMlWebService supports paginated response
Remind user Get-AzureRmMlWebService "Name" parameter needs to work with "ResourceGroupName" parameter

Network

Added new cmdlet to get application gateway backend health

Get-AzureRmApplicationGatewayBackendHealth

Added support for creating UltraPerformance sku

New-AzureRmVirtualNetworkGateway -GatewaySku
New-AzureVirtualNetworkGateway -GatewaySku

RemoteApp

Added cmdlets to enable User Disk and Gold Image Migration feature

Export-AzureRemoteAppUserDisk
Export-AzureRemoteAppTemplateImage

SiteRecovery

New cmdlets have been added to support one to one mapping with service objects.

Get-AzureRmSiteRecoveryFabric
Get-AzureRmSiteRecoveryProtectableItem
Get-AzureRmSiteRecoveryProtectionContainerMapping
Get-AzureRmSiteRecoveryRecoveryPoin
Get-AzureRmSiteRecoveryReplicationProtectedItem
Get-AzureRmSiteRecoveryServicesProvider
New-AzureRmSiteRecoveryFabri
New-AzureRmSiteRecoveryProtectionContainerMapping
New-AzureRmSiteRecoveryReplicationProtectedItem
Remove-AzureRmSiteRecoveryFabric
Remove-AzureRmSiteRecoveryProtectionContainerMapping
Remove-AzureRmSiteRecoveryReplicationProtectedItem
Remove-AzureRmSiteRecoveryServicesProvider
Set-AzureRmSiteRecoveryReplicationProtectedItem
Start-AzureRmSiteRecoveryApplyRecoveryPoint
Update-AzureRmSiteRecoveryServicesProvider

Following cmdlets have been modified for to support one to one mapping with service objects.

Edit-AzureRmSiteRecoveryRecoveryPlan
Get-AzureRmSiteRecoveryNetwork
Get-AzureRmSiteRecoveryNetworkMapping
Get-AzureRmSiteRecoveryProtectionContainer
Get-AzureRmSiteRecoveryStorageClassification
Get-AzureRmSiteRecoveryStorageClassificationMapping
Start-AzureRmSiteRecoveryCommitFailoverJob
Start-AzureRmSiteRecoveryPlannedFailoverJob
Start-AzureRmSiteRecoveryTestFailoverJob
Start-AzureRmSiteRecoveryUnplannedFailoverJob
Update-AzureRmSiteRecoveryProtectionDirection
Update-AzureRmSiteRecoveryRecoveryPlan

HUB support added to Set-AzureRmSiteRecoveryReplicationProtectedItem.
Deprecation warning introduced for cmlets/parameter-sets which does not comply to SiteRecovery service object model.

Breaking changes

Data Lake Store

The following cmdlets were affected this release (PR 2965):

Get-AzureRmDataLakeStoreItemAcl (Get-AdlStoreItemAcl)

This cmdlet was removed and replaced with Get-AzureRmDataLakeStoreItemAclEntry (Get-AdlStoreItemAclEntry).
The old cmdlet returned a complex object representing the access control list (ACL). The new cmdlet returns a simple list of entries in the chosen path&;s ACL.

# Old
Get-AdlStoreItemAcl -Account myadlsaccount -Path /foo

# New
Get-AdlStoreItemAclEntry -Account myadlsaccount -Path /foo

Get-AzureRmDataLakeStoreItemAclEntry (Get-AdlStoreItemAclEntry)

This cmdlet replaces the old cmdlet Get-AzureRmDataLakeStoreItemAcl (Get-AdlStoreItemAcl).
This new cmdlet returns a simple list of entries in the chosen path&039;s ACL, with type DataLakeStoreItemAce[].
The output of this cmdlet can be passed in to the -Acl parameter of the following cmdlets:

Remove-AzureRmDataLakeStoreItemAcl
Set-AzureRmDataLakeStoreItemAcl
Set-AzureRmDataLakeStoreItemAclEntry

# Old
Get-AdlStoreItemAcl -Account myadlsaccount -Path /foo

# New
Get-AdlStoreItemAclEntry -Account myadlsaccount -Path /foo

Remove-AzureRmDataLakeStoreItemAcl (Remove-AdlStoreItemAcl), Set-AzureRmDataLakeStoreItemAcl (Set-AdlStoreItemAcl), Set-AzureRmDataLakeStoreItemAclEntry (Set-AdlStoreItemAclEntry)

These cmdlets now accept DataLakeStoreItemAce[] for the -Acl parameter.
DataLakeStoreItemAce[] is returned by Get-AzureRmDataLakeStoreItemAclEntry (Get-AdlStoreItemAclEntry).

# Old
$acl = Get-AdlStoreItemAcl -Account myadlsaccount -Path /foo
Set-AdlStoreItemAcl -Account myadlsaccount -Path /foo -Acl $acl

# New
$aclEntries = Get-AdlStoreItemAclEntry -Account myadlsaccount -Path /foo
Set-AdlStoreItemAcl -Account myadlsaccount -Path /foo -Acl $aclEntries

ApiManagement

The following cmdlets were affected this release (PR 2971):

New-AzureRmApiManagementVirtualNetwork

The required parameters to reference a virtual network changed from requiring SubnetName and VnetId to SubnetResourceId in format/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.ClassicNetwork/virtualNetworks/{virtualNetworkName}/subnets/{subnetName}

# Old
$virtualNetwork = New-AzureRmApiManagementVirtualNetwork -Location <String> -SubnetName <String> -VnetId <Guid>

# New
$virtualNetwork = New-AzureRmApiManagementVirtualNetwork -Location <String> -SubnetResourceId <String>

Deprecating Cmdlet Set-AzureRmApiManagementVirtualNetworks

The Cmdlet is getting deprecated as there was more than one way to Set Virtual Network associated to ApiManagement deployment.

# Old
$networksList = @()
$networksList += New-AzureRmApiManagementVirtualNetwork -Location $vnetLocation -VnetId $vnetId -SubnetName $subnetName
Set-AzureRmApiManagementVirtualNetworks -ResourceGroupName "ContosoGroup" -Name "ContosoApi" -VirtualNetworks $networksList

# New
$masterRegionVirtualNetwork = New-AzureRmApiManagementVirtualNetwork -Location <String> -SubnetResourceId <String>
Update-AzureRmApiManagementDeployment -ResourceGroupName "ContosoGroup" -Name "ContosoApi" -VirtualNetwork $masterRegionVirtualNetwork

Network

The following cmdlets were affected this release (PR 2982):

New-AzureRmVirtualNetworkGateway

Description of what has changed :- Bool parameter:-ActiveActive is removed and SwitchParameter:-EnableActiveActiveFeature is added for enabling Active-Active feature on newly creating virtual network gateway.

# Old
# Sample of how the cmdlet was previously called
New-AzureRmVirtualNetworkGateway -ResourceGroupName $rgname -name $rname -Location $location -IpConfigurations $vnetIpConfig1,$vnetIpConfig2 -GatewayType Vpn -VpnType RouteBased -EnableBgp $false -GatewaySku HighPerformance -ActiveActive $true

# New
# Sample of how the cmdlet should now be called
New-AzureRmVirtualNetworkGateway -ResourceGroupName $rgname -name $rname -Location $location -IpConfigurations $vnetIpConfig1,$vnetIpConfig2 -GatewayType Vpn -VpnType RouteBased -EnableBgp $false -GatewaySku HighPerformance -EnableActiveActiveFeature

Set-AzureRmVirtualNetworkGateway

Description of what has changed :- Bool parameter:-ActiveActive is removed and 2 SwitchParameters:-EnableActiveActiveFeature / DisableActiveActiveFeature are added for enabling and disabling Active-Active feature on virtual network gateway.

# Old
# Sample of how the cmdlet was previously called
Set-AzureRmVirtualNetworkGateway -VirtualNetworkGateway $gw -ActiveActive $true
Set-AzureRmVirtualNetworkGateway -VirtualNetworkGateway $gw -ActiveActive $false

# New
# Sample of how the cmdlet should now be called
Set-AzureRmVirtualNetworkGateway -VirtualNetworkGateway $gw -EnableActiveActiveFeature
Set-AzureRmVirtualNetworkGateway -VirtualNetworkGateway $gw -DisableActiveActiveFeature
Quelle: Azure

Keeping up with Azure Government: September Highlights

I’m super excited to recap a busy month of service launches and releases. This has been an extra busy month for us where we landed critical and key features that you all have been waiting for. We believe these updates provide a richer, more robust customer experience and are excited to bring these additional awesome products to the US Government market including: SQL v12, Redis Cache, Virtual Machine Scale Sets, A1-A7 VM series for ARM, D/DS VM Series for ARM and F-Series VMs for ARM, Service Fabric, Storage Service Encryption, Web Apps.

Specifically, major updates to the platform include:

Enhancements to Azure Government Resource Management (ARM), making it much easier to setup and configure your virtual machines and other services
SQL v12 allows users to focus on rapid app development and get rid of your need to worry about managing the VMs and infrastructure that support your SQL databases
Storage Service Encryption encrypts your data at rest to meet government security and compliance requirements
Bringing F-Series VMs to USGov Iowa

Other highlights from this month include:

Azure Government pricing is now in the pricing calculator allowing government consumers to better budget the use of Azure services.
Microsoft signing the CJIS security addendum for Oregon, which marked the 23rd state to the list of states that Microsoft supports for CJIS, more than any other cloud provider, making up the cloud leader in CJIS

To stay up to date on all things Azure Government, be sure to subscribe to our RSS feed and to receive emails by clicking “Subscribe by Email!” on the Azure Government Blog. To experience the power of Azure Government for your organization, sign up for an Azure Government Trial.
Quelle: Azure

TIBCO DataSynapse comes to the Azure Marketplace

We are pleased to announce the launch of TIBCO DataSynapse GridServer Engine – cloud edition in the Azure Marketplace. Available from today, this collaboration expands on the HPC burst work we have been doing with DataSynapse customers over the last few months to provide an easy deployment method in a cost effective manner.

Over the last 15 years TIBCO Software has been working to empower executives, developers and business users through the integration of applications and ecosystems to analyze data and create real-time solutions.

TIBCO DataSynapse GridServer is a service execution platform for dynamically scaling any application at any time across grid infrastructure. As a result of its improvement in productivity, performance and uptime to existing applications, it is used heavily within Financial Services for parallel computing certain risk calculations. By bringing DataSynapse GridServer Engine 6.2.0 to Azure, Microsoft its extending the possibilities of this platform by providing infrastructure flexibility, increased scalability and cost control.

The pay-as-you-go Marketplace image is intended for those customers that have an existing GridServer DataSynapse installation on-premises and wish to extend the environment beyond its current capacity into Azure.

This solution provides one or more virtual machines in Azure which are configured to connect to the director, the IP and FQDN that are provided at the time the Marketplace offering is purchased. The solution assumes that the customer is either running one or more on-premises managers and has a VPN connection between their Azure account and their managers already set up, or they are running their grid in Azure and already have a virtual network to which they intend to add the engine VM.

TIBCO has provided the option of installing the image on three VM families, A-series, D-series and Dv2. This was intended to give customers a range of price vs. performance and appeal to a number of different workloads.

"Microsoft Azure and TIBCO have worked closely together to deliver the perfect solution that combines Azure&;s compute resources with TIBCO&039;s Datasynapse Grid Server in a pay-as-you-go model. This enables our investment banking customers to tackle new regulations, such as Basil III&039;s FRTB and meet the large compute demands."

                                                                                                     – Mike Kennedy – TIBCO Sr. Director of Engineering

We are excited to see the continued development of DataSynapse GridServer for use with critical compute workloads and enable the benefits of the scale and elasticity of the Azure cloud. Further information on this release can be found here.

 
Quelle: Azure

Azure Backup hosts Ask Me Anything session

The Azure Backup team will host a special Ask Me Anything session on /r/Azure, Thursday, October 20, 2016 from 09:00 am to 1:00 pm PDT.

What&;s an AMA session?

We&039;ll have folks from across the Azure Backup Engineering team available to answer any questions you have. You can ask us anything about our products, services or even our team!

Why are you doing an AMA?

We like reaching out and learning from our customers and the community. We want to know how you use Azure and Azure Backup and how your experience has been. Your questions provide insights into how we can make the service better.We did this last year and we are excited about the questions and feedback we have received and we are doing it again.

Who will be there?

You, of course! We&039;ll also have PMs and Developers from the Azure Backup team participating throughout the day.
Have any questions about the following topics? Bring them to the AMA.

•    Backup of Azure IaaS VMs (both Classic and Resource Manager VMs)
•    Azure Backup Server
•    System Center Data Protection Manager (We announced VMware VM backup support couple of months ago)
•    Microsoft Azure Recovery Services Agent

Why should I ask questions here instead of StackOverflow, MSDN or Twitter? Can I really ask anything?

An AMA is a great place to ask us anything. StackOverflow and MSDN have restrictions on which questions can be asked while Twitter only allows 140 characters. With an AMA, you’ll get answers directly from the team and have a conversation with the people who build these products and services.

Here are some question ideas:
•    What is Azure Backup? What is the cloud-first approach to backup?
•    How should I choose between Azure Backup Server and System Center Data Protection Manager for my application backup?
•    What are the pros/cons of using VM backup versus File/Folder backup?
•    How does the “Protected Instances” billing model work?
•    Why should I pick cloud over tape for long term retention?
•    What can be protected in Recovery Services vault?

Go ahead, ask us anything about our public products or the team. Please note, we cannot comment on unreleased features and future plans.

Join us! We&039;re looking forward to having a conversation with you!
Quelle: Azure

Microsoft Common Controls Hub provides uncommon convenience

In an effort to push the envelope for providing our customers transparency and a top grade compliance experience, I’d like to to announce a compliance tool newly available through the Microsoft Trust Center for Azure; the Common Controls Hub powered by Unified Compliance. This customized Microsoft portal lets you compare control frameworks across a number of compliance mandates and privacy regimes including ISO 27001, SOC 1 and 2, PCI, FedRAMP, EU Model Clauses, hundreds of geographic-specific requirements, and many others.

We’ve arranged for any Microsoft customer (Azure, Office 365, CRM, or others) to create a free account to access a Microsoft-curated library of complete standards guidance. You’ll see control descriptions and objectives, have the ability to map requirements from one framework to another, and gain a deeper understanding of any gaps in your own compliance activities.

Best of all, these frameworks are maintained for you! Researchers are constantly revising the source data based on updates to the standards, and ensuring default mappings stay relevant. You can build a custom controls list to help guide your own security and audit efforts, and once you have narrowed down the set of controls that are applicable to your environment, you can track your status against them.

The Microsoft Common Controls Hub is another step in providing the cloud industry’s highest levels of transparency and compliance with international standards. In addition to assessments and attestations against more than 45 different certifications, laws, and regulations, Azure remains committed to enabling our customers to achieve compliance with their own industries’ mandates and regional requirements.
Quelle: Azure

Azure IoT Gateway SDK integrates support for Azure Functions

At Microsoft, we believe the edge of the network plays a critical role in IoT, not only in IoT devices themselves but also in intermediate field gateways.  Earlier this year, we announced the open source Azure IoT Gateway SDK, our approach to accelerating the development of IoT edge scenarios such as supporting legacy devices, minimizing latency, conserving network bandwidth, and addressing security concerns.  Since then, we’ve been busy improving and enhancing our SDK completely out in the open.

Today, we are happy to announce an exciting new capability we’ve added to the IoT Gateway SDK: Support for Azure Functions.  With Azure Functions integration, developers can easily call cloud-based logic from their IoT gateway. Just write an Azure Function and you can quickly call it from a Function Module in the Azure IoT Gateway SDK.

For example: if something goes wrong in your field gateway environment, such as local devices that can’t connect or misbehave, and you want to upload diagnostic information to your Azure IoT solution for inspection by operations, our new Functions integrations makes this simple. Just create an Azure Function that takes this data, stores it and alerts operations – and then call it from your gateway running the Azure IoT Gateway SDK when you encounter a problem.

The Azure IoT Gateway SDK supports everything from low-level modules written in C to connect the broad variety of deployed devices, to high level modules for productivity such as our new Azure Functions support.  The best part of the Azure IoT Gateway SDK is how easy it is to chain these modules together to create reusable processing pipelines that suit your needs.

We’ve already seen some great success stories from businesses benefitting from our approach to edge intelligence, and we’re looking forward to seeing what our customers and partners will create with this exciting new capability.
Quelle: Azure