Visual Studio 2017 im Test

Die neue Version von Microsofts Entwicklungsumgebung bändigt den hohen Ressourcenbedarf von Visual Studio durch Modularisierung. Diverse Verbesserungen wie das "Live Unit Testing" erleichtern den Programmieralltag.

Quelle: Heise Tech News

Protect Windows Server System State to cloud with Azure Backup!

One of the key endeavors of the cloud-first approach of Azure Backup is to empower enterprises to recover from security attacks, corruptions, disasters, or data loss situations quickly, securely, and reliably.  Restoring servers efficiently in the wake of evolving IT threats involves going beyond recovering data alone from backups. Our customers have expressed varying degrees of complexity in how their operating systems and applications are configured. Restoring this dynamic configuration captured in the form of the Windows Server System State, in addition to data, with minimum infrastructure, forms a critical component of disaster recovery.

Today we are extending the data backup capabilities of the Azure Backup agent to enable customers to perform comprehensive, secure, and reliable Windows Server recoveries. We are excited to preview the support for backing up Windows Server System State directly to Azure with Azure Backup.

Azure Backup will now integrate with the Windows Server Backup feature that is available natively on every Windows Server and provide seamless and secure backups of your Windows Server System State directly to Azure without the need to provision any on-premises infrastructure.

Value proposition

Comprehensive Protection for Active Directory, File-Servers and IIS Web servers: Active Directory (AD) is the most critical database of any organization and therefore requires a backup strategy that allows for reliable recoveries during critical scenarios. System State of a domain-controller server captures the Active Directory and files that are required for domain-controller synchronization and allow for targeted Active Directory protection and restores.
On a File Server, System State captures important file-cluster configurations and policies that protect files from unauthorized access. Combined with file-folder backup, the backup of System State with Azure Backup agent provides the ability to comprehensively recover File Servers.
On an IIS Web Server, System state captures the IIS Metabase that contains crucial configuration information about the server, the site and even files and folders and therefore is the recommended option to restore Web Servers.
Cost-Effective Offsite for Disaster Recovery: System State for most Windows Servers is less than 50 GBs in size. For that size, at $5 a month and pay-as-you-go Azure storage, Azure Backup eliminates all infrastructure and licensing costs, and enables you to protect your Windows Server System State for reliable restores. No need to provision local hard-drives, or offsite storage, or employ additional tools or servers to manage system state backups and ensure their off-siting. Azure Backup takes care of off-siting System State on a specified schedule to Azure!
Secure Backups: The enhanced security features built into Azure Backup and data-resilience offered by Azure ensure that your critical system state backups remain secure from malicious attacks, corruptions, and deletions.
Flexible Restores: With Azure Backup’s Restore-as-a-Service, you can restore System State files from Azure without any egress charges. Additionally, you can apply System State to your Windows Servers at your convenience using the native Windows Server Backup utility.
Single management pane in Azure: With Azure Backup’s Restore-as-a-Service, you can restore System State files from Azure without any egress charges. Additionally, you can apply System State to your Windows Servers at your convenience using the native Windows Server Backup utility.

Availability for Windows Server (Preview)

The support for backing up System State with Azure Backup agent is available in preview for all Windows Server versions from Windows Server 2016 all the way down to Windows Server 2008 R2!

Getting started

Follow the four simple steps below to start protecting your Windows Servers using Azure Backup!

Create an Azure Recovery Services Vault.
Download the latest version of the Azure Backup Agent from the Azure Portal.  
Install and Register the Agent.
Start protecting Windows Server System State and other Files and Folders directly to Azure!

Related links and additional content

New to Azure Backup, sign up for a free Azure trial subscription.  
Need help? Reach out to Azure Backup forum for support or browse Azure Backup documentation.
Tell us how we can improve Azure Backup by contributing new ideas and voting up existing ones.
Follow us on Twitter @AzureBackup for the latest news and updates.
Connect with us at the Azure Tech Community.

Quelle: Azure

Higher database eDTU limits for Standard elastic pools in Azure SQL Database

Until now the maximum DTUs per database in a Standard elastic pool was limited to 100 eDTUs in Azure SQL Database. We are pleased to announce the public preview of an increase in this limit to as much as 3000 eDTUs with new click stop choices starting at 200 eDTUs. These higher limits are especially well suited for databases with activity bursts that demand more CPU than previously provided by Standard pools. For IO intensive workloads, Premium pools continue to provide the best performance experience with lower latency per IO and more IOPS per eDTU.

New choices in database eDTU limits for Standard pools

Learn More

To learn more about SQL Database elastic pools, please visit the SQL Database elastic pool webpage.  And for pricing information, please visit the SQL Database pricing webpage.
Quelle: Azure

IoT Hub message routing: now with routing on message body

IoT Hub message routing can now be done on message body! Thanks to the inundation of feedback from our customers requesting the ability to route messages based on the message body, the team prioritized the work and it’s now available for everyone to use.

Back in December, we released message routing for IoT Hub to simplify IoT solution development. Message routing allows customers to automatically route messages to different services based on customer-defined queries in IoT Hub itself, and we take care of all of the difficult implementation architecture for you. Message routing initially shipped based on message headers, and today I am happy to announce that you can now route messages based on message body for JSON messages, available today.

Message routing based on headers gives customers the ability to route messages to custom endpoints without the service cracking open the telemetry flowing through it, but it came with a limitation: customers had to add information to the headers that they weren’t including otherwise, which hindered the usefulness. Many customers wanted to be able to route directly based on the contents of the message body because that’s where the interesting information already was. Routing on message body is intuitive and allows customers full control over message routing.

It’s super simple to route based on message body: just use $body in the route query to access the message body. For example, my device uses the C# device SDK to send messages using this example code:

DeviceClient deviceClient = DeviceClient.CreateFromConnectionString(deviceClientConnectionString);string messageBody = @"{
""Weather"":{
""Temperature"":50,
""Time"":""2017-03-09T00:00:00.000Z"",
""PrevTemperatures"":[
20,
30,
40
],
""IsEnabled"":true,
""Location"":{
""Street"":""One Microsoft Way"",
""City"":""Redmond"",
""State"":""WA""
},
""HistoricalData"":[
{
""Month"":""Feb"",
""Temperature"":40
},
{
""Month"":""Jan"",
""Temperature"":30
}
]
}
}";

// Encode message body using UTF-8
byte[] messageBytes = Encoding.UTF8.GetBytes(messageBody);

using (var message = new Message(messageBytes))
{
// Set message body type and content encoding.
message.ContentEncoding = "utf-8";
message.ContentType = "application/json";

// Add other custom application properties.
message.Properties["Status"] = "Active";
await deviceClient.SendEventAsync(message);
}

 

There are a variety of ways to route messages based on the information provided in the example message. Here are a couple of types of queries you might want to run:

Simple body reference

$body.Weather.Temperature = 50
$body.Weather.IsEnabled
$body.message.Weather.Location.State = 'WA'

Body array reference

$body.Weather.HistoricalData[0].Month = 'Feb'

Multiple body references

$body.Weather.Temperature >= $body.Weather.PrevTemperatures[0] + $body.Weather.PrevTemperatures[1]
$body.Weather.Temperature = 50 AND $body.message.Weather.IsEnabled

Combined with built-in functions

length($body.Weather.Location.State) = 2
lower($body.Weather.Location.State) = 'wa'

Combination with message header

$body.Weather.Temperature = 50 AND Status = 'Active'

In order for IoT Hub to know whether the message can be routed based on its body contents, the message must contain specific headers which describe the content and encoding of its body. In particular, messages must have both these headers for routing on message body to work:

Content type of "application/json"
Content encoding must match one of:

"utf-8"
"utf-16"
"utf-32"

If you are using the Azure IoT Device SDKs, it is pretty straightforward to set the message headers to the required properties. If you are using a third-party protocol library, you can use this table to see how the headers manifest in each of the protocols that IoT Hub supports:

 
AMQP
HTTP
MQTT

Content type
iothub-content-type
iothub-contenttype
RFC 2396-encoded($.ct)=RFC 2396-encoded(application/json)

Content encoding
iothub-content-encoding
iothub-contentencoding
RFC 2396-encoded($.ce)=RFC 2396-encoded(encoding)

The message body has to be well-formed JSON in order for IoT Hub to route based on the message body. Messages can still be routed based on message headers regardless of whether the content type/content encoding are present. Content type and content encoding are only required for IoT Hub to route based on the body of the message.

This feature was brought to you in part by the outpouring of feedback we got requesting the ability to route messages based on message body, and I want to send a huge THANK YOU to everyone who requested the functionality. As always, please continue to submit your suggestions through the Azure IoT User Voice forum or join the Azure IoT Advisors Yammer group.
Quelle: Azure

Elon Musk Says He'll Leave Presidential Councils If Trump Quits Paris Climate Accord

AP Photo/Evan Vucci

Tesla and SpaceX chief executive Elon Musk will step down from several of Trump administration advisory councils if the president pulls the US out of the Paris climate agreement, he tweeted on Wednesday.

Musk sits on an economic advisory council as well as a manufacturing group. Tesla's stated mission is “to accelerate the world’s transition to sustainable energy.” He previously said serving on two of Trump's advisory councils would “serve the greater good.”

Musk wouldn't be the first tech leader to step down from Trump's advisory groups. Uber CEO Travis Kalanick resigned from Trump's economic advisory council in February after facing backlash from users and protests outside the ride-hail company's San Francisco headquarters.

BuzzFeed News reported in January that some Tesla customers had canceled their Model 3 orders over Musk's relationship with Trump.

Nate Erickson

Tesla did not immediately return a request for comment.

Quelle: <a href="Elon Musk Says He'll Leave Presidential Councils If Trump Quits Paris Climate Accord“>BuzzFeed

Run massive parallel R Jobs in Azure, now at a fraction of the price

We continue to add new capabilities to our lightweight R package, doAzureParallel, built on top of Azure Batch that allows you to easily use Azure's flexible compute resource right from your R session. Combined with the recently announced low-priority VMs on Azure Batch, you can now run your parallel R jobs at a fraction of the price. We also included other commonly requested capabilities to enable you to do more, and to do it more easily, with doAzureParallel.

Using R with low priority VMs to reduce cost

Our second major release comes with full support for low-priority VMs, letting R users run their jobs on Azure’s surplus compute capacity at up to an 80% discount.

For data scientists, low-priority is great way to save costs when experimenting and testing their algorithms, such as parameter tuning (or parameter sweeps) or comparing different models entirely. And Batch takes care of any pre-empted low-priority nodes by automatically rescheduling the job to another node.

You can also mix both on-demand nodes and low-priority nodes. Supplementing your regular nodes with low-priority nodes gives you a guaranteed baseline capacity and more compute power to finish your jobs faster. You can also spin up regular nodes using autoscale to replace any pre-empted low-priority nodes to maintain your capacity and to ensure that your job completes when you need it.

​Other new features

Aside from the scenarios that low-priority VMs enable, this new release includes additional tools and common feature asks to help you do the following:

Parameter tuning & cross validation with Caret
Job management and monitoring to make it easier to run long-running R jobs
Leverage resource files to preload data to your cluster
Additional utility to help you read from and write to Azure Blob storage
ETL and data prep with Hadley Wickham’s plyr

​Getting started with doAzureParallel

doAzureParallel is extremely easy to use. With just a few lines of code, you can register Azure as your parallel backend which can be used by foreach, caret, plyr and many other popular open source packages.

Once you install the package, getting started is as simple as few lines of code:

# 1. Generate your credentials config and fill it out with your Azure information
generateCredentialsConfig(“credentials.json”)

# 2. Set your credentials
setCredentials(“credentials.json”)

# 3. Generate your cluster config to customize your cluster
generateClusterConfig(“cluster.json”)

# 4. Create your cluster in Azure passing, it your cluster config file.
cluster <- makeCluster(“cluster.json”)

# 5. Register the cluster as your parallel backend
registerDoAzureParallel(cluster)

# Now you are ready to use Azure as your parallel backend for foreach, caret, plyr, and many more

For more information, visit the doAzureParallel Github page for a full getting started guide, samples and documentation.

We look forward to you using these capabilities and hearing your feedback. Please contact us at razurebatch@microsoft.com for feedback or feel free to contribute to our Github repository.

Additional information:

Download and get started with doAzureParallel
For questions related to using the doAzureParallel package, please see our docs, or feel free to reach out to razurebatch@microsoft.com
Please submit issues via Github

Additional resources:

See Azure Batch, the underlying Azure service used by the doAzureParallel package
More general purpose HPC on Azure
Learn more about low-priority VMs
Visit our previous blog post on doAzureParallel

Quelle: Azure

Use Amazon Cloud Directory Typed Links to Create and Search Relationships across Hierarchies

Starting today, you can create and search relationships across hierarchies in Amazon Cloud Directory by using typed links. With typed links, you can build directories that can be searched across hierarchies more efficiently by filtering your queries based on relationship type. Typed links also enable you to model different types of relationships between objects in different hierarchies and to use relationships to prevent objects from being deleted accidentally.
Quelle: aws.amazon.com