Upgrade classic Backup and Site Recovery vaults to ARM Recovery Services vaults

Today, we are pleased to offer seamless upgrade of classic Backup or Site Recovery vaults to ARM based Recovery Services vaults.

Last May 2016, we announced the General Availability of Recovery Services (RS) vault based on the Azure Resource Manager for Azure Backup and Azure Site Recovery. Since then, we announced many new features for both services which are available only with Recovery Services vaults.  The new upgrade feature allows customers on the older classic vaults of both Backup and ASR to seamlessly upgrade to RS vault with minimal downtime, no loss of data, recovery points or configuration settings, and take advantage of all the new features available with RS vaults.  

New features available only in Recovery Services vaults

Backup

Enhanced capabilities to help secure your backup data: With Recovery Services vaults, Azure Backup provide security capabilities to protect cloud backups. These security features ensure that you can secure your backups and recover data using cloud backups if production and backup servers are compromised.  Learn more
Central monitoring for your hybrid IT environment: With Recovery Services vaults, you can now monitor not only your Azure IaaS VM backups but also your on-premises backups from a central portal. Learn more
Role based Access Control: Recovery Services vaults are based on the Azure Resource Manager model and thus bring you the benefits of RBAC access that restrict backup and restore access to the defined set of user roles. Learn more
Protect all configurations of Azure Virtual Machines: Recovery Services vaults can protect Resource Manager based V2 VMs including Premium Disks, Managed Disks and Encrypted VMs. This allows you to upgrade your classic V1 VMs to V2 VMs and retain your older recovery points for the V1 VMs as well as configure protection for the newly upgraded V2 VMs in the same vault. Learn more
Instant restore for IaaS VMs: Using Recovery Services vaults, you can now restore files and folders without having to restore the entire IaaS VM thus enabling you to have faster restores. This support is available for both Windows and Linux VMs. Learn more

Site Recovery

Azure Resource Manager support: You can protect and failover your virtual machines and physical machines into Azure ARM stack. You get the benefits of RBAC access that restrict replication and recovery operations to the defined set of user roles.
Streamlined ‘Getting Started’ experience: Simplicity has been the design goal of Recovery Services vaults. The new Getting Started experience vastly simplifies setting up disaster recovery for your applications.
Exclude Disk: This feature allows you to exclude specific disks that do not contain important data from being replicated. You can save storage and network resources by not replicating unwanted churn.
Support for Premium and Locally-redundant Storage (LRS): You can now protect applications that need higher IOPs by replicating into premium storage.
Support for managed disks: You can attach managed disks to your machines after failover to Azure. Using managed disks simplifies disk management for Azure IaaS VMs by managing the storage accounts associated with the VM disks.

For more details, please refer to this blog.

No impact to ongoing replication or existing backups

The entire upgrade process has been designed to be quick, smooth and easy to perform.

There is no physical data movement between the old vault and the upgraded vault and the upgrade process only involves updating configuration settings.
Once started, the upgrade typically takes about 15-30 minutes. During periods of heavy load, this could take up to 1 hour.
During the upgrade process, replication and scheduled backups will continue to happen, so you will remain protected.
During the upgrade, you will not be able to do management operations.

For Backup, these include new registrations, configuring new cloud backups and restoring IaaS VMs.
For Site Recovery, it includes operations such as registering a new server, performing a test failover, executing a failover, failback.

Post upgrade, all your settings and configuration will be retained, including all backup and recovery points that you created from the classic vault.

How to upgrade?

There is significant demand for upgrading to the RS vault, and we are expecting a number of customers to sign up for upgrade. To streamline this process, we will be releasing customers into the upgrade queue in batches. You can sign-up for upgrade using the following links:

Backup vaults: Sign up
Site Recovery vaults: Sign up

Once your subscription has been white-listed for upgrade, Microsoft will contact you to proceed with the upgrade.
Quelle: Azure

Introducing Google Cloud IoT Core: for securely connecting and managing IoT devices at scale

By Indranil Chakraborty, Product Manager, Google Cloud

Today we’re announcing a new fully-managed Google Cloud Platform (GCP) service called Google Cloud IoT Core. Cloud IoT Core makes it easy for you to securely connect your globally distributed devices to GCP, centrally manage them and build rich applications by integrating with our data analytics services. Furthermore, all data ingestion, scalability, availability and performance needs are automatically managed for you in GCP style.

When used as part of a broader Google Cloud IoT solution, Cloud IoT Core gives you access to new operational insights that can help your business react to, and optimize for, change in real time. This advantage has value across multiple industries; for example:

Utilities can monitor, analyze and predict consumer energy usage in real time
Transportation and logistics firms can proactively stage the right vehicles/vessels/aircraft in the right places at the right times
Oil and gas and manufacturing companies can enable intelligent scheduling of equipment maintenance to maximize production and minimize downtime

So, why is this the right time for Cloud IoT Core?

About all the things

Many enterprises that rely on industrial devices such as sensors, conveyor belts, farming equipment, medical equipment and pumps — particularly, globally distributed ones — are struggling to monitor and manage those devices for several reasons:

Operational cost and complexity: The overhead of managing the deployment, maintenance and upgrades for exponentially more devices is stifling. And even with a custom solution in place, the resource investments required for necessary IT infrastructure are significant.
Patchwork security: Ensuring world-class, end-to-end security for globally distributed devices is out of reach — or at least not a core competency — for most organizations.
Data fragmentation: Despite the fact that machine-generated data is now an important data source for making good business decisions, the massive amount of data generated by these devices is often stored in silos with a short expiration date, and hence never reaches downstream analytic systems (nor decision makers).

Cloud IoT Core is designed to help resolve these problems by removing risk, complexity and data silos from the device monitoring and management process. Instead, it offers you the ability to more securely connect and manage all your devices as a single global system. Through a single pane of glass you can ingest data generated by all those devices into a responsive data pipeline — and, when combined with other Cloud IoT services, analyze and react to that data in real time.

Key features and benefits

Several key Cloud IoT Core features help you meet these goals, including:

Fast and easy setup and management: Cloud IoT Core lets you connect up to millions of globally dispersed devices into a single system with smooth and even data ingestion ensured under any condition. Devices are registered to your service quickly and easily via the industry-standard MQTT protocol. For Android Things-based devices, firmware updates can be automatic.
Security out-of-the-box: Secure all device data via industry-standard security protocols. (Combine Cloud IoT Core with Android Things for device operating-system security, as well.) Apply Google Cloud IAM roles to devices to control user access in a fine-grained way.
Native integration with analytic services: Ingest all your IoT data so you can manage it as a single system and then easily connect it to our native analytic services (including Google Cloud Dataflow, Google BigQuery and Google Cloud Machine Learning Engine) and partner BI solutions (such as Looker, Qlik, Tableau and Zoomdata). Pinpoint potential problems and uncover solutions using interactive data visualizations, or build rich machine-learning models that reflect how your business works.
Auto-managed infrastructure: All this in the form of a fully-managed, pay-as-you-go GCP service, with no infrastructure for you to deploy, scale or manage.

“With Google Cloud IoT Core, we have been able to connect large fleets of bicycles to the cloud and quickly build a smart transportation fleet management tool that provides operators with a real-time view of bicycle utilization, distribution and performance metrics, and it forecasts demand for our customers.”
—  Jose L. Ugia, VP Engineering, Noa Technologies

Next steps
Cloud IoT Core is currently available as a private beta, and we’re launching with these hardware and software partners:

Cloud IoT Device Partners

Actions Semiconductor
Allwinner Technology
ARM
Marvell
Microchip
Intel
Mongoose OS
NXP
Realtek
Sierra Wireless
SOTEC

Cloud IoT Application Partners

Helium
Losant
Mnubo
Tellmeplus

When generally available, Cloud IoT Core will serve as an important, foundational tool for hardware partners and customers alike, offering scalability, flexibility and efficiency for a growing set of IoT use cases. In the meantime, we look forward to your feedback!

Quelle: Google Cloud Platform

General Availability: Azure Search parses JSON Blobs

Today, we are happy to announce general availability for JSON parsing with Azure Search’s Blob Storage indexer.

Azure Search has long supported indexers for a variety of data sources on Azure: Document DB, Azure SQL Database, Tables, and Blobs. Indexers allow for Azure Search to automatically pull data (along with changes and deletions) into an Azure Search index without writing any code. The Blob indexer in particular is interesting because it is able to crack open and index a multitude of file types: Office documents, PDFs, HTML files, and more.

With today’s announcement, we are releasing the ability for the Blob Storage indexer to parse JSON content stored in blobs. This capability is not currently configurable in the Azure Portal. Note that support for parsing multiple documents from JSON arrays remains in preview.

Indexing JSON objects

With JSON parsing enabled, the Blob Storage Indexer can index properties of JSON objects, like the example below, into separate fields in your search index.

{
"text" : "A hopefully useful article explaining how to parse JSON blobs",
"datePublished" : "2016-04-13"
"tags" : [ "search", "storage", "howto" ]
}

To set up JSON parsing, create a datasource as usual:

POST https://[service name].search.windows.net/datasources?api-version=2016-09-01
Content-Type: application/json
api-key: [admin key]

{
"name" : "my-blob-datasource",
"type" : "azureblob",
"credentials" : { "connectionString" : "DefaultEndpointsProtocol=https;AccountName=;AccountKey=;" },
"container" : { "name" : "my-container", "query" : "optional, my-folder" }
}

Then, create an indexer (https://docs.microsoft.com/rest/api/searchservice/create-indexer) and set the parsingMode parameter to json.

POST https://[service name].search.windows.net/indexers?api-version=2016-09-01
Content-Type: application/json
api-key: [admin key]

{
"name" : "my-json-indexer",
"dataSourceName" : "my-blob-datasource",
"targetIndexName" : "my-target-index",
"schedule" : { "interval" : "PT2H" },
"parameters" : { "configuration" : { "parsingMode" : "json" } }
}

Azure Search only supports primitive data types, string arrays, and GeoJSON points, which means that the Blob Storage indexer cannot index arbitrary JSON. However, it is possible to select parts of the JSON object and “lift” them to top-level fields of an Azure Search document. To learn more about this, visit our documentation on field mappings. 

Learn More

Read more about Azure Search and its capabilities and visit our documentation. Please visit our pricing page to learn about the various tiers of service to fit your needs.
Quelle: Azure

The Latest Docker Certified Container and Plugins for March and April 2017

The Docker Certification Program provides a way for technology partners to validate and certify their software or plugin as a container for use on the Docker Enterprise Edition platform.  Since the initial launch of the program in March, more Containers and Plugins have been certified and available for download.
 
Certified Containers and Plugins are technologies that are built with best practices as Docker containers, tested and validated against the Docker Enterprise Edition platform and APIs, pass security requirements, reviewed by Docker partner engineering and cooperatively supported by both Docker and the partner. Docker Enterprise Edition and Certified Technology provide assurance and support to businesses for their critical application infrastructure.
Check out the latest Docker Certified technologies to the Docker Store:

Dynatrace provides monitoring Docker applications and Docker clusters out of the box.
{code} by Dell EMC certified a number of REX-ray volume plugins for the following: REX-Ray for AWS EFS, REX-Ray for AWS EBS, REX-Ray for S3FS, REX-Ray for Isilon, REX-Ray for GCE and REX-Ray for ScaleIO.
HPE OpsBridge Agent provides monitoring of Docker applications with HPE Operations Bridge.
CoScale Agent provides a lightweight solution for monitoring the performance of your Docker containers and microservices in production.
NexentaEdge Docker NFS Volume Plug-In for the Nexenta Scale-Out High Performance Multi-Service Solution with Cluster-Wide Deduplication and Compression.
Oracle: As announced at DockerCon, many Oracle products are now available on Docker Store including Oracle Coherence, Oracle WebLogic Server, Oracle Java 8 SE (Server JRE), and Oracle Instant Client.
VMware Vsphere Volume Service for Docker enables the ability to run stateful container applications on VMware vSphere.
Weaveworks Network Plugin provides simple, resilient multi-host Docker networking.

Check Visit Docker Store regularly to browse and download the latest Certified Containers and Plugins. Interested in publishing? Sign up here to start posting to Docker Store.
 

The latest Docker Certified Containers and Plugins on #Docker StoreClick To Tweet

Continue your Docker journey with these helpful links:

Try Docker Enterprise Edition for free
Browse the Docker Store for Certified Containers and Certified Plugins
Sign up to become a Docker Store Publisher

The post The Latest Docker Certified Container and Plugins for March and April 2017 appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/