Customer success stories with Azure Backup: Metori Capital Management

As a part of our continued customer success story series, Metori Capital, an asset management company based in Paris, shared their Azure Backup story on how they have secured their data and assets using Azure Backup for years.

Customer background

Metori Capital is a pure play asset management company specialized in systematic quantitative strategies with approximately $500M of assets under their management. Their day-to-day business is to collect market data and run proprietary quantitative models to identify an optimal allocation of wealth across a wide variety of financial instruments. They then send hundreds of orders to the main futures markets across the world. In parallel, they monitor risks, reconciliations of thousands of trades, and conduct valuation of the funds they manage.

Metori Capital relies on sophisticated technology and proprietary algorithms to help it buy, sell, and manage assets in Epsilon. It needed to build an enterprise platform that could support these algorithms, meet compliance, and safeguard data. To keep pace with its goals, Metori also wanted the benefits of scalability and reliable business continuity.

Azure Backup

Metori Capital, hosting a large part of their data in virtual machines, relies completely on Azure Backup for data protection and resiliency in a state of disaster. Their business demands guaranteed operational efficiency and scalability, and the company has all the processes fully automated and streamlined.

"For all those reasons, from our inception in 2016, we decided to have our production environment 100 percent Azure hosted. Also in our business, continuity of operations is an absolute prerequisite that imposes us to demonstrate that we might be only marginally affected by any hardware, network or server outage," — Loïc Guenin, Global head of Front Office Technologies at Metori Capital.

Azure Backup promises data availability even in situations of disaster and is what helped Metori Capital in adverse situations. The periodic business continuity and disaster recovery drills ensured security compliance at every step.

"This is where Azure Backup is an important service to us: We regularly conduct “fire drill” exercises against our IT infrastructure to assess our resiliency to different kinds of disrupting events. In the context of such exercises, thanks to Azure Backup, we proved that we could deploy a working clone of our production environment on a different infrastructure, in a matter of hours," — Loïc Guenin.

Summary

Metori Capital believes that Azure Backup provides a great return on investment by getting simple, secure, and cost-effective protection for business-critical workloads. Azure also provides Metori with enterprise capabilities it needs to stay connected to markets and investors, meet multiple compliance requirements, retain and safeguard data, and ensure business continuity – all while only paying for the services it consumes.

“Without Azure, it might have cost us 10 times more to build an environment that could cope with the future we expect,” — Nicolas Gaussel, CEO of Metori Capital Management.

Azure Backup provides a complete and dependable solution for Metori Capital’s critical workloads. It also provides them with the agility to backup their ever-growing data to Azure.

Learn more about Azure Backup on Azure.com

Attending Ignite? Add the session, Microsoft Azure Backup: Deep dive into Azure’s built-in data protection, to your schedule builder and attend on Thursday, November 7th, from 11:45am – 12:30pm.
Quelle: Azure

Advancing industrial IoT capabilities in Azure Time Series Insights

Late last year, we announced the preview of some of the foundational capabilities of our industrial IoT analytics platform with a scalable time series storage for trending decades of data, semantic model support to describe domain-specific metadata, and enhanced analytics APIs and UX. We are building on the power of this analytics platform with additional new capabilities that will add richness and flexibility, and open up new scenarios for our enterprise IoT customers. Today, we are announcing the following new capabilities:

Warm and cold analytics support that builds on top of our existing preview and provides retention-based data routing between warm and cold stores. Customers can now perform interactive analytics over warm data as well as gain operational intelligence over decades of historical data stored in a customer-owned Azure Data Lake.
A flexible analytics platform that enables attaching a customer-owned Azure Data Lake to Azure Time Series Insights for data archival, thereby allowing customers to have ownership of their IoT data. Customers can connect to and interop across a variety of advanced analytics scenarios such as predictive maintenance and machine learning using familiar technologies including Apache Spark™, Databricks, Jupyter, etc.
Rich query APIs and user experience to support interpolation, new scalar and aggregate functions, categorical variables, scatter plots, and time shifting of time series signals for in-depth analysis.
Significant scale and performance improvements at all layers of the solution including ingestion, storage, query, and metadata/model to support customers’ IoT solution needs.
Azure Time Series Insights Power BI connector that enables customers to take the queries they do in Azure Time Series Insights directly into Power BI to get a unified view of their BI and time series analytics in a single pane of glass.

Azure Time Series Insights continues to provide a scalable pay-as-you-go pricing model enabling customers to tune their usage to suit their business demands and let Azure Time Series Insights analytics platform worry about scaling the infrastructure to meet their growing needs.

A comprehensive analytics platform for Industrial IoT

We released a preview of our first wave of capabilities last year in December. We have since had great customer adoption and feedback that has led us to the preview refresh today.

Our customers span all major industrial IoT segments including manufacturing, automotive, oil and gas, power and utility, smart buildings, and IoT consulting. These customers are telling us that IoT time series analytics is more than just the potential to achieve operational excellence. IoT time series data together with rich contextualization helps them drive dynamic transformation, enabling their businesses to become more agile and data-driven than ever before.

To help maximize the value of time series data and drive this digital revolution, we're updating the Azure Time Series Insights offering to support comprehensive and rich analytics over multi-layered storage, open file format and flexibility to connect to other data services for connected data scenarios, enterprise grade scale and performance, enhanced user experience and SDK support, and out-of-box connectors to data services such as Power BI to enable end-to-end analytics scenarios.

Details of the new features in preview refresh

Comprehensive and rich analytics over multi-layered storage

The majority of industrial IoT customers work with IoT data for a variety of data access scenarios. To satisfy these requirements, Azure Time Series Insights provides scalable multi-layered time series storage for warm and cold data analytics. When a customer provisions Azure Time Series Insights, upon selecting the PAYG pricing option, they can configure Azure Storage as the cold store, as well as enable warm store. Additionally, a customer can choose the retention period (configurable at any time) for the warm store.  Azure Time Series Insights will automatically route ingested data based on the configured retention period to warm store, for example if retention was configured as 30d, as data is being streamed, 30d worth of data is stored in warm store. All data is, by default, routed to customer-owned Azure data lake for purposes of archiving and analytics. Queries done over the configured retention period are always served up from the warm store with no additional input from the user. Queries outside of the retention period are always served up from the cold store. This allows customers to do high volume, interactive, asset-based analytics over warm for monitoring, dashboarding, and troubleshooting scenarios. Customers can continue to do asset-based analytics over decades of cold data stored in their Azure Data Lake for operational intelligence including troubleshooting, golden batch analysis, predictive analytics, etc.

Flexible analytics platform for integrating with first and third party data services

A critical and powerful capability that is unleashed with our cold store is data connectivity to other data solutions for end-to-end scenario coverage. As mentioned earlier, the cold store is a customer-owned Azure Data Lake and is the source of truth for all their IoT data and metadata. Data is stored in an open source, Apache Parquet format for efficient data compression, space, query efficiency, and portability.

Azure Time Series Insights will provide out-of-box connectors for popular and familiar data services that our customers use, for example Apache Spark™ or Databricks for machine learning, and predictive analytics. This is a work in progress and will become available to customers shortly.

As part of this preview refresh, we are releasing the Azure Time Series Insights Power BI connector. This feature is available in the Azure Time Series Insights Explorer user experience through the ‘Export’ option, allowing customers to export the time series queries they create in our user experience directly into the Power BI desktop and view their time series charts alongside other BI analytics. This opens the door to a new class of scenarios for industrial IoT enterprises who have invested in Power BI. It provides a single pane of glass over analytics from various data sources including IoT time series, thereby unlocking significant business and operational intelligence.

Enhanced asset-based analytics API and user experience

Since our preview launch in December last year, we have worked with a number of key IoT enterprise customers to prioritize the set of requirements around query and user experience. The result is the following new capabilities we are announcing as part of our preview refresh today:

Interpolation to reconstruct time series signals from existing data
Discrete signal processing with categorical variables
Trigonometric functions
Scatter plots
Time shifting time series signals to understand data patterns
Model API improvements for hierarchy traversal, time series search, auto-complete, paths, and facets
Improved search and navigation efficiency and continuation token to support query at scale
Improved charting capabilities including support for step interpolation, minimum or maximum shadows, etc.​
Updated model authoring and editing experience
Increased query concurrency to support up to 30 concurrent queries

We have a number of new capabilities coming in this space including support for time weighted averages, additional scalar and aggregate functions, dashboards, etc. over the coming months.

Azure Time Series Insights is committed to our customers’ success

We look forward to continuing to deliver on our commitment of simplifying IoT for our customers and empowering them to achieve more with their IoT data and solutions. For more information, please visit the Azure Time Series Insights product page and documentation. Also, try out the quickstart to begin using Azure Time Series Insights today.

Please provide feedback and suggestions on how we can improve the product and documentation by scrolling down to the bottom of each documentation page, where you can find a button for “product feedback” or sign in to your GitHub account and provide feedback. We value your input and would love to hear from you.
Quelle: Azure

Building retail solutions with Azure IoT Central

Azure IoT Central is an IoT app platform for solution builders that simplifies the challenges of building and deploying scalable and affordable enterprise grade IoT applications. Across the retail industry, the use of connected devices to deliver business performance continues to grow in popularity. New solutions are accelerating business model transformation by connecting manufacturers, supply chains, warehouses, and store shelves to owners, operators, and customers in exciting new ways. Today we’ll discuss our stance on IoT in the retail industry, as well as tell you about just a few of our partners building incredible solutions on Azure IoT Central.

Based on the recent IoT Signals report, our survey of over 3,000 enterprise decision makers across the world, the Retail industry has the highest adoption rate of IoT related solutions at 90 percent, which is higher than Manufacturing, Transportation, Healthcare, or Government. Right now, retail and wholesale companies see top use cases for IoT within their supply chains (64 percent) and inventory optimization (59 percent) and of course leaders across all industries have concerns about security. Yet, we know that retailers and companies along the value chain have a long way to go before reaping all the benefits that IoT will provide.

Updates to IoT Central

Today, we announced updates to Azure IoT Central to help solution builders move beyond proof of concept to building business-critical applications they can brand and sell directly or through Microsoft AppSource. IoT Central can help retail solution builders accelerate development, enabling them to get connected, stay connected and transform their business by managing IoT solutions that deliver IoT data and insights to business applications where decisions are made. For more information, please see our IoT Cental Blog.

We are supporting retail specific solution builders with five IoT Central retail app templates for builders to brand, customize, and make their own apps using extensibility via APIs, data connectors to business applications, repeatability and manageability of their investment through multitenancy, and seamless device connectivity. Get started today with any app template for free and starter materials like a sample operator dashboard and simulated devices to show you what’s possible. In early 2020 updated pricing will help with predictability as you sell your solutions directly to customers or through Microsoft AppSource.

When you are ready to get to customizing and extending, take a look at our rich documentation set, which augments the journey with overviews, tutorials, how-to’s, and industry relevant concepts.

Figure 1, Your brand — Your SaaS – Customize and extend one of these 5 retail app templates to make them your own

Innovative retail partners building their SaaS with IoT Central

Established industry leaders across the retail ecosystem are optimizing omnichannel solutions with IoT Central; delivering IoT insights and actions from the beginning of the supply chain through distribution, warehousing, and into the hands of consumers through storefront or delivery. Learn about what QuEST Global, C.H. Robinson, Dynamics 365 Connected Store, and Footmarks Inc. are doing today.

Digital distribution center solution from Lenovo

In July, Lenovo introduced Lenovo Digital Distribution Center (built with IoT Central) and discussed many of the challenges faced by distribution centers globally, including staffing surges during peak times, labor costs, space constraints, and overall productivity.

Figure 2, illustration of the digital distribution workflow

Figure 3, Architecture diagram of Digital Distribution Center by Lenovo

Today we’ll introduce three more solution builders developing solutions across connected logistics, store analytics, and smart inventory management.

Connected logistics solutions from C.H. Robinson and QuEST Global

The challenges facing global logistics and fleet management continue to grow as more retailers move to just-in-time shipping and warehousing. With the holiday shopping (and shipping) season fast approaching, global shipping and freight transportation provider, C.H. Robinson, is putting IoT Central to work during its busiest time of the year. Intel intelligent gateways and IoT tags managed by IoT Central bring new data and insights into industry leading Navisphere Vision. Jordan Kass, President of TMC, a division of CH Robinson responsible for Navisphere told us; “Navisphere Vision provides global shippers supply chain visibility and a predictive analytics platform. To speed up our deployment, increase our capabilities, and evolve for the future, we are partnering and building new device connections with Azure IoT Central to empower one robust agnostic connection that allows for infinite scalability and speed. This enables us to further optimize and deliver better outcomes—such as improved savings, reliability, and visibility—during these high-stakes holiday shipping months.”

    Figure 4, an example of a Navishphere Vision dashboard for IoT device insights tracking temperature and humidity levels in shipping containers.  

Figure 5, Architecture diagram for Nacisphere Vision by C.H. Robinson

Road safety is a global issue affecting billions of people around the world. QuEST Global's fleet management solution Fleet Tracker aims to reduce roadside issues, using CalAMP OBD2 dongles to deliver real-time location, driving pattern, speed, engine health, and geo fencing; simultaneously managing vehicles nearly anywhere in the world. Maxence Cacheux, Head of Strategic Partnerships from Quest Global told us, “We are delighted with the successful deployment of our fleet management solution built on IoT Central, which enhanced its speed, security, and scalability. Now we are planning for the future when our customers around the world have tens of thousands of connected devices, delivering business transforming insights and actions.”

Figure 6, an example of a dashboard from QuEST Global’s Fleet Tracker solution, deliveirng insights from connected vehicles

Figure 7, Architecture diagram for Fleet Tracker by QuEST Global

 

Store analytics from Dynamics 365

Dynamics 365 Connected Store empowers retailers with real-time observational data to improve in-store performance. From customer movement to the status of products and store devices, Dynamics 365 Connected Store will provide a better understanding of your retail space. Built with IoT Central, Dynamics 365 Connected Store empowers store managers and employees to provide optimal shopping experiences through triggered alerts based on real-time data from video cameras and IoT sensors. This new workflow can significantly improve in-store performance by protecting inventory, increasing profitability, and optimizing the shopping experience in real time.

Figure 8, An example of Dynamics 365 Connected Store dashboard enabling retail staff to visualize the flow of traffic throughout their grocery store using optical IoT sensors

Figure 9, Architecture diagram of Dynamics 365 Connected Store

Smart Inventory Management from Footmarks Inc.

Consumer packaged goods (CPG) manufacturers share many of the same challenges, one of them is getting hundreds or sometimes thousands of custom assets like displays to the correct retail locations, and keeping them in the store for the right amount of time.

When displays don’t reach their pre-determined locations, retailers experience significant loss in sales, a key impact for brands during important buying times. Around the country today, we know a significant portion of point-of-purchase (POP) display programs are not compliant, a problem that Footmarks Inc. is looking to solve through their Smart Tracking app built with IoT Cental, an asset tracking application that delivers previously unavailable insights to CPG’s.

CPG’s can now track the location of their POP assets throughout the entire supply chain and into store execution. Gone are the days of mystery shopping and expensive store visits to get details on your assets.  Shawn Englund, Footmarks Inc.’s CEO is enthusiastic for the future saying,“We are excited to be working with some of the world’s largest CPGs to solve the age-old issue of merchandizing compliance. By adding Azure IoT Central we are able to gain even more insights throughout our CPG partner supply chains and provide actionable insights throughout each of their campaigns.”

Figure 10, An example of a Footmarks dashboard showing POP asset tracking along the supply chain.

Figure 11, Architecture diagram of Smart Tracking by Footmarks Inc.

Getting started

We are at the beginning of an incredible revolution that connects strategy, tools, and devices, and empowers retailers to turn insights into actions. Retail companies around the world are using IoT today to reinvent how they connect to customers, empower employees with the right information, deliver an intelligent supply chain, and informing new business models as individuals and organizations continue to connect billions of new devices to business applications. Here is how you can get started.

1. Start building today on IoT Central

2. Browse the growing list of retail applications in AppSource, devices in the Azure IoT Certified Device Catalog, or contact any of the solution builders discussed today:

Lenovo Digital Distribution Center, email Lenovo for information
C.H. Robinson Navisphere Vision
Footmarks Inc. Smart Tracking, email Footmarks for more information
Quest – Global Fleet Tracker
Dynamics 365 Connected Store, Learn more and request a preview

3. Connect with us at Microsoft Ignite, November 4-8

4. Visit us at the world’s largest Retail trade show, NRF in New York City where you can speak with experts and get hands on, January 12-14
Quelle: Azure

Azure IoT Central: Democratizing IoT for all solution builders

For the last five years, our industry has buzzed with the promises of IoT. IoT has evolved from being a next-horizon term, to a common vernacular employed across industry conversations. In fact, earlier this year we surveyed 3,000 enterprise decision makers across the world and learned that 85% have developed at least one IoT project. Across four major industries (manufacturing, retail, healthcare, and government), more than 84% of enterprise decision makers consider IoT “critical” to success (read the full report here).

Despite this near consensus, the average maturity of production-level IoT projects remains extremely low. Over 90% of companies experience some failure in the proof of concept stage due to concerns and knowledge gaps around how to scale their solutions securely, reliably, and affordably. This finding is not surprising. Scaling a project not only increases the cost, it also introduces significant technical complexity—from knowing how to adapt an architecture as the number of connected devices grows to millions, to ensuring your security remains robust as your breachable footprint expands.

Having worked with thousands of IoT customers, our engineers have encountered these issues time and time again. We used these learnings to evolve Azure IoT Central and help solution builders avoid common pitfalls that prevent many projects from moving beyond the proof of concept stage. We explained these findings in our new report, “8 attributes of successful IoT solutions,” to help IoT solution builders ask the right questions upfront as they design their systems, and help them select the right technology platforms.

A fully managed IoT app platform

Azure IoT Central is our IoT app platform for solution builders (such as ISVs, SIs, and OEMs), to design, deploy, and manage enterprise-grade solutions that they can brand and sell to their customers, either directly or through Microsoft AppSource.

Azure IoT Central provides a complete and robust platform that handles the “plumbing” of IoT solutions. It is by no means an end-to-end solution. The value of IoT Central is brought to life when solution builders leverage it to connect and manage their devices, as well as to extend device insights into their line of business applications. This allows solution builders to spend their time and energy in their area of expertise, transforming their businesses through value-adding and brand-differentiating elements. With whitelabeling, solution builders can go to market with a resulting solution that reflects their brand. While many customers choose to design and build cloud solutions using individual Azure services (a Platform as a Service, or PaaS, approach), Azure IoT Central reduces the cost of building and maintaining a PaaS-based solution by providing a fully managed platform.

Today we’re announcing several major updates to Azure IoT Central. We’re confident these updates will inspire builders to develop industry-leading solutions with the peace of mind that their applications rest on a secure, reliable, and scalable infrastructure–enabling them to connect and manage devices, generate insights, and bring those new insights into their existing applications.

 

New app templates for industry-focused enablement

Today we are releasing 11 new industry app templates, designed to illustrate the types of solutions our partners and customers can build across retail, healthcare, government, and energy.  

Innovative partners using Azure IoT across industries

From startups to established leaders, we are seeing solution builders across industries leverage Azure IoT Central to transform their industries.

One area where we're seeing solution builders use Azure IoT Central to design innovative solutions is healthcare. Every 20 seconds a limb is lost to diabetes. To tackle this issue, Seattle-based startup, Sensoria Health joined forces with leading orthopedic medical footwear manufacturer, Optima Molliter, to launch an IoT solution that enables continuous, remote monitoring of patients recovering from diabetic foot ulcers. Patients and physicians can leverage the Sensoria Core solution, a hub utilizing IoT and artificial intelligence (AI) based on telemetry from Optima footwear, to monitor real-time patient adherence to clinician recommendations via a mobile app.

Physicians can leverage the clinician dashboard, which provides a holistic view of their patient population, to manage patient interactions, understand patient adherence to recommendations over time, and to decide which patients are in most need of care at a given moment. By enabling real-time alerts, physicians can manage care escalation decisions to expedite the healing of foot wounds and reduce the risk of amputations. Azure IoT Central provided the IoT application infrastructure that allowed Sensoria to quickly build a globally available, secure, and scalable IoT solution. Furthermore, Azure IoT Central leverages Azure API for FHIR, enabling Sensoria Health to ensure healthcare interoperability and compliance standards are met when managing the health data provided by EMR systems and from Sensoria Core embedded microelectronic devices. Read the press release to learn more.

We're also seeing well-established solution builders like C.H. Robinson, an American Fortune 500 provider of multimodal transportation services and third-party logistics, are taking advantage of IoT Central. Using Intel intelligent gateways and IoT tags managed by Azure IoT Central, C.H. Robinson has quickly integrated IoT data and insights into its industry-leading Navisphere Vision product. The Navisphere solution is being used by leading retailers including Microsoft’s own supply chain teams to optimize logistics and costs as we prepare to deliver Surface and Xbox products ahead of the holiday season. Jordan Kass, President of TMC, a division of C.H. Robinson responsible for Navisphere described the challenges facing the industry; “Today, Retailers of all sizes need to know where their products are and where they are going … . Building with IoT Central offered us speed, scale, and simplicity to connect devices like Intel’s gateways and IoT tags.”

Vattenfall, a Swedish energy company investing deeply in renewable energy, and Microsoft are collaborating on solutions using Azure IoT Central to address challenges in energy markets to match supply and demand for renewable energy. “The IoT Central app platform has expedited our product development, providing fast and seamless device connectivity and device management, and built-in data storage, rule creation, and dashboarding for our operators,” says, Sebastian Waldenström, Head of Vattenfall’s IoT and Energy Management.

While many of our partners have established industry expertise within verticals, we’ve also seen IoT Central be used by solution builders with horizontal reach, such as Mesh Systems. Mesh Systems is a global expert in asset-tracking solutions, with customer applications spanning retail, logistics, banking, pest control, construction, and much more. “IoT Central helps us do what we do best–only now what used to take 3 months to build now takes 3 days” said Doyle Baxter, strategic alliance manager at Mesh Systems.

These partners and others are on a journey building with IoT Central. Read more about building retail solutions with IoT Central here, and follow along in the coming months as we feature more partner success across other industry segments.

New capabilities for production-level solutions

Expanding IoT Central portfolio with IoT Edge: Businesses can now run cloud intelligence directly on IoT devices at the edge managed by Azure IoT Central. This new feature helps businesses connect and manage Azure IoT Edge devices, deploy edge software modules, publish insights, and take actions at-scale–all from within Azure IoT Central.

Seamless device connectivity with IoT Plug and Play: Solution builders building with Azure IoT Central can select from a range of Azure IoT Pre-Certified Plug and Play devices and quickly connect them to the cloud. Customers can now build production grade IoT solutions within days without having to write a single line of device code, drastically cutting down the time to market and costs.

Range of actions within the platform: Azure IoT Central exposes various levels of extensibility from within the platform. A user can define rules on device data streams that trigger no-code (Microsoft Flow) or low-code actions (Azure Logic Apps). A solution builder could also configure more complex actions, exchanging data with an external service via a Webhook or Azure Functions based action.

Extensibility through data export: Continuous Data Export from Azure IoT Central can be used to integrate data streams directly into Azure PaaS services like Azure Blob Storage for data retention, or Azure Event Hub and Azure Service Bus for building rich processing pipelines for IoT data and insights into business applications, or into storage for Azure Machine Learning.

Public APIs to access features: Solution builders with extensibility needs beyond device data now have access to Central features through our public APIs. Users can develop robust IoT solutions that leverage IoT Central programmatically as the core for device modelling, provisioning, lifecycle management, operations (updating and commanding), and data querying.

Application repeatability: Today, solution builders can use application templates to export their investments and duplicate them for new customers, saving hours of time on configuration and customization.

Manageability and scale through multitenancy: We know that many solution builders need more than just repeatability; they also need manageability to truly scale their investments to customers. Which is why in the coming months, Azure IoT Central will support multitenancy; solution builders can build once and use a multitenancy interface to on-board, configure, and update many customers and organizations globally across regions, offering both device and data sovereignty without sacrificing manageability.

User access control through custom user roles: Organizational complexity varies across customer solution implementations. Custom user roles allow for clearly defined access control to the data as well as actions and configurations within the system. It gives users control over exactly what they need and nothing more.

Device and data scale: Azure IoT Central scales users' data processing pipelines and provides storage to support millions of devices. Solution builders can achieve device scale by seamlessly connecting devices with IoT Plug and Play integration and authoring IoT Central experiences for Plug and Play devices.

Pricing update: In early 2020, we're unveiling a new pricing tier that will make scaling solutions more affordable and will provide more flexibility for solution builders. IoT Central customers will soon be able to select between multiple pricing plans based on their specific message volume needs for their projects. Check back on our pricing page in the coming weeks for more details.

Azure IoT Central: your IoT application platform

Microsoft is investing $5 billion in Azure IoT over the next four years. Our goal is to simplify the journey in IoT, allowing solution builders to bring solutions to market faster, while staying focused on digital transformation.

Azure IoT Central offers a powerful example of how Microsoft continues to deliver on this commitment. By removing the complexity and overhead of setup, management burden, and operational costs, we can accelerate the creation of innovative solutions across all industries. Azure IoT Central provides organizations with the IoT application platform they need to create the next wave of innovation in IoT. And that means a more intelligent and connected world that empowers people and organizations to achieve more. To learn more, visit our product page.
Quelle: Azure

Leverage Azure premium file shares for high availability of data

This post was co-authored by Mike Emard Principal Program Manager, Azure Storage. 

SQL Server on Azure virtual machines brings cloud agility, elasticity, and scalability benefits to SQL Server workloads. SQL virtual machine offers full control on the operating system, virtual machine size, storage subsystem, and the level of manageability needed for your workload. Preconfigured SQL Server image from Azure Marketplace comes with free SQL Server manageability benefits like Automated Backup and Automated Patching. If you choose to self-install SQL Server on Azure virtual machines then you can register with SQL virtual machine resource provider to get all the benefits available to SQL marketplace images and simplified license management.

Microsoft provides an availability SLA of 99.95 percent that covers just the virtual machine not SQL Server. For SQL Server high availability on Azure virtual machines, you should host at least two virtual machine instances in an availability set (for availability at 99.95 percent) or different availability zones (for availability at 99.99 percent) and configure a high availability feature for SQL Server, such as Always On availability groups or failover cluster instance.

Today, we are announcing a new option for SQL Server high availability with SQL Server failover cluster with Azure premium file shares. Premium file shares are solid-state drive backed consistent low latency files shares that are fully supported for use with SQL Server failover cluster instance for SQL Server 2012 and above on Windows Server 2012 and above.

Azure premium file shares offer the following key advantages for SQL Server failover cluster instance

Ease of management

File shares are fully managed by Azure. 
 Provisioning is very simple. 
 Resize capacity in seconds with zero downtime by setting a property of the share. Increasing your storage capacity as your database grows is simple and does not cause unavailability, there is no need to provision lots of extra storage up front.
 Increase input/output per second (IOPS) in seconds with zero downtime by resizing your share. Increase the size of your premium share to get the IOPS your workload needs. 
 Seasonal workloads can temporarily increase IOPS and resize back down as easily as increasing. Again, zero downtime! 

Lower the work on your virtual machines

 Input or output is offloaded to your managed file share so you may be able to use a smaller, less expensive, virtual machine. 

Burstable input or output capacity

 Premium file share (PFS) provides automated bursting for IOPS capacity up to a limit based on a credit system. If your workload needs occasional bursts, then you should leverage this free and fully automated input or output capacity. Follow the premium files provisioning and bursting documentation to learn more.

Zonal Redundancy

 Zonally redundant storage available in some regions. You can deploy SQL Server failover cluster instance with one virtual machine in one availability zone and another in a different zone to achieve 99.99 percent high availability both for compute and the storage.

Premium file shares provide IOPS and throughout capacity that will meet the needs of many workloads. However, for input or output intensive workloads, consider SQL Server failover cluster instance with storage spaces direct based on managed premium disks or ultra-disks. You should check the IOPS activity of your current environment and verify that premium files will provide the IOPS you need before starting a migration. Use Windows Performance Monitor disk counters and monitor total IOPS (disk transfers per sec) and throughput (disk bytes per sec) required for SQL Server data, log and temp data base files. Many workloads have bursting input or output so it is a good idea to check during heavy usage periods and note the max IOPS as well as average IOPS. Premium file shares provide IOPS based on the size of the share. Premium files also provide complimentary bursting where you can burst your input or output to triple the baseline amount for up to one hour.

Use the step by step guide for configuring SQL failover cluster instance with Azure premium files to configure a SQL Server failover cluster instance with PFS today and leverage the new technologies and innovations Azure provides to modernize SQL Server workloads. Please share your feedback through UserVoice. We look forward to hearing from you!
Quelle: Azure

Start building with Azure Cognitive Services for free

This post was co-authored by Tina Coll, Sr Product Marketing Manager, Azure Cognitive Services.

Innovate at no cost to you, with out-of-the box AI services that are newly available for Azure free account users. Join the 1.3 million developers who have been using Cognitive Services to build AI powered apps to date. With the broadest offering of AI services in the market, Azure Cognitive Services can unlock AI for more scenarios than other cloud providers. Give your apps, websites, and bots the ability to see, understand, and interpret people’s needs — all it takes is an API call — by using natural methods of communication. Businesses in various industries have transformed how they operate using the very same Cognitive Services now available to you with an Azure free account.

Get started with an Azure free account today, and learn more about Cognitive Services.

These examples are just a small handful of what you can make possible with these services:

Improve app security with face detection: With Face API, detect and compare human faces. See how Uber uses Face API to authenticate drivers.
Automatically extract text and detect languages: Easily and accurately detect the language of any text string, simplifying development processes and allowing you to quickly translate and serve localized content. Learn how Chevron applied Form Recognizer for robotic process automation, quickly extracting text from documents.
Personalize your business’ homepage: Use Personalizer to deliver the most relevant content and experiences to each user on your homepage.
Develop your own computer vision model in minutes: Use your own images to teach Custom Vision Service the concepts you want it to learn and build your own model. Find out how Minsur, the largest tin mine in the western hemisphere, harnesses Custom Vision for sustainable mining practices.
Create inclusive apps: With Computer Vision and Immersive Reader, your camera becomes an inclusive tool that turns pictures into spoken words for low vision users.
Build conversational experiences for your customers: Give your bot the ability to interact with your users with Azure Cognitive Services. See how LaLiga, the Spanish men’s soccer league, engages hundreds of millions of fans with its chatbot using LUIS, QnAMaker, and more.

It’s easy to get started

1. Create an Azure free account.

2. Visit the Azure portal to deploy services.

3. Find step-by-step guidance for deploying Cognitive Services.
Quelle: Azure

Introducing Corda Enterprise on Azure Blockchain Service

Providing our customers with choice and flexibility is central to our mission around blockchain in Azure. Today, we are pleased to introduce that we're bringing managed Corda Enterprise to Azure Blockchain Service.

The road to Corda Enterprise on Azure as a managed service

In 2016, Microsoft and R3 worked together to bring Corda Enterprise to Azure as a virtual machine image in the Azure Marketplace.

In 2017, the relationship matured to a partnership, and in the subsequent years we’ve worked closely with customers, consortiums, and independent software vendors (ISVs) to help them bring Corda-based solutions to Azure. Working together with our customers and partners, we’ve seen the launch of multiple Corda consortiums on Azure, from Insurwave’s launch in 2018 to the recent September 2019 announcement of TradeIX’s launch of the Marco Polo Network on Azure.

As customers were building end to end solutions, one of the big requests was to make integrating Corda with enterprise data, systems, and Software as a Service (SaaS) easier. Earlier this year, we released the Corda Logic App and Flow Connectors that brought 30 years of Microsoft enterprise integration experience to Corda. With Flow and PowerApps, it also became possible for citizen developers to build low-code or no-code web and mobile apps for Corda.

However, the biggest request we had from customers was for Corda to be released as a managed service in Azure. Specifically, a Platform as a Service (PaaS) offering that would set up Corda nodes to connect with the appropriate Corda network, manage node health, and update both the nodes and the underlying software.

Today at CordaCon, we’re pleased to share that customers can now sign up for the preview of Corda Enterprise on Azure Blockchain Service.

Simple Corda node deployment

Corda on Azure Blockchain Service provides you with the ability to choose where to provision and host nodes, either on the Corda Network (Livenet, Testnet, UAT) or a private Corda network.

For the preview, Azure Blockchain Service supports the latest Corda Enterprise version (currently at 4.X). In addition to provisioning the node, Azure Blockchain Service automatically connects the Corda node to the appropriate network based on your Azure Blockchain Service. Being part of Azure Blockchain Service, you can configure and deploy a Corda node within the Azure portal or programmatically through REST APIs, CLI, or PowerShell. This dramatically simplifies Corda node deployment and connection.

Managed Corda nodes and Corda Distributed Applications

In addition to provisioning and deploying Corda nodes, Azure Blockchain Service provides managed APIs to help you manage your Corda nodes and Corda Distributed Applications (CorDapps). With Corda node management, you’ll be able to control access to your node, scale the node up or down, and drive flow draining. With CorDapp management, you’ll be able to easily add, manage, and version your CorDapps on your node.

Integrated node and CorDapp health, monitoring, and logging

Corda on Azure Blockchain Service leverages Azure Monitor making it easier to access Corda node and CorDapp health, monitoring, and logging information. With Azure Monitor, you’re able to customize alerts and actions based on logs and events. With all Corda and CorDapp logs at your fingertips, you’re able to create custom visualizations and dashboards based on the health and monitoring data.

Next steps

If you are building a solution on Corda Enterprise and are interested in joining the preview, please fill out the following form.

For those of you at CordaCon this week who would like to learn more, please come visit us at our booth or attend our Fully Managed Corda Enterprise with Azure Blockchain Service session on October 24th to speak with members of the Azure Blockchain team.
Quelle: Azure

Gain on OLTP price-performance with Azure SQL Virtual Machines

This post was co-authored by Jamie Reding, Senior Program Manager, Sadashivan Krishnamurthy, Principal Architect, and Bob Ward, Principal Architect.

Today, most applications are running online transactional processing (OLTP) transactions. Online banking, purchasing a book online, booking an airline ticket, sending a text message, and telemarketing are examples of OLTP workloads. OLTP workloads involves inserting, updating, and/or deleting small amounts of data in a database and mainly deals with large numbers of transactions by large number of users. Majority of OLTP workloads are read heavy, use diverse transactions, and utilizes wide range of data types.

Azure brings many price-performance advantages for your workloads with SQL Server on Azure Virtual Machines (VM) with a wide range of Azure Virtual Machine series and Azure disk options. Memory optimized VM series like Intel based Es_v3 series or AMD based Eas_v3 series offer high virtual CPU (vCPU) to memory ratio at a very low cost. Constraint vCPU capable VM sizes offer reduced cost of SQL Server licencing by constraining the vCPU abailable to the VM, while maintaining the same memory, storage, and input or output (I/O) bandwidth. Premium Solid State Drives (SSDs) deliver high-performance and low-latency managed disks with high IOPS and throughput capabilities needed for SQL Server data and log files. Standard SSDs, cost-effective storage options optimized for consistent performance, come as an optimum destination for most SQL Server backup files.

In addition to the large IOPS capacity of the Premium Disks, Azure Blobcache is a huge value for mission critical OLTP workloads as it brings significant additional high-performance I/O capacity to Azure Virtual Machine for free. Blobcache is a multi-tier caching technology enabled by combining the VM RAM and local SSD. You can host SQL Server data files on premium SSD managed disks with read only Blobcache and leverage extremely high-performance read I/Os that exceed the underlying disk’s capabilities. High scale VMs comes with very large Blobcache sizes that can host the all the data files for most applications. As all I/O activity from the Blobcache is free, you can boost application throughput with extremely high performance reads and optimize price-performance by only paying for the writes. Considering the majority of the OLTP workloads today come with 10 to 1 ratio for read and write, this is up to a 90 percent price-performance gain.

Additionally, for workloads demanding very low I/O latency, Azure ultra-disks deliver consistent low latency disk storage at high throughput and high IOPS levels. Ultra-disks maximize application throughput if the workload was bottlenecked on I/O latencies.

Based on read to write ratio, transaction complexity and scale pattern you may choose to use TPC-E or TPC-C for performance measurements. In general, TPC-E represents majority of the OLTP workloads in these days as it includes complex transactions and high read to write ratio. If you have write intensive workloads running simple transactions, then you can leverage the simplicity of TPC-C benchmark for performance validation. For detailed testing of SQL Server performance on Azure Virtual Machines with a scaled down TPC-E workload and HammerDB TPC-C kit please see this article.

Get started with SQL Server in Azure Virtual Machines

Azure SQL Virtual Machine service offers full control on the VM, storage and SQL Server configuration and gives you full flexibility to deploy the most cost-efficient solution for your workload’s specific requirements. You can create an SQL VM with performance optimized storage configuration enabled by SQL VM resource provider today and boost price-performance gain for your workload with performance best practices for SQL virtual machine.

Click here to start testing with free SQL Server Developer edition images on Azure Virtual Machines.

Get started on your Azure migration with the Data Migration Guide.
Quelle: Azure

October 2019 unified Azure SDK preview

Welcome back to another release of the unified Azure Data client libraries. For the most part, the API surface areas of the SDKs have been stabilized based on your feedback. Thank you to everyone who has been submitting issues on GitHub and keep the feedback coming.

Please grab the October preview libraries and try them out—throw demanding performance scenarios at them, integrate them with other services, try to debug an issue, or generally build your scenario and let us know what you find.

Our goal is to release these libraries before the end of the year but we are driven by quality and feedback and your participation is key.

Getting started

As we did for the last three releases, we have created four pages that unify all the key information you need to get started and give feedback. You can find them here:

.NET
Java
JavaScript and TypeScript
Python

For those of you who want to dive deep into the content, the release notes linked above and the changelogs they point to give more details on what has changed. Here we are calling out a few high-level items.

APIs locking down

The surface area for Azure Key Vault and Storage Libraries are nearly API-complete based on the feedback you’ve given us so far. Thanks again to everyone who has sent feedback, and if anyone has been waiting to try things out and give feedback, now is the time.

Batch API support in Storage

You can now use batching APIs with the SDKs for Storage to handle manipulating large numbers of items in parallel. In Java and .NET you will find a new batching library package in the release notes while in JavaScript and Python the feature is in the core library.

Unified credentials

The Azure SDKs that depend on Azure Identity make getting credentials for services much easier.

Each library supports the concept of a DefaultAzureCredential and depending on where your code runs, it will select the right credential for logging in. For example, if you’re writing code and have signed into Visual Studio or performed an az login from the CLI, the client libraries can automatically pick up the sign-in token from those tools. When you move the code to a service environment, it will attempt to use a managed identity if one is available. See the language specific READMEs for Azure Identity for more.

Working with us and giving feedback

So far, the community has filed hundreds of issues against these new SDKs with feedback ranging from documentation issues to API surface area change requests to pointing out failure cases. Please keep that coming. We work in the open on GitHub and you can submit issues here:

API design Guidelines
.NET
Java
JavaScript and TypeScript
Python

In addition, we're excited to say we'll be attending Microsoft Ignite 2019, so please come and talk to us in person. Finally, please tweet at us at @AzureSdk.

Get started with Azure for free.

Quelle: Azure

Azure API for FHIR® moves to general availability

Today, Microsoft becomes the first cloud with a fully managed, first-party service to ingest, persist, and manage healthcare data in the native FHIR format. The Azure API for FHIR® is releasing today in generally availability to all Azure customers.

The core mission in healthcare is to deliver better health outcomes, and the data standard fueling the future of that mission is FHIR. The Fast Healthcare Interoperability Resource (FHIR) has revolutionized the industry in the last several years and is rapidly becoming established as the preferred standard for exchanging and managing healthcare information in electronic format.  Microsoft understands the unique value FHIR offers to enable management of Protected Health Information (PHI) in the cloud, so we’re advancing Azure technology to enable our health customers the ability to ingest, manage, and persist PHI data across the Azure environment in the native FHIR format.

With the Azure API for FHIR, a developer, researcher, device maker, or anyone working with health data—is empowered with a turnkey platform to provision a cloud-based FHIR service in just minutes and begin securely managing PHI data in Azure. We’ve simplified FHIR through this new Platform-as-a-Service (PaaS) so customers can free up their operational resources and focus their development efforts on lighting up analytics, machine learning, and actionable intelligence across their health data.

Aridhia and Great Ormand Street Hospital (GOSH) in London, UK are leaders in the healthcare industry who are already leveraging FHIR in the Azure Cloud to power their Digital Research Environment (DRE), serving both historic and current patient records data: 

“We now have a unified API as a basis for designing, testing, and deploying the next generation of machine learning and digital services in the hospital for our young patients. This will also enable rapid and easier collaboration with our international pediatric hospital partners to share specialised tools to improve patient outcomes and experience," said Professor Neil Sebire, Chief Research Information Officer at GOSH.

“Partnering with Microsoft on the Azure API for FHIR allows us to scale out and accelerate our customers’ use of SMART on FHIR. The managed service is a great additional component in the Aridhia DRE platform, bringing research and innovation closer to clinical impact,” added Rodrigo Barnes, CTO at Aridhia.

Managed FHIR service in the cloud

Normalizing health data in the FHIR format allows you to leverage the power of an open source standard that evolves with the science of healthcare. The FHIR standard is designed precisely for health data flows, so it allows for data interoperability now and sets your ecosystem up for the future as the science of medicine evolves.  Blending a variety of data sets through a FHIR service ushers in powerful opportunities for accelerated machine learning development. As you develop and implement research and efficiency models for your system, data output can be securely and easily exchanged with any application interface that works with FHIR API.

Using the Azure API for FHIR brings your team all the benefits of the cloud – paying only for what you use, delivering low latency and high performance, and providing on-demand, scalable machine learning tools with built in controls for security and intelligence.

Key features of the Azure API for FHIR include:

•    Provision and start running an enterprise-grade, managed FHIR service in just a few minutes
•    Support for R3 and R4 of the FHIR Standard
•    Role Based Access Control (RBAC) – allowing you to manage access to your data at scale
•    Audit log tracking for access, creation, modification, and reads within each data store
•    Secure compliance in the cloud: ISO 27001:2013 certified, supports HIPAA and GDPR, and built on the HITRUST-certified Azure platform
•    Global Availability and Protection of your data with multi-region failover
•    SMART on FHIR functionality

Security for PHI data in the cloud

The cloud environment you choose to manage your Protected Health Information (PHI) matters. Microsoft runs on trust.

We’ve built the Azure API for FHIR so your data is isolated and protected with layered, in-depth defense and advanced threat protection according to the most stringent industry compliance standards. Azure covers 90+ compliance offerings, including International Organization for Standardization (ISO 27001:2013), and the Health Insurance and Portability and Accountability Act (HIPAA).  You can be confident that the Azure API for FHIR will enable persistence, security, and exchange of PHI data in a private and compliant pipeline.

  “Humana is using Microsoft’s Azure API for FHIR to enable care team access to our members’ digital health records in a universal language and that is guarded by always-on security. By providing access to members’ records, Humana can focus on supporting doctors, nurses, and clinicians and helping our members experience their best lives.” – Marc Willard, VP, Humana

“Using Azure API for FHIR allows us to focus on designing People Compatible™ solutions for healthcare organization of all sizes in this dynamic regulatory environment, with less worrying about security and scalability.” – Pawan Jindal, Founder & President, Darena Solutions

Building the foundations of artificial intelligence in healthcare

While we’re excited to light our cloud on FHIR, we’re even more excited about the foundations FHIR is forging for the future of machine learning and life sciences in healthcare.  We’re actively engaging with a broad set of customers who are pioneering new innovation with FHIR. Whether you’re improving operational efficiency across your ecosystem, need a new secure FHIR-based data store, or want to create richer datasets for research and innovation, the future of health data in the cloud is here, and it’s on FHIR.

Check out Azure API for FHIR and do more with your health data.
Quelle: Azure