Getting to know Looker – common use cases

As we welcome Looker’s team and their business intelligence (BI) and analytics technology to Google Cloud, we’re exploring all the platform can do. Looker helps to leverage the full potential of data you’re collecting or have access to. You can model the data, analyze it, create visualizations, embed real-time dashboards, build data applications, and share the right data with the people who need it in your organizations. In many ways, Looker is like an API for your data.With the potential to gather so much data today, that part often seems easy. But putting that data to work for you can really bring value. Get started with Looker’s unique approach to BI with some of these common use cases, and consider how you might apply some of these in your own organization. Once you know what you’re looking for and have some concrete numbers, you can decide which data can help you make decisions and set business goals. Check out these explainers to learn more about BI concepts and how you can work across business, IT, and operations teams to apply them appropriately.Building beautiful dashboards. After all the work of getting data into BigQuery and applying analytics to it, dashboards are a great payoff. To build one that users will love, start by knowing what the reason for the dashboard is, and who the audience is within your organization. From there, get approval on the overall look of the dashboard, then start working on the actual creation. Ideally, your dashboard will be easy to scan and offer opportunities for users to drill down further if needed. You can also pay attention to details like flow and color to really entice users. Learn more.Measuring customer profitability. Having a handle on customer profitability can help you understand whether customers are actually costing you money, rather than making you money. The measure of customer profitability is more complex than lifetime value or net margin of a transaction—it should include all touch points the customer has with your company. Those could include customer service interactions, fulfillment requirements, or overuse of services.You can use a step-by-step process to determine customer profitability, including identifying customer channels, segmenting customer groups, and digging into your collected data to understand more about various costs related to customers. Once that process is in place, you can find that number as often as possible, then use the resulting information to adapt your business strategies and goals. Learn more.Understanding customer segmentation. Customer segmentation refers to splitting customers into groups based on shared characteristics. These groups may have demographic, lifestyle, or behavioral differences that are useful to know. For example, you might segment customers using the recency, frequency, monetary (RFM) method to identify consumer habits and high-value shoppers. There are some typical customer segmentation models, ranging from simple to complex, and you may choose various models to understand your business and make product and budget decisions. Using customer segmentation can also inform your marketing and promotional strategies, and deliver the right content to users. With Looker, queries are made directly against your database and not by moving or extracting data to workbooks, cubes, .csv files, proprietary databases, or desktops. This key Looker differentiator promotes data integrity while keeping data movement to a minimum and access to sensitive information restricted. Learn more.Using conversion funnels. Measuring customer or user experience can include lots of gray areas, but a conversion funnel is a way to explore how site visitors are progressing as they move around your website. You can do an initial conversion funnel analysis as a baseline measurement, then find where you can optimize the site experience for customers. You may measure conversions, transactions, and leads, but may also include average order value or gross margin. There are typically five steps in a site purchase funnel. Learn more.Going beyond BI. Business intelligence is just the beginning—you can go beyond simply action-oriented insights to perform other tasks within Looker. For example, you can trigger workflows in other systems based on unified metrics within Looker using the Action Hub, securely and reliably send governed data to other systems, or bring routine tasks into Looker to close the loop between insight and action. Learn more.These use cases just scratch the surface. To find more inspiration about what’s possible, head over to the Looker blog to discover the BI topics most pertinent to your business, catch one of Looker’s breakout sessions at Google Cloud Next ‘20: OnAir, or explore this new page.
Quelle: Google Cloud Platform

Enabling SAP enterprises and mission critical workloads on Google Cloud

For the vast majority of businesses, the future they see today is very different than the one they saw before the global pandemic: different priorities, different resources, and different approaches to risks and rewards.Thinking about the future can be especially challenging for our enterprise customers running SAP applications. Many kicked off 2020 with their plans for digital transformation in full swing, including long-awaited migrations from legacy SAP instances onto modern, cloud-native S/4HANA environments. Many other enterprises were at least considering similar initiatives, based on Google Cloud’s ability to enable, support, and drive new sources of innovation for customers pursuing SAP HANA migrations.New concerns about business continuityAlmost all of these companies have something in common: They’re deeply concerned with recognizing and responding to the business technology challenges that emerged and continue to emerge from the pandemic—and which, in some cases, exposed them to significant and costly disruptions.Based on our conversations with customers running SAP environments, three of these challenges stand out as especially important and impactful:Managing and minimizing infrastructure risk: Beginning in March of this year, IT departments learned what happens when their supply chains get stretched past the breaking point. As network and storage hardware suddenly grew scarce, so did access to the onsite IT staff required to install and maintain these systems. And today, as a result, operating and maintaining an on-premises SAP environment looks quite a bit riskier than it once did.Adapting to new realities for elasticity and scale: The height of the pandemic featured extremes in scale and demand; some businesses shut down unused data center infrastructure, while others raced to keep up with massive, yet highly unpredictable, surges in demand.Retooling for faster, smarter business analytics: Many enterprises were confronting market disruptions and business-model shifts even before the pandemic. But the past few months accelerated the evolution of new business models and competitive landscapes. Winning in this environment will be much easier for enterprises equipped with smarter, faster, better-integrated analytical capabilities.Today, we’re sharing a number of ways we’re helping SAP customers address these challenges. Announcing Mission Critical Support for SAP customersTo address the support requirements for businesses running SAP, Google Cloud launched Premium Support earlier this year. Premium Support provides third party support of applications like SAP to meet the unique needs of these applications, including a specialized team of Technical Solution Engineers with the right expertise to support SAP customers running on Google Cloud. This global team of experts will offer 24/7 support, brings deep SAP and Google Cloud Platform knowledge to customer inquiries, and can assist with higher quality and more timely issue resolution. For customers who require an even greater level of support, Google Cloud is announcing Mission Critical Support to help SAP customers that cannot tolerate any type of downtime.  When your business can’t afford to be down, our Mission Critical Support provides:Fastest response time with a 5 Min Service Level objective response. We work to thoroughly understand your environment ahead of time so if an incident arises, we are prepared to act.   Proactive and preventative engagement with support experts that can navigate your issues quickly. Mission Critical Support teams know your architecture, which partners you work with, your workloads, the areas of the world in which you operate to help inform their response. A collaborative approach for continuous improvement of your environment.Mission Critical Support involves a deep understanding of your SAP environment. Through an Assessment, Google Cloud’s Professional Services Organization (PSO) will evaluate your environment. Based on the assessment findings, you will take corrective action during Remediation to ensure your SAP environment is ready. On Boarding entails Google Technical Solutions Engineers (TSE) working to determine your Service Level Indicators. Customers running Mission Critical support can file priority zero (P0) cases. The ongoing benefit is Google Cloud’s partnership with you to deliver continuous improvement for your Mission Critical SAP environment.Enterprise grade business continuity FFF Enterprises, Inc. has been a trusted name in wholesale pharmaceutical distribution for decades, serving over 80% of US hospitals. After encountering outage issues running SAP on a legacy infrastructure and hosting provider, FFF Enterprises recognized the need for a change, and transitioned their critical SAP S/4HANA environment to Google Cloud. This move has provided FFF with higher performance and more robust and dependable infrastructure than their legacy systems at a comparable or lower cost.Migrating our SAP hosting service with Google Cloud and Managecore has been a game changer. It maximizes the reliability of our IT systems and puts us on a path toward advancing our digital transformation. Jon Hahn, Chief Information Officer, FFF EnterprisesFor SAP customers, our robust support offering is underpinned by highly reliable infrastructure capabilities. Google Cloud’s rugged infrastructure maintains business continuity for all of Alphabet’s applications at massive scale; we are ready to help keep your SAP environment up and running with these key capabilities: Multilayer infrastructure high availability (HA): Google Cloud ishighly available by design, with a redundant infrastructure of data centers around the world that contain zones designed to be independent from each other. Live Migrationkeeps virtual machine instances running through planned host system events, such as hypervisor  or hardware updates. VM instances also have the ability to restart automatically in the event of unplanned downtime. Fast, flexible, reliable disaster recovery (DR): Many SAP enterprises learned during the pandemic just how vulnerable their data centers can be to supply chain disruptions. Using Google Cloud for DR ensures that critical SAP workloads and data sources are protected.Efficient and reliable data protection: Creating a backup strategy and choosing suitable services for backup are key to protecting your SAP systems. Google Cloudoffers several native capabilities for automated, cost-effective SAP system backup. You can also use the backup options and interfaces that are offered by applications (for example, SAP HANA Backint) or managed backup solutions from third parties such as Actifio, Commvault and Dell EMC.  Simple, cost-effective ways to scale fast while minimizing risk: For many SAP enterprises, managing risk comes down to ensuring that one or two business-critical apps are available and securely running at peak performance. A common solution here involves “lift and shift” projects that migrate applications to the cloud as quickly and simply as possible, while trading off some of the advantages of more complex migrations.Learn more about how SAP customers can ensure support, and business continuity with Google Cloud—read “How to run SAP on Google Cloud if high availability is high priority,” and visit SAP on Google Cloud.
Quelle: Google Cloud Platform

Microsoft Azure IoT Connector for FHIR now in preview

Today, Microsoft released the preview of Azure IoT Connector for FHIR—a fully managed feature of the Azure API for FHIR. The connector empowers health teams with the technology for a scalable end-to-end pipeline to ingest, transform, and manage Protected Health Information (PHI) data from devices using the security of FHIR® APIs.

Telehealth and remote monitoring. It’s long been talked about in the delivery of healthcare, and while some areas of health have created targeted use cases in the last few years, the availability of scalable telehealth platforms that can span multiple devices and schemas has been a barrier. Yet in a matter of months, COVID-19 has accelerated the discussion. We have an urgent need for care teams to find secure and scalable ways to deliver remote monitoring platforms and to extend their services to patients in the home environment.

Unlike other services that can use generic video services and data transfer in virtual settings, telehealth visits and remote monitoring in healthcare require data pipelines that can securely manage Protected Health Information (PHI). To be truly effective, they must also be designed for interoperability with existing health software like electronic medical record platforms. When it comes to remote monitoring scenarios, privacy, security, and trusted data exchanges are must-haves. Microsoft is actively investing in FHIR-based health technology like the Azure IoT Connector for FHIR to ensure health customers have an ecosystem they trust.

FHIR to fuel the Internet of Medical Things

FHIR (Fast Healthcare Interoperability Resources) is now the interoperability standard for secure and private exchange of health data. FHIR began as an open source framework for clinical data, but it’s growing adoption makes it an ideal technology to bring together data from the growing “Internet of Medical Things” (IoMT) and expand healthcare in remote monitoring scenarios.

Today remote data capture often requires device-specific platforms, making it difficult to scale when new processes are added or if patients use multiple devices. Developers have to build their own secure pipelines from scratch. With the Azure IoT Connector for FHIR available as a feature on Microsoft’s cloud-based FHIR service, it’s now quick and easy for health developers to set up an ingestion pipeline, designed for security to manage PHI from IoT devices. The Azure IoT Connector for FHIR focuses on biometric data at the ingestion layer, which means it can connect at the device-to-cloud or cloud-to-cloud workstreams. Health data can be sent to Event Hub, Azure IoT Hub, or Azure IoT Central, and is converted to FHIR resources, which enables care teams to view patient data captured from IoT devices in context with clinical records in FHIR.

The key features of the Azure IoT Connector for FHIR include:

Conversion of biometric data (such as blood glucose, heart rate, or pulse ox) from connected devices into FHIR resources.
Scalability and real-time data processing.
Seamless integration with Azure IoT solutions and Azure Stream Analytics.
Role-based Access Control (RBAC) allows for managing access to device data at scale in Azure API for FHIR.
Audit log tracking for data flow.
Helps with compliance in the cloud: ISO 27001:2013 certified, supports HIPAA and GDPR, and built on the HITRUST certified Azure platform.

 

Microsoft customers are already ushering in the next generation of healthcare

As the delivery of healthcare shifts outside the exam room, new FHIR-enabled technology is fueling IoT scenarios across the ecosystem of Microsoft’s customers.
Here are few of the great solutions already underway:

Humana’s Conviva Care Centers transform care for chronic conditions with IoT and FHIR

Conviva Care Centers, Humana’s senior-focused primary care subsidiary, will be using the Azure IoT Connector for FHIR this fall as Humana accelerates remote monitoring programs for patients living with chronic conditions. Congestive heart failure patients who monitor their weight and blood pressure at home will be able to use a new platform that enables easy sharing of data with their care team. Data from in-home devices, like scales and blood pressure cuffs, can be transferred via Azure IoT Connector for FHIR, providing doctors and nurses real-time data managed in a highly secure and private pipeline and allowing for proactive virtual touchpoints. Humana’s flexible remote monitoring platform will not only ensure patients have the support they need between clinic visits, but will also accelerate the future of user-centric care.

“Using the Azure IoT Connector for FHIR will open up new remote care paths for patients living with chronic conditions. Being able to make decisions with data coming in real time from home devices will be the game changer for improving the quality and timeliness of patient care.” —Marc Willard, Senior Vice President of Digital Health and Analytics at Humana

Sensoria Health’s Motus Smart—powered by Sensoria—is the new gold standard for enabling diabetes rehabilitation with remote monitoring

Motus Smart, powered by Sensoria, is a cutting-edge device used to provide remote patient monitoring quantified patient adherence and activity data to manage patients with diabetic foot ulcers and reduce amputation risk. Sensoria was able to deploy the Azure IoT Connector for FHIR to enable highly secure data exchange from the Motus device to patients, their doctors, and others within their circle of care. Clinicians at the Rancho Los Amigos National Rehabilitation Center are using enterprise-class applications to see real-time data, proactively reach out to patients, and address any issues that might be impeding proper treatment.

Centene connected health data platform helps manage chronic diseases

Centene is using Azure IoT Connector for FHIR in an effort to better manage the ever-expanding personal bio-metric data resulting from the proliferation of wearables and other medical devices. The company is leveraging the connector to explore the use of near-real-time monitoring and alerting as part of its overall priority on improving the health of its members, enabling them to take better care of themselves, and supporting its care management staff with actionable insights to improve the health of the communities Centene serves. In the future, Centene intends to use the connector to monitor and manage chronic conditions such as congestive heart failure, diabetes, and high-blood pressure. By leveraging Microsoft’s scalable, open platforms, Centene can make further progress toward improving outcomes for Centene Health Plan members.

Learn more and get started

We’re excited about the way our customers are embracing and delivering transformative care with FHIR technology. As we bring down the barriers of interoperability with new FHIR-based tools, the future vision of how we can evolve healthcare starts to unfold and it's inspiring.

Microsoft has expanded the tools in our FHIR ecosystem to include IoT pipelines, so our customers have easy to use, interconnected tools for responsibly managing patient health data. Whether you’re building clinical applications, analytics engines, or developing artificial intelligence (AI) with telehealth and remote monitoring, we want to make sure you have pipelines for PHI data with security in mind. Check out the Azure IoT Connector for FHIR and the Azure API for FHIR to get started today!

Read more about the Microsoft Cloud for Healthcare, which brings together our integrated capabilities, like our FHIR tools, with robust cloud capabilities specific to customers and partners in the healthcare industry. The Microsoft Cloud for Healthcare enriches patient engagement and connects health teams to help improve collaboration, decision-making, and operational efficiencies.

 

FHIR® is the registered trademark of HL7 and is used with the permission of HL7.
Quelle: Azure

Azure Time Series Insights Gen2: Leading the next generation of industrial IoT analytics platforms

The Internet of Things (IoT) is well-established for helping businesses find real-time insights from their industrial assets opening the path towards Industry 4.0. Answering questions like “how are all of my assets performing right now?” or “how can I improve my manufacturing process and attainment?” and “when will my assets need servicing?” used to be impossible to know or required manual data collection that was always out of date.

Today, business leaders are taking advantage of IoT to see this information with the click of a button. Yet as larger volumes of data are collected from industrial assets, finding insights can become more and more difficult. It can start to require costly and time-consuming data wrangling and data analytics techniques performed by highly specialized staff.

This is where Azure Time Series Insights Gen2 comes in. This fully managed IoT analytics platform—generally available today—enables you to uncover hidden trends, spot anomalies, and conduct root-cause analysis in large volumes of industrial data with an intuitive and straightforward user experience. Simple yet powerful, Azure Time Series Insights Gen2 allows you to explore and analyze billions of contextualized events across millions of sensors.

Since Azure Times Series Insights Gen2 is a serverless offering, you don’t have to worry about managing complicated compute clusters yourself. Additionally, Azure Time Series Insights Gen2 provides a scalable, pay-as-you-go pricing model enabling you to tune your usage to your business demands.

Azure Time Series Insights Gen2 is both a web experience and a platform. Knowledge workers can use the Time Series Explorer web experience to find insights from petabytes of IoT data in seconds through the simple, intuitive user interface. Developers can use the open and scalable platform to build solutions and custom user experiences with our rich APIs and JavaScript SDKs.

Azure Time Series Insights Gen2 is tailored for industrial IoT applications.

Driven by feedback from customers around the globe, here are key features that are now generally available and how they benefit industrial IoT customers.

Azure Time Series Insights Gen2 offers multi-layered storage

IoT customers work with IoT data in a variety of ways. The two most common scenarios we see are:

Highly interactive analytics over a short time span.
Advanced analysis of decades worth of historical data.

Azure Time Series Insights Gen2 covers both scenarios with retention-based data routing between managed warm and bring your own cold stores, including Azure Data Lake Storage. Warm store can be configured to retain up to 31 days of IoT data allowing you to perform highly interactive asset-centric analytics with low latency to monitor, trend, and troubleshoot your assets. Cold store, with its near-infinite, retention can be used to store decades worth of historical IoT data, ready to be used for operational intelligence and improved efficiencies.

Multi-layered storage.

Enterprise scale to power the analytics needs of industrial customers

Azure Time Series Insights Gen2 powers the analytics needs of many industrial customers across all major segments, including manufacturing, power and utilities, oil and gas, automotive, smart buildings, and mining. These customers generate billions of events across millions of data points, with most struggling to keep pace with the vast amounts of data generated by their assets. Azure Time Series Insights Gen2 scales to accommodate high volumes of data quickly and efficiently. Alongside our scalable storage options, Azure Time Series Insights Gen2 supports one-million-time series instances (or tags) per environment with rich semantic modeling. This allows you to seamlessly explore highly contextualized data and correlate trends across your industrial assets to unlock insights and achieve operational excellence.

Azure Time Series Gen2 supports one million tag instances.

Microsoft Power BI connecter helps bring your data silos together

The ability to bring your data silos together is important to make data driven decisions and drive digital transformation. Azure Time Series Insights Gen2 provides an out of the box Power BI connector which connects your Azure Time Series Insights Gen2 queries to a Power BI workspace. You can easily view your time series and business intelligence data in a single pane of glass to make better decisions with a holistic view of your business posture.

Azure Time Series Gen2 integrates with Power BI.

Contextualize raw telemetry with the Time Series Model

Traditionally, the data that's collected from IoT devices lacks contextual information, which makes it difficult to use for business purposes. The Time Series Model, within Azure Time Series Insights Gen2, allows you to contextualize raw telemetry by defining hierarchies, instance properties, and types. This makes your analysis of asset-centric data simple and more valuable to your organization.

It’s easy to get started with Time Series Model using Time Series Explorer to both author and curate your model. Alternatively, the Time Series Model can also be managed through our rich API surface.

The Time Series Model, within Azure Time Series Insights Gen2, allows you to contextualize raw telemetry.

Gain insights using Azure Time Series Insights Gen2 with Azure Digital Twins

Achieve even greater insights by integrating Azure Time Series Insights Gen2 and Azure Digital Twins. Azure Digital Twins allows you to fully model your physical environment and stream live IoT data for a complete view of your connected assets and environments. Understand how your assets, customers, and processes interact in both real and simulated environments.

 

Gain greater insights using Azure Time Series Insights Gen2 with Azure Digital Twins.

Open and flexible integration

Azure Time Series Insights Gen2 can be used with tools you know and love. Our cold store is backed by a customer-owned Azure Data Lake. Combining Azure Data Lake storage with our native support for the open source, highly efficient Apache Parquet lets you dive into decades of historical IoT data.

In addition, Azure Time Series Insights Gen2 ships with a Power BI connector allowing customers to export the time series queries they create in Azure Time Series Insights Gen2 into Power BI and view their time series data alongside other business data. Other highly sought-after connectors for popular analytics platforms such as Apache Spark™, Databricks, and Synapse will become available over time.

Time Series Explorer—analytics tool for knowledge workers and developers

The first-class user experience of the Time Series Explorer lets you use interpolation, scalar and aggregate functions, categorical variables, scatter plots, and time shifting of time series signals to analyze the data.

Time Series Explorer features the following user experience capabilities:

Automatically refresh charts.
Reverse lookup instance placement within the hierarchy.
Select and chart multiple variables through a single operation.
View chart statics.
Create marker annotations.
Duplicate time series instances in the well and change variables.
Change the line colors through the new color picker tool.
Use swim lanes to group related time series together.

New rich query APIs now give you the ability to use interpolation, new scalar and aggregate functions and categorical variables outside of the Time Series Explorer.

Time Series Explorer features the following API capabilities:

Interpolate patterns from existing data to reconstruct time series signals.
Process discrete signals using categorial variables.
Apply trigonometric functions to identify patterns.
Calculate time weighted averages.
Leverage new APIs for hierarchy traversal, time series search, auto-complete, paths, and facets.
Query data at scale with improved search and navigation efficiency.
Leverage new conditional logic, such as IFF, which allows you to determine if an expression is true or false when selecting what data should be considered for computation. When used with categorical variables, you can create threshold monitors and map ranges of values to their categories.

Customers are using Azure Time Series Insights to gain business insights in manufacturing, power and utilities, oil and gas, automotive, smart buildings, and mining.

Fonterra empowers employees with data

Founded in 2001, Fonterra is the world’s second largest dairy processor, responsible for approximately 30 percent of global dairy exports. Owned by over 10,000 New Zealand farmers, the co-operative operates in over 100 countries and processes approximately 22 billion liters of milk each year.

In 2018, Fonterra made a decision to fast-forward their digital transformation. After a lengthy review, Microsoft was chosen to upgrade their old system with a new, cutting-edge, cloud-based platform. Renamed the “New Historian,” the updated system promises to deliver on their goal of becoming a data driven organization by giving their operators, leaders, data scientists, and business intelligence teams the power to use data more intelligently.

"Fonterra is embracing advanced technologies to transform into a data-driven organization. We selected Azure Time Series Insights to provide storage, contextualization, and analysis capabilities and replace our legacy on-premises historian. This will allow us to effectively consolidate our data to empower operators, leaders, data scientists, and business intelligence teams." —Tristan Hunter, General Manager of Automation and Operational Technology, Fonterra

ENGIE Digital supports thousands of assets

ENGIE Digital, a provider of renewable energy, delivers energy and provides energy-related services to millions of consumers in more than 50 countries. ENGIE Digital designs, builds, and runs unique solutions that help other ENGIE Digital business units by supporting their development and operations. ENGIE Digital uses an in-house operational platform to collect and process millions of IoT signals every second from thousands of wind, solar, biogas, and hydroelectric energy assets around the globe—often in real-time.

ENGIE Digital selected Azure Time Series Insights and Microsoft Azure IoT Edge to modernize its platform. With these updates, the platform now supports ENGIE Digital teams across hundreds of renewable energy sites worldwide.

“Azure Time Series Insights is a foolproof solution. Its scalability, resilience, performance, and cost-effectiveness mean we always have the latest data at hand.” —Sebastien Gauthier, Head of Darwin Delivery, ENGIE Digital, energy and energy-related service provider

ShookIOT leverages Azure Time Series Insights to deliver customer insights

Oil and gas industry veterans, Dr. Dave Shook and Leanna Chan, have spent twenty years consulting with clients in the oil and gas industry. Time and time again, they see oil and gas companies struggling to leverage the full value of their data.

Traditionally companies store data in on-premises time-series database applications called historians; legacy operational technology (OT) tools that keep data siloed. This makes it difficult to connect with powerful information technology (IT) tools, such as cloud-based analytics. Additionally, collecting process data can be prohibitively expensive. Some process manufacturers store less than 75 percent of their data.

To address these challenges, the two entrepreneurs had a vision to fuse OT data with IT. They founded ShookIOT in Edmonton, Alberta, Canada in 2017. Their philosophy was to free data siloed on-premises and migrate it to the cloud—specifically the ShookIOT Fusion Cloud Historian running on Microsoft Azure. Once in the cloud, customers, such as Chevron, could harness the full value of their data leverage tools like Azure Time Series Insights.

“After our customer’s data and contextual information is stored in Azure, we leverage tools like Azure Time Series Insights to view data trends and Power BI to create data visualizations.” —Dave Shook, Co-Founder and CEO, ShookIOT

“ShookIOT Fusion improves upon the traditional long-term data storage found at most sites, leverages the Microsoft Azure cloud platform and accelerates all Azure analytics tools by providing operational and business data with context to users. —Leanna Chan, Co-Founder and Chief Revenue Officer, ShookIOT

Gain insights from large volumes of data easily

Explore and analyze billions of contextualized events across millions of industrial sensors. Uncover hidden trends, spot anomalies, and conduct root-cause analysis in large volumes of data with an intuitive and straightforward user experience. We’re excited to see how you use Azure Time Series Insights Gen2 to drive your digital transformation.

See the following resources to learn more:

Visit the Azure Time Series Insights Gen2 product page.
Read the Azure Time Series Insights documentation.
Read the Quickstart guide Explore the Azure Time Series Insights Preview demo environment.
Watch the Microsoft Build 2020 session Make your IoT data useful with an end-to-end analytics platform, Azure Time Series Insights.
View the Channel 9 IoT Show Deep Dive, Analyzing IoT Data using Azure Time Series Insights
Watch the Channel 9 IoT Show, Using Azure Time Series Insights to create an industrial IoT analytics platform.

Quelle: Azure

Azure Data Factory Managed Virtual Network

Azure Data Factory is a fully managed, easy-to-use, serverless data integration, and transformation solution to ingest and transform all your data. Choose from over 90 connectors to ingest data and build code-free or code-centric ETL/ELT processes.

Security is a key tenet of Azure Data Factory. Customers want to protect their data sources and hope that data transmission occurs as much as possible in a secure network environment. Any potential man-in-the-middle or spoof traffic attack on public networks could bring problems of data security and data exfiltration.

Now we are glad to announce the preview of Azure Data Factory Managed Virtual Network. This feature provides you with a more secure and manageable data integration solution. With this new feature, you can provision the Azure Integration Runtime in Managed Virtual Network and leverage Private Endpoints to securely connect to supported data stores. Your data traffic between Azure Data Factory Managed Virtual Network and data stores goes through Azure Private Link which provides secured connectivity and eliminates your data exposure to the internet. With the Managed Virtual Network along with Private Endpoints, you can also offload the burden of managing virtual network to Azure Data Factory and protect against the data exfiltration.

High-level architecture

Azure Data Factory Managed Virtual Network terminology

Managed Virtual Network

The Managed Virtual Network is associated with Azure Data Factory instance and managed by Azure Data Factory. When you provision Azure Integration Runtime, you can choose to have the Azure Integration Runtime within Managed Virtual Network.

Creating an Azure Integration Runtime within managed Virtual Network ensures that data integration process is completely isolated and secure.

Managed Private Endpoints

Managed Private Endpoints are private endpoints created in the Azure Data Factory Managed Virtual Network establishing a private link to Azure resources. Azure Data Factory manages these private endpoints on your behalf.

Private endpoint uses a private IP address in the managed virtual network to effectively bring the service into it. Private endpoints are mapped to a specific resource in Azure and not the entire service. Customers can limit connectivity to a specific resource approved by their organization.

Next steps

Get more secure today by following the steps for a Managed Virtual Network.

Quelle: Azure