Access modeled data from Looker Studio, now in public preview

In April, we announced the private preview of our integration between Looker and Looker Studio (previously known as Data Studio). At Next in October, to further unify our business intelligence under the Looker umbrella, we announced that Data Studio has been renamed Looker Studio. The products are now both part of the Looker family with Looker Studio remaining free of charge. At Next we also announce that the integration between these two products is now available in public preview with additional functionality.How does the integration work?Customers using the Looker connector will have access to governed data from Looker within Looker Studio. The Looker connector for Looker Studio makes both self-serve and governed BI available to users in the same tool/environment. When connecting to Looker, Looker Studio customers are able to leverage its semantic data modelwhich enables complex data to be simplified for end users with a curated catalog of business data, pre-defined business metrics, and built-in transformations. This helps users make calculations and business logic consistent within a central model and promotes a single source of truth for their organization. Access to Looker-modeled datawithin Looker Studio reports allows people to use the same tool to create reports that rely on both ad-hoc and governed data. They can use LookML to create Looker data models by centrally defining and managing business rules and definitions in one Git, version-controlled data model.. Users can analyze and rapidly prototype ungoverned data (from spreadsheets, csv files, or other cloud sources) within Looker Studio and blend governed data from Looker with data available from over 800 data sources in Looker Studio to rapidly generate new insights. They can turn their Looker-governed data into informative, highly customizable dashboards and reports in Looker Studio and collaborate in real-time to build dashboards with teammates or people outside the company. What’s new in the public preview version?We are excited that we are now able to offer this preview to a broader reach of customers, many of whom have already asked for access to the Looker connector for Looker Studio. Additionally, with this Public Preview, additional capabilities have been added to more fully represent the Looker model in Looker Studio:We are providing support for field hierarchies in the Looker Studio data panel, to keep fields organized when working with large Explores. The data panel will now show a folder structure, and you will be able to see your fields organized in the usual ways – for Views, Group Labels, and Dimension Groups. We are providing greater visibility by exposing field descriptions in new ways to enable users to quickly check the description information specified in the Looker model. Field descriptions will be available within the data panel and within tables in the report.Users will also see an option to “Open in Looker Studio” from Explores in Looker, enabling them to quickly create a Looker Studio report with a data source pointing back to that Explore.And to ensure users are getting the most current data from the underlying data source, refreshing data in Looker Studio now also refreshes the data in the Looker cache. Specifically, for this public preview, we’ve implemented enhanced restrictions on Looker data sources in Looker Studio, so admins can rest easy about testing out the functionality:We’ve disabled owner’s credentials for Looker data sources in Looker Studio, so each and every viewer needs to supply their own credentials including for shared reports.We’re also currently disabling data download and email scheduling for these data sources in Looker Studio. We’re planning to integrate with these permissions in Looker in the near future.Calculated fields are disabled, so end users cannot define their own custom metrics and dimensions in Looker Studio, and need to rely on the fields defined in the Looker Explore. How do I access the preview?This integration encompasses the connector along with changes made to both Looker Studio and Looker to represent the Looker model and extend Looker governance in Looker Studio. There is much more to come as we continue our efforts to bring together a complete, unified platform balancing self-service and governed BI. We’re planning to continue adding functionality in Looker Studio to fully represent the Looker model, and want to ensure Looker admins have insight into API activity coming from Looker Studio – similar to the way they might use System Activity in Looker today. In extending governance, we want to expand the circle of trust from Looker to Looker Studio, and we’ll be looking for customers to help us plan the best way forward. This integration is compatible with Google Cloud hosted instances with Looker version 22.16 or higher. To get access, an admin of a Looker instance can submit the sign-up form providing an instance URL and specifying which organizational domain to enable. For more information on how to get started go to the Looker Studio Help Center.For more information and demo, watch the Next ‘22 session ANA202: Bringing together a complete, unified BI platform with Looker and Data Studio and Keynote: ANA100: What’s new in Looker and Data Studio.
Quelle: Google Cloud Platform

How Telus Insights is using BigQuery to deliver on the potential of real-world big data

Editor’s note: Today, we’re hearing from TELUS Insights about how Google BigQuery has helped them deliver on-demand, real-world insights to customers.Collecting reliable, de-identifiable data on population movement patterns and markets has never been easy, particularly for industries that operate in the physical world like transit and traffic management, finance , public health, and emergency response. Unlike online businesses, these metrics might be collected  manually or limited by smaller sample sizes during a relatively short time. But imagine the positive impact this data could have if organizations had access to mass movement patterns and trends to solve complicated problems and mitigate pressing challenges such as traffic accidents, economic leakage, and more.As one of Canada’s leading telecommunications providers, TELUS is in a unique position to provide powerful data insights about mass movement patterns. At TELUS, we recognize that the potential created by big data comes with a huge responsibility to our customers.  We have always been committed to respecting our customers’ privacy and safeguarding their personal information,  which is why we have implemented industry-leading Privacy by Design standards to ensure that their privacy is protected every step of the way. All the data used by TELUS Insights is fully de-identified, meaning it cannot be traced back to  an individual. It is also aggregated into large data pools, ensuring privacy is fully protected at all times.BigQuery checked all our boxes for building TELUS InsightsTELUS Insights is the result of our vision to help businesses of all sizes and governments at all levels make smarter decisions based on real-world facts. Using industry-leading privacy standards, we can strongly de-identify our network mobility data and then aggregate it so no one can trace back data to any individual. We needed to build an architecture that would provide the performance necessary to run very complex queries, many of which were location-based and benefited from dedicated geospatial querying. TELUS is recognized as the fastest mobile operator and ranked first for network quality performance in Canada, and we wanted to deliver the same level of performance for our new data insights business.We tested out a number of products, from data appliances to an on-premise data lake, but it was BigQuery, Google Cloud’s serverless, highly scalable, and fully managed enterprise data warehouse, that eventually came out ahead of the pack. Not only did BigQuery deliver fast performance that enabled us to easily and quickly analyze large amounts of data at infinity scale, it also offered support for geospatial queries, a key requirement for the TELUS Insights business. Originally, the model for TELUS Insights was consultative in nature: we would meet with customers to understand their requirements and our data science team would develop algorithms to provide the needed insights from the available data sets.However, performance from our data warehouse proved challenging. It would take us six weeks of query runtime to extract insights from a month of data. To best serve our customers,  we began investigating the development of an API that, with simple inputs, would provide a consistent output so that customers could start using the data in a self-serve and secure manner. BigQuery proved itself able to meet our needs by combining high performance for complex queries, support for geospatial queries, and ease of implementing a customer-facing API.High performance enabled new models of customer serviceWith support for ANSI SQL, our data scientists found the environment very easy to use.  The performance boost was immediately apparent with project queries taking a fraction of the time compared to previous experiences – and that was before performing any optimization. BigQuery’s high performance was also one of the main reasons we were able to successfully launch an API that can be consumed directly and securely by our customers. Our customers were no longer limited on the size of their queries and would now get their data back in minutes. In the original consulting model, customers were dependent on our team and had little direct control over their queries, but BigQuery has allowed us to put the power of our data directly in our customers’ hands, while maintaining our commitment to privacy.Using BigQuery to power our data platform means we also benefit from the entire ecosystem of Google Cloud services and solutions, opening up new doors and opportunities for us to deepen the value of our data through advanced analytics and AI-based techniques, such as machine learning. Cloud architecture enabled a quick pivot to meet COVID challengesWhen the COVID-19 pandemic hit, we realized there was a huge value in de-identified and aggregated network mobility data for health authorities and academic researchers in helping reduce COVID-19 transmission without compromising the personal privacy of Canadians. As our TELUS Insights API was already in place, we were able to immediately shift focus and meet this public health need. Our API allowed us to provide supervised and guided access to government organizations and academic institutions to our de-identified and aggregated data, after which they were able to build their own algorithms, specific to the needs of epidemiology. BigQuery also enabled us to build federated access environments where we could safelist these organizations and, with appropriate supervision, allow them to securely access views they needed to build their reporting.COVID-19 Use Case:  The image above shows de-identified and aggregated mass movement patterns in the City of Toronto into outlying regions in May 2020 when stay-at-home orders were issued by the City and residents started traveling to cottage country.  Public Health authorities were able to use this data to inform local hospitals of the surge in population in their surrounding geographic location and to attempt to provision extra capacity at nearby hospitals, including the provisioning of equipment such as much needed ventilators.Our traditional Hadoop environments could never adapt to that changing set of requirements so quickly. With BigQuery, we were able to get the system up and running in under a month. That program, now called Data for Good, won both awards: the HPE International Association of Privacy Professionals’ Privacy Innovation of the Yearaward for 2020 and Social Impact & Communications and Service Providers Google Cloud Customer awardfor 2021. TELUS’ Data for Good program is supporting other areas of social good, in no small part because of the architectural benefits of having built on BigQuery and Google Cloud.Ready to unleash the power of our data with Google CloudBigQuery is a key enabler of TELUS Insights, enabling us to shift from a slow, consultative approach to a more adaptive data-as-a-service model that makes our platform and valuable data more accessible to our customers. Moving to BigQuery led to major improvements in performance, reducing some of our initial queries from months of runtime to hours. Switching to a cloud-based solution with exceptionally high performance also made it easier for us to create an API to serve our commercial customers and enabled us to offer a key service, in a time of crisis, to the community with our Data for Good program. To learn more about TELUS Insights, or to book a consultation about our products and services, visit our website.When we built our TELUS Insights platform, we worked with leading industry experts in de-identification. In addition, TELUS has taken a leadership role in de-identification and is a founding member of the Canadian Anonymization Network, whose mission is to help establish strong industry standards for de-identification. The TELUS de-identification methodology and, in fact, our whole Insights service, has been tested through re-identification attacks[1] [2] , stress-tested and, importantly, it has been Privacy by Design Certified. Privacy by Design certification was achieved in early 2017 for our Custom Studies product, and in early 2018 for our GeoIntelligence product.Related ArticleUnleashing the power of BigQuery to create personalized customer experiencesBigQuery’s high performance drives real-time, actionable decision-making that enables Wunderkind to bring large brands closer to their cu…Read Article
Quelle: Google Cloud Platform

Zero downtime migration for Azure Front Door—now in preview

In March of this year, we announced the general availability of two new Azure Front Door tiers. Azure Front Door Standard and Premium is our native, modern cloud content-delivery network (CDN), catering to both dynamic and static content delivery acceleration with built-in turnkey security and a simple and predictable pricing model. It has already been widely adopted by many of our customers. We also promised to provide a zero downtime migration tool to migrate from Azure Front Door (classic) and Azure CDN from Microsoft (classic) to the new Azure Front Door tier.

Today, we are taking the next step in that journey, and we are excited to announce the preview of the Azure Front Door tier migration capability as well as some new additional features. The migration capability for Azure CDN from Microsoft (classic) will be coming soon.

New features/capabilities on the new Front Door since general availability

Along with the migration feature, we added more capabilities, and integrations to the new Front Door tiers to provide you a better cloud CDN solution and a more integrated Azure cloud experience.

Preview—Upgrade from Standard to Premium tier without downtime: To learn more about upgrading to Premium tier, see Azure Front Door Tier Upgrade. This capability is also supported during the migration from Azure Front Door (classic) to the new Front Door tier.
Preview—Managed identities integration: Azure Front Door now supports Managed Identities generated by Azure Active Directory to allow Front Door to easily and securely access other Azure AD–protected resources such as Azure Key Vault. This feature is in addition to the AAD Application access to Key Vault that is currently supported. To learn more about how to enable managed identities on Azure Front Door Standard and Premium, please read Set up managed identity with Front Door.
Integration with App Service: Front Door can now be deployed directly from the App Service resource with a few clicks. The previous deployment workflow only supported Azure Front Door (classic) and Azure CDN.
Pre-validated domain integration with Static Web Apps: Static Web App (SWA) customers who have already validated custom domains at the SWA level can now skip domain validation on their Azure Front Door. For more details, see Configure a custom domain on Azure Front Door using the Azure portal.
Terraform support for Azure Front Door Standard and Premium, enabling the automation of Azure Front Door Standard and Premium provisioning using Terraform. For more information, see Create a Front Door Standard/Premium profile using Terraform.
Azure Advisor integration provides suggestions for best practices and configurations, including expired certificates, certificates about to expire, autorotation failure for managed certificates, domains pending validation after 24 hours, use the latest "secret" version.

Migration overview

Azure Front Door enables you to perform a zero-downtime migration from Azure Front Door (classic) to Azure Front Door Standard or Premium in just three simple steps. The migration will take a few minutes to complete depending on the complexity of your Azure Front Door (classic) instance, such as the number of domains, backend pools, routes, and other configurations.

If your Azure Front Door (classic) instance has custom domains with your own certificates, there will be two extra steps to enable managed identities and grant managed identity to a key vault for the new Azure Front Door profile.

The classic instance will be migrated to the Standard or Premium tier by default based on the Azure Front Door (classic) WAF configurations. Upgrading from the Standard tier to Premium during the migration is also supported. If your Azure Front Door (classic) qualifies to migrate to Azure Front Door Standard, but the number of resources exceeds the standard quota limit, the Azure Front Door (classic) instances will be migrated to a Premium profile instead.

If you have Web Application Firewall (WAF) policies associated with the Front Door profile, the migration process will create copies of your WAF policies and configurations for the new Front Door profile tier. You can also use an existing WAF policy that matches the tier you're migrating to.

Azure Front Door tier migration is supported using the Azure portal. Azure PowerShell, Azure CLI, SDK, and Rest API support will come soon.

You’ll be charged for the Azure Front Door Standard and Premium base fee from the moment the migration completes. Data transfer out from edge location to client, Outbound Data Transfer from Edge to the Origin, Requests will be charged based on the traffic flow after migration. For more details about Azure Front Door Standard and Premium pricing, see our pricing for Azure Front Door.

Notable changes after migration

DevOps: Azure Front Door Standard and Premium uses a different resource provider namespace Microsoft.Cdn, while Azure Front Door (classic) uses Microsoft.Network. After migration from classic to the Standard or Premium tier, you’ll need to change your Dev-Ops scripts and infrastructure code to use the new namespace and updated ARM template, Bicep, PowerShell Module, Terraform, CLI commands, and API.
Endpoint: The new Front Door endpoint gets generated with a hash value to prevent domain takeover, in the format of endpointname-hashvalue.z01.azurefd.net. The Azure Front Door (classic) endpoint name will continue to work after migration. However, we recommend replacing it with the newly created endpoint in Azure Front Door Standard and Premium. For more information, refer to Endpoint in Azure Front Door.
Diagnostic logs and metrics won’t be migrated. We recommend you enable diagnostic logs and monitoring metrics in your Azure Front Door Standard or Premium profile after migration. Azure Front Door Standard and Premium tier also offers built-in reports and health probe logs.

Get started

Get started with your Azure Front Door migration today!

To learn more about the service and various features, refer to the Azure Front Door documentation.

Learn more about Azure Front Door's tier migration capabilities

About Azure Front Door (classic) to Standard/Premium tier migration.
Mapping between Azure Front Door (classic) and Standard/Premium tier.
Migrate Azure Front Door (classic) to Standard/Premium tier in the Azure portal.

We’re looking forward to your feedback to drive a better experience for the general availability of the migration feature.
Quelle: Azure

Ankündigung der allgemeinen Verfügbarkeit der Unterstützung von SQL Notebooks im Amazon Redshift Query Editor

Amazon Redshift bietet eine neue Möglichkeit, an mehreren SQL-Abfragen zu arbeiten, indem sie in einem einzigen Notebook mit Dokumentations-, Visualisierungs- und Kooperationsfunktionen organisiert werden. Die neue SQL-Notebook-Benutzeroberfläche in Amazon Redshift Query Editor v2 ermöglicht es Benutzern wie Datenanalysten und Datenwissenschaftlern, Datenanalysen effizienter auszuführen, indem sie relevante Abfragen und Informationen für eine einfache Nutzung zusammenhalten.
Quelle: aws.amazon.com

Die mobile App der AWS-Konsole fügt Support für AWS CloudShell hinzu

Benutzer der mobilen App der AWS-Konsole können jetzt über die iOS- und Android-Anwendungen auf die AWS CloudShell zugreifen. Die mobile App der AWS-Konsole bietet die AWS CloudShell in einer mobilfreundlichen Oberfläche, mit der Benutzer Skripte mit der AWS-Befehlszeilenschnittstelle (AWS CLI) ausführen können, um von unterwegs mit mehr als 250 AWS-Services zu interagieren. Benutzer haben auch Zugriff auf eine erweiterte mobile Tastatur, wenn sie die AWS CloudShell in der mobilen App der AWS-Konsole verwenden. Die erweiterte mobile Tastatur bietet Benutzern Tasteneingaben (z. B. Tabulator, Strg, Alt, Esc), die in der AWS CloudShell-Konsole auf dem Desktop verfügbar sind. Die mobile App der AWS-Konsole bietet die AWS CloudShell derzeit in den folgenden AWS-Regionen an: USA Ost (Nord-Virginia), USA Ost (Ohio), USA West (Oregon), Asien-Pazifik (Mumbai), Asien-Pazifik (Sydney), Asien-Pazifik (Tokio), Kanada (Zentral), Europa (Frankfurt), Europa (Irland), Europa (London) und Südamerika (São Paulo).
Quelle: aws.amazon.com

AWS Cloud Control API ist jetzt in der Region AWS Naher Osten (VAE) verfügbar

Die AWS Cloud Control API ist jetzt auch in der AWS Naher Osten (VAE) Region verfügbar. Die Cloud Control API ist eine Reihe gemeinsamer Anwendungsprogrammierschnittstellen (APIs), die es Entwicklern leicht machen sollen, ihre Cloud-Infrastruktur einheitlich zu verwalten und die neuesten AWS-Funktionen schneller zu nutzen. Mit Cloud Control API können Entwickler den Lebenszyklus von Hunderten von AWS-Ressourcen und mehr als einem Dutzend Ressourcen von Drittanbietern mit fünf konsistenten APIs verwalten, anstatt verschiedene servicespezifische APIs zu verwenden. Mit der Cloud Control API können Partner des AWS-Partnernetzwerk (APN) die Integration ihrer Lösungen mit bestehenden und zukünftigen AWS-Funktionen und -Services durch eine einmalige Integration automatisieren, anstatt wochenlang benutzerdefinierte Entwicklungsarbeit zu leisten, wenn neue Ressourcen verfügbar werden. Terraform von HashiCorp, Pulumi und Red Hat Ansible haben ihre Lösungen in die AWS Cloud Control API integriert.
Quelle: aws.amazon.com