20+ Cloud Networking innovations unveiled at Google Cloud Next

Networking is the foundational fabric that allows organizations to thrive in a digital business world. Today at Next ‘22, we are announcing a series of innovations to our Google Cloud networking services, all designed to meet customers where they are with AI/ML-powered services and built-in security. We start with a planet-scale network that is continually expanding to reach more customers. At 35 regions, 106 zones and 173 network edge locations across 200+ countries and territories, the Google Cloud Network offers services that allow customers to easily migrate, modernize, secure, and observe their workloads.“As enterprises continue to migrate new and established workloads to public cloud, they are recognizing that network architectures, infrastructure, and operating models must be modernized. In a cloud context, the network truly is the digital nervous system, providing secure and ubiquitous connectivity for business resilience and digital experiences. With these latest enhancements and additions to its network and security portfolio, Google Cloud is responding to the need for simplified cloud migrations through network modernization, which is integral to the success of enterprise digital transformation,” said Brad Casemore, Research VP, Datacenter and Multicloud Networks, IDC. Let’s take a closer look at all the enhancements we announced today, also covered in our Networking session MOD 205.  Simplify migrationsAs customers migrate services to the cloud, they may face connectivity and security challenges. Private Service Connect connects services across VPC networks that are in different groups, teams, projects or organizations, over an encrypted connection. Today, we are announcing the following Private Service Connect enhancements in Preview:  L7 PSC provides consumer-controlled security, routing, and telemetry to help enable more flexible and consistent policy for all servicesPrivate Service Connect over interconnect provides support for on-prem traffic through Cloud Interconnects to PSC endpointsPrivate Service Connect for hybrid environments can enable producers and consumers to securely connect and access managed services from cloud or on-premIntegration with 5 new partner managed services from Confluent, Databricks, DataStax, Grafana, and Neo4J, enabling customers to easily consume data and analytics services You can learn more about Private Service Connect and these enhancements here. Customers with High Performance Computing (HPC) workloads are migrating to the cloud to leverage exponential gains in IOPS. Workloads such as scale-out analytics, AI/ML, and financial risk modeling and simulation demand the highest compute and network performance. We are introducing the preview of 200 Gbps networking for the new C3 virtual machine family, offering 2x the bandwidth of the C2 family, and line rate encryption using the open-source PSP Security Protocol. Accelerate modernization When it comes to the network, modernization takes on many forms. For some customers, it’s about application modernization and for others, it’s about modernizing with cloud and reaching more customers through content-delivery networks (CDNs). Here are just a few of the ways we’re helping Google Cloud customers modernize their network infrastructure. Content Delivery Network Earlier this year, we introduced Media CDN, which leverages the same infrastructure as YouTube to enable exceptional video-on-demand and live streaming experiences through caching presence across 1,300+ cities and 200+ countries and territories. Paramount Global is one of the world’s largest producers of premium entertainment content, and has adopted Media CDN: “Streaming is one of the key growth areas for Paramount Global. When we migrated traffic onto Media CDN, we observed consistently superior performance and offload metrics. Partnering with Google Cloud enables us to provide our subscribers with the highest quality viewing experience.” says Chris Xiques, SVP of Video Technology Group at Paramount Global.Media CDN now supports the Live Stream API to ingest and package source content into HTTP-Live Streaming and DASH formats for optimized live streaming. We are enabling two new developer-friendly integrations in Preview for Media CDN: Dynamic Ad Insertion with Google Ad Manager which provides customized video ad placements, and third-party Ad Insertion using our Video Stitcher API for personalized ad placement. With these options, content producers can introduce additional monetization and personalization opportunities to their streaming services. For advanced customization, we are introducing the Preview of Network Actions for Media CDN, a fully managed serverless solution based on open-source web assembly that enables programmability for customers to deploy their own code directly in the request/response path at the edge. Using Network Actions, customers can unlock a wide variety of custom use cases such as security controls, cache offload, custom logs, and more. Many customers are rethinking and modernizing their CDNs and migrating to cloud-based solutions to minimize costs and maximize end-to-end performance. AppLovin, which provides an industry-leading mobile app platform, is one such customer that migrated to Cloud CDN for improved performance. “AppLovin powers many of the world’s most popular mobile apps and game studios. Partnering with Google Cloud has enabled us to expand our platform globally and reach more users quickly. We tripled our traffic in 90 days with millions of requests per second and saw a 50% reduction in latency with Google Cloud Load Balancing and Cloud CDN,” says Omer Hasan, VP of Operations at AppLovin. Today, we are adding dynamic compression to Cloud CDN to further accelerate applications by significantly reducing the size of responses transferred from the edge to a client. Dynamic compressionaccelerates page load times and reduces egress traffic for better performance and efficiency. Container networking Customers running network-intensive Enterprise and Telco workloads in container network functions (CNFs) can use high-performance dataplane and multi-networking under the umbrella of Network Function Optimizer. Network Function Optimizer, in Preview, delivers enhanced networking capabilities that allow customers to connect multiple container network functions, apply labels for selection and to steer the traffic to them. High performance networking in Google Distributed Cloud Edge platform leverages capabilities such as DPDK and SR-IOV for faster packet processing.Protect with built-in securityGoogle Cloud offers a comprehensive network security solution to help protect your cloud infrastructure. Cloud Firewall and Cloud Armor are two of those tools.Expanding Cloud FirewallGoogle Cloud Firewall helps customers achieve a zero-trust network posture via a fully distributed, cloud-native firewall service with advanced protection capabilities and granular controls. We are expanding our Cloud Firewall product line and introducing two new tiers: Cloud Firewall Essentials and Cloud Firewall Standard. The new Cloud Firewall Standard in Preview offers expanded policy objects for firewall rules that can simplify configuration and micro-segmentation to help protect your cloud infrastructure and workloads. It includes the following types of objects, whose contents are built and auto-updated by Google: Google Cloud Threat Intelligence – with five types of curated lists, one of which is known malicious IPs – Domain Name (FQDN),  and Geo-location based objects, which together, combine to offer robust and highly scalable protection. Cloud Firewall Essentials is our current foundational tier of firewall capabilities. We recently introduced new configuration structures, Global and Regional Network Firewall Policies, which havebuilt-in IAM controls, may be applied across VPCs, and support batch rules updates. In addition, we announced IAM-governed Tags, enabling scalable micro-segmentation policies that follow the workload. Both of these features are now generally available. And we have added Address Group objects, in Preview, to help simplify automation and infrastructure-as-code operations.The combination of IAM-governed Tags in Cloud Firewall Essentials, the dynamic objects in Cloud Firewall Standard, Address Groups, and our existing hierarchical firewall rules helps customers run a very flexible, least-privilege, self-service environment that enforces pinpoint policy with greater simplicity and decreased operational cycles. You can hear more about Cloud Firewall at Next session MOD107.Cloud Armor named a Strong PerformerWe’ve also extended the capabilities of another network security product, Google Cloud Armor, which helps protect web applications, services, and APIs from both DDoS attacks and web application exploit attempts. You can now configure the ML-based Adaptive Protection capability – which recently detected and protected a customer from the largest L7 DDoS attack to date (hear more in session SEC201) – to automatically deploy its proposed rules. We’ve also enhanced tuning for preconfigured WAF rules, adding field exclusion, signature opt-in, and expanded JSON content type support, all now in Preview. Preconfigured WAF rules using the latest ModSecurity Core Rule Set v3.3 covering the OWASP Top 10 web-app vulnerability risks are now generally available. And we are pleased to share that Google Cloud Armor was named a Strong Performer in The Forrester Wave™: Web Application Firewalls, Q3 2022 (report linked here). This is our initial debut in the WAF Wave, and it’s encouraging to see the third party recognition for the product in this market segment.Observe, detect, and recommendThroughout the customer journey, observability is a key enabler of successful network migration, modernization, and security. Network Intelligence Center, our real-time observability platform, continues to expand its ability to help customers tame operational complexity. Here are several enhancements to Network Intelligence Center. Network Analyzer, now generally available, automatically learns and monitors customers’ network deployment , specifically to detect mis-configurations and drifts on network topology, firewall rules, routes, load balancers and connectivity to services and applications. Customers can set alerts on insights with log-based alerting, and programmatically access the data with the Recommender API. Performance Dashboard now provides visibility into latency measurements for Google Cloud to Internet traffic at per-project and global levels. This visibility helps customers plan the placement of their Google Cloud resources and overall network architecture. Network Topology is enhanced with a new “top talkers” view so that customers can quickly identify and monitor their top contributors to egress, and optimize the architecture for performance and cost.Firewall Insights launched new enhancements to provide IPv6 rule coverage and custom insight refresh cycle to generate shadowed rule insights for projects. Innovating at all layers of the stackFrom startups born in the cloud to enterprises migrating to the cloud, companies are leveraging the ubiquity of cloud everywhere as a catalyst to shape, expand, and accelerate their digital transformation. At Google Cloud, we are working side-by-side with customers to simplify their cloud journey with innovations at all layers of the networking and security stack to open new possibilities. Check out these Cloud Networking sessions from Google Cloud NEXT to learn more.Related ArticleRead Article
Quelle: Google Cloud Platform

What's new in Google Cloud databases: More unified. More open. More intelligent.

Every organization is going through some form of digital transformation and serving their customers in new ways. Modern conveniences have taught consumers that their experience is paramount — no matter how big or small the company or how complex the problem. Powering these digital experiences are operational databases, the backbone of most applications. The quality of the customer experience is critically dependent on how reliable, scalable, performant, and secure these operational databases are. At Google Cloud, our mission is to accelerate every organization’s ability to digitally transform. A large part of that is helping our customers and partners innovate faster with a unified, open, and intelligent data cloud platform. At Google Cloud Next, we’re excited to announce new Google Cloud databases capabilities that enable more opportunities for growth and innovation within your organization.The four key areas we’ve focused on are: Building a unified and integrated data cloud for transactional and analytical dataBreaking free from legacy databases and our commitment to open ecosystems and standardsInfusing AI and machine learning across data-driven workflowsEmpowering builders to be more productive and impactfulUnifying transactional and analytical data Traditionally, data architectures have separated transactional and analytical workloads, including their underlying databases — and for good reason. Transactional databases are optimized for fast reads and writes, while analytical databases are optimized for aggregating large data sets. Because these systems are largely decoupled, it can create many inefficiencies. Enterprises struggle to piece together disparate data solutions, they spend valuable time managing complex data pipelines, and they expend a lot of effort replicating data between databases. Ultimately, they find it difficult to build intelligent, data-driven applications. At Google Cloud, we’re uniquely positioned to solve this problem because of how we’ve architected our data platform. Our transactional and analytical databases are built on a highly scalable distributed storage system, with disaggregated compute and storage, and high-performance Google-owned global networking. This combination allows us to provide tightly integrated data cloud services across Cloud Spanner, Cloud Bigtable, AlloyDB for PostgreSQL, and BigQuery.We’re excited to announce the Preview of Bigtable change streams for easy data replication. Bigtable is a highly performant, fully managed NoSQL database service that processes over 5 billion requests per second at peak and has more than 10 exabytes of data under management. With change streams, you can track writes, updates, and deletes to Bigtable databases and replicate them to downstream systems such as BigQuery. Change streams helps support real-time analytics, event-based architectures, and multicloud operational database deployments. This capability joins recently launched Spanner change streams. We also recently announced Datastream for BigQuery in Preview, which provides easy replication of data from operational database sources such as AlloyDB, PostgreSQL, MySQL, and Oracle, directly into BigQuery with a few simple clicks. With a serverless, auto-scaling architecture, Datastream allows you to easily set up an Extract, Load, Transform (ELT) pipeline for low-latency data replication, enabling real-time insights in BigQuery.Datastream enables real-time insights in BigQuery with just a few steps.Greater freedom and flexibility with open source and open standardsIn recent years, organizations have become unwilling to tolerate opaque costs, restrictive licensing, and vendor lock-in, and we’re seeing them increasingly adopt open-source databases and open standards. In particular, PostgreSQL has emerged as a leading alternative to legacy, proprietary databases because of its rich functionality, ecosystem extensions, and enterprise readiness.To make sure we support your workloads, we offer three PostgreSQL options. First, AlloyDB for PostgreSQL is a PostgreSQL-compatible database, currently in preview, that delivers the performance, availability, scale, and functionality needed to support commercial-grade workloads. In our performance tests, AlloyDB is more than 4x faster than standard PostgreSQL for transactional workloads. We’re excited to announce a major expansion of the AlloyDB partner ecosystem, with more than 30 partner solutions to support business intelligence, analytics, data governance, observability, and system integration.We also recently announced that our Database Migration Service supports migration of PostgreSQL databases to AlloyDB, in preview. This service helps you migrate to AlloyDB from any PostgreSQL database — whether it’s on premises, self-managed on Google Cloud, or on another cloud — in an easy-to-use, secure, and serverless manner, and with minimal downtime.The second PostgreSQL offering is Cloud SQL for PostgreSQL, a fully managed, up-to-date version of PostgreSQL for easy lift-and-shift migrations or new application development. We support the most popular PostgreSQL extensions and over 100 database flags and you get the same experience of open source PostgreSQL, with the strong management, availability, and security of Cloud SQL. It’s no surprise that Cloud SQL is used by more than 90% of the top 100 Google Cloud customers. New customers can get started with a Cloud SQL free trial.Finally, Spanner, our globally-distributed relational database with strong external consistency and up to 99.999% availability, offers a PostgreSQL interface that lets you take advantage of familiar tools and skills from the PostgreSQL ecosystem. We’re continuing to prioritize PostgreSQL compatibility of Spanner, and are excited to announce a key milestone — the Spanner PostgreSQL interface now supports its first group of PostgreSQL ecosystem drivers, starting with Java (JDBC) and Go (pgx). This support can reduce the cost of migrating apps to Spanner using off-the-shelf PostgreSQL drivers your developers already use. And to further democratize access to Spanner, we recently announced free trial instances.Infusing AI and machine learning across data-driven workflowsAI and machine learning (ML) are critical to data-driven transformations, helping you get more value from your data. Among their many benefits, AI and ML tools can help recognize patterns, enhance and improve operational capabilities with new insights, and create compelling customer experiences. Most companies face significant hurdles not only trying to build ML models, but also integrating them into applications without extensive coding and specialized AI/ML skills. Harnessing AI and ML in workflows of all kinds should be easy, especially within your data platform. At Google Cloud, we’ve invested in AI and ML technologies for both database system optimizations to make our services more intelligent, and for AI and ML service integrations. For database system optimizations, capabilities such as Cloud SQL cost recommenders and AlloyDB autopilot make it easier for database administrators and DevOps teams to manage performance and capacity for large fleets of databases. In addition to infusing AI and ML into our databases, we’ve been focused on providing integration with Vertex AI, Google Cloud’s machine learning platform, to enable model inferencing directly within the database transaction. We’re excited to announce, in preview, the integration of Vertex AI with Spanner. You can now use a simple SQL statement in Spanner to call a machine learning model in Vertex AI.With this integration, AlloyDB and now Spanner can allow data scientists to build models easily in Vertex AI and developers to access these models using the SQL query language. For example, retailers need to detect fraudulent transactions during the checkout process and take appropriate action. With the Vertex AI integration, you can simply call the fraud detection ML model in the Spanner query using a function like ML_PREDICT.Predict fraudulent transactions in retail checkout process using Vertex AI integrationEmpowering builders to be more productiveBuilding, testing and deploying apps is cumbersome. Plus, even after an app is built, maintaining it requires regular monitoring, performance tuning, scaling, and security patching — all of which distract developers from strategic initiatives. As a result, organizations can be slow to innovate and may fall behind their competition. That’s why we prioritize the developer experience and are excited to share the latest advancements we’re making in Firestore, Cloud SQL, and Spanner.Developers love Firestore because of how fast they can build an app end to end. More than 4 million databases have been created in Firestore, and Firestore applications power more than 1 billion monthly active end users using Firebase Auth. But what happens when the application grows? We want to ensure developers can focus on productivity, even when their apps are experiencing hyper-growth. To achieve this, we’ve made three updates to Firestore all aimed at supporting growth and reducing costs. For applications using Firestore as a backend-as-a-service, we’ve removed the limits for write throughput and concurrent active connections. Now, if your app becomes an overnight success, you can be confident that Firestore will scale smoothly. Additionally, we’re rolling out the COUNT() function in preview next week, which gives you the ability to perform cost-efficient, scalable, count aggregations. This capability supports use cases like counting the number of friends a user has, or determining the number of documents in a collection. Finally, to help you efficiently manage storage costs, we’ve introduced time-to-live (TTL) which enables you to pre-specify when documents should expire, and can rely on Firestore to automatically delete expired documents.We’re also making advancements to security and performance in Cloud SQL and Spanner. Now it can be easier to detect, diagnose, and prevent database performance problems with Cloud SQL Query Insights for MySQL (also available for PostgreSQL). We recently introduced PostgreSQL System Insights in preview, and are excited to announce two additional types of Cloud SQL recommenders. Security recommenders continuously detect security vulnerabilities and check for risky security configurations such as a public IP address with broad access or unencrypted connections. Performance recommenders, meanwhile, help identify and resolve common misconfigurations that increase the risk of performance degradation or downtime. We recently launched Query insights for Spanner which provides pre-built dashboards for quickly diagnosing query performance issues. In addition, lock and transaction insights for Spanner (coming Q4, 2022) will help troubleshoot lock contention issues on Spanner that can slow down applications. You can easily correlate row-ranges, columns, and sample transactions contending for locks and debug high latency transactions using granular metrics.Pre-built dashboards to troubleshoot high latencies due to lock contentions in Spanner.Tap into new possibilitiesThe future of data has endless possibilities, and we’re excited to partner with you to help accelerate your data-driven business transformation. Tune into Next ‘22 for more details on the announcements, and get inspired by learning how companies like MLB, PLAID, Forbes, DaVita, Credit Karma, and Box are innovating with Google Cloud databases.Related ArticleLatest database innovations for transforming the customer experienceGoogle Cloud adds Spanner free trial instances and fine-grained access control, Datastream for BigQuery and PostgreSQL, database migratio…Read Article
Quelle: Google Cloud Platform

Advancing digital sovereignty on Europe's terms

In September 2021, we unveiled “Cloud. On Europe’s Terms,” an ambitious commitment to deliver cloud services that provide the highest levels of digital sovereignty while enabling the next wave of growth and transformation for European organizations. We’ve since seen increasing demand from customers and policymakers for digital sovereignty solutions. Working closely with our European customers, partners, policy makers, and governments, today at Google Cloud Next we’re pleased to share that we’ve delivered and are continuing to develop a broad portfolio of Sovereign Solutions that can support European customers’ current and emerging sovereignty needs as they progress their digital transformation. Google Cloud Sovereign Solutions come from these efforts to understand evolving sovereignty requirements. We’ve heard from numerous customers that they would prefer to work with local partners in their transformation journey, and that’s why we have established partnerships including T-Systems in Germany, S3NS in France, Minsait in Spain, and Telecom Italia in Italy. Sovereign Controls by T-Systems is now generally available, and Local Controls by S3NS, the Thales-Google partnership, is now available in Preview. You can expect more region and market announcements in the coming months. Google Cloud Sovereign SolutionsOur Sovereign Solutions are designed to support data, operational and software sovereignty requirements, increasing customer control and transparency for sensitive data moving to the cloud. For example, Sovereign Solutions can help support compliance with European regulations such as GDPR and legal rulings such as Schrems II.Google Cloud Sovereign Solutions comprise Sovereign Controls, which can help organizations more easily manage vital data sovereignty goals, as well as Supervised Cloud and Hosted Cloud options to help address operational and software sovereignty concerns.The range of Google Cloud Sovereign SolutionsThe market is already making valuable use of our offerings: In September, employee communication platform Haiilo said it would rely on the Sovereign Cloud from T-Systems and Google in the future.Oliver Queck, vice president at T-Systems International, said that Sovereign Solutions can help drive digital transformation. “T-Systems and Google Cloud are building and delivering sovereign cloud services for European enterprises from the public and business sectors. Our common goal: support all organizations in migrating their workloads to the cloud – through more innovation, flexibility, performance, and data security.”Let’s look at each of the product offerings: Sovereign ControlsSovereign Controls for Google CloudToday, customers can meet many data sovereignty requirements using Google Cloud controls, delivered directly by Google Cloud through Assured Workloads for EU or through our local partners. These Sovereign Controls can help organizations:Create and maintain workloads with data residency controls in Europe for core customer content at rest, with processes that help limit personnel access to core customer content to EU persons located in the EU;Maintain comprehensive visibility and control over administrative access to the data and workloads;Encrypt data with keys that they (or someone appointed by them) control and manage outside of Google’s infrastructure through our Cloud External Key Manager.At Google Cloud we firmly believe that the control of encryption keys is the strongest and most effective technical measure against extraterritorial requests for data that can be offered to cloud customers today. To achieve sufficient control, keys must be kept outside of the cloud provider infrastructure and coupled to a strong key access justification mechanism. Queck added that Sovereign Controls can help balance data management and control requirements with the drive to innovate. “With Sovereign Controls by T-Systems, we have developed a cloud solution that allows you to securely host your sensitive data and implement supplementary data protection measures that can help meet the requirements of European data protection authorities without losing on scalability or elasticity. In other words, you retain full control over your data, software, and operations, and still benefit from all the advantages of the Google Cloud – especially the innovation power,” he said.Sovereign Controls for Google WorkspaceCustomers’ sovereignty requirements also extend to the digital tools they use to collaborate and communicate. We recently announced Sovereign Controls for Google Workspace, which will provide digital sovereignty capabilities for organizations to control, limit, and monitor transfers of data to and from the EU starting at the end of 2022, with additional capabilities delivered throughout 2023. This commitment builds on existing Client-side encryption,  Data regions, and Access Controls capabilities in Workspace.Supervised CloudOur forthcoming Supervised Cloud offerings will be managed and operated by partners to support data sovereignty and operational sovereignty needs for specialized and highly sensitive data. We are in the process of designing and building these offerings, aligned with local regulations in France, as well as strong customer needs in Germany, with our respective partners S3NS and T-Systems. Cyprien Falque, managing director of S3NS, a Thales and Google Cloud partnership, spoke about plans for their forthcoming solution and ways for customers to begin their journey to the cloud now: “S3NS’ mission is to help public and private organizations in France benefit from the power of Google Cloud while protecting their sensitive data in compliance with the criteria of the French ‘Trusted Cloud’. Our Local Controls offering is a first step and a first milestone this year, before a future solution in compliance with the French ‘Trusted Cloud’ criteria, which we are working on in parallel. Our objective is to be among the first to make such an offering available for certification based on hyperscale cloud technology,” Falque said.Hosted CloudFinally, there are some customers and workloads that have a strict need to support disconnected operations. To help meet these software sovereignty requirements, we will offer Hosted Cloud services, which are part of Google Cloud’s Distributed Cloud offerings. Google Distributed Cloud Hosted does not require connectivity to Google Cloud at any time to manage infrastructure, services, APIs, or tooling.LuxConnect is a Luxembourg-based IT provider whose mission is to strengthen the country’s IT infrastructure and to increase international Internet connectivity in order to drive progress and innovation. LuxConnect CEO Paul Konsbruck said that he is looking forward to the availability of Sovereign Cloud solutions that can complement existing IT services, including Hosted Cloud.“Digital sovereignty is becoming more and more important to us and to our customers across Europe as we work to support their digital transformation initiatives. The ability to keep data within strict boundaries and maintain local operations further enable efforts in Luxembourg to serve for example as “data embassies” – protecting vital data and services that are essential to the smooth running of a country on their behalf, thereby reducing the potential impact of cyberattacks,” he said. “We are closely following the efforts of Google Cloud to develop and offer Hosted Cloud solutions that can complement our data center, connectivity, and HPC offerings.”Expanding the Sovereign Solutions ISV ecosystemIt’s also clear from our discussions with Cloud customers and partners that the ability to meet sovereignty requirements must include support for the applications they use to power their businesses and drive innovation. To this end, today we are thrilled to announce that more than 20 Independent Software Vendors (ISVs) from Europe and around the world have joined the new Google Cloud Ready – Sovereign Solutions program with the intent of bringing their products to Google Cloud Sovereign Solution environments. These partners include Aiven, Broadcom (Symantec), Cloud Software Group (Citrix), Climate Engine, Commvault, Confluent, Datadog, DataIKU, Dell Technologies, Elastic, Fortinet, Gitlab, Iron Mountain, LumApps, MongoDB, NetApp, OpenText, Palo Alto Networks, Pega Systems, Siemens, SUSE, Thales, Thought Machine, Veeam, and VMware. Learn moreWe’ll continue to listen to our customers and key stakeholders across Europe who are setting policy and helping shape requirements for customer control of data. Our goal is to make Google Cloud the best possible place for sustainable, digital transformation for European organizations on their terms — but also for others around the world, and there is much more to come.Related ArticleStrengthening our European data sovereignty offerings with Assured Workloads for EUAssured Workloads for EU on Google Cloud is now generally available to help address customer requirements for data residency and data sov…Read Article
Quelle: Google Cloud Platform

Introducing the next evolution of Looker, your unified business intelligence platform

As consumers, we all benefit from unprecedented access to data in everything we do, from finding answers on the web to navigating a new city to picking the best place to eat dinner. But at work it’s not that easy. Instead of having answers to questions at our fingertips, getting those answers is a costly IT project away—and when we get the answers, they only raise new questions that you then need to get back into the IT queue to answer. Just as Google’s mission is to organize the world’s information and make it universally accessible and useful, Looker aims to do the same for your business data, making it easy for users to get insights, and for you to build insight-powered applications. That vision is our north star for business intelligence at Google Cloud, which is why we acquired Looker in 2020, and why we have big plans for the next few years. Today, we are unifying our business intelligence product family under the Looker umbrella. Looker is the name you’ll hear us use when talking about our Google Cloud business intelligence products, as we bring together Looker, Data Studio, and core Google technologies like artificial intelligence (AI) and machine learning (ML). And starting today, Data Studio is now Looker Studio. With this complete enterprise business intelligence suite, we will help you go beyond dashboards and infuse your workflows and applications with the intelligence needed to help make data-driven decisions.Expanding the power and reach of LookerLooker Studio helps make it easy for everyone to do self-service analytics. It currently supports more than 800 data sources with a catalog surpassing 600 connectors, making it simple to explore data from different sources with just a few clicks, and without ever needing IT. In addition to the functionality customers already know and love, Looker Studio plans to evolve to include a complete user interface for working with data modeled in Looker. As a first step on this journey, we are happy to announce that access to Looker data models from Looker Studio is available in preview today. This capability allows customers to explore trusted data via the Looker modeling layer. For the first time, customers can easily combine both self-service analytics from ad-hoc data sources and trusted data that has already been vetted and modeled in Looker.We love what our users have accomplished with Looker Studio, from tracking NBA MVP Award votes and the location of the International Space Station, to breaking down the pumpkin spice economy and the video gaming industry. To support these vast and diverse use cases, we will continue to make Looker Studio available at no cost. At the same time, many of our customers require enterprise-focused features to use Looker Studio as part of their data stack while also addressing their governance and compliance requirements. To meet this customer demand, we are pleased to announce the availability of Looker Studio Pro. Customers who upgrade to Looker Studio Pro will get new enterprise management features, team collaboration capabilities, and SLAs. This is only the first release, and we’ve developed a roadmap of capabilities, starting with Dataplex integration for data lineage and metadata visibility, that our enterprise customers have been asking for. When Looker joined Google Cloud, one of the primary goals was to bring business intelligence closer to core Google Cloud technologies. As a major step on that journey, we are pleased to announce Looker (Google Cloud core) in preview today. This new version of Looker will be available in the Google Cloud Console and is deeply integrated with core cloud infrastructure services, such as key security and management services. Open data means open business intelligence, tooInsights everywhere doesn’t just mean everywhere in Looker. Our customers want an open data cloud that breaks down silos, whether that’s analytics, business intelligence, or machine learning. Just as we’ve already done with our BigQuery data warehouse, we are committed to integrating Looker with as many Google and partner products as our customers need. For example, we are deeply integrating Looker and Google Workspace so insights are available in the familiar productivity tools you use every day. This will provide easy access, via spreadsheets and other documents, to consistent, trusted answers from curated data sources across your organization. Looker integration with Google Sheets is in preview now, with plans for full release in the first half of 2023. To continue to meet our customers where they are in their open data cloud journey, we are working to connect other popular business intelligence offerings as well. That connection can allow visualization tools, including Tableau and Power BI, to easily analyze trusted data from Looker. Additionally, Looker continues to be an open platform, and we are expanding our partnerships from the modeling layer right into the user experience.To demonstrate that, we are also pleased to announce a new partnership with Sisu Data. Often, your data will tell you when something unusual has happened, but finding out why can take hours or days, even for skilled data scientists. Our partnership with Sisu Data will help automate finding root causes 80% faster than traditional approaches. This deep integration will enable a smooth experience for more customers.Real data, real intelligence, real impactOur customers use Looker for both internal business intelligence and to create embedded data products. Mercado Libre, a retailer in Latin America with a geographically diverse workforce, uses Looker and BigQuery to enable accelerated fulfillment. In the first half of 2022 alone, they were able to deliver 79% of shipments in less than 48 hours. Another customer, Wpromote, is one of Adweek’s fastest growing digital marketing agencies. They rebuilt the infrastructure they use to run every aspect of their business using Looker and BigQuery. “We wanted to build a system with no ceiling, where the only limitation would be our vision and our ability to execute,” Wpromote Chief Technology Officer Paul Dumais said. “Everything that you can do from the Looker UI can be done via API.”So much of our everyday lives revolve around data, and that’s especially true when it comes to the way we work. Data provides us with a wealth of knowledge, but without the right tools, sometimes it can be overwhelming, expensive, and can take far too long. The new, unified Looker product family helps you make the most of your data, delivering a better experience for your teams and, ultimately, a better experience for your customers — today and long into the future. Over 10 million users already access the Looker family of products, including Looker Studio, each month. Join them in making insights more accessible across your organization.To learn more about our other advances and announcements in the data cloud announcements, you can read more here on the Google Cloud Blog.Related ArticleRead Article
Quelle: Google Cloud Platform

Developers – Build, learn, and grow your career faster with Google Cloud

To help developers build faster, learn faster and grow their career faster with Google Cloud, we are excited to announce three developer initiatives today: a new Google Cloud Skills Boost annual subscription with expanded developer benefits; no-cost #GoogleClout challenges; and the Google Cloud Fly Cup Challenge happening during Google Cloud Next, in partnership with the Drone Racing League (DRL). Training made easy, wherever you areA recent survey noted that the number of individuals learning to code online increased from 60% in 2020 to 70% in 2021. Last year, we were the first cloud provider to launch a learning subscription to help address the developer skills shortage. We committed to equipping 40 million people with cloud skills with Google Cloud Skills Boost, which provides learners with online courses, skills development, and certifications. Google Cloud Skills Boost is available for learners globally, and offers training resources in languages including Arabic, Bahasa, Brazilian Portuguese, Chinese, English, French, German, Hebrew, Italian, Japanese, Korean, Polish, Spanish, Turkish, and Ukrainian. With Google Cloud Skills Boost, our goal is to make it easier to get started and grow their cloud skills.Introducing the new Google Cloud Skills Boost annual subscriptionTo build on this work, today, we’re enhancing the Google Cloud Skills Boost annual subscription by introducing Innovators Plus – a new suite of developer benefits, available under the existing $299/year subscription*. Built for developers who are looking to accelerate their professional skills and business growth, this package provides extensive training and certification benefits including live learning events, Google Cloud credits, and certification exam vouchers. Other leading cloud providers limit offers to learning content and labs, or have no developer-targeted offering at all. Through this new Google Cloud Skills Boost annual subscription, developers get the added access, rewards and recognition they’ve been asking for as they level-up their cloud skills. The expanded Google Cloud Skills Boost annual subscription with Innovators Plus offers a comprehensive range of benefits, including:Access to 700+ hands-on labs, skill badges, and courses$500 Google Cloud creditsA Google Cloud certification exam voucherBonus $500 Google Cloud credits after the first certification earned each yearLive learning events led by Google Cloud expertsQuarterly technical briefings hosted by Google Cloud executivesNew Google Cloud learning challenges to build skills In addition to the Google Cloud Skills Boost annual subscription, we’ve introduced other new learning challenges to make learning about Google Cloud technologies fun and rewarding. Take your cloud skills to new heights with the Google Cloud Fly Cup ChallengeWe recently announced the Google Cloud Fly Cup Challenge, created in partnership with The Drone Racing League (DRL) to celebrate the new era of tech-driven sports. Using DRL race data and Google Cloud analytics tools like BigQuery and Cloud SQL, developers of any skill level will be able to predict race outcomes and provide tips to DRL pilots to help enhance their season performance. Participants will compete for a chance to win an all-expenses-paid trip to the season finale of the DRL World Championship Race and be crowned the champion on stage. Register here to join the race to become the DRL champion today.Flex your #GoogleClout and win the hottest book in cloudJust in time for Next, developers can demonstrate their cloud knowledge against participants worldwide in new #GoogleClout challenges — no-cost, 20-minute micro-competitions.Here’s how it works:Work on the six new challenges in the #GoogleClout game by October 13(Optional) Share your scores on social media using the #GoogleClout hashtagIf you complete all six challenges, you’ll receive a special badge on your Developer Profile, plus an electronic copy of Priyanka Vergadia’s bestselling book, “Visualizing Google Cloud”.Grow your career faster with Google CloudThe demand for qualified experts is huge. But what exactly does that mean for you? In Skillsoft’s 2022 list of the top 15 highest paying IT certifications,Google Cloud certifications rank among the top for the 4th year in a row. By building your technical expertise on Google Cloud, you can grow your skills and your career faster.Ready to get started?Visit Google Cloud Skills Boost to start your annual subscription and unlock Innovators Plus benefits.Register for Next and discover more about the Google Cloud Fly Cup and #GoogleClout challenges in the Developer Zone.* Subject to eligibility limitations. Innovators Plus requires you to use a Google Account and a Developer Profile. For customers in the EEA, the UK, and Switzerland), Innovators Plus is restricted to business or professional use.Related ArticleRead Article
Quelle: Google Cloud Platform

Google Cloud Next: top AI and ML sessions

Google Cloud Next starts this week, and features over a dozen sessions dedicated to helping organizations innovate with machine learning (ML) and inject artificial intelligence (AI) into their workflows. Whether you’re a data scientist looking for cutting-edge ML tools, a developer aiming to more easily build AI-powered apps, or a non-technical worker who wants to leverage AI for greater productivity, here are some can’t miss AI and ML sessions to add to your playlist. Developing ML models faster and turning data into action For data scientists and ML experts, we’re offering a variety of sessions to help accelerate the training and deployment of models to production, as well to bridge the gap between data and AI. Top sessions include: ANA204: What’s next for data analysts and data scientistsJoin this session to learn how Google Cloud can help your organization turn data into action, including overviews of the latest announcements and best practices for BigQuery and Vertex AI.ANA207: Move from raw data to ML faster with BigQuery and Vertex AIWhat does the end-to-end journey from raw data to AI look like on Google Cloud? In this session, learn how Vertex AI can help you decrease time to production, track data lineage, catalog ML models for production, support governance, and more — including step-by-step instructions for integrating your data warehouse, modeling, and MLOps with BigQuery and Vertex AI.ANA103: How to accelerate machine learning development with BigQuery ML Google Cloud’s BigQuery ML accelerates the data-to-AI journey by letting practitioners build and execute ML models using standard SQL queries. Join this session to learn about the latest BigQuery ML innovations and how to apply them. Building AI-powered apps Developers building AI into their apps will also find lots to love, including the following: ANA206: Maximize content relevance and personalization at scale with large language modelsAccurately classifying content at scale across domains and languages ranks among the most challenging natural language problems. Featuring Erik Bursch, Senior Vice President of Digital Consumer Product and Engineering at Gannett, this session explores how Google Cloud can help identify content for ad targeting, taxonomize product listings, serve the most relevant content, and generate actionable insights.BLD104:Power new voice enabled interfaces and applications with Google Cloud’s speech solutionsFeaturing Ryan Wheeler, Vice President of Machine Learning at Toyota Connected North American, this session dives into the ways organizations use Google’s automatic speech recognition (ASR) and speech synthesis products to unlock new use cases and interfaces.Applying AI to core business processesEmployees without technical expertise are innovating with AI and ML as well, infusing it into business processes so they can get more done. To learn more, be sure to check out these sessions:ANA109:Increase the speed and inclusivity of global communications with Google’s zero code translation toolsAn estimated 500 billion words are translated daily but most translation processes for enterprises are manual, time-consuming, and expensive. Join this session — featuring Murali Nathan, Senior Director, Digital Experience and Transformation at Avery Dennison — to find out how Google Cloud’s AI-powered translation services are addressing these challenges, helping businesses to drive more inclusive consumer experiences, save millions of dollars, and localize messages across the world in minutes.ANA111: Improve document efficiency with AI: Make document workflows faster, simpler, and pain free with AIGoogle’s Document AI family of solutions help organizations capture data at scale by extracting structured data from unstructured documents, reducing processing costs and improving business speed and efficiency. Featuring Andreas Vollmer, Managing Director, Head of Document Lifecycle at Commerzbank, this session investigates how Google is expanding the capabilities of our Document AI suite to solve document workflow challenges. ANA108:Delight customers in every interaction with Contact Center AIGoogle Cloud Contact Center AI brings the power of AI to large contact centers, helping them to deliver world-class customer experiences while reducing costs. Join this session — featuring Stephen Chauvin, Business Technology Executive, Voice & Chat Automation/Contact Center Delivery at KeyBank — to learn about the newest Contact Center AI capabilities and how they can help your business. Register for Next ‘22.Related ArticleRead Article
Quelle: Google Cloud Platform

Google Cloud Next for security: 6 essential sessions

Zero Trust. Securing the software supply chain. Policy controls. And what about Mandiant? We have so many cool and interesting security sessions coming this week at Google Cloud Next, it can be hard to know where to start. Fortunately, we’ve got you covered with these six must-watch security sessions to add to your playlist:1. SEC101What’s next for security professionals?Kick off your Next Security journey with Mandiant’s Sandra Joyce, Charles Schwab’s Bashar Abouseido, and our own Sunil Potti on Google Cloud’s vision for security. Here you’ll get a first look at many of our latest innovations, and learn from other security industry leaders on how Google Cloud can help secure your digital transformation — and bring cloud-first security capabilities everywhere you operate.2. SEC104Google + Mandiant: Transforming Security Operations and Incident ResponseTake a deeper dive into the role that the Mandiant team will play as they join Google Cloud. A leader in dynamic cyber defense, threat intelligence and incident response services, Mandiant shares our cybersecurity vision to help organizations improve their threat, incident and exposure management.3. SEC300How Goldman Sachs bolstered their security posture through policy management and controlsLearn how Goldman Sachs thinks about and approaches building a strong security posture using Google Cloud controls for defense-in-depth. This session will also cover how other customers use Google Cloud tools to help manage and enforce their policies.4. DEI102Women in securityIn this panel led by Sri Subramanian, our Head of Product for Cloud Identity and Access Management, you’ll get to meet and hear from women in security leadership roles at organizations working with Google Cloud. These leaders play a critical role in driving innovation and security in their organizations and across the industry.5. SEC100From dependencies to deployment: How to secure your software supply chainSoftware supply chain security is vitally important to the future of… well, everything. In this session, learn more about how Google Cloud can help secure your software supply chain with best practices and a comprehensive yet modular toolset covering dev, supply, CI/CD, and runtime protection.6. MOD107Secure your cloud infrastructure, the Google wayGoogle secures the largest web apps and services on earth. We’ve learned a few things along the way about how to secure cloud infrastructure — compute, data, and network. This session will focus on examples of how to better protect your infrastructure services from cyber threats with controls that are designed in and at your disposal, without compromising performance.To explore the full catalog of breakout sessions and labs designed for security professionals, check out the Secure track in the Next ‘22 Catalog.Related ArticleRead Article
Quelle: Google Cloud Platform

Know before you go: Google Cloud Next

Google Cloud Next is one day away. Don’t miss out on the latest news, product announcements, and predictions from Google that will shape the cloud of tomorrow.Sign up now if you haven’t registered, so you can:Join the Opening Keynote with Google and Alphabet CEO Sundar Pichai and Google Cloud CEO Thomas Kurian tomorrow at 9 AM PDT – kicking off 24 hours of live broadcasts from New York, Sunnyvale, Tokyo, Bengaluru, and Munich.Browse the Google Cloud Next session catalog for curated content by track: Build for application developers, Analyze for data analysts and scientists, Design for data engineers, Modernize for enterprise architects and developers, Operate for DevOps, sysadmins and operations, Secure for security professionals, Collaborate for business leaders and IT administrators, and Innovate for executive and technology business leaders. Once you’re registered, you can create your playlists  with live broadcasts and 125 on-demand sessions. You can also check out Curated by Google playlists to discover which sessions some of today’s brightest minds are excited to attend.View Innovators Hive at Google Cloud Next, the event experience designed for developers and technical practitioners, is on demand on Day 3 from Sunnyvale, Munich, Bengaluru, and Tokyo — which is being presented live in Japanese.Get ready to unlock Google Cloud training and certification offerings and other exciting opportunities that will be announced at Next ’22.The cloud event of the year starts tomorrow, and it’s going to be big. Register now.Related ArticleRead Article
Quelle: Google Cloud Platform

Streamline your models to production with the Vertex AI Model Registry

Machine learning (ML) is iterative in nature — model improvement is a necessity to drive the best business outcomes. Yet, with the proliferation of model artifacts, it can be difficult to ensure that only the best models make it into production.Data science teams may get access to new training data, expand the scope of use cases, implement better model architectures, or simply make adjustments as the world around your models is constantly changing. All of these scenarios require building new versions of models to be released into production. And with the addition of new versions, it matters to be able to manage, compare, and organize them. Moreover, without a central place to manage your models at scale, it’s difficult to govern model deployment with appropriate gates on release and maintenance according to compliance to industry standards and regulations. To address these challenges, today we are excited to announce the Global Availability (GA) launch of the Vertex AI Model Registry.Fig. 1  – Vertex AI Model Registry – Landing pageWith the Vertex AI Model Registry, you have a central place to manage and govern the deployment of all of your models, including BigQuery, AutoML and custom models. You can use the Vertex AI Model Registry at no charge. The only cost that occurs when using the registry is if you deploy any of your models to endpoints or if you run a batch prediction.Vertex AI Model Registry offers key benefits to build a streamlined MLOps process: Version control and ML metadata tracking to guarantee reproducibility across different model versions over time. Integrated model evaluation to validate and understand new models using evaluation and explainability metrics. Simplified model validation to enhance model release.Easy deploymentto streamline models to production. Unified model reporting to ensure model performanceVersion control and ML metadata tracking to guarantee model reproducibilityVertex AI Model Registry allows you to simplify model versioning and track all model metadata to guarantee reproducibility over time. With the Vertex AI SDK, you can register custom models, all AutoML models (text, tabular, image, and video), and BQML models. You can also register models that you trained outside of Vertex AI by importing them to the registry.Fig. 2 – Vertex AI Model Registry – Versioning viewIn Vertex AI Model Registry, you can organize, label, evaluate, and version models. The registry gives you a wealth of model information at your fingertips, such as model version description, model type, and model deployment status. You can also associate additional information such as the team who built a particular version or the application the model is serving.In the end, you can get a single picture of your models and all of their versions using the Model Registry console. You can drill down and get all the information about a specific model and its associated versions so you can guarantee reproducibility across different model versions over time. Integrated model evaluation to ensure model quality Thanks to the integration with the new Vertex AI Model Evaluation service, you can now validate and understand your model versions using evaluation and explainability metrics. This integration allows you to quickly identify the best model version and audit the quality of the model before deploying it in production. For each model version, the Vertex AI Model Registry console shows classification, regression, and forecasting metrics depending on the type of model.Fig. 3 – Vertex AI Model Registry – Model Evaluation viewSimplified model validation to improve model release.  In an MLOps environment, automation is critical for ensuring that the correct model version is used consistently across all downstream systems. As you scale your deployments and expand the scope of your use cases, your team will need solid infrastructure for flagging that a particular model version is ready for production.In Vertex AI Model Registry, aliases are uniquely named references to a specific model version. When you register a new model, the first version automatically gets assigned the default alias. Then you can create and assign custom aliases to your models depending on how you decide to organize your model lifecycle. An example of model alias usage would be assigning the stage of the reviewing process (not started, in progress, under review, approved) or the status of the model life cycle (experimental, staging, or production).Fig. 4 – Vertex AI Model Registry – Aliases viewIn this way, the Model Registry simplifies the entire model validation process by making it easy for downstream services, such as model deployment pipelines or model serving infrastructure, to automatically fetch the right model.Easy deployment to streamline models to productionAfter a model has been trained, registered, and validated, the model is ready to be deployed. With Vertex AI Model Registry, you can easily productionalize all of your models (BigQuery models included) with point-and-click model deployment thanks to the integration with Vertex AI Endpoints and Vertex AI Batch Predictions. In the Vertex AI Model Registry console, you select the approved model version, you define the endpoint and you specify some model deployment and model monitoring settings. Then you deploy the model. After the model has been successfully deployed, you can see that the model status is automatically updated in the models view and it is ready to generate both online and batch predictions.Fig. 5 – Vertex AI Model Registry – Model DeploymentUnified model reporting to ensure model performanceA deployed model keeps performing if the input data remains similar to the training data. But realistically, data changes over time and the model performance degrades. This is why model retraining is so important. Typically, models are retrained at regular intervals, but ideally models should be continuously evaluated with new data before making any retraining decisions.  With the integration of Vertex AI Model Evaluation, now in preview, after you deploy your model, you define a test dataset and an evaluation configuration as inputs. In turn, it returns model performance and fairness metrics directly in the Vertex AI Model Registry console. Looking at those metrics you can determine when the model needs to be retrained based on the data you record in production. These are important capabilities for model governance, ensuring that only the freshest, most accurate models are used to drive your business forward.Fig. 6 – Vertex AI Model Registry – Model Evaluation comparison viewConclusion The Vertex AI Model Registry is a step forward for model management in Vertex AI. It provides a seamless user interface which shows you all of the models that matter most to you free of charge, and at-a-glance metadata to help you make business decisions.In addition to a central repository where you can manage the lifecycle of your ML models, it introduces new ways to work with models you’ve trained outside of Vertex AI, like your BQML models. It also provides model comparison functionality via the integration with our Model Evaluation service, which makes it easy to ensure that only the best and freshest models are deployed. Additionally, this one stop view improves governance and communication across all stakeholders involved in the model training and deployment process. With all these benefits of the Vertex AI Model Registry, you can confidently move your best models to production faster. Want to learn more?To learn more about the Vertex AI Model Registry, please visit our other resources:Vertex AI Model Registry DocumentationBQML Model Registry Documentation Vertex AI Model Evaluation Documentation Want to dive right in? Check out some of our Notebooks, where you can get hands-on practice: Get started with Vertex AI Model RegistryGet started with Model Governance with Vertex AI Model RegistryDeploy BigQuery ML Model on Vertex AI Model Registry and Make PredictionsGet started with Vertex AI Model EvaluationSpecial thanks to Ethan Bao, Shangjie Chen, Marton Balint, Phani Kolli, Andrew Ferlitch, Katie O’Leary, and all the Vertex AI Model Registry team for support and great feedback.Related ArticleRead Article
Quelle: Google Cloud Platform

Built with BigQuery: How Tinyclues and Google Cloud deliver the CDP capabilities that marketers need

Editor’s note: The post is part of a series highlighting our awesome partners, and their solutions, that are Built with BigQuery.What are Customer Data Platforms (CDPs) and why do we need them?Today, customers utilize a wide array of devices when interacting with a brand. As an example, think about the last time you bought a shirt. You may start with a search on your phone as you take the subway to work. During that 20 minute ride, you narrow down the type of shirt . Later, as you take your lunch break, you spend a few more minutes refining your search on your work laptop and you are able to find two shirt models of interest. Pressed for time, you add both to your shopping cart at an online retailer to review at a later point. Finally, after you arrive back home and as you are checking your physical mail, you stumble across a sales advertisement for the type of shirt that you are looking for, available at your local brick and mortar store. The next day you visit that store during your lunch break and purchase the shirt. Many marketers face the challenge of creating a consistent 360 customer view that captures the customer lifecycle, as illustrated in the example above – including their online/offline journey, interacting with multiple data points across multiple data sources.The evolution of managing customer data reached a turning point in the late 90’s with CRM software that sought to match current and potential customers with their interactions. Later as a backbone of data-driven marketing, Data Management Platforms (DMPs) expanded the reach of data management to include second and third party datasets including anonymous IDs. A Customer Data Platform combines these two types of systems, creating a unified, persistent customer view across channels (mobile, web etc) that provide data visibility and granularity at individual level.A new approach to empowering marketing heroesTinyclues is a company that specializes in empowering marketers to drive sustainable engagement from their customers and generate additional revenue, without damaging customer equity. The company was founded in 2010 on a simple hunch: B2C marketing databases contain sufficient amounts of implicit information (data unrelated to explicit actions) to transform the way marketers interact with customers, and a new class of algorithms based on Deep Learning (sophisticated machine learning that mimics the way humans learn) holds the power to unlock this data’s potential. Where other players in the space have historically relied – and continue to rely – on a handful of explicit past behaviors and more than a handful of assumptions, Tinyclues’ predictive engine uses all of the customer data that marketers have available in order to formulate deeply precise models, down even to the SKU level. Tinyclues’ algorithms are designed to detect changes in consumption patterns in real-time, and adapt predictions accordingly.This technology allows marketers to find precisely the right audiences for any offer during any timeframe, increasing engagement with those offers and, ultimately, revenue; additionally, marketers are able to increase campaign volume while decreasing customer fatigue and opt-outs, knowing that audiences are receiving only the most relevant messages. Tinyclues’ technology also reduces time spent building and planning campaigns by upwards of 80%, as valuable internal resources can be diverted away from manual audience-building.Google Cloud’s Data Platform, spearheaded by BigQuery, provides a serverless, highly scalable, and cost-effective foundation to build this next generation of CDPs. Tinyclues Architecture:To enable this scalable solution for clients, Tinyclues receives purchase and interaction logs from clients in addition to product and user tables. In most cases, this data is already in the client’s BigQuery instance, in which case they can be easily shared with Tinyclues utilizing BigQuery authorized views. In cases where the data is not in BigQuery, flat files are sent to Tinyclues via GCS and are ingested in the client’s data set via a lightweight Cloud Function. The orchestration of all pipelines is implemented via Cloud Composer (Google’s managed Airflow). The transformation of data is accomplished by utilizing simple select statements in the Data Built Tool (DBT), which is wrapped inside an airflow DAG that powers all data normalization and transformations. There are several other DAGs to fulfill more functionalities, including: Indexing the product catalog on Elastic Cloud (Elasticsearch managed service) on GCP to provide auto-complete search capabilities to TCs clients as shown below:The export of Tinyclues-powered audiences to the clients’ activation channels, whether they are using SFMC, Braze, Adobe, GMP, or Meta.Tinyclues AI/ML Pipeline powered by Google Vertex AITCs ML Training pipelines are used to train models that calculate propensity scores. They are composed using Airflow DAGs, powered by Tensorflow & Vertex AI Pipelines. BigQuery is used natively, without data movement, to perform as much feature engineering as possible in-place. TC uses the TFX library to run ML Pipelines in Vertex AI. Building on top of Tensorflow as their main deep learning framework of choice due to its maturity, open source platform, scalability and support for complex data structures (Ragged and Sparse Tensors). Below is a partial example of TC’s Vertex AI Pipeline graph, illustrating the workflow steps in the training pipeline. This pipeline allows for the modularization & standardization of functionality into easily manageable building blocks. These blocks are composed of TFX components (TC reuses most of the standard components in addition to customizing some such as a proprietary implementation of the Evaluator to compute both ML Metrics (which is part of the standard implementation) but also more Business Metrics like Overlap of clickers etc. The individual components/steps are chained with DSL to form a pipeline that is modular and easily orchestrated or updated as needed.With the trained Tensorflow models available in GCS, TCs exposes these in BigQuery ML (BQML) to enable their clients to score millions of users for their propensity to buy X or Y within minutes. This would not be possible without the power of BigQuery and also frees TC from previously experienced scalability issues.As an illustration, TC has the need to score thousands of topics among millions of users. This used to take north of 20 hours on their previous stack, and now takes less than 20 minutes thanks to the optimization work that TC has implemented in their custom algorithm and the sheer power of BQ to scale to any workload accordingly. Data Gravity: Breaking the Paradigm – Bringing the Model to your DataBQML enables TC to call pre-trained TensorFlow models within an SQL environment, thus avoiding exporting data in and out of BQ using already provisioned BQ serverless processing power. Using BQML removes the layers between the models and the data warehouse and allows them to express the entire inference pipe as a number of SQL requests. TC no longer has to export data to load it into their models. Instead, they are bringing their models to the data.Avoiding the export of data in and out of BQ and the serverless provisioning and start of machines saves significant time. As an example, exporting an 11M lines campaign for a large client previously took 15 min or more to process. Deployed on BQML it now takes minutes with more than half of the processing time attributed to network transfers to our client system. Inference times in BQML compared to TCs legacy stack:As can be seen, using this approach enabled by BQML, the reduction in the number of steps leads to a 50% decrease in overall inference time, improving upon each step of the prediction.The Proof is in the puddingTinyclues has consistently delivered on its promises of increased autonomy for CRM teams, rapid audience building, superior performance against in-house segmentation, identification of untapped messaging and revenue opportunities, fatigue management, and more, working with partners like Tiffany & Co, Rakuten, and Samsung, among many others.ConclusionGoogle’s data cloud provides a complete platform for building data-driven applications like the headless CDP solution developed by Tinyclues — from simplified data ingestion, processing, and storage to powerful analytics, AI, ML, and data sharing capabilities — all integrated with the open, secure, and sustainable Google Cloud platform. With a diverse partner ecosystem, open-source tools, and APIs, Google Cloud can provide technology companies the portability and differentiators they need to serve the next generation of marketing customers.  To learn more about Tinyclues on Google Cloud, visit Tinyclues. Click here to learn more about Google Cloud’s Built with BigQuery initiative. We thank the many Google Cloud team members who contributed to this ongoing data platform  collaboration and review, especially Dr. Ali Arsanjani in Partner Engineering.Related ArticleRead Article
Quelle: Google Cloud Platform