How to use a machine learning model from a Google Sheet using BigQuery ML

Spreadsheets are everywhere! They are one of the most useful productivity tools available. They make organizing, calculating, and presenting data a breeze. Google Sheets is the spreadsheet application included in Google Workspace, which has over 2 billion users. Machine learning, or ML for short, has also become an essential business tool. Making predictions with data at low cost and high accuracy has transformed industries. The adoption of machine learning in business is estimated to be growing at over 40% a year.Doesn’t it make sense to bring the power of machine learning to all the data out there in spreadsheets? I definitely think so! Now we have the tools to make this happen. Let’s take a look in this blog post.The Big PictureBigQuery ML, built into BigQuery, enables users to create machine learning models using standard SQL queries. In this blog post, we’ll discuss how to create a time series forecasting model with BigQuery ML. The input for a time series model is a historical sequence of values, and the output is a sequence of future predicted values.I’ve picked this particular type of model because time series data is very common in spreadsheets. You can imagine a variety of scenarios that might have an ordered set of dates along with numeric values: sales, staffing, operational metrics, etc. For an in-depth look at a demand forecasting reference pattern using BigQuery ML, I recommend this blog post.What you’ll learn in this blog post will apply to any type of machine learning model. BigQuery ML supports a wide variety of model types, from neural networks, AutoML models, or even TensorFlow models. You don’t need to know how the model is built–you just need to bring your data and choose the appropriate options. BigQuery ML will build and host the model for you.To build the model, you need training data to learn patterns from. Fortunately, that data is right in your spreadsheet! If you use Connected Sheets, you can even access massive amounts of data directly from BigQuery, ensuring that you’re working with the most recent, secured data at all times.To execute BigQuery ML queries from Sheets, we’ll use Apps Script, a cloud-based, JavaScript platform to extend Google Workspace. The Apps Script code will extract input data from the spreadsheet; execute BigQuery ML queries for training and predicting; and update the spreadsheet with predictions from the model.The ProblemAs an example scenario, let’s forecast visits to an e-commerce site using Google Analytics data from BigQuery Public Datasets. The dataset consists of 12 months of traffic, content, and transaction data. Let’s look at a chart of hourly website visits:This dataset contains some complex patterns that the robust forecasting capabilities in BigQuery ML can handle well. For example, we see a repeating pattern over each day, as well as over each week (daily and weekly seasonality). Also, there are some spikes that could potentially throw off a forecasting algorithm, but BigQuery ML provides automatic outlier detection to manage these events.Using Data from BigQueryOur example code will work with your data, wherever it comes from. If you do have access to business data in BigQuery, Connected Sheets is the recommended way to access it from Sheets. Let’s take a look.From the Data menu, you’d choose to Connect to BigQuery:Select the project, dataset, and table, and your spreadsheet is now connected!Connected Sheets allows you to unlock big data insights with features you’re already familiar with in Sheets such as pivot tables, charts and formulas,that can be automatically refreshed with new data from BiqQuery. For our scenario, we’d want to create a model trained with a snapshot from a certain point in time. That’s as easy as selecting the data, and then selecting Extract:Using a BigQuery ML Model from SheetsLet’s now look into how we can execute BigQuery ML commands from the Sheets user interface. We’ll walk through a code sample from the Google Workspace ml-integration-samples to show how this can be done.By opening up the Script Editor from your sheet (Tools > Script Editor), you’ll be able to write your Apps Script code and configure integrations. This sample has enabled two Advanced Services, BigQuery and Sheets, to support the use of these APIs in the script.Let’s first look at the menu that’s added to Sheets, and see how we linked it to code in the project:The onOpen() is triggered when the spreadsheet is opened. You can see that it creates several menu items, which are each linked to functions. For example, the “Train” menu item will invoke the train() function when it is selected.Model TrainingLet’s now explore how the training function works with any custom data provided. The user selects a range of data in the sheet, each containing a date and the data to forecast. The training code extracts these values that are populated in the Range object.The rows are then inserted into a temporary BigQuery table. Finally, a query is executed to create a time series model with the data in the table.Let’s now look at an example query that could be generated by the training code. It specifies a model type of “ARIMA” (for time series) and the timestamp and data columns from the temporary table. Additional options are available in BigQuery ML, such as holidays, but are not included in the sample.ForecastingTo make a forecast, you would select a number of rows in the spreadsheet, and then select Forecast in the menu. The script will invoke the ML.FORECAST() function on the model, specifying the horizon as the number of rows selected.Two fields are extracted from the forecast, for each time period:The forecast timestamp formatted as a string with the date, time, and time zone.The forecast value for that timestamp.The query might look like this, assuming 3 rows are selected:Now, let’s look at an actual forecast! Here, we see the last week of data in blue, along with a forecast for the last 3 days in red.Putting it all togetherBy combining Sheets with BigQuery ML, you can bring the power of machine learning to any data in your spreadsheet. You just need to have enough high-quality data to extract meaningful patterns from.The code sample shows how the training process can work with all types of data, as long as it has a date and a numeric value. Depending on your needs, the sample could be modified to work with different model types; use different options; or work with different spreadsheet layouts. Then, business users can simply use the menu to build new models and forecast with them.A template sheet, pre-installed with the code sample, can help you get started. You can also follow the instructions in the code sample to install it yourself. If you want to get more hands-on experience with BigQuery ML, I recommend this introductory codelab. With Sheets and BigQuery ML, I hope you are able to solve more problems than ever before!Related ArticleHow to build demand forecasting models with BigQuery MLWith BigQuery ML, you can train and deploy machine learning models using SQL. With the fully managed, scalable infrastructure of BigQuery…Read Article
Quelle: Google Cloud Platform

Introducing GKE Autopilot: a revolution in managed Kubernetes

In the years since Google invented Kubernetes, it has completely revolutionized IT operations, becoming the de facto standard for organizations looking for advanced container orchestration. Organizations that need the highest levels of reliability, security, and scalability for their applications choose Google Kubernetes Engine (GKE). In the second quarter of 2020 alone, more than 100,000 companies used our application modernization platforms and services—including GKE—to build and run their applications. Until now, Kubernetes still involved a fair bit of manual assembly and tinkering to optimize it for your needs. Today, we’re introducing GKE Autopilot, a revolutionary mode of operations for managed Kubernetes that lets you focus on your software, while GKE Autopilot manages the infrastructure.For many businesses, the flexibility and power that Kubernetes and GKE offers is ideal, giving them a high level of control over most aspects of their cluster configurations. For others though, this level of control and choices can be overwhelming or unnecessary for their workloads’ requirements, as they just want a simple way to build a more secure and consistent development platform. Autopilot can help, allowing businesses to embrace Kubernetes and simplifying operations by managing the cluster infrastructure, control plane, and nodes.With its optimized, ready-for-production cluster, Autopilot offers a strong security posture and ops-friendly configuration, reducing the need to learn the nitty-gritty details of cluster configuration. By managing the cluster infrastructure, Autopilot also helps reduce Day-2 operational and maintenance costs, while improving resource utilization. Autopilot is a hands-off fully managed Kubernetes experience that allows you to focus more on your workloads and less on managing cluster infrastructure. One GKE, two modes of operationWith the launch of Autopilot, GKE users can now choose from two different modes of operation, each with their own level of control over their GKE clusters and the relative responsibilities related to GKE. GKE already offers an industry-leading level of automation that makes setting up and operating a Kubernetes cluster easier and more cost effective than do-it-yourself and other managed offerings; Autopilot represents a significant leap forward. In addition to the fully managed control plane that GKE has always provided, using the Autopilot mode of operation automatically applies industry best practices and can eliminate all node management operations, maximizing your cluster efficiency and helping to provide a stronger security posture.GKE AutopilotGKE has always been about simplifying Kubernetes, while still giving you control. Perhaps you still want to customize your Kubernetes cluster configurations or manually provision and manage the cluster’s node infrastructure. If so, you can continue to use GKE with the current mode of operation in GKE, referred to as Standard, which provides the same configuration flexibility that GKE offers today.GKE StandardLeave the management to GKE Early access customers have found that choosing Autopilot has the potential to dramatically improve the performance, security, and resilience of their Kubernetes environments, while reducing the overall operational load required for managing Autopilot clusters. Here are some of the benefits they are excited about. Optimize for production like a Kubernetes expert With Autopilot, GKE creates clusters based on battle-tested and hardened best practices learned from Google SRE and engineering experience. These optimized configurations are ready for production, helping reduce the GKE learning curve. GKE also automatically provisions cluster infrastructure based on your workload specifications and can take care of managing and maintaining the node infrastructure. “Reducing the complexity while getting the most out of Kubernetes is key for us and GKE Autopilot does exactly that!” – Mario Kleinsasser, team leader at Strabag InternationalEnjoy a stronger security posture from the get-go GKE already does a lot to help secure your cluster—from hardening the lowest level of hardware, through the virtualization, operating system, Kubernetes, and container layers. With Autopilot, GKE helps secure the cluster infrastructure based on years of experience running the GKE fleet. Autopilot implements GKE hardening guidelines and security best practices, utilizing GCP unique security features like Shielded GKE Nodes and Workload Identity. In addition, Autopilot blocks certain features deemed as less safe such as External IP Services or legacy authorization, disabling CAP_NET_RAW and restricting specific cipher suite usage. By locking down individual Kubernetes nodes, Autopilot further helps reduce the cluster’s attack surface, and minimizes ongoing security configuration mistakes.Use Google as your SRE for both nodes and the control plane Google SRE already handles cluster management for GKE; with Autopilot, Google SREs manage your nodes as well, including provisioning, maintenance, and lifecycle management. Because Autopilot nodes are locked down, sysadmin-level modifications that could result in nodes being unsupportable can be prevented. Autopilot also supports maintenance windows and a pod disruption budget for maintenance flexibility. In addition to GKE’s SLA on hosts and the control plane, Autopilot also includes an SLA on Pods—a first.“GKE Autopilot is the real serverless K8s platform that we’ve been waiting for. Developers can focus on their workloads, and leave the management of underlying infrastructure to Google SREs.” – Boris Simandoff, VP Engineering, at Via Transportation, Inc. Pay for the optimized resources you useWith Autopilot, we provision and scale the underlying compute infrastructure based on your workload specifications and dynamic load, helping to provide highly efficient resource optimization. Autopilot dynamically adjusts compute resources, so there’s no need to figure out what size and shape nodes you should configure for your workloads. With Autopilot, you pay only for the pods you use and you’re billed per second for vCPU, memory and disk resource requests. No more worries about unused capacity!Welcoming the GKE partner ecosystemWe designed Autopilot to be broadly compatible with how GKE has always worked, as well as with partner solutions. Out of the gate, Autopilot supports logging and monitoring from DataDog and CI/CD from GitLab. Both work just as they do in GKE today—no need to configure things differently or use sidecars. Our goal is full partner compatibility, and many more integrations are expected in the coming months.Join the Kubernetes revolutionWe’re proud of the dramatic efficiency that GKE brings to running complex, distributed applications, and GKE Autopilot represents the next big leap forward in terms of management and operations. Autopilot is generally available today1; we encourage you to see the difference that it brings to your Kubernetes environment. Get started today with the free tier. To learn more about GKE Autopilot, tune into this week’s episode of the Kubernetes Podcast with GKE Autopilot Product Manager Yochay Kiriaty.Save the date: Build the future with Google Kubernetes Engine online event is on March 11th. Join us to learn what’s new in the world of containers and Kubernetes at Google Cloud, get access to exclusive demos and hear from experts. See you there!1. You can currently access Autopilot from the command line interface, and we are gradually rolling it out to the Google Cloud Console for all GCP regions. If you don’t see the Autopilot option in Cloud Console yet, use the CLI or try again later.Related ArticleLooking ahead as GKE, the original managed Kubernetes, turns 5Happy birthday, GKE. As we look ahead, we wanted to share five ways we’re continuing our work to make GKE the best place to run Kubernetes.Read Article
Quelle: Google Cloud Platform

A look at the new Google Cloud Marketplace Private Catalog, now with Terraform support

In April 2019, we announced that Private Catalog was generally available to all Google Cloud Marketplace customers, giving cloud administrators, developers, and procurement managers within an organization a single, centralized home for apps and solutions that are approved for everyone’s internal use. Private Catalog makes teams more productive by making Google Cloud services, third-party and internal solutions easier to find and use, and limits risk by letting admins grant clear visibility and deployment controls. Since making Private Catalog generally available, we’ve been working closely with multiple Enterprise customers and are pleased to announce enhancements to our admin experience, along with support for Terraform.Deploy your infrastructure using TerraformEngineers increasingly use Terraform as their preferred  technology for  building and deploying solutions that can scale across clouds, and Terraform support has been one of our most oft-requested features.With this update to Private Catalog, you can ingest and deploy Terraform configurations. Further, you can deploy your infrastructure using Terraform without having to install Terraform. Stay tuned for additional features this year to enhance the Terraform experience.Easier-to-manage products With Private Catalog, the products you create traditionally must live under a single catalog. This made it easy to manage your products at the catalog level but made it difficult if you wanted to include the same product in multiple catalogs, or get a list of all the products in your organization.With the latest version of Private Catalog, products and catalogs are now separate; you can now manage products outside of a catalog, bulk-add products to multiple catalogs, and gain a clear, concise view of all the products in their organizations, along with which catalogs they’re shared to. Private Catalog also features a revamped Admin UX that makes it easy to view lists of products and catalogs.Simple delegate managementBefore the current update, you had to add Private Catalog IAM permissions at the Organization level. We heard feedback that you wanted the flexibility to add permissions at different levels of the GCP resource hierarchy. Now, you can now add Catalog IAM permissions at the folder and project levels, which eliminates the need for a Cloud Admin at the Org level to manage Catalog Admin permissions. With this federated model,  complex enterprises can easily manage separate catalogs and permissions anywhere in the GCP hierarchy and enable admins to manage individual catalogs at different folders or projects. No more losing track of who’s an Org Admin, and restricting Catalog Admins’ permissions. Of course, you can also choose to maintain the existing centralized model if you choose.View shared catalogs with easeAnother  customer pain point has been around catalog sharing. Cloud admins share catalogs with the GCP Org, Folder or Project to provide visibility of the solutions to their end users, but Private Catalog didn’t show the sharing metadata. With this release, sharing controls and information are displayed prominently in the Catalog list, with improved governance controls, and you can easily unshare the catalog.Click to enlargeBulk-edit products When you shared products across many catalogs, you’ve traditionally had to recreate them in each catalog and manage each catalog separately. With this release, products are managed independently from catalogs, so you can easily bulk-assign them to catalogs and manage the product in a single place.Begin your move to Private Catalog todayThis is our largest update to Private Catalogs since its launch in April. To take advantage of the many new capabilities, you will need to migrate to it, which will move your products into a single Google Cloud project. We’re working hard to make the migration process painless and simple. For existing Private Catalog customers, when you’re ready to migrate, please fill out this short, six-question form. We will then contact you directly to schedule your migration and get you up and running on the newest version. New customers can visit the Private Catalog page to sign up or learn more about the feature here. Please note that all new Private Catalog customers will need to select a project when you first start up.We hope the new functionality will help you ensure compliance and governance within your organization, and we welcome any feedback you may have. Try out Private Catalog on Google Cloud Marketplace today.Related ArticleImprove enterprise IT procurement with Private Catalog, now in betaWith Private Catalog, now in beta, you can control which GCP apps are available throughout your organization.Read Article
Quelle: Google Cloud Platform

AWS Control Tower bietet jetzt eine Regionsauswahl

Heute kündigen wir die AWS Control Tower Region Selection an, eine neue Funktion, die Ihnen die Möglichkeit bietet, den geografischen Fußabdruck Ihrer AWS Control Tower-Ressourcen effizient zu verwalten. Sie können jetzt die zusätzlichen Regionen auswählen, die Sie mit AWS Control Tower verwalten möchten. Damit haben Sie die Möglichkeit, Compliance- und regulatorische Belange zu berücksichtigen und gleichzeitig die mit der Expansion in zusätzliche Regionen verbundenen Kosten auszugleichen.

Quelle: aws.amazon.com