A cloud built for developers — 2021 year in review

2021 was a seminal year for software developers. Every company accelerated their digital and online efforts, while simultaneously moving to remote development. Innovation by driving developer productivity was top of mind for nearly every IT executive we spoke to. Many asked us about Alphabet’s long track record of innovation. From Google search to Waymo’s driverless cars,  is there a secret to developing the next big thing? The answer is simple: 10X thinking. Look for solutions that help customers drive 10X improvements, through a series of smaller increments that compound to a large impact over time. At Google Cloud, we follow a similar philosophy to help our customers become innovative technology companies. In recent times, we’ve worked closely with partners, customers, and developers on services that help unlock 10X improvements in developer productivity. Six years ago, we introduced a managed Kubernetes service, Google Kubernetes Engine (GKE). This year, we added GKE Autopilot, which revolutionized Kubernetes management by eliminating all node management operations. Likewise, our Cloud Run serverless platform was the first service of its kind, allowing developers to go beyond running small bits of code and run full applications in a serverless environment. From September 2020 to September 2021, Cloud Run deployments more than quadrupled. More recently, we co-founded the Open Source Security Foundation and began working on secure continuous Integration and delivery (CI/CD) services a year or so ahead of the cybersecurity threats that made it to headlines. Here are the top developer challenges that customers asked us to solve in 2021: Driving distributed developer productivitySecuring the software supply chainSimplifying running of cloud-native applications Read on for more insights. Driving distributed developer productivityA critical prerequisite for innovation is time. Investments in developer productivity free developers to work on the important things. Traditionally, developers have spent hours downloading and installing tools to their local environments, updating them with the latest versions, or dependencies. Cloud Shell Editor is a full remote development environment with a growing set of built in security capabilities. It comes with developer tools pre-installed, including MySql, Kubernetes, Docker, minikube, Skaffold, etc. Developers just needed a web browser and internet connection to be productive. Developers now have access to tutorials right from Cloud Shell Editor, and can try code samples directly in our documentation. Additionally, with support for buildpacks, developers can create container images directly from source code, without knowing anything about docker or containers. Securing the software supply chainSoftware supply chain vulnerabilities had far reaching consequences in 2021, with events such as SolarWinds, Mimecast/Microsoft Exchange, and Log4jaffecting businesses, daily life, and entire governments. President Biden even issued an executive order to strengthen software supply-chain security standards.Solving the software supply chain problem requires players across industries to work together. This is why we co-founded theOpen Source Security Foundation (Open SSF). We also proposed SLSA, an industry-wide framework for maintaining the integrity of software artifacts throughout the software supply chain. Open source, with its complex dependency trees, continues to remain a prime target for exploitation. In fact, an estimated 84% of commercial code bases have at least one open source vulnerability. Today, developers can use our tools such as Allstar GitHub App, open source security score cards and Open Source Insights to implement security best practices, determine a risk score for open source projects, and visualize a project’s deep dependencies. And several of these same  kinds of open-source innovations are available out of the box to Google Cloud customers. Here are a few examples: Detailed recommendations to help mitigate the Apache Log4j vulnerability. The Java scanning feature of Google Cloud On-Demand Scanning, which can be quite handy for developers to identify Linux-based container images that use an impacted version of Log4j. On-Demand Scanning can be used with no charge until December 31, 2021. Cloud Build, our serverless CI/CD service, offers SLSA Level 1 compliance by default. This verifiable build provenance lets you trace a binary to the source code to prevent tampering and prove that the code you’re running is the code you think you’re running. Cloud Build’s new build integrity feature improves on this by automatically generating digital signatures, which can be validated before deployment by Binary Authorization. Simplifying running cloud-native applicationsInnovation is rarely a straight road, there are many wrong turns along the way. Developers need a cost effective runtime, a way to run experiments and fail forward fast. That’s why GKE Autopilot takes GKE, the most mature Kubernetes service on the market and further simplifies Kubernetes operations by providing a managed control and data plane, an optimized configuration out-of-the-box, automated scalability, health checks and repairs, and pay-for-use pricing. “With GKE Autopilot, we can do more with our business. We can continue developing and upgrading our products, rather than focusing on fine-tuning infrastructure.”—Jun Sakata, Software Engineer, Site Reliability, Ubie Simpler still is no cluster all. Cloud Run provides developers the freedom to run services from code or container images with no cluster or VM to manage. At the same time, it provides a hypervisor grade secure sandbox environment and several built in DevOps capabilities such as, multi-versioned deployments, gradual rollouts and rollbacks, GitHub and Cloud Build integrations. This is ideal for web and mobile application development. In 2021, with additions like higher per-instance concurrency, new CPU allocation controls, and support for standard Docker images, the benefits of serverless can now be expanded to a wider range of workloads, including legacy ones. Additionally, with newer cost controls along with billing flexibility like committed use contracts and features like always-on CPU, it’s possible to run more steady-state pattern workloads cost effectively in a serverless environment.  Best of all, thanks to improvements like these, organizations using Cloud Run have reported reduction in developer recruiting costs by 40%. Cloud Run is also the first platform to provide developers the option to optimize their carbon footprint.  With the news self-service Region Picker you can choose the data center region with the lowest gross carbon cost on which to run your Cloud Run workloads. Further, with just one click, Google Cloud Carbon Footprintgives you access to the energy-related emissions data for external carbon disclosures. “With Cloud Run, we only need half the people to manage our systems as compared to before” Google Cloud Platform Architect, Cosmetics “Cloud Run is one of the easiest services on Google Cloud Platform you can deploy to. It’s just super simple.” CTO,Healthcare SaaSIf you want to give Cloud Run and associated Cloud Functions a try, check out the Easy as Pie Serverless Hackathon, which offers  over $20,000 USD in cash prizes.2022: More to come  2021 brought simplification and greater attention to developer productivity. It is essential that developers continue to operate at even higher levels of the stack, without worrying about infrastructure, security, compliance and integrations. This is the Northstar for 2022. In 2022, look for Google Cloud to co-innovate with our ISV partners, developers, and SecOps team to bring you the 10X innovation you need from the cloud that is built for developers.
Quelle: Google Cloud Platform

Google Cloud Data Analytics 2021: The year in review

As I look back on 2021 I’m proud to see a fast growing number of companies use our data platform to unlock new insights, build new business models and help improve their employees’ and their customers’ experience.Data itself is just inactive information, useless without activation. The true power of data comes when it’s being used to build intelligent applications, help people make better decisions, increase automation and ultimately change how value is being created.This year, tens of thousands of  customers unlocked their data advantage with Google Cloud’s unified data platform. From breaking down data silos, building internet-scale applications, building smart processes with AI, building data meshes that span beyond their enterprise and turn data into an asset.These customers all used Google Cloud’s unified data platform to remove barriers across data silos, accelerate existing analytic investments, and achieve business outcomes faster. I’m truly honored to share some of the most important moments from our partners, customers, and practitioners this year. Thank you for your trust and commitment and for choosing Google Cloud as your innovation partner — to break down silos and turn data into value.One retailer using these solutions is Carrefour, one of the largest grocery retailers in France. Carrefour needed to ensure it had the right products, in front of the right shoppers, at the right store location. With Google Cloud, Carrefour developed an assortment recommendation tool that helped the chain support a more personalized selection at the store level, giving store directors the autonomy to influence inventory needs. The tool also gives Carrefour headquarters visibility into the merchandising decisions by each of their franchise stores.Enabling the real-time enterprise In 2021, more customers looked to shift to real-time data processing and insights with Google Cloud so that they could make decisions at the speed of their business and deliver excellent customer experiences. For example, Twitter’s data platform ingests trillions of events, processes hundreds of petabytes of data, and runs tens of thousands of jobs on over a dozen clusters every day. With this expanded partnership, Twitter is adopting Google’s Data Cloud including BigQuery, Dataflow, Cloud Bigtable and machine learning (ML) tools. These tools not only power the company’s rapidly growing data ecosystem to enable faster data-informed decisions, but also to enable deeper ML-driven product innovation.Another great example is the story of Verizon Media, who switched to the Google Cloud from another provider to ingest 200TB Daily, store 100PB in BigQuery, stream 300MB per second and achieve a 90+% productivity improvement by combining Looker, BigQuery and the rest of our Data Platform.Finally, another great journey is that of ATB Financial who migrated its extensive SAP backbone that supports its 800,000+ customers to Google Cloud, and built a system on BigQuery for real-time data acquisition, enrichment, and AI assisted and self-service analyticsGoing BIG with healthcare and life sciencesHCA Healthcare is using BigQuery to analyze data from its 32 million annual encounters and identify opportunities to improve clinical care. I was particularly pleased about our partnership and how  Sam Hazen, the company’s CEO explained that “Next-generation care demands data science-informed decision support and how he described our partnership and shared passion for ”innovation and continual improvement as foundational to our efforts.”Moderna relies on data to respond to the disproportionate impact the pandemic has had on minority groups, utilizing insights from Looker to increase diversity in their COVID-19 vaccine trials to improve representation. “Looker has a depth to it — it’s not just a visualization that you look at. People can go deeper as they learn more,” Dave Johnson, VP of Informatics, Data Science, and AI at Moderna.We were also incredibly honored to work with the National Cancer Institute’s in order to support breast cancer research with fast and secure data sharing.  Combining our AI Platform and BigQuery to work with large and heterogeneous data, the team was able to successfully demonstrate that “researchers can inexpensively analyze large amounts of data, and do so faster than ever before.”Increasing enterprise agilityNiantic Labs built a globally scalable game for millions of users on Google Cloud. In this video, they share their experience scaling with Google Kubernetes Engine (GKE) and Spanner, and describe how their data science team works with BigQuery, Dataflow, and Pub/Sub for their data analytics needs.Finally, Telefónica partnered with Google Cloud to foster Spain’s digital transformation and advance 5G mobile edge computing. As part of this partnership, Google Cloud will launch a new cloud region in Spain to assist the country in economic recovery amidst the COVID-19 crisis. Telefónica will also use Google Cloud services to boost its own digital capabilities—in areas such as machine learning, artificial intelligence (AI), data analytics, and application development—to continue to provide new services and tools to its global customer base. 2021 Data Cloud momentum, thanks to all our customersSome of my favorite customer stories of this year describe how our customers inspired and pushed us to take our products and offerings to new heights. For me, this was most apparent at our inaugural Data Cloud Summit in late May, where we were able to unveil to our customers the latest product and feature announcements of everything having to do with data. We launched Dataplex, allowing customers and users to centrally manage, monitor, and govern data across data lakes, warehouses, and marts – all from one single viewpoint. We also announced Datastream, our serverless change data capture (CDC) and replication service, as well as Analytics Hub, a fully-managed service built on BigQuery that allows our customers to create safe and governable data sharing ecosystems. We’ve made migrations to Cloud SQL easier and faster with the Database Migration Service. More than 85% of all migrations are underway in under an hour, with the majority of customers migrating their databases from other clouds. At Google Cloud Next ‘21, we also had devoted space to share our product improvements and iterations back to our customers. Amidst many announcements, I was most proud to speak about Spark on Google Cloud, the world’s first autoscaling and serverless Spark service for the Google Cloud data platform, BigQuery Omni, our cross-cloud analytics solution, Google Earth Engine on Google Cloud, a launch that brings together Google Earth Engine’s 50+ petabyte catalog of satellite imagery and geospatial datasets for planetary-scale analysis, and Spanner PostgreSQL interface, which allows enterprises to take advantage of Spanner’s unmatched global scale, 99.999% availability, and strong consistency using skills and tools from the popular PostgreSQL ecosystem. We also held the sixth edition of JOIN, Looker’s annual user conference which included 3 days of live educational content with over 15 customers and partners participating spanning 5 keynotes, 33 breakouts, 12 how-to’s, 27 Data Circles of Success, and our popular Hackathon. Content focused on activating users with data experiences, composable analytics, and our unifiedsemantic model. All sessions from JOIN are now available on-demand.Tapping into the data ecosystemOne of our customers’ most recurring themes was an interest in expanding their data aperture and tapping into the data ecosystem around them. We addressed this feedback in three main ways. First, we activated an ecosystem for collective intelligence on BigQuery. Now, more than 3,000 organizations shared more than 250 petabytes of data and Google Cloud shared more than 150 public datasets that can be used across a myriad of use cases.Second, We also leaned into this spirit of knowledge sharing by packaging over 30 architecture design patterns which include code, data models, and industry best practices. We’ve also increased industry domain expertise in areas such as retail, financial services, healthcare, and gaming and are continuing to develop industry white papers such as this one – How to develop Global Multiplayer Games using Cloud Spanner to reduce the time to value for customers.Third, we continue to support an open ecosystem of data partners including; Neo4j, Databricks, MongoDB, Informatica, Tableau, and C3.ai giving customers the flexibility of choice to build their data clouds without being locked into a single approach.  Going further than we imagined, togetherWe are incredibly grateful to our customers for choosing Google Cloud to write their data story, and we can’t wait to see what you do next. Learn more about how organizations are building their data clouds with Google Cloud solutions.Related ArticleTurn data into value with a unified and open data cloudAt Google Cloud Next we announced Google Earth Engine with Bigquery, Spark on Google Cloud and Vertex AI WorkbenchRead Article
Quelle: Google Cloud Platform

Cloud CISO Perspectives: December 2021

This is our last Cloud CISO Perspectives of 2021. It’s been an eventful year for the cybersecurity industry, both good and bad, and I welcome the opportunities and challenges we will continue to address together in 2022. In this final post, I’ll share the latest updates from the Google Cybersecurity Action Team, new reports from Google’s security research teams and more information on Google Cloud’s Log4j impact and assessment.Update on Log4j vulnerabilityGoogle Cloud continues to actively follow the evolving security vulnerabilities in the open-source Apache “Log4j” utility and we are providing regular updates to our security advisory page. Responding to these vulnerabilities can be especially stressful, even more so when reaching the end of the year. We encourage everyone using vulnerable versions of Log4j, in any environment, to upgrade as soon as possible and according to guidance published by Apache, found here. As the entire industry works through its response to Log4j, the Google Cybersecurity Action Team also continues to publish and update recommended actions for mitigating exposure to the Log4j vulnerabilities. The state of open source software securityWhat recent events have taught us and will continue to teach us into 2022 is that we owe our thanks to the volunteers and maintainers of open source software. More than ever, we need continued industry investment and commitment to support them.For years, Google has been focused on addressing this challenge. Our open source security team helped found the Open Source Security Foundation (OpenSSF). Over the past year, we have doubled down on our investments in open source software security; from tools to frameworks to funding maintainers of open source software projects to focus on security. This past August, we committed $10 billion to advancing cybersecurity for organizations and governments globally where a major part of that commitment is focused on securing the open source software ecosystem, including $100 million in investments to third-party organizations like Linux Foundation and OpenSSF. One of the primary challenges facing defenders at this very moment is simply getting a handle on where Log4j dependencies exist within their organization’s codebases. Our Supply-chain Levels for Software Assurance (SLSA) project, which we open sourced in partnership with the OpenSSF, is an end-to-end framework to manage supply chain integrity and security, and its implementation would greatly aid organizations in this kind of situation. Last week, Google’s Open Source Insights team published an analysis on the impact of the Apache Log4j vulnerability where they pulled together a list of 500 affected packages with some of the highest transitive usage and encouraged maintainers or users helping with the patching effort to maximize impact and unblock more of the community. Improvements such as these could qualify for financial rewards from the Secure Open Source Rewards program. You can explore your package dependencies and their vulnerabilities by using Open Source Insights. We all can do our part to support this critical function of our software ecosystem, and I look forward to seeing how organizations, governments and individuals work together to make improvements in the coming year. Google Cybersecurity Action Team Highlights Below I’ll recap the latest updates, new services and resources across our Google Cybersecurity Action Team, Google Cloud Security product teams and Google security research efforts since our last post. SecurityQ4 Cloud Security Talks Recap: We hosted our final Google Cloud Security Talks event of 2021 where our security teams focused on zero trust and covered everything from Google’s history with BeyondCorp to our strategic thinking when it comes to applying zero trust principles to production environments. We also shared product updates across the portfolio and talked about how zero trust fits into our invisible security vision. Check out the recap in this blog post and watch the sessions virtually on-demand.Autonomic Security Operations: Our Autonomic Security Operations solution continues to resonate with organizations and security professionals widely as teams look for more ways to modernize their security operations. Dr. Anton Chuvakin and Iman Ghanizada from the Google Cybersecurity Action Team recently published a whitepaper on how organizations can work towards a 10x transformation of their SOC. Their first blog post in a series of many looks at what security teams can learn from Site Reliability Engineering (SRE) principles and philosophies to begin their journey towards modernizing the SOC. ComplianceSoftware-Defined Community Cloud: Our Google Cloud compliance team outlined a new concept for how the industry can address challenges within legacy community cloud implementations. Our Assured Workloads product implements a novel approach to help customers meet compliance and sovereignty requirements through a software-defined community cloud. A software-defined community cloud is designed to deliver the benefits of a community cloud in a more modern architecture. Google Cloud’s approach provides security and compliance assurances without the strict physical infrastructure constraints of legacy approaches.Continuous Compliance: Following the Google Cybersecurity Action Team’s launch of the Risk and Compliance as Code solution, our customer engineering teams shared some timelycase studies on how Google Cloud customers are reaching continuous compliance, encompassing real-time attestation and notification. The key learning: the more familiar control owners become with our GCP capabilities, the more confident they feel to automate their controls.Shared FateSecured Data Warehouse blueprint: Google Cloud customers can jump start the migration and analysis of sensitive business data by using the new Google Cloud Secured Data Warehouse blueprint. This opinionated guidance consists of both documentation and deployable Terraform assets.  It is built around BigQuery and incorporates Cloud DLP, Cloud Storage, PubSub, Dataflow, Data Catalog, and CMEK to implement security best practices across data ingestion, storage, processing, classification, encryption, logging, monitoring and governance. Security Foundations Blueprint v2.5: And we’re excited to announce the next version of our Security Foundations Blueprint. New content provides further control for data residency and also supports Assured Workloads for enhanced native platform guardrails.  We review the guide and corresponding blueprints regularly as we continue to update best practices to include new product capabilities. Controls and ProductsNetwork-based Cloud threat detection with Cloud IDS: We announced the general availability of our Cloud IDS solution that helps enterprises detect network-based threats and helps organizations meet compliance standards that call for the use of an intrusion detection system. With the general availability, Cloud IDS now has the following enhancements: service availability in all regions, detection signatures automatically updated daily and new compliance support for customers’ HIPAA compliance requirements and ISO27001 certification.New zero trust features in BeyondCorp Enterprise: The BCE team released the Policy Troubleshooter feature in general availability. The tool provides support for administrators to triage blocked access events and easily unblock users within an organization, which is an essential tool for admins as employees continue to work remotely or in hybrid and need ways to access corporate resources and information securely. Keyless Authentication from GitHub Actions: Following GitHub’s introduction of OIDC tokens into GitHub Actions Workflows, you can now authenticate from GitHub Actions to Google Cloud using Workload Identity Federation, removing the need to export a long-lived JSON service account key.  New functionality like this is a part of Google Cloud’s ongoing efforts to make security invisible and our platform secure-by-default. Learn more in the blog post. Threat Intelligence Combating cyber crime at scale: In December, Google took action to disrupt Glupteba, a sophisticated botnet targeting Windows machines. This was also the first lawsuit against a blockchain enabled botnet, where the attackers protected itself using blockchain technology. Google’s Threat Analysis Group took steps to detect and track Glupteba’s malicious activity over time and we launched litigation which we believe will set a precedent and help deter future activity. The details in TAG’s analysis and our litigation demonstrate that crime on the internet is sophisticated, and at Google, we feel a responsibility as part of this ecosystem to play a part in disrupting this activity to help everyone on the Internet be safer.iMessage zero-click exploit: In a recent blog post, Google’s Project Zero researchers show for the first time how an in-the-wild zero-click iMessage exploit works and how it is used by NSO. Must-listen podcasts Earlier this month, the Google Cloud Security podcast hit a major milestone: 46 episodes in its first year! Check out this post to see the top themes from our podcast throughout 2021, including episodes on zero trust security, cloud threat detection, how to make cloud migrations more secure and data security in the cloud. This wraps up the year for Cloud CISO Perspectives in 2021! We’ll be back in 2022 with continued updates from our Google Cybersecurity Action Team and more. If you’d like to have this Cloud CISO Perspectives post delivered every month to your inbox, click here to sign-up.Related ArticleCloud CISO Perspectives: November 2021Google Cloud CISO Phil Venables shares his thoughts on the latest security updates from the Google Cybersecurity Action Team.Read Article
Quelle: Google Cloud Platform

Google Cloud’s top AI blog posts from 2021

Artificial intelligence (AI) remained in the spotlight over the last year, as the gap continued to grow between organizations that merely possess data and those that can use data to leverage the power of AI to generate actionable insights or improve customer experiences. At Google Cloud, helping you turn AI investments into real-world results is one of our foremost goals—and we kept our foot on the gas in 2021, launching a variety of new solutions, research, and tutorials. But don’t fret if you might have missed anything along the way. Whether you’re a seasoned data scientist or someone looking to solve problems with AI for the first time, Google Cloud offers platforms, tools, and best practices for all levels of expertise—and to close out the year, we’ve collected some of our top 2021 AI blog posts to help you kick off 2022 on the right foot.  Vertex AI: one platform for all your ML toolsUnveiled in May, Vertex AI was one of our most significant AI announcements of the year. A managed machine learning (ML) platform, Vertex AI supports your data teams more quickly building, deploying and maintaining ML models. Compared to competing platforms, it requires almost 80% fewer lines of code to train a model, helping your organization to implement Machine Learning Operations(MLOps) across all levels of expertise. Whether you’re newly adopting Vertex AI in 2022 or have been using it for months, here are a variety of articles to help you take full advantage of its powerful feature: What is Vertex AI? Developer advocates share moreAI Simplified: Managing ML data sets with Vertex AIUse Vertex AI Pipelines to build an AutoML classification end-to-end workflowBuild a reinforcement learning recommendation application using Vertex AIVertex AI Matching Engine: Blazing fast and massively scalable nearest neighbor searchPyTorch on Google Cloud: How To train and tune PyTorch models on Vertex AIAnnouncing Vertex AI Pipelines general availabilityVertex AI NAS: higher accuracy and lower latency for complex ML modelsCoca-Cola Bottlers Japan collects insights from 700,000 vending machines with Vertex AIGoogle demonstrates leading performance in latest MLPerf BenchmarksCCAI: reimagining customer experiences with the power of AIGoogle Cloud’s Contact Center AI (CCAI) platforms make ML-powered language models more accessible and impactful by helping even companies with limited AI expertise to uncover insights in customer and partner interactions, and deploy virtual agents that can chat naturally with customers.CCAI is built specifically to help call centers deliver excellent customer service on demand, even when human agents are unavailable, and we added powerful new features throughout the year, including Speaker ID, whichlets customers authenticate themselves with just a few spoken words; Agent Assist, which provides human agents with continuous insight into customer intent during calls and chats; and CCAI Insights, which uses AI to mine raw contact center interactions for insights, regardless of whether the data originated with a virtual or human agent. To learn more about how customers are leveraging CCAI, don’t miss our post about HSBC’s use of Dialogflow, a part of CCAI, to ease the call burden on its policy experts. DocAI: unlocking the value in unstructured data Our Document AI (DocAI) platform eliminates the guesswork and manual labor involved with document processing, helping your teams to better understand the data captured in documents and to streamline workflows. So your business can move even faster from implementation to value, we’ve introduced use case-specific additions to the platform, including Lending DocAI, Procurement DocAI, and Contract DocAI. To dive into how our customers are putting DocAI solutions to work, be sure to check out these articles: How Mr. Cooper is using AI to increase speed and accuracy for mortgage processingGoing global: Workday uses Google Cloud AI to accelerate document processingIndustry solutions: smarter decision making, from retail to manufacturing Delivering the right information to customers in the right context is crucial, so we’re pleased to see the great results that retail customers like IKEA and Bazaarvoice have enjoyed with our Recommendations AI solutions, achieving 30% and 60% increases in click-through rates, respectively. We published several blogs in 2021 to help you do more with this solution, including: Recommendations AI data ingestionRecommendations AI modelingServing predictions & evaluating Recommendations AIHow to get better retail recommendations with Recommendations AIIn addition to our use case-oriented DocAI solutions and our Recommendations AI work with retailers, we’ve also been digging into other specific industries and challenges, highlighted by the following:Visual Inspection AI: a purpose-built solution for faster, more accurate quality controlNew research reveals what’s needed for AI acceleration in manufacturingTranslation: customer connections without language barriersTranslation is one of the fastest growing AI use cases, and we rolled out a wide range of feature updates to help you connect with customers, highlighted by the ability to translate business documents across more than 100 languages. If faster translation workflows are among your 2022 resolutions, don’t miss our post about best practices for translating websites with Translation API or our article about how the city of San Jose uses AI translation to ensure critical services reach the community. We’re just getting started: training and recognition At Google Cloud, we see AI continuing to impact business and continuing to become easier to implement and leverage—so the preceding 2021 are just the tip of the iceberg. We’re looking forward to helping your organization do even more with AI in 2022, but in the meantime, here is a collection of blogs highlighting some of our additional research, recommendations, and plaudits, as well as training resources to help you do more with AI, faster: Forrester names Google Cloud a leader in AI InfrastructureGartner names Google a leader in 2021 Magic Quadrant for Cloud AI Developer Services reportGoogle Cloud AI leaders share tips for getting started with AIWhy you need to explain machine learning modelsFree AI and machine learning training for fraud detection, chatbots, and moreGrow your ML skills with free offer from Coursera7 tips for trouble-free ML model trainingYour guide to all things AI & ML at Google Cloud NextRelated ArticleGoogle Cloud expands CCAI and DocAI solutions to accelerate time to valueGoogle Cloud deepens customer understanding with Contact Center AI Insights and transforms contract management with Contract DocAIRead Article
Quelle: Google Cloud Platform

Google Meet in 2021: A year of accelerated innovation

2021 has been a year of change for all of us as the pandemic continued to expedite technology’s role in keeping us connected through video. Google Meet played a significant role in enabling those connections, helping friends, families, and colleagues across the globe safely come together. As we saw in our recent global Economist Impact survey (October 2021), hybrid work has become a standard practice for many, underscoring the need for meaningful video meetings for the foreseeable future. Our survey also saw that, over the course of the pandemic, a lack of physical connection has led to feelings of disconnectedness. That’s why this year we’ve worked hard to enable ways for people to stay connected in a more human way—with a solution that’s flexible, immersive, inclusive, and secure by design. Here’s a look back at this year’s most requested and impactful Google Meet features that help address the challenges of hybrid work, learning, and life.Fueling the flexible future of workAs many of us worked from home and others made big moves across the country, we delivered greater flexibility to support work-from-anywhere. We introduced new features that help people easily join or host meetings from their device of choice.A new, intuitive interface allows participants to get started quickly, display up to 49 people at once, and hide self-view to reduce meeting fatigue. A standalone progressive web app and updated mobile apps allow you to join from your device, whether you’re working from your desktop or connecting from your phone on a walking meeting.Recently increased meeting sizes of up to 500 attendees for select Education, Business, and Enterprise plans helps ensure everyone has a spot in the meeting. Attendance reports that can keep track of attendees, whether for work, a class, or a new workshop you’re hosting.The new, intuitive interface lets you hide your video tile to reduce meeting fatigueMaking meetings more immersive As people returned to the office, many organizations needed to ensure an immersive meeting experience for in-person and remote attendees alike. To support this challenge, we invested in new Google Meet hardware and features that help bridge the gap.Companion mode, rolling out to users in January, allows you to join a meeting from a conference room on your personal device to actively participate in chats, polls, and Q&A, while leveraging in-room audio and video.New Google Meet Series One all-in-one video conferencing devices, Series One Desk 27 and Board 65, and Rally Bar and Rally Bar Mini room solutions from Logitech can turn almost any space into a video collaboration hub.Recently launched interoperability between Cisco Webex devices and Google Meet devices can broaden your calling network and create a cross-platform experience. Organizations can also get more from their Google Meet hardware, which can now be leveraged as digital signage displaysfor workplace announcements when not in use.Series One Desk 27 is a all-in-one touch-enabled video conferencing device for the office or home officeSupporting collaboration equityFor meetings to be more equitable and inclusive, we want to help everyone on the team feel seen and heard. Regardless of where you’re dialing in from, your level of experience with video meetings, or the language you speak, you can have the opportunity to engage and participate. New virtual backgrounds, automatic light adjustment, and noise reduction help keep the focus on people and ensure that participants can be seen and heard, whether they’re joining from the office, a coffee shop, or the kitchen table.Hand raising improvements with automatic lowering give participants the opportunity to share their perspectives, while breakout rooms can help facilitate deeper engagement in small group discussions.Polls and Q&Acan help keep the presentation going and on topic while helping to make sure the audience is engaged and questions are answered.Translated captions, rolling out soon, and live caption support for multiple spoken languages can help attendees who are deaf or hard of hearing, speak a different primary language, or are in a noisy location, better understand and participate in meetings.Translated captions can help ensure all attendees understand and participate in meetingsGreater control and protection for your conversationsAs more organizations adopt cloud-based meeting solutions, security remains top of mind for IT decision makers. To address this concern, we launched new security features and host controls on top of our trusted global infrastructure to help better protect your data and conversations.With client-side encryption for Google Meet, currently in beta, companies are able to manage encryption keys in-house to help fulfill data sovereignty and compliance requirements. Meeting hosts and co-hosts can now choose which attendees can share screens and send chats, as well as mute the room when others are speaking and end the call for everyone on the call.Organizations and individuals can record meetings with confidence and control view access with Google Drive for on-demand viewing.Meeting hosts can now reach a broader audience with live stream support for up to 100,000 viewers that can be shared with people in other trusted organizations.As I look back on 2021, I’m so proud of the innovations our team launched, and more importantly, our customers’ perseverance in adapting to the new realities of the workplace. They’re leveraging our solution along with new workplace practices to bridge the gap and make hybrid work feel more human. We’re excited to see all the ways (and places) people will use Meet to connect, create, and collaborate in 2022.
Quelle: Google Cloud Platform

The top three insights we learned from data analytics customers in 2021

As you know, we’re obsessed about learning from customers. That’s why, we made it a point to sit down with a new Google Cloud customer every week this year to share with you their journey both to our unified data cloud platform and what they’ve learned from their experience along the way. What began as a simple exercise of routinely keeping up with our customers quickly evolved into a complete video series that I have the pleasure of hosting every Tuesday. This year you learned from companies of all sizes, all industries and across the globe.In this series, we learn that succeeding with data requires tough business decisions, a lot of creativity and a specific vision. Perhaps for the first time, this series, aptly titled “Data Journeys,” spotlights our community voices from the data space in a genuine, grassroots, and unfiltered way.We’ve learned so much from each customer guest and their company’s unique challenges, trials, and triumphs – all things that make up a great journey or adventure. So, as we put the wrappings on another eventful year, we’re thrilled to tie a bow around our 2021 customer “Data Journeys” and share the top 3 learnings from their journeys with you.Lesson 1: Customers migrating to the cloud aren’t simply looking for a provider that can successfully execute a lift-and-shift motion. Instead, they look to take advantage of their cloud migration as an opportunity to rethink, relearn, and rebuild. When they were reaching the end of their contract with a different data warehousing provider with a fixed compute model, Keybank, a regional financial institution with over 1,000 branches managing over $145 billion in assets, decided to migrate to the cloud. I was fortunate to speak with Mike Onders, EVP Chief Data Officer, Divisional CIO, and Head of Enterprise Architecture for Keybank who led the migration charge. According to Onders, he decided Google Cloud Platform was the right fit for the dynamic, security-sensitive nature of the banking company’s business because Google’s data suite provided an elastic, fast, and consistent environment for data. When selecting Google Cloud however, Keybank was clear they did not want to simply lift-and-shift their Teradata Warehouse, analytics users, and Hadoop Data Lake to Google and wipe their hands to call it a day. Instead, they saw the migration as an opportunity to reinvent old processes and traditional ways of doing banking. “We really spun up Python and Spark clusters to do some new fraud modeling, especially around smart holds and check deposits and how to use fraud algorithms to determine whether we should hold this money or release it to you when you deposit that check,” Onders said.We really spun up Python and Spark clusters to do some new fraud modeling, especially around smart holds and check deposits and how to use fraud algorithms to determine whether we should hold this money or release it to you when you deposit that check Mike Onders , EVP Chief Data Officer, Divisional CIO, and Head of Enterprise Architecture at KeybankIn addition to smart check hold modelings, Keybank has re-engineered how they conduct attrition modeling, next best offers/product recommendations, and credit risk predictions using Google Cloud’s intelligent data cloud platform. Now, they are eyeing even more new innovation, particularly around how to use data to drive customer satisfaction and differentiate their services from their competitors. .For more insights, watch Keybank’s episode on Data Journeys. Lesson 2: Customers are passionate about applying Google’s technology, from predictive analytics to AI and automation, to ‘data for good’ initiatives.TELUS is one of Canada’s largest telecommunications companies with over 9.5 million subscribers. The powerhouse manages geolocated data generated by talk and text signals, networks, and cellular towers across the country. For some perspective on the sheer amount of data they handle, TELUS analyzed over 1.2 petabytes of data last year and are expecting that number to grow more each year. In March of 2020 it became clear that COVID-19 was more than an outbreak and growing to be a global pandemic, and TELUS faced a choice to carry on business as usual or to serve their community at a larger scale. With Google Cloud by their side to support them, the choice was a no brainer.TELUS realized they could leverage their data residing within Google Cloud’s data cloud in a privacy-preserving manner to launch a data for good initiative and support all Canadians during a time of need. Within a matter of just 3 weeks using tools like BigQuery and Data Studio, TELUS launched a new platform that empowered the Canadian government to make better strategic decisions about how to combat COVID within its borders – all while keeping individual subscribers’ data private.For their efforts, TELUS earned a HPE-IAPP Privacy Innovation Award and helped the government contain and minimize the spread of the virus.After speaking with Michael Ames, Senior Director of Healthcare and Life Science at SADA, I realized their company was similarly focused on using data analytics technology for community good. SADA is a Google Cloud Premier Partner that offers consultation and implementation services. Ames’ day-to-day involves helping hospitals and healthcare providers migrate their data to the cloud. Hospitals and healthcare systems around the United States undoubtedly handle massive amounts of data from symptom records to patient history to insurance and payment records, and more. SADA realized that increasing the rate that the right treatment would be matched with the right person during their first visit would help hospitals run their business better, increase patient satisfaction and health, save both parties time and money, and therefore maximize profits that hospitals could reinvest in care for their community. But, after conducting a study with the journal Nature taking the top ten biggest drugs by revenue and analyzing how they affected patients to determine whether they achieved their intended purpose, Ames knew matching the right drug with the right patient was a major issue. Once SADA used data to create predictive models within Google’s intelligent data cloud stack, they were able to help doctors prescribe medications that better matched individual patient needs and health profiles, providing far more personalized medicine. Looking forward, SADA is excited to use a data-driven approach to improve other aspects of health care. “Take that idea of prescribing drugs and expand that to physical therapy treatments, behavioral therapy treatments, decisions we make about where and how people should live, whether we should build a park in a certain neighborhood, how the air quality is affecting health, and we start to get a sense of the need to bring data together in a way that will be… much more powerful in treating the health of people” Ames said.Take that idea of prescribing drugs and expand that to physical therapy treatments, behavioral therapy treatments, decisions we make about where and how people should live, whether we should build a park in a certain neighborhood, how the air quality is affecting health, and we start to get a sense of the need to bring data together in a way that will be… much more powerful in treating the health of people Michael Ames, Senior Director of Healthcare and Life Science at SADAFor more insights, watch TELUS’ episode and SADA’s episode on Data Journeys. Lesson 3: Customers choose Google Cloud to address today’s needs, but also because they are gearing up with technology that can solve their unknown problems of tomorrow. Delivery Hero is the largest food delivery network outside of China. The food delivery company operates in over 50 countries and processed over 663 million orders in just the first quarter of 2021.I was shocked to learn that millions of orders equal 174 data sets, 5,000 tables, and upwards of 7 million queries a month. Hence, it was my pleasure to speak with Matteo Fava, Delivery Hero’s Senior Director, Global Data Products and Analytics, and understand how they manage their immense data volume.A significant part of Delivery Hero’s data strategy involves making sure the right teams have access to the right data on the backend, and that their customers have a safe, efficient, and enjoyable experience of receiving their food on the frontend. In fact, that was the main reason why they partnered with Google Cloud in the first place: to ensure business would run as usual with no hiccoughs.  As Fava’s team gained deeper understanding into user behavior and delivery routes through Google Analytics and BigQuery, Delivery Hero discovered a puzzle they hadn’t previously anticipated of how to best deliver food in a consistent way despite significant cross-cultural differences between countries or even cities or neighborhoods within the same country. One such puzzle was how to optimize “horizontal” deliveries such as delivering to standard homes versus “vertical” deliveries such as delivering to apartment buildings while often keeping a 15 minute delivery promise to their customers. Delivery Hero ultimately used real-time insights into deliveries and predictive modeling to ensure their estimated time of food delivery took into account longer elevator wait time in populated cities like Hong Kong and the time it takes to enter and navigate condominium complexes in places like Dubai. Delivery Hero knew it was one thing to have an efficient new platform for managing data, but it’s another to trust that this platform can solve unexpected challenges that may not be apparent today. For more insights, watch Delivery Hero’s episode on Data Journeys. Wrapping it upWe can’t wait to explore new customer data journeys next year and continue to share insights with the  community. We hope you’ll follow the Data Journey series  by subscribing to the  playlist, and if you have a great suggestion for a guest (or want to be a guest yourself), please let us know!
Quelle: Google Cloud Platform

Reaching more customers with Contact Center AI: 2021 Wrap-up

2021 has been a high-stakes year for call centers, with many organizations forced to rapidly scale up their call center operations in response to ongoing pandemic disruptions. We’re proud that 2021 has also been an amazing year for Google Cloud’s Contact Center AI (CCAI), which has helped our customers adapt and thrive, despite the challenging conditions. Beginning in January, we launched Dialogflow CX in GA. Agent Assist preview was released in May. Most recently, CCAI Insights GA was announced at Google Cloud NEXT in October. During NEXT, we shared lots of great content on how you can use CCAI to improve your customer experience with these breakout sessions:Using CCAI Insights to Better Understand Your Customers Customer Impact with Conversational AI Drive Results by Transforming the Customer Experience with AI-Powered Business MessagesBut don’t just take our word for it. We also got a chance to hear how some companies are using CCAI to better reach their own customers, including The Home Depot, TELUS, and Love Holidays. We partnered with CDW to discuss  transforming the contact center with AI and with Quantiphi on how to migrate from Dialogflow EX to CX. Our integration with Looker Block also makes CCAI Insights even more powerful by visualizing contact center metrics. Over the summer, we hosted aDialogflow CX competition with more than 1,100 participants. Just last month, we showed how we’ve enabled businesses to use AI in their interactions using Google Business Messages. Looking to the future, we talked about the future in our article,“Reimagining your Customer Experience with Conversational AI.”  Amwell,  a U.S.-based telehealth company that is launching CCAI, including the recently launched CCAI Insights, is among the enterprises harnessing AI to transform its call centers. “With Contact Center AI, we aim to digitize our support for improved operational efficiency and elevated analytics capabilities, while enhancing the customer experience for patients, providers, and staff,” says Paul Johnson, SVP Client Services at Amwell. “Contact Center AI Insights will allow Amwell to better understand why our platform users are reaching out to support and how they feel about the overall experience – valuable insights for our support organization.”As we recap the momentum of CCAI for 2021, it’s also a good time to review exactly how CCAI works.What is CCAI?As the volume of customer calls increases, it’s becoming even more important to make the most of human agents’ time to lower costs and improve customer experiences. CCAI enables you to do just that: it frees human agents to concentrate on more complex calls by providing them with real-time information to better handle those calls. Single source of intelligence: Contact Center AI provides a consistent, high-quality conversational experience across all channels and platforms, both human and virtual. Because the “brains” of CCAI are centralized in the cloud, you can apply consistent intelligence across every application in the customer journey. Ability to go off-script: Huge cost savings can be realized by having a virtual agent handle voice calls. This is easier said than done, however,  because conversations rarely are completely linear; instead, they meander from topic to topic, which is difficult to handle programmatically in a fixed-path Interactive Voice Response (IVR) system.   Contact Center AI has the ability to go “off script” — to let callers go down tangents or side paths to the main conversation, while still tracking towards the main objective of the call.  With CCAI, your virtual agents can answer complex questions and complete complicated tasks, including allowing for unexpected stops and starts, unusual word choices, or implied meanings.  Developers can define supplemental questions, and CCAI can easily retain the context, answer the supplemental question, and come back to the main flow.Versatile fulfillment: CCAI has the ability to handle multiple use cases for the customer with the same virtual agent, which enables you to fully automate routine tasks and deflect calls.  The same virtual agent can take a payment, update information like a phone number, give a customer information on their balance, and process information for other tasks, all within the same conversational flow.How does CCAI work?CCAI has three key components:Conversation Core: This is the central AI brain that underpins CCAI and its ability to understand, talk, and interact.  It enables and orchestrates high-quality conversational experiences at scale making it possible for customers to have  conversations with a virtual agent that are as good as conversations with a human agent.Understand – Speech-to-text speech recognition understands what customers are saying regardless of how they phrase things, what vocabulary they use, what accent they have, and so on. Talk – Text-to-speech enables virtual agents to respond to customers in a natural, human-like manner that pushes the conversation along, rather than frustrate them.Interact – Dialogflow identifies customer intent and determines the appropriate next step. You can build conversational flows in a point-and-click interface, and generate automated ML models for human-like conversational experiences.Virtual agents with Dialogflow: This component automates interactions with customers, using natural conversation to identify and address their issues. Virtual agents enable customers to get immediate help anytime, day or night. Agent Assist: This component brings AI to human agents to increase the quality of their work, while decreasing their average handling time.  Agent Assist shares initial context and provides real-time, turn-by-turn guidance to coach agents through business processes, as well as full call transcriptions that agents can edit and file quickly. CCAI Insights: CCAI Insights aids your contact center management team in making better data driven decisions for their business by breaking down conversations using natural language processing and machine learning. Having this information allows your business to reduce manual analysis and focus on decision making like which conversations need your attention, where to deploy virtual agent automation to have the biggest impact and how to address your customer needs.  How does CCAI create experiences for agents and customers? When a user initiates a chat or voice call and the contact center provider connects them with CCAI, a virtual agent engages with the user, understands their intent, and fulfills the request by connecting to the backend. If necessary, the call can be handed off to a human agent, who sees the transcript of the interaction with the virtual agent, gets feedback from the knowledge base to respond to queries in real time, and receives a summary of the call at the end. Insights help you understand what happened during the virtual agent and live agent sessions. The result is improved customer experiences and CSAT scores, lower agent handling times, and more time for human agents to spend on more complicated customer issues.And there you have it: a quick overview of CCAI and its progress in 2021. For more details, check out the documentation or our CCAI solutions page.For more #GCPSketchnote, follow the GitHub repo. For similar cloud content follow me on Twitter @pvergadia and keep an eye out on thecloudgirl.dev.Related ArticleGoogle Cloud expands CCAI and DocAI solutions to accelerate time to valueGoogle Cloud deepens customer understanding with Contact Center AI Insights and transforms contract management with Contract DocAIRead Article
Quelle: Google Cloud Platform

Google Cloud enables the National Cancer Institute's Institute for Systems Biology-Cancer Gateway in the Cloud to support breast cancer research with fast and secure data sharing

Research organizations today recognize the challenge of sifting through siloed data sets, and analyzing and sharing this data with the global research community—all while staying secure and compliant within a range of national and international standards. It is precisely these constraints that led the U.S. National Cancer Institute (NCI) to create Cloud Resources, which are components of the NCI Cancer Research Data Commons that allow scientists to analyze cancer datasets in a cloud environment (vs. having to download data and use custom hardware). Included in these resources is the Institute for Systems Biology-Cancer Gateway in the Cloud (ISB-CGC). ISB-CGC relies on Google Cloud to securely host terabytes of genomic and proteomic data, and provide flexible and scalable analytics tools that can be integrated into research models. Complex computations that traditionally required days to complete are now executed in just minutes or hours. And ISB-CGC can now deliver open data, compute, and analytics resources to the global research community.Enabling faster time-to-discoverySpeed and scale can make all the difference when it comes to potentially life-saving research. Take breast cancer for example. It’s the world’s most prevalent cancer and according to the World Health Organization, more than two million women were diagnosed with it in 2020 alone. With such a large number of impacted women, each with unique biological features and personal paths through the disease, breast cancer research is particularly data intensive. And processing this on-premises is too slow, expensive, and burdensome to patients.By working with Google, NCI’s ISB has not only made data more useful to cancer researchers around the world, but also has fundamentally changed how cancer investigators conduct research. BigQuery, Google Cloud’s highly scalable mulitcloud data warehouse, underpins the cloud-based platform that connects researchers to a wide collection of cancer datasets, as well as the analytical and computational infrastructure to analyze that data quickly. “We are spreading the message of the cost-effectiveness of the cloud,” said Dr. Kawther Abdilleh, lead bioinformatics scientist at General Dynamics Information Technology, a partner of ISB. “With Google Cloud’s BigQuery, we’ve successfully demonstrated that researchers can inexpensively analyze large amounts of data, and do so faster than ever before.”Integrating diverse tools and datasetsTraditionally, researchers have downloaded source data and performed analysis locally on their personal machines using programming languages like R and Python. As the volume and complexity of cancer data has grown, this approach has become unsustainable. Through the use of Google Cloud services, like Notebooks and BigQuery application programming interfaces (APIs), researchers can now use their desired methods to analyze data on the ISB-CGC platform, directly in the cloud—without the need to download data. For example, in their September 2020 paper on data integration and analysis in the cloud, Dr. Abdilleh and Dr. Boris Aguilar, senior research scientists at ISB, demonstrated how cloud-based data analysis can be used to identify novel biological associations between clinical and molecular features of breast cancer. “Google’s AI platform, for example, allows us to easily create notebooks to use R or Python in combination with BigQuery or machine learning to perform large-scale statistical analysis of genomic data, all in the cloud,” Aguilar wrote. “This type of analysis is particularly effective when the data is large and heterogenous, which is the case for cancer-related data.” Drs. Abdilleh and Aguilar developed a set of BigQuery user-defined functions (UDFs) to perform statistical tests and gain a more holistic picture of breast cancer. Performing these statistical functions directly on the massive data stored in BigQuery vs. in an on-premises computer program later in the analysis workflow saved a significant amount of time. In fact, by using UDFs with BigQuery, analysis that typically required supercomputers and days of computation was complete in minutes. Drs. Abdilleh and Aguilar have now made their UDFs available for use by the broader research community via BigQuery, opening doors for fellow breast cancer researchers to build on this progress and make strides in their life-saving work.  Global access to critical cancer dataWith so many lives and families impacted by cancer—and researchers worldwide diligently seeking answers—the need to accelerate and improve the means by which cancer research is conducted is critical. ISB-CGC’s success using Google Cloud as the foundation of its infrastructure and data cloud strategy has opened the door for the cancer research community to gain real-time, secure access to data that plays a significant role in the early detection of cancer. Read the case study for more detail on how Google Cloud is supporting breast cancer research.
Quelle: Google Cloud Platform

2021 Gartner® Magic Quadrant™ for Cloud Database Management Systems recognizes Google as a Leader

We are thrilled that Gartner has positioned Google as a Leader for the second year in a row in the 2021 Gartner® Magic Quadrant™ for Cloud Database Management Systems (DBMS).We believe the report evaluated Google’s unified capabilities across both transactional and analytical use cases, and showcases innovation progress in areas like data management consistency, high speed processing and ingestion, security, elasticity, advanced analytics, and more. With the recent announcement of Dataplex, organizations can centrally manage, monitor, and govern their data across data lakes, data warehouses, and data marts with consistent controls.  Solutions like BigQuery ML provide a “built-in” approach for advanced analytics capabilities and Analytics Hub offer the infrastructure customers need to share data analytics solutions securely and at scale in ways never before achieved. For example, over a seven-day period in April, more than 3,000 different organizations shared over 200 petabytes of data using BigQuery. Research shows 90% of organizations have a multicloud strategy, which is why we have invested in a cross-cloud data analytics solution for Google Cloud, AWS, and Azure with BigQuery Omni. Additionally, our progress with Anthos and our Distributed Cloud this past year further advance our ability to support multi and hybrid cloud scenarios. To gain a competitive advantage using data, organizations need a data platform that transcends transactional and analytical workloads and can be run with the highest level of reliability, availability and security. Cloud Spanner, our globally distributed relational database has redefined the scale, global consistency, and availability of Online Transaction Processing (OLTP) systems. Spanner processes over 1 billion requests per second at peak, and has been battle-tested with some of the most-demanding applications, including Google services such as Search, YouTube, Gmail, Maps, and Payments. And what’s unique about our core services Spanner and BigQuery, is that they leverage common infrastructure such as our highly durable distributed file system (Colossus), our large-scale cluster management system (Borg), and Jupiter, our high-performance networking infrastructure, enabling features such as federation between Spanner and BigQuery.We remain focused on the integration of Google Trends, Maps, Search, Ads, and have increased industry domain expertise in areas such as retail, financial services, healthcare, and gaming. We’re continuing to develop industry white papers such as this one – How to develop Global Multiplayer Games using Cloud Spanner – and we’re proud of the work the team has done to create and share industry and horizontal architecture patterns built from industry leaders to serve as solution accelerators for customer use cases. Innovation momentum continues with a unified and open data cloudWe continue to innovate across our data cloud portfolio, especially with the innovations we announced at Google Cloud NEXT’21. BigQuery Omni is now available for AWS and Azure, supporting cross-cloud analytics for customers. We’ve added additional capabilities for enterprise data management and governance with Dataplex, which recently went GA. We’ve made migrations to Cloud SQL easier and faster with the Database Migration Service. More than 85% of all migrations are underway in under an hour, with the majority of customers migrating their databases from other clouds. We are embracing openness with Spanner by adding a PostgreSQL interface, allowing enterprises to take advantage of Spanner’s unmatched global scale, 99.999% availability, and strong consistency using skills and tools from the popular PostgreSQL ecosystem. We are also automating data processing with Spark on Google Cloud, which enables developers to spend less time on infrastructure management and more time on data science, modeling, and delivering business value. Finally, we announced Google Earth Engine on Google Cloud, allowing customers to integrate Earth Engine with BigQuery, Google Cloud’s ML technologies, and the Google Maps Platform. With these innovations, enterprises like PayPal, Deutsche Bank, and Equifax use Google Cloud to solve for their end-to-end data lifecycle use cases. Organizations like Telefonicause Google Cloud to deliver new customer experiences. Telefónica has transformed every aspect of how they store, share, and analyze data while doubling processing power and lowering costs.We continue to support an open ecosystem of data partners including Informatica, Tableau, MongoDB, Neo4j, C3.ai, and Databricks, giving customers the flexibility of choice to build their data clouds without being locked into a single approach. We are honored to be a Leader in the 2021 Gartner Magic Quadrant for Cloud Database Management Systems (DBMS), and look forward to continuing to innovate and partner with you on your digital transformation journey. Download the complimentary 2021 Gartner Magic Quadrant for Cloud Database Management Systems report. Learn more about how organizations are building their data clouds with Google Cloud solutions. Gartner, Magic Quadrant for Cloud Database Management Systems, Henry Cook, Merv Adrian, Rick Greenwald, Adam Ronthal, Philip Russom, 14 December 2021.Gartner and Magic Quadrant are registered trademarks of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission. All rights reserved.Gartner does not endorse any vendor, product or service depicted in its research publications and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s Research & Advisory organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from Google Cloud.Related ArticleIntroducing Google Distributed Cloud—in your data center, at the edge, and in the cloudGoogle Distributed Cloud runs Anthos on dedicated hardware at the edge or hosted in your data center, enabling a new class of low-latency…Read Article
Quelle: Google Cloud Platform

Optimize your system design using Architecture Framework Principles

To help our customers on their path to success with Google Cloud, we published the Google Cloud Architecture Framework – a set of canonical best practices for building and operating workloads that are secure, efficient, resilient, high performing, and cost effective. Today, we’re diving deeper into the Architecture Framework System Design pillar, including the four key principles of system design and recent improvements to our documentation. We’ll also expand on the new space of the Google Cloud Community dedicated to the Architecture Framework, which was created to help you achieve your goals with a global community of supportive and knowledgeable peers, Googlers, and product experts.What is system design? The System Design Pillar is the foundational pillar of the Architecture Framework, which includes Google Cloud products, features, and design principles to help you define the architecture, components, and data you need to satisfy your business and system requirements. The System Design concepts and recommendations can be further applied across the other five pillars of the Architecture Framework: Operational Excellence, Security, Privacy, and Compliance, Reliability, Cost Optimization, and Performance Optimization. You can evaluate the current state of your architecture against the guidance provided in the System Design Pillar to identify potential gaps or areas for improvement.System design core principlesA robust system design is secure, reliable, scalable, and independent, enabling you to apply changes atomically, minimize potential risks, and improve operational efficiency. To achieve a robust system design, we recommend you follow four core principles:Document everythingWhen customers are either looking to move to the cloud or starting to build their applications, one of the major success blockers we see is the lack of documentation. This is especially true when it comes to correctly visualizing current architecture deployments. A properly documented cloud architecture helps establish a common language and standards, enabling your cross-functional teams to communicate and collaborate effectively. It also provides the information needed to identify and guide future design decisions that power your use cases. Over time, your design decisions will grow and change, and the change history provides the context your teams need to align initiatives, avoid duplication, and measure performance changes effectively over time. Change logs are particularly invaluable when you’re onboarding a new cloud architect, who is not yet familiar with your current system design, strategy, or history. Simplify your design (use fully managed services) When it comes to system design, simplicity is key. If your architecture is too complex to understand, your developers and operations teams can face complications during implementation or ongoing management. Wherever possible, we highly recommend using fully managed services to minimize the risk of managing and maintaining baseline systems, as well as the time and effort required by your teams.  If you’re already running your workloads in production, testing managed service offerings can help simplify operational complexities. If you’re starting new, start simple, establish an MVP, and resist the urge to over-engineer. You can identify corner use cases, iterate, and improve your systems incrementally over time.Decouple your architectureDecoupling is a technique used to separate your applications and service components – such as a monolithic application stack – into smaller components that can operate independently. A decoupled architecture therefore, can run its function(s) independently, irrespective of its various dependencies.   With a decoupled architecture,  you have increased flexibility to apply independent upgrades, enforce specific security controls, establish reliability goals, monitor health, and control granular performance and cost parameters. You can start decoupling early in your design phase or incorporate it as part of your system upgrades as you scale.  Utilize statelessnessIn order to perform a task, stateful applications rely on various dependencies, such as locally-cached data, and often require additional mechanisms to capture progress and sustain restarts. On the other hand, stateless applications can perform tasks without significant local dependencies by utilizing shared storage or cached services. This enables your applications to quickly scale up with minimum boot dependencies, thereby withstanding hard restarts, reducing downtime, and maximizing service performance for end users. The System Design Pillar describes recommendations to make your applications stateless or to utilize cloud-native features to improve capturing machine state for your stateful applications. System design principles applied across other pillarsThe core System Design principles can be applied across the other five pillars of the Architecture Framework, including Operational Excellence, Security, Reliability, Cost, and Performance Optimization. Here are a few examples of how this looks in practice.Use fully managed and highly-available operational tools to deploy and monitor your workloads, so you can minimize the operational overhead of maintaining and optimizing them. Apply security controls at the component level. By decoupling and isolating components, you can apply fine-grained governance controls to effectively manage compliance and minimize the blast radius of potential security vulnerabilities.Design for high availability and scalability. A decoupled architecture enables you to define and control granular reliability goals, so you can maximize the durability, scalability, and availability of your critical services, while optimizing non-critical components on-the-go.    Define budgets and design for cost efficiency. Cost usually becomes a significant factor as you define reliability goals, so it’s important to consider various cost metrics early on when you’re designing your applications. A decoupled architecture will help you enforce granular cost budgets and controls, thereby improving operational efficiency and cost optimization. Optimize your design for speed and performance. As you design your service availability within your cost budget, ensure you also consider performance metrics. Various operational tools will provide insights to view performance bottlenecks and highlight opportunities to improve performance efficiency. These are just a few examples, but you can see how the System Design principles can be expanded into various other use cases across the other five pillars of the Architecture Framework.The Architecture Framework is now part of The Google Cloud CommunityThe Google Cloud Community is an innovative, trusted, and vibrant hub for Google Cloud users to ask questions and find answers, engage and build meaningful connections, share ideas and have an impact on product roadmaps, as well as learn new skills and develop expertise.Today, we’re announcing the launch of a new space in the Google Cloud Community dedicated to the Architecture Framework. In this space, you can: Access canonical articles that provide practical guidance and address specific questions and challenges related to the System Design pillar. We’ll be releasing articles focused on the remaining five pillars in the coming months.Engage in open discussion forums where members can ask questions and receive answers.Participate in Community events, such as our “Ask Me Anything” series, where we’ll host a virtual webinar on a specific topic of the Architecture Framework and open it up for questions from the audience. Together, the Google Cloud Community and Architecture Framework provide a trusted space for you to achieve your goals alongside a global community of supportive and knowledgeable peers, Googlers, and product experts.Explore the new space of the Community today and if you haven’t already, sign up to become a member so you can take full advantage of all the opportunities available.What’s new for System Design 2.0?Earlier this year, we released an updated version (2.0) of the Architecture Framework, and we’ve been continuing to enhance our catalog of best practices based on feedback from our global partner and customer base, as well as our team of Google product experts. Here’s what’s new in the System Design Pillar:Resource labels and tags best practices were added to simplify resource management.The compute section is now reorganized to focus on choosing, designing, operating, and scaling compute workloads. The database section is reorganized into topics like selection, migration, and operating database workloads, and highlights best practices around workflow management.The data analytics section now includes sections on data lifecycle, data processing, and transformation. A new section on artificial intelligence (AI) and machine learning (ML) that covers best practices for deploying and managing ML workloads. As always, we welcome your feedback so we can continue to improve and support you on your path to success with Google Cloud. Special note and thank you to Andrew Biernat, Willie Turney, Lauren van der Vaart, Michelle Lynn, and Shylaja Nukala, for helping host the Architecture Framework on the Google Cloud Community site. And Minh “MC” Chung, Rachel Tsao, Sam Moss, Nitin Vashishtha, Pritesh Jani, Ravi Bhatt, Olivia Zhang, Zach Seils, Hamsa Buvaraghan, Maridi Makaraju, Gargi Singh, and Nahuel Lofeudo for helping make System Design content a success!Related ArticleSupercharge your Google Cloud workloads with up-to-date best practices from Architecture FrameworkGoogle Cloud best practices have been updated to version 2.0, enabling better security, compliance, reliability, operations, and cost- an…Read Article
Quelle: Google Cloud Platform