Apigee best practices for Contact Center AI

By now, you’ve probably interacted with a customer service chatbot at some point. However, many of those interactions may have left a lot to be desired. Modern consumers generally expect more than a simple bot that answers questions with predefined answers—they expect a virtual agent that can solve their problems.Google Cloud Contact Center AI (CCAI) can make it easier for organizations to efficiently support their end customers with natural interactions delivered through AI-powered conversation. In this guide, we’ll share seven Apigee best practices for building fast, effective chatbots with secure APIs using CCAIand Apigee API Management.This blog post assumes you have basic knowledge of CCAI and Apigee API Management.Good conversation is challengingOne of the many challenges organizations face is how to provide a bot experience to customers when information resides in more places than ever. Creating an optimal virtual agent generally involves integrating with both new and legacy systems that are spread out across a mix of on-premises and cloud environments, using REST APIs.Dialogflow CX is a natural language processing module of CCAI that translates text or audio from a conversation into structured data. A powerful feature of Dialogflow CX is webhook fulfillments to connect with backend systems. Once a virtual agent triggers a webhook, Dialogflow CX connects to backend APIs, consumes the responses, and stores required information in its context. This integration can allow virtual agents to have more informed and purposeful interactions with end users, such as verifying store hours, determining whether a particular item is in stock, and checking the status of an order.Developing APIs for CCAI fulfillment is not a straightforward task. There can be many challenges associated with it, including:Complexity: You may need to access APIs that are not exposed externally, which can require significant collaboration and rules to enable access to existing data and systems. This can easily lead to technical debt and more inefficiency without an API Gateway that can translate the complexities of data systems in real time and forward them to a customer.Increased customer frustration: Contact centers often act as one of the primary drivers of customer experience. Improving the speed of response can enhance experiences, but any friction or delays can be magnified. Caching and prefetching data are some commonly used flows to enable faster virtual agent responses. API Orchestration: APIs generally require more than just exposing an endpoint as they need to change often in response to customer needs. This flexibility can require API orchestration, where APIs are decoupled from rigid services and integrated into an interface tailored to the expected consumption patterns and security requirements of interacting with Dialogflow CX. Without an API platform, translating complexities of data systems in realtime and forwarding to the caller is not efficient.How Dialogflow and Apigee deliver better chatbot experiencesCCAI can be more effective when woven into the fabric of the business via APIs. The more functionality (and therefore more APIs) you add to the agent, the more critical it can become to streamline the API onboarding process. You need to consolidate repetitive work, validate security postures, and identify and implement optimizations to ensure a great end user experience.Apigee API Management can pave the way for faster and easier fulfillment. Apigee is an intuitive platform for bot designers and architects to incorporate key business processes and insights into their workflow. More specifically, it enables Dialogflow to speak with your backend systems. You can use Apigee’s built-in policies to inspect Dialogflow requests, set responses, validate defined parameters, and trigger events in real time. For example, if a call meets a defined business criteria, Apigee can augment a “360 degree view” in a data warehouse like BigQuery, add a customer to a campaign list, or send a SMS/text alert—all without any material impact on the routing time. By pairing CCAI with Apigee, you can leverage a greater portion of Google Cloud’s transformation toolset, reduce the amount of time needed for conversation architects to integrate APIs, and create a more cohesive development environment for solving call center challenges.Seven ways to get more out of Contact Center AI API development with ApigeeThe following are several best practices for Apigee API development for Dialogflow CX API fulfillments:1. Create a single common Apigee API proxy Let’s assume we have a Dialogflow CX virtual agent that needs three fulfillment APIs that will be fronted by Apigee:get list of moviesadd movie ticket to cartorder item in cartTechnically, you can create a separate Dialogflow CX webhook for each of these APIs, which can point to three separate API proxies. However, because Dialogflow has a proprietary request and response format, creating three separate API proxies for those fulfillment APIs results in three non-RESTful proxies that are difficult to consume for any clients other than Dialogflow CX virtual agents.Instead, we recommend creating a common Apigee API proxy that is responsible for handling all the fulfillment APIs required by the agent. Dialogflow CX will have just one webhook that is configured to send requests to the common Apigee API proxy. Each webhook call is sent with a webhook tag that uniquely identifies the correct fulfillment API.2. Leverage Dialogflow policies as much as possibleApigee provides two Dialogflow-specific policies: ParseDialogflowRequest and SetDialogflowResponse. It is highly recommended to use these policies whenever possible. Doing so not only adheres to the general best practice of choosing built-in policies first over custom code, but also ensures that parsing and setting of Dialogflow request and response is standardized, hardened, and performant.As a general rule:ParseDialogflowRequest is required only once in an API proxy and placed in the PreFlow after authentication has taken place.SetDialogflowResponse may be used for each distinct fulfillment response (i.e., for each unique webhook tag). If the SetDialogflowResponse does not meet all of the requirements, either supplement or replace it with AssignMessage or JavaScript policies.3. Use conditional flows for each webhook tagConditional flows should be used to separate the logic for different fulfillment APIs. The easiest way to implement this is by placing a ParseDialogflowRequest policy in the PreFlow. Once that policy has been added, the flow variable google.dialogflow.<optional-prefix>.fulfillment.tag will be populated with the value of the webhook tag. This variable can then be used to define the conditions in which a request enters a particular conditional flow.Here is an example of a conditional flow using the same three fulfillment APIs from above:4. Consider utilizing proxy chainingDialogflow CX webhooks have their own request and response format instead of following RESTful conventions such as GET for reads, POST for creates, PUT for updates, etc. This makes it difficult for conventional clients to easily consume an API Proxy created for DIalogflow CX.Hence we recommend using proxy chaining. With proxy chaining you can separate API proxies into two categories: Dialogflow proxies and resource proxies. Dialogflow proxies can be lightweight proxies limited to actions specific to the Dialogflow client. These might include:Authenticating requestsTranslating a Dialogflow CX request into a RESTful formatSending a RESTful request to the resource proxyTranslating the response back from the resource proxy into the Dialogflow formatAnd any tasks that involve connecting to the backend and exchanging data should fall to your resource proxies. You should create resource proxies just like any other Apigee API proxy, without considerations for Dialogflow in mind. The focus should be on providing an eloquent, RESTful interface for all types of clients to easily consume.Proxy chaining provides a way to reuse proxies. However, it can incur some additional overhead as the call moves from one proxy to another. Another approach you can use is to develop components that are expressly designed to be reused, using Reusable shared flows. Shared flows combine policies and resources together and can be abstracted into shared libraries, allowing you to capture functionality that can be consumed in multiple places. They also let security teams standardize on approach and rules for connectivity to trusted systems, assuring security compliance without compromising the rate of innovation. Proxies you want to connect in this way must be in the same organization and environment.5. Improve performance with cache prefetchingWhen creating a chatbot or any other natural language understanding-enhanced application, response latency is an important metric — the time it takes for a bot to respond back to the user. Minimizing this latency helps retain user attention and avoids scenarios where the user is left wondering whether the bot is broken.If a backend API that a Dialogflow virtual agent relies on has a long response time, it may be useful to prefetch the data and store it in Apigee’s cache to improve performance. You can include tokens and other meta-information, which can directly impact the time elapsed between customer input and a return prompt from Dialogflow. The Apigee cache is programmable, which can enable greater flexibility and thus a better conversation experience. You can implement prefetching and caching data using Response Cache (or Populate Cache) combined with Service Callout policy.6. Prefer responding with a single complex parameter instead of multiple scalar parametersWhen responding to a virtual agent with the SetDialogflowResponse policy, one can return multiple values at once via the <Parameters> element. This element accepts one or more children <Parameter> elements. If possible, it’s generally more effective to return a single parameter as a JSON object instead of breaking up the response as multiple parameters, each containing a single string or number. You can leverage this strategy via <JSONPath>.This approach is recommended because: Parameters will be logically grouped.Dialogflow CX can still easily access the composite parameters using dot notation.The agent can use a null value for a single parameter to erase previous response parameters and delete the entire JSON object instead of having to specify a null value for many different individual parameters7. Consider responding with 200s on certain errorsIf a webhook service encounters an error, Dialogflow CX recommends returning certain 4XX and 5XX status codes to notify the virtual agent that an error has occurred. Whenever Dialogflow CX receives these types of errors, it invokes the webhook.error event and continues execution without making the contents of the error response available to the agent.However, there are scenarios where it is reasonable for the fulfillment API to provide feedback on an error, such as notifying the user that a movie is no longer available or that a certain cinema ticket is invalid. In these cases, consider responding with a 200 HTTP status code to provide context around whether the error was expected (e.g. 404) vs. unexpected (e.g. 5XX).Get startedApigee’s built-in policies, nuanced approach to security, shared flows, and caching mechanism can provide a smoother way to implement effective virtual agents that deliver speedy responses to your end customers. By applying these best practices, your Dialogflow engineers can have more time to innovate and focus on building better conversation experiences rather than integrating backend systems.Try building a sample Contact Center AI workflow with Apigee or visit Integrating with Contact Center AI to find out more.Related ArticleContact Center AI reimagines the customer experience through full end-to-end platform expansionGoogle Cloud extends Contact Center AI with Contact Center AI Platform, adding CRM integration for end-to-end management of customer jour…Read Article
Quelle: Google Cloud Platform

QuintoAndar becomes largest housing platform in Latin America with help from Google Cloud

Stanford University classmates Gabriel Braga and André Penha knew the real estate market in Brazil was plagued by bureaucracy and steep fees, and they were sure they could build something better. They envisioned a digital marketplace that could connect potential tenants and homebuyers to landlords and sellers to streamline real estate transactions in Brazil. In 2012, they founded QuintoAndar, a housing marketplace that connects property owners, residents, brokers and agents in Brazil. The company, which began with a small team of developers, has now the largest valuation of a proptech in Latin America at $5.1B as of August 2021, after raising another $120M on top of the $300 million in Series E funding they raised in May 2021.Building a PWA at Google for Startups Accelerator: BrazilQuintoAndar started out with four projects in two stacks for their front end main products: Android and iOS mobile apps and desktop and mobile websites. The brand wasn’t well-known, so users were hesitant to install its apps. To meet their aggressive traffic and growth goals, QuintoAndar decided to participate in the Google for Startups Accelerator: Brazil program. Their Google mentors introduced them to the concept of Progressive Web Apps, which use modern web capabilities to deliver an app-like user experience, and described a long-term strategy for QuintoAndar using a PWA. QuintoAndar’s leadership could see that a PWA would allow them to evolve the product on multiple platforms by unifying the production and support of new features.To help focus developers on a main stack and offer users a great web experience, the QuintoAndar team decided to go all-in on a PWA written in React, using Chrome for the browser and Chrome DevTools to develop and debug the app. They used WorkBox to improve the offline experience and Google Material Design to unify the desktop-app cross-platform experience.The PWA served as QuintoAndar’s main web digital channel, and three apps met the needs of three user categories: home buyers and renters; sellers or landlords; and real estate agents. Home buyers and renters used QuintoAndar’s main PWA to search for homes, schedule virtual or on-site visits, negotiate, and complete all the steps of the rental or sales process. Homeowners used the homeowners’ app to list properties for sale or rent, monitor visits, negotiate with potential buyers or tenants, and close deals. Real estate agents used the agent’s app to manage their schedules, book visits, contact clients, and manage deals.QuintoAndar’s four years of focusing on its PWA helped the company shape its product and drive growth. Traffic increased to 30 times its initial rates. By 2021, with a larger engineering and product team and a well-known brand, QuintoAndar decided to invest in mobile app development, to offer a better user experience. After researching mobile development options, the company built a native app with Flutter, and QuintoAndar’s app score went up from 3.9 to 4.5. The company continues to invest in both web and native mobile app platforms.Leveraging Google Cloud to get results Now, QuintoAndar has dedicated platform teams to improve its tech stack and build developer tools for stream-aligned teams, like design system, web performance, and native teams. On the stack side, they use Next.js for web and Flutter for native apps. They also use YouTube, Google Maps Platform, Firebase, Cloud Firestore, Cloud Functions, and Analytics to real time sync in features such as favorite lists and negotiations (which are back-and-forth messages between the tenant/buyer and the homeowner/seller). When QuintoAndar launched, none of the players in the proptech market in Brazil showed exact locations of their listings, which had a negative impact on user experience. QuintoAndar uses Google Maps to show the exact location of properties, which forces the market to change accordingly.Looking forward to growthQuintoAndar has grown steadily, and today, the company employs over 4,000 people, with technical teams of more than 600. QuintoAndar is available in all five 5 regions of Brazil and more than 60 Brazilian cities, and is expanding internationally, starting with Mexico. Taking its lead from Google mentors, the company has adopted guiding principles of innovation, keeping customers at the center of decision-making, working collaboratively, and delivering results.  If you want to learn more about how Google Cloud can help your startup, visit our pagehere to get more information about our program, and sign up for our communications to get a look at our community activities, digital events, special offers, and more.Related ArticleHow Google Cloud is helping more startups build, grow, and scale their businessesLearn how Google is investing in startups at the 2022 Google Cloud Startup Summit.Read Article
Quelle: Google Cloud Platform

Understanding Google Cloud’s VMware Engine Migration Process and Performance

Google Cloud VMware Engine (GCVE) allows a user to deploy a managed VMware environment within an Enterprise Cloud Solution. We’ve put together a new white paper, “Google Cloud VMware Engine Performance Migration & Benchmarks,” to help our customers better understand the architecture, its performance, and the benefits. If you’re not familiar with Google Cloud VMware Engine yet, let’s talk a bit more about it.Utilizing Google Cloud lets you access existing services and cloud capabilities; one of those services and solutions mentioned within this document is our Hybrid Cloud Extension, also known as HCX. HCX provides you with an easier transition from on-prem to the cloud, allowing systems administrators to quickly deploy a private cloud and scale their needed Virtual Machines accordingly. The proposed referenced solution is well suited for organizations looking to begin their cloud migration journey and understand the technical requirements within the process without having to be fully committed to their cloud strategy or evacuation data center strategy.Currently, many organizations are navigating their way through their current IT challenges and cloud solutions. Google Cloud VMware Engine provides you the “easy on-ramp” to migrate your workloads into the cloud. You don’t have to move everything to the cloud at once, though, because GCVE provides the option to scale your IT infrastructure from on-prem to the cloud at your discretion by leveraging HCX. HCX also lets you migrate a virtual machine from on-premise to the cloud via a VPN or internet connection without any additional downtime or having to save their work and log off of their machine. With GCVE, you can continue to work during your business hours and operations while your systems administrators migrate your teams to the cloud without the downtime associated with virtual machine migration.The ability to migrate a virtual machine from on-prem to the cloud raises another question: how fast can a targeted virtual machine migrate to the cloud? Google analyzed this specific scenario, assessing what the requirements were to migrate an on-prem virtual machine to the cloud via a Virtual Private Network (VPN), and then analyzing how fast that connection was established and transmitted through HCX. The answer to that question—and more—is all contained within our brand new white paper, “Google Cloud VMware Engine Performance Migration & Benchmarks,“ which you can download now. And if you’re ready to get started with your migration efforts, sign up for a free discovery and assessment with our migration experts.Related ArticleHow Google Cloud and partners can accelerate your migration successLearn more about our updates to RAMP, our holistic, end-to-end migration program to help customers simplify and accelerate their path to …Read Article
Quelle: Google Cloud Platform

How SingleStoreDB uses Google Cloud Marketplace to drive great customer experiences

Founded in 2012, SingleStoreDB is a distributed SQL, cloud-native database that offers ultra-fast, low-latency access to large datasets — simplifying the development of modern enterprise applications. And by combining transactional (OLTP) and analytical (OLAP) workloads, SingleStoreDB introduces new efficiencies into data architecture. We are a multicloud database, meaning that our customers can launch SingleStoreDB on any cloud infrastructure they choose. At the same time, our alignment with Google Cloud has been foundational to our success.The following image showcases SingleStoreDB’s patentedthree-tier storage architecture — memory, local disk and object storage. This architecture enables fast inserts, large volumes of data storage, and overall better scalability and TCO.Let me tell you about how Google Cloud Marketplace has helped us grow our business from a startup in 2011 to one that 12 Fortune 50 companies and half of the top 10 banks in the U.S. rely upon today.Filling a critical data niche to accelerate business valueFirst, let’s talk about why companies choose to use SingleStoreDB alongside Google Cloud and other IT solutions.SingleStoreDB on Google Cloud is ideal for data-intensive applications with these five key requirements:  Query latency — Execute and receive query results with millisecond latencies secondConcurrency — Support a large number of users or concurrent queries, without sacrificing latencyQuery complexity — Handle both simple and complex queries Data size — Effortlessly operate over large data setsData ingest speed — Load (or ingest) data at very high rates, from thousands to millions of rows per secondEvery customer approaches IT differently. Many are either taking a multicloud or hybrid approach to their infrastructure. By running on Google Cloud, SingleStoreDB helps customers manage workloads across all clouds and on-prem systems.We find that many of our customers use SingleStoreDB as a single-pane view into all their data and infrastructure, as well as for data-intensive applications. Because SingleStoreDB is an analytical and transactional database, it shines when handling workloads that require outstanding accuracy and speed.Especially when it comes to our larger customers, companies increasingly run SingleStoreDB on Google Cloud thanks to the Global Network Backbone. Because Google Cloud owns its global fiber network, we tap into Google’s secure-by-design infrastructure to protect information, identities, applications, and devices. This, alongside the increasing popularity of Google Cloud AI/ML solutions, makes SingleStore on Google Cloud a growing force.Success on Google Cloud MarketplaceWhile the partnership between SingleStoreDB and Google Cloud solutions is paramount to our success, the availability of our solution on Google Cloud Marketplace amplifies customer satisfaction.Google Cloud Marketplace simply makes it easier for partners to sell into Google Cloud customers. Here are some of the advantages we’ve seen since launching on Google Cloud Marketplace:Our customers can easily transact and deploy via Marketplace, providing that true Software-as-a-Service feel for SingleStoreDB.Transacting through Marketplace counts toward our customers’ Google Cloud commit, which helps partners like us tap into budget that customers are already setting aside for Google Cloud spend.Billing and invoicing are intuitive and efficient, as everything is automated through Google Cloud Marketplace. Our customers receive a single, clear invoice from Google covering SingleStoreDB, Google Cloud, and other partner solutions purchased from Marketplace.Many of our customers have previously signed the Google Cloud User License Agreement (ULA), which further expedites procurement by speeding up legal reviews.Transforming how we sellAll of these advantages translate to a powerful change in our sales strategy. We no longer have to get bogged down by numbers and contracts. Instead, we can focus on educating customers about how the use of our solutions on Google Cloud will deliver returns to their business. Co-selling with Google Cloud has benefited our business significantly and provides customers with the best possible experiences.We have also seen surprising advantages stem from our success on Google Cloud Marketplace. Our product and engineering teams are much more engaged with Google Cloud technologies now, putting a heavy emphasis on new integrations. We’re actively exploring new Dataflow templates, native connectors for BigQuery and Data Studio, and other solutions.What SingleStoreDB customers are sayingTo give you a better idea of how our customers are benefiting, here are some recent comments about running SingleStoreDB on Google Cloud and transacting through Google Cloud Marketplace:“Our goal has and will always be to build our platform in a way that makes it feel like an on-premise solution. Speed of data processing and delivering are critical to providing this within our solutions. Both Google Cloud and SingleStore have helped us achieve this.” — Benjamin Rowe, Cloud & Security Architect, Arcules“The SingleStore solution on Google Cloud allows marketing teams using Factors.ai to make more informed data decisions by organizing data in one system and allowing for self-serve analytics.” — Praveen Das, Co-Founder, Factors.aiAt the end of the day, our customer experiences and the business impacts they achieve with our solutions are our most critical KPIs. By partnering with Google Cloud, we have unlimited potential to improve our services to power the next generation of data and analytics applications.Discover how SingleStore can transform your business on the Google Cloud Marketplace.Related ArticleFour ways Google Cloud Marketplace simplifies buying cloud softwareLearn about new features in Google Cloud Marketplace that make buying and selling better than ever.Read Article
Quelle: Google Cloud Platform

Google Distributed Cloud adds AI, ML and Database Solutions to deliver customers even greater flexibility and choice

Organizations need a cloud platform that can securely scale from on-premises, to edge, to cloud while remaining open to change, choice, and customization. They must be able to run their applications wherever they need, on infrastructure optimized to process a very high volume of data with minimal delay, all while maintaining the satisfaction and stability of their ML-driven user experiences. At Google, we deeply understand these customer requirements, which is why we launched Google Distributed Cloud last year. With Google Distributed Cloud, we bring a portfolio of fully managed solutions to extend our infrastructure to the edge and into customers’ own data centers. Today, Google Cloud customers love our artificial intelligence (AI) services for building, deploying and scaling more effective AI models with developers successfully using our core machine learning (ML) services to build and train high-quality custom ML models. Our customers are also using a variety of our managed database solutions because of their simplicity and reliability. However, some customers have highly sensitive workloads and desire to use their own private, dedicated facilities. For these customers, we’re excited to announce they will be able to run a selection of these same AI, ML, and database services in Google Distributed Cloud Hosted inside their own data centers within the next year. With this announcement, customers get to take advantage of Anthos, a common management control plane which provides a consistent development and operations experience across hybrid environments. This same experience is now available for on premise environments.Our portfolio of AI, ML, and database products enable customers to quickly deploy services with out-of-box simplicity that includes delivering valuable insights through both unstructured and structured data. The integration of our Google Cloud AI and database solutions into the Google Distributed Cloud portfolio means the ability to harness real-time data insights like never before due to the proximity to where the data is being generated and consumed. This includes ensuring low latency to support applications that are mission critical to businesses such as computer vision, which can be used on the factory floor to detect flaws in products or to index large amounts of video. The addition of these transformative capabilities allow customers to save money, innovate faster and provide the greatest flexibility and choice. With this integration, customers using Google Distributed Cloud Hosted will have access to some exciting AI features. One example is our Translation API that can instantly translate texts in more than one hundred languages. Translation API is a feature available in Vertex AI, our managed ML platform that is generally available and allows companies to accelerate the deployment and maintenance of AI models. With this announcement, customers who need to run highly sensitive workloads in an on-premise or edge environment can now leverage the unique functionality of Translation API along with other Google Cloud pre-trained API’s in Vertex AI such as Speech-to-Text and optical character recognition (OCR). These features were all trained on our planet-scale infrastructure to deliver the highest level of performance, and as always, all of our new AI products also adhere to our AI Principles. Additionally, by incorporating our managed databases offering onto the Google Distributed Cloud portfolio, customers can process data locally to migrate or modernize applications, opening up more time for innovation and to create value in their applications. This is especially true in industries like financial services and healthcare where there are compliance requirements on where data can reside. With these new AI, ML and databases products available in our Google Distributed Cloud portfolio, customers will still have full authority to maintain autonomy and control over their own data centers, yet can rely on Google for the latest technology innovations in cloud services. For more information, please visit Google Distributed Cloud, and to learn more about Vertex AI specifically, join us at our Applied ML Summit.
Quelle: Google Cloud Platform

Cloud CISO Perspectives: May 2022

May was another big month for us, even as we get ready for more industry work and engagement at the RSA Security Conference in San Francisco. At our Security Summit and throughout the past month, we continued to launch new security products and features, and increased service and support for all our Google Cloud and Google Workspace customers. Google Cloud’s Security Summit 2022Our second annual Security Summit held on May 17 was a great success. In the days leading up to the Summit, we discussed how we are working to bring Zero Trust policies to government agencies, and we revealed our partnership with AMD to further advance Confidential Computing – including an in-depth review focused on the implementation of the AMD secure processor in the third generation AMD EPYC processor family. We also introduced the latest advancements in our portfolio of security solutions. These include our new Assured Open Source Software service (Assured OSS), which enables enterprise and public sector users of open source software to incorporate the same OSS packages that Google uses into our own developer workflows; extending Autonomic Security Operations (ASO) to the U.S. public sector, a solution framework to modernize cybersecurity analytics and threat management that’s aligned with the Zero Trust and supply-chain security objectives of 2021’s cybersecurity Executive Order and the Office of Management and Budget memorandum; expanding our compliance with government software standards; and SAML support for Workload Identity Federation, so that customers can use a SAML-based identity provider to reduce their use of long-lived service account keys. Advancing open source software securityWe continued to partner with the Open Source Security Foundation (OpenSSF,) the Linux Foundation, and other organizations at another industry open source security summit to further develop the initiatives discussed during January’s White House Summit on Open Source Security. We’re working towards the goal of making sure that every open source developer has effortless access to end-to-end security by default. As covered in our Security Summit, an important part of this effort is Assured OSS, which leverages Google’s extensive security experience and can help organizations reduce their need to develop, maintain, and operate complex processes to secure their open source dependencies. Assured OSS is expected to enter Preview in Q3 2022.Also, as part of our commitment to improving software supply-chain security, the Open Source Insights project helps developers better understand the structure and security of the software they use. We introduced Open Source Insights data in BigQuery in May so that anyone can use Google Cloud BigQuery to explore and analyze the dependencies, advisories, ownership, license and other metadata of open-source packages across supported ecosystems, and how this metadata has changed over time.  Why Confidential Computing and our partnership with AMD mattersI’d like to take a moment to share a bit more on the importance of Confidential Computing and our partnership with AMD. I’ve been talking a lot this year about why we as an industry need to evolve our understanding of shared responsibility into shared fate. The former assigns responsibilities to either the cloud provider or the cloud provider’s customer, but shared fate is a more resilient cybersecurity mindset. It’s a closer partnership between cloud provider and customer that emphasizes secure-by-default configurations, secure blueprints and policy hierarchies, consistently available advanced security features, high assurance attestation of controls, and insurance partnerships.In our collaboration with AMD, we focused on how secure isolation has always been critical to  our cloud infrastructure, and how Confidential Computing cryptographically reinforces that secure isolation. AMD’s firmware and product security teams, Google Project Zero, and the Google Cloud Security team collaborated for several months to analyze the technologies and firmware that AMD contributes to Google Cloud’s Confidential Computing services. Also in May, we expanded the availability of Confidential Computing to include N2D and C2D Virtual Machines, which run on third-generation AMD EPYC™ processors.GCAT HighlightsHere are the latest updates, products, services and resources from our cloud security teams this month: SecurityPSP protocol now open source: In order to better scale the security we offer our customers, we created a new cryptographic offload protocol for internal use that we open sourced in May. Intentionally designed to meet the requirements of large-scale data-center traffic, the PSP Protocol is a TLS-like protocol that is transport-independent, enables per-connection security, and is offload-friendly. Updating Siemplify SOAR: The future of security teams is heading towards “anywhere operations,” and the latest version of Siemplify SOAR can help get us there. It gives organizations the building blocks needed across cloud infrastructure, automation, collaboration, and analytics to accelerate processes for more timely responses and automated workflows. In turn, this can free up teams to focus on more strategic work.Guardrails and governance for Terraform: Popular open-source Infrastructure-as-Code tool Terraform can increase agility and reduce errors by automating the deployment of infrastructure and services that are used together to deliver applications. Our new tool verifies Terraform and can help reduce misconfigurations of Google Cloud resources that violate any of your organization’s policies. Benchmarking Container-Optimized OS: As part of our security-first approach to safeguarding customer data while also making it more scalable, we want to make sure that our Container-Optimized OS is in line with industry-standard best practices. To this end, the Google Cloud Security team has released a new CIS benchmark that clarifies and codifies the security measures we have been using, and makes recommendations for hardening. New reCAPTCHA Enterprise guidebook: Identifying when a fraudster is on the other end of the computer is a complex endeavor. Our new reCAPTCHA Enterprise guidebook helps organizations identify a broad range of online fraud and strengthen their website security.Take the State of DevOps 2022 survey: The State of DevOps report by Google Cloud and the DORA research team is the largest and longest running research of its kind, with inputs from more than 32,000 professionals worldwide. This year will focus on how security practices and capabilities predict overall software delivery and operations performance, so be sure to share your thoughts with us.Industry updatesSecurity improvements to Google Workspace: I wrote at the beginning of the year that data sovereignty is one of the major, driving megatrends shaping our industry today. At the beginning of May we announced Sovereign Controls for Google Workspace, which can provide digital sovereignty capabilities for organizations, both in the public and private sector, to control, limit, and monitor transfers of data to and from the EU starting at the end of 2022, with additional capabilities delivered throughout 2023. This commitment builds on our existing Client-side encryption, Data regions, and Access Controls capabilities. We are also extending Chrome’s Security Insights to Google Cloud and Google Workspace products, as part of our efforts to consistently provide advanced features to our customers. Can you hear the security now? Pindrop is joining forces with Google Cloud. If you’ve never heard of Pindrop, you’ve almost certainly encountered their technology, which is used to authenticate payments, place restaurant and shopping orders, and check financial accounts over the phone. Their technology provides the backbone for anti-fraud efforts in voice-based controls, as well. With Google Cloud, Pindrop can be better able to detect deep fakes and robocalls, help banks authenticate transactions, and provide retailers with secure AI-powered call center support.Compliance & Controls Expanding public sector and government compliance: Google Cloud is committed to providing government agencies with the security capabilities they need to achieve their missions. In addition to our aforementioned Autonomic Security Operations and new Assured Open Source Software (OSS) service, we’re expanding Assured Workloads. This can help enable regulated workloads to run securely at scale in Google Cloud’s infrastructure. We are also pleased to announce that 14 new Google Cloud services support FedRAMP Moderate and three services are being added to support FedRAMP High, with more coming this summer. (You can read the full list of those services at the end of this blog.)  Next month we’ll recap highlights from the RSA Conference and much more.  To have our Cloud CISO Perspectives post delivered every month to your inbox, sign up for our newsletter. We’ll be back next month with more security-related updates.Related ArticleCloud CISO Perspectives: April 2022Google Cloud CISO Phil Venables shares his thoughts on the latest security updates from the Google Cybersecurity Action Team.Read Article
Quelle: Google Cloud Platform

How a robotics startup switched clouds and reduced its Kubernetes ops costs with GKE Autopilot

Don’t look now, but Brain Corp operates over 20,000 of its robots in factories, supermarkets, schools and warehouses, taking on time-consuming assignments like cleaning floors, taking inventory, restocking shelves, etc. And BrainOS®, the AI software platform that powers these autonomous mobile robots, doesn’t just run in the robots themselves — it runs in the cloud. Specifically, Google Cloud. But that wasn’t always the case. Brain Corp recently partnered with Google Cloud to migrate its robotics platform from Amazon EKS to Google Kubernetes Engine (GKE) Autopilot. Running thousands of robots in production comes with tons of operational challenges, and Brain Corp. needed a way to reduce the day-to-day ops and security maintenance overhead that building a platform on Kubernetes (k8s) usually entails. Just by turning on Autopilot, they’ve offloaded all the work of keeping clusters highly available and patched with the latest security updates to Google Cloud Site Reliability Engineers (SREs) — a huge chunk of work. Brain Corp’s ops team can now focus on migrating additional robots to the new platform, not just “keeping the lights on” with their k8s clusters.Making the switch to Google CloudWhen Brain Corp decided to migrate off of EKS, they set out to find a cloud that had the best technology, tools, and platform to easily integrate data and robotics. Brain Corp’s Cloud Development team began by trying to implement a proof-of-concept architecture to support their robots on Google Cloud and another cloud provider. It became clear that Google Cloud was the right choice when it only took a week to get the POC up and running, whereas on the other cloud provider it took a month. During the POC, Brain Corp realized benefits beyond ease of use. Google’s focus on simple integration between its data products contributed significantly to moving from POC to production. Brain Corp was able to offload Kubernetes operational tasks to Google SREs using GKE Autopilot, which allowed them to focus on migrating robots to their new platform on Google Cloud. Making the switch to GKE AutopilotAlex Gartner, Cloud Infrastructure Lead at Brain Corp, says his team is responsible for “empowering developers to develop and deploy stuff quickly without having to think too hard about it.” On EKS, Brain Corp had dedicated infrastructure engineers who did nothing but manage k8s. Gartner was expecting to have his engineers do the same on standard GKE, but once he got a whiff of Autopilot, he quickly changed course. Because GKE Autopilot clusters are secure out of the box and supported by Google SREs, Brain Corp was able to reduce their operations cost and provide a better and more secure experience for their customers. Another reason for switching to Autopilot was that it provided more guardrails for developer environments. In the past, Brain Corp development environments might experience cluster outages because of a small misconfiguration. “With Autopilot, we don’t need to read every line of the docs on how to provision a k8s cluster with high availability and function in a degraded situation,” Gartner said. He noted that without Autopilot they would have to spend a whole month evaluating GKE failure scenarios to achieve the stability Autopilot provides by default. “Google SREs know their service better than we do so they’re able to think of failure scenarios we’ve never considered,” he said, much less replicate. For example, Brain Corp engineers have no real way to simulate out-of-quota or out-of-capacity scenarios.How has GKE Autopilot helped Brain Corp?Since adopting Autopilot, the Cloud Infrastructure team at Brain Corp has received fewer pages in the middle of the night because of a cluster or service going down unexpectedly. The clusters are scaled and maintained by Google Cloud. By imposing high-level guardrails on the cluster that you can’t disable, Autopilot “provides a better blast shield by default,” Gartner said. It also makes collecting performance metrics and visualizing them in Grafana dashboards drastically easier, since it exports tons of k8s and performance metrics by default. “Now we don’t need to spend time gathering or thinking about how to collect that information,” he said.Autopilot has also improved the developer experience for Brain Corp’s software engineers. They run a lot of background computing jobs and traditionally have not been able to easily fine-tune pod-level cost and compute requirements. Autopilot’s per-pod billing increases transparency, allowing devs to know exactly how much their jobs cost. They’ve also been able to easily orient compute requirements to the pods themselves. Billing at the app level instead of the cluster level makes chargeback easier than overprovisioning a cluster that five teams use and figuring out how to split the bill. “We don’t want to spend time optimizing k8s billing,” Gartner said. Cutting costs has been a huge advantage of switching to Autopilot. According to Gartner, there’s a “5-10% overhead you get billed for by just running a k8s node that we are not billed for anymore. We’re only paying for what our app actually uses.”How can Autopilot improve? GKE Autopilot launched last year, and isn’t at full feature parity with GKE yet. For example, certain scientific workloads require or perform better using specific CPU instruction sets. “GPU support is something we would love to see,” Gartner said. Even so, the benefits of GKE Autopilot over EKS far outweighed the limitations, and in the interim, they can spin up GKE Standard clusters for specialized workloads.With all the extra cycles that GKE Autopilot gives back to Brain Corp’s developers and engineers, they have lots of time to dream up new things that robots can do for us — watch this space. Curious about GKE and GKE Autopilot? Check out Google Cloud’s KubeCon talks available on-demand.Related ArticleGoogle Cloud at KubeCon EU: New projects, updated services, and how to connectEngage with experts and learn more about Google Kubernetes Engine at KubeCon EU.Read Article
Quelle: Google Cloud Platform

Google Cloud simplifies customer verification and benefits processing with Document AI for Identity cards

If you’ve opened an account at a bank, applied for a government benefit, or provided a proof of age document on an ecommerce website, chances are you’ve had to share a physical or digital copy of a Driver’s License or a passport as proof of your identity. For businesses or public sector organizations that need this information to provide services, processing images of identity documents has long been a time- and resource-intensive process that requires extensive human intervention. Solutions exist to help digitally capture the data, but they require extensive human intervention that impacts the speed and cost of processing and ultimately the time to service customers.The Google Cloud Document AI family of solutions has been designed to help solve some of the hardest problems for data capture at scale by extracting structured data from unstructured documents to help reduce processing costs and improve business speed and efficiency. Today, we’re announcing the general availability of identity parsers that bring the power of Document AI to customer verification, KYC, and other identity-based workflows. With Document AI for Identity, businesses can leverage automation to extract information from identity documents with a high degree of accuracy, without having to bear the cost and turnaround time of manual tasks by a service provider. Document AI for Identity leverages artificial Intelligence to provide a set of pre-trained models that can parse identity and supports US driver’s licenses (generally available), US passports (generally available), French driver’s licenses (preview) and French National ID cards (preview), with more documents to be added from around the world over the coming months.When our customers process high-volume workloads or complex workflows, they need a high degree of accuracy, since getting the first step wrong can derail the entire workflow. The introduction of special parsers for Identity processing can help solve one of the most commonly required document processing needs that our financial services and public sector customers face.Along with the identity parsers, Google Cloud is also offering its“Human in the Loop” service, in which verification for a subset of identity documents can be automatically assigned to a pool of humans (internal or external) for manual review, based on confidence scores. While there are multiple industries and applications that could benefit from Document AI for Identity, we’ve seen two main kinds of applications being adopted during the solution’s preview. One is around processing ID cards uploaded as unstructured images at scale, so that enterprises can have IDs on file. The second use case is to perform advanced checks on identity documents to validate their authenticity and / or to detect fraud. Google Cloud’s fraud detector API (which is currently in preview) can complement Document AI for Identity and apply an extra layer of normalization to help validate the identity as a government-issued ID by checking for suspicious words, image manipulation, and other common issues with forged identity documents. With new versions of driver’s licenses being frequently released, Document AI for identity uses specialized models and constantly-updated training data to help make sure the parsers can offer a high degree of accuracy. For all use cases, Document AI does not retain any customer data after completing the processing request (successfully or with an error). Check out this demoand visit the Document AI for Identity landing page for more information on how Document AI can help solve your identity processing needs, and ask your Google Cloud account team to help you integrate Identity Document AI into your workflows.Related ArticleSnap Inc. adopts Google Cloud TPU for deep learning recommendation modelsSnap, Inc. is using Google Cloud solutions to quickly turn millions of data points into personalized customer ad recommendations.Read Article
Quelle: Google Cloud Platform

“Take that leap of faith” Meet the Googler helping customers create financial inclusion

What brought you to Google?Honestly, the opportunity fell in my lap. I had just graduated from NYU with an MS in business ops and hospitality, and a staffing agency for Google reached out—they were looking for hospitality majors to support recruitment, hosting an engineer for the day as they walked through their interview process. I took the chance!Can you tell us a little bit more about your role as senior program manager at Google Cloud?I am the lead program manager across our program to grow Cloud with Black+ -owned businesses. We created this program to help enable digital acceleration for institutions playing a crucial role in combating systemic racism, and to increase their presence in the financial services industry. In my role, I work directly with customers, bringing together their vision with our engineering and innovation, to help them see their future on cloud.I’m proud to share that by aligning our mission with the right partners, the team has identified, integrated, and onboarded seven Black-owned financial services institutions onto Google Cloud. For example, we worked with First Independence, a black owned bank headquartered in Michigan who has been serving the local community—including small businesses—for 52 years. We partnered with a digital lending platform to help them digitize their loan process, allowing clients to quickly and easily apply for loans under the federal Paycheck Protection Program (PPP loans). Without the new tech infrastructure, many of their clients may have missed the opportunity to get this federal – and for many businesses, critical – support due to slow processes.We started small and learned a lot along the way, now we want to expand to other industries. Helping one bank at a time creates a lasting impact. (You can read more about the banks here)How do you feel like your background in hospitality supports your current role?I like to think of myself as a problem solver. It’s very cliche, but I love really working with people and helping them figure out how to get to their end goal. In this particular role, it’s working with customers, specifically financial institutions, that didn’t trust putting financial information on the cloud. Once I started to engage with these customers, I was able to build trust with them through a larger goal of helping the community. Once we built that rapport, they felt more comfortable. I want to help our underbanked communities be financially secure, have financial literacy and build generational wealth.We now have other industries that have heard about us and want to learn more about the program.Why do you think that cloud is well positioned to help advance financial inclusion?We all know about the wealth gap. We all know about the education gap. Cloud technology can help, Cloud’s scale and flexibility could actually change the lives generationally of people that need help.We can really shift the focus on not just saying Black Lives Matter, saying a name, or wearing a t-shirt, but also empowering organizations to grow their impact and better serve their communities.How would you describe your experience at Google?To begin my career here as a TVC and wanting to be a full-timer; to getting a role right before a pandemic, then being promoted last year, I still can’t believe it. I couldn’t be at a company that I didn’t align with what they were putting out there. I’m not a fake it ‘til you make it kind of person. I’m very honest and transparent. And, I feel that Google, at its core, is a great company. Do you have advice for other people who may want to align their passion with their profession?Take that leap of faith. I followed a great manager to this role, took a chance, and am so glad I did.Related ArticleMeet the people of Google Cloud: Jim Hogan, driving innovation through inclusionJim Hogan shares his experience as an autistic Googler and how inclusion drives innovation.Read Article
Quelle: Google Cloud Platform

Google Cloud enables inclusive financial services with Black-owned business initiative

In June of 2020, Sundar Pichai outlined a number of commitments Google will make to racial equity, starting with the Black+ community in the United States. As part of this initiative, we formed a team at Google Cloud to help black entrepreneurs accelerate the growth of their businesses with cloud technology. Racial equity is inextricably linked to economic opportunity. According to McKinsey, advancing racial equity would create new oppor­tunities for participation in the economy for underrepresented individuals, resulting in significant benefits to businesses, families, and communities across the country. Black-owned financial institutions play a vital role in closing the racial wealth gap by providing greater access to financial products and services to historically underrepresented and underserved communities. That is why we decided to focus our initial efforts on empowering Black entrepreneurs and Black businesses in the financial services industry.   Together with partners like Uncommon Impact Studio, World Wide Technology, and Zencore, we aim to bring data, technology, and marketing capabilities that are uniquely Google to Black-owned banks and fintechs. By implementing cloud technologies, seven Black-owned financial institutions have been able to accelerate their digital transformation, scale their business, and connect their products and services to people that need them most. Let’s dive a little deeper into a few companies that are part of the initiative:  BetaBank: Improving access to capital for small businesses BetaBank recently announced its FDIC application to become one of the first digitally native banks built from the ground up on Google Cloud. BetaBank founder Seke Ballard recognized early that the financial lending system was broken, and he identified technology as the key to removing bias from small business lending. Ballard created an AI algorithm to weigh risk and calculate qualification of an SMB loan application with more accuracy, speed, and at a lower cost than traditional banks.BetaBank’s mission is to provide small business owners equitable access to financial services. Ballard and his team selected Google Cloud as the cloud infrastructure on which to build, run, and manage BetaBank. Google Cloud will provide a scalable, secure infrastructure  to grow BetaBank’s business and networks, and the tools to support regulatory compliance, fraud prevention, and overall security. OneUnited Bank: Delivering personalized customer experiences OneUnited Bank is one of the first and largest Black-owned digital banks in the United States.   OneUnited Bank worked with Google Cloud to implement Contact Center AI, a Google Cloud platform that enables companies to leverage AI to transform the performance of its call centers. The company also implemented Google Ads Search campaigns to connect with new customers.We recognize that there are things we can do that will 10x this company and there are ways that Google can help us Jim Slocum CIO One UnitedOneUnited paired Contact Center AI with its existing technology, and leveraged DialogFlow, Google Cloud’s conversational AI natural language understanding platform, to create a more personalized customer experience and scale their contact center interactions.  The success of the deployment was revelatory to OneUnited as to what cloud and AI technologies can do for them and their customers. First Independence Bank: Modern infrastructure for better community lending First Independence Bank is the only Black-owned bank headquartered in Michigan and has been serving the local community in Detroit for over 52 years. To ensure the bank could compete in the future, its legacy systems needed a digital upgrade. In September 2021, First Independence Bank partnered with a digital lending platform for business banking, to speed up its digital federal Paycheck Protection Program (PPP) loan application process  as a convenience to its PPP loan applicants. As part of this partnership, First Independence Bank has committed to migrate onto Google Cloud to create a more efficient lending process for customers.Data Capital Management: Harnessing the power of AIData Capital Management(DCM) is a digital investment research, services and advisory firm whose CEO and Co-Founder Michael Beal knew early on the power that artificial intelligence (AI) and machine learning (ML) can bring to the fund management industry. DCM worked closely with Google Cloud engineers to enhance its current offerings of DCM AI models (“AI Traders”) that investors can leverage to manage their stock holdings and digital wallets. Training AI models requires massive amounts of data and compute power. As the firm’s operations grew, the opportunity to optimize performance with Google Cloud was a primary factor for the decision to migrate DCM’s DCM.ai investor portal and all supporting investment research, execution, and reporting features from their legacy provider to Google Cloud.  What’s next? Building on our commitment to increase racial equity through technology, we are expanding this program beyond financial services to bring the full value of Google Cloud to other industries including education, entertainment, healthcare, and clean energy. If your company is interested in getting involved, please fill out this form.Related ArticleMeet the people of Google Cloud: Priyanka Vergadia, bringing Google Cloud to life in illustrationsWhen COVID shut down our world, Developer Advocate Priyanka Vergadia found ways to connect with the developer community through illustrat…Read Article
Quelle: Google Cloud Platform