How SingleStoreDB uses Google Cloud Marketplace to drive great customer experiences

Founded in 2012, SingleStoreDB is a distributed SQL, cloud-native database that offers ultra-fast, low-latency access to large datasets — simplifying the development of modern enterprise applications. And by combining transactional (OLTP) and analytical (OLAP) workloads, SingleStoreDB introduces new efficiencies into data architecture. We are a multicloud database, meaning that our customers can launch SingleStoreDB on any cloud infrastructure they choose. At the same time, our alignment with Google Cloud has been foundational to our success.The following image showcases SingleStoreDB’s patentedthree-tier storage architecture — memory, local disk and object storage. This architecture enables fast inserts, large volumes of data storage, and overall better scalability and TCO.Let me tell you about how Google Cloud Marketplace has helped us grow our business from a startup in 2011 to one that 12 Fortune 50 companies and half of the top 10 banks in the U.S. rely upon today.Filling a critical data niche to accelerate business valueFirst, let’s talk about why companies choose to use SingleStoreDB alongside Google Cloud and other IT solutions.SingleStoreDB on Google Cloud is ideal for data-intensive applications with these five key requirements:  Query latency — Execute and receive query results with millisecond latencies secondConcurrency — Support a large number of users or concurrent queries, without sacrificing latencyQuery complexity — Handle both simple and complex queries Data size — Effortlessly operate over large data setsData ingest speed — Load (or ingest) data at very high rates, from thousands to millions of rows per secondEvery customer approaches IT differently. Many are either taking a multicloud or hybrid approach to their infrastructure. By running on Google Cloud, SingleStoreDB helps customers manage workloads across all clouds and on-prem systems.We find that many of our customers use SingleStoreDB as a single-pane view into all their data and infrastructure, as well as for data-intensive applications. Because SingleStoreDB is an analytical and transactional database, it shines when handling workloads that require outstanding accuracy and speed.Especially when it comes to our larger customers, companies increasingly run SingleStoreDB on Google Cloud thanks to the Global Network Backbone. Because Google Cloud owns its global fiber network, we tap into Google’s secure-by-design infrastructure to protect information, identities, applications, and devices. This, alongside the increasing popularity of Google Cloud AI/ML solutions, makes SingleStore on Google Cloud a growing force.Success on Google Cloud MarketplaceWhile the partnership between SingleStoreDB and Google Cloud solutions is paramount to our success, the availability of our solution on Google Cloud Marketplace amplifies customer satisfaction.Google Cloud Marketplace simply makes it easier for partners to sell into Google Cloud customers. Here are some of the advantages we’ve seen since launching on Google Cloud Marketplace:Our customers can easily transact and deploy via Marketplace, providing that true Software-as-a-Service feel for SingleStoreDB.Transacting through Marketplace counts toward our customers’ Google Cloud commit, which helps partners like us tap into budget that customers are already setting aside for Google Cloud spend.Billing and invoicing are intuitive and efficient, as everything is automated through Google Cloud Marketplace. Our customers receive a single, clear invoice from Google covering SingleStoreDB, Google Cloud, and other partner solutions purchased from Marketplace.Many of our customers have previously signed the Google Cloud User License Agreement (ULA), which further expedites procurement by speeding up legal reviews.Transforming how we sellAll of these advantages translate to a powerful change in our sales strategy. We no longer have to get bogged down by numbers and contracts. Instead, we can focus on educating customers about how the use of our solutions on Google Cloud will deliver returns to their business. Co-selling with Google Cloud has benefited our business significantly and provides customers with the best possible experiences.We have also seen surprising advantages stem from our success on Google Cloud Marketplace. Our product and engineering teams are much more engaged with Google Cloud technologies now, putting a heavy emphasis on new integrations. We’re actively exploring new Dataflow templates, native connectors for BigQuery and Data Studio, and other solutions.What SingleStoreDB customers are sayingTo give you a better idea of how our customers are benefiting, here are some recent comments about running SingleStoreDB on Google Cloud and transacting through Google Cloud Marketplace:“Our goal has and will always be to build our platform in a way that makes it feel like an on-premise solution. Speed of data processing and delivering are critical to providing this within our solutions. Both Google Cloud and SingleStore have helped us achieve this.” — Benjamin Rowe, Cloud & Security Architect, Arcules“The SingleStore solution on Google Cloud allows marketing teams using Factors.ai to make more informed data decisions by organizing data in one system and allowing for self-serve analytics.” — Praveen Das, Co-Founder, Factors.aiAt the end of the day, our customer experiences and the business impacts they achieve with our solutions are our most critical KPIs. By partnering with Google Cloud, we have unlimited potential to improve our services to power the next generation of data and analytics applications.Discover how SingleStore can transform your business on the Google Cloud Marketplace.Related ArticleFour ways Google Cloud Marketplace simplifies buying cloud softwareLearn about new features in Google Cloud Marketplace that make buying and selling better than ever.Read Article
Quelle: Google Cloud Platform

Google Distributed Cloud adds AI, ML and Database Solutions to deliver customers even greater flexibility and choice

Organizations need a cloud platform that can securely scale from on-premises, to edge, to cloud while remaining open to change, choice, and customization. They must be able to run their applications wherever they need, on infrastructure optimized to process a very high volume of data with minimal delay, all while maintaining the satisfaction and stability of their ML-driven user experiences. At Google, we deeply understand these customer requirements, which is why we launched Google Distributed Cloud last year. With Google Distributed Cloud, we bring a portfolio of fully managed solutions to extend our infrastructure to the edge and into customers’ own data centers. Today, Google Cloud customers love our artificial intelligence (AI) services for building, deploying and scaling more effective AI models with developers successfully using our core machine learning (ML) services to build and train high-quality custom ML models. Our customers are also using a variety of our managed database solutions because of their simplicity and reliability. However, some customers have highly sensitive workloads and desire to use their own private, dedicated facilities. For these customers, we’re excited to announce they will be able to run a selection of these same AI, ML, and database services in Google Distributed Cloud Hosted inside their own data centers within the next year. With this announcement, customers get to take advantage of Anthos, a common management control plane which provides a consistent development and operations experience across hybrid environments. This same experience is now available for on premise environments.Our portfolio of AI, ML, and database products enable customers to quickly deploy services with out-of-box simplicity that includes delivering valuable insights through both unstructured and structured data. The integration of our Google Cloud AI and database solutions into the Google Distributed Cloud portfolio means the ability to harness real-time data insights like never before due to the proximity to where the data is being generated and consumed. This includes ensuring low latency to support applications that are mission critical to businesses such as computer vision, which can be used on the factory floor to detect flaws in products or to index large amounts of video. The addition of these transformative capabilities allow customers to save money, innovate faster and provide the greatest flexibility and choice. With this integration, customers using Google Distributed Cloud Hosted will have access to some exciting AI features. One example is our Translation API that can instantly translate texts in more than one hundred languages. Translation API is a feature available in Vertex AI, our managed ML platform that is generally available and allows companies to accelerate the deployment and maintenance of AI models. With this announcement, customers who need to run highly sensitive workloads in an on-premise or edge environment can now leverage the unique functionality of Translation API along with other Google Cloud pre-trained API’s in Vertex AI such as Speech-to-Text and optical character recognition (OCR). These features were all trained on our planet-scale infrastructure to deliver the highest level of performance, and as always, all of our new AI products also adhere to our AI Principles. Additionally, by incorporating our managed databases offering onto the Google Distributed Cloud portfolio, customers can process data locally to migrate or modernize applications, opening up more time for innovation and to create value in their applications. This is especially true in industries like financial services and healthcare where there are compliance requirements on where data can reside. With these new AI, ML and databases products available in our Google Distributed Cloud portfolio, customers will still have full authority to maintain autonomy and control over their own data centers, yet can rely on Google for the latest technology innovations in cloud services. For more information, please visit Google Distributed Cloud, and to learn more about Vertex AI specifically, join us at our Applied ML Summit.
Quelle: Google Cloud Platform

Cloud CISO Perspectives: May 2022

May was another big month for us, even as we get ready for more industry work and engagement at the RSA Security Conference in San Francisco. At our Security Summit and throughout the past month, we continued to launch new security products and features, and increased service and support for all our Google Cloud and Google Workspace customers. Google Cloud’s Security Summit 2022Our second annual Security Summit held on May 17 was a great success. In the days leading up to the Summit, we discussed how we are working to bring Zero Trust policies to government agencies, and we revealed our partnership with AMD to further advance Confidential Computing – including an in-depth review focused on the implementation of the AMD secure processor in the third generation AMD EPYC processor family. We also introduced the latest advancements in our portfolio of security solutions. These include our new Assured Open Source Software service (Assured OSS), which enables enterprise and public sector users of open source software to incorporate the same OSS packages that Google uses into our own developer workflows; extending Autonomic Security Operations (ASO) to the U.S. public sector, a solution framework to modernize cybersecurity analytics and threat management that’s aligned with the Zero Trust and supply-chain security objectives of 2021’s cybersecurity Executive Order and the Office of Management and Budget memorandum; expanding our compliance with government software standards; and SAML support for Workload Identity Federation, so that customers can use a SAML-based identity provider to reduce their use of long-lived service account keys. Advancing open source software securityWe continued to partner with the Open Source Security Foundation (OpenSSF,) the Linux Foundation, and other organizations at another industry open source security summit to further develop the initiatives discussed during January’s White House Summit on Open Source Security. We’re working towards the goal of making sure that every open source developer has effortless access to end-to-end security by default. As covered in our Security Summit, an important part of this effort is Assured OSS, which leverages Google’s extensive security experience and can help organizations reduce their need to develop, maintain, and operate complex processes to secure their open source dependencies. Assured OSS is expected to enter Preview in Q3 2022.Also, as part of our commitment to improving software supply-chain security, the Open Source Insights project helps developers better understand the structure and security of the software they use. We introduced Open Source Insights data in BigQuery in May so that anyone can use Google Cloud BigQuery to explore and analyze the dependencies, advisories, ownership, license and other metadata of open-source packages across supported ecosystems, and how this metadata has changed over time.  Why Confidential Computing and our partnership with AMD mattersI’d like to take a moment to share a bit more on the importance of Confidential Computing and our partnership with AMD. I’ve been talking a lot this year about why we as an industry need to evolve our understanding of shared responsibility into shared fate. The former assigns responsibilities to either the cloud provider or the cloud provider’s customer, but shared fate is a more resilient cybersecurity mindset. It’s a closer partnership between cloud provider and customer that emphasizes secure-by-default configurations, secure blueprints and policy hierarchies, consistently available advanced security features, high assurance attestation of controls, and insurance partnerships.In our collaboration with AMD, we focused on how secure isolation has always been critical to  our cloud infrastructure, and how Confidential Computing cryptographically reinforces that secure isolation. AMD’s firmware and product security teams, Google Project Zero, and the Google Cloud Security team collaborated for several months to analyze the technologies and firmware that AMD contributes to Google Cloud’s Confidential Computing services. Also in May, we expanded the availability of Confidential Computing to include N2D and C2D Virtual Machines, which run on third-generation AMD EPYC™ processors.GCAT HighlightsHere are the latest updates, products, services and resources from our cloud security teams this month: SecurityPSP protocol now open source: In order to better scale the security we offer our customers, we created a new cryptographic offload protocol for internal use that we open sourced in May. Intentionally designed to meet the requirements of large-scale data-center traffic, the PSP Protocol is a TLS-like protocol that is transport-independent, enables per-connection security, and is offload-friendly. Updating Siemplify SOAR: The future of security teams is heading towards “anywhere operations,” and the latest version of Siemplify SOAR can help get us there. It gives organizations the building blocks needed across cloud infrastructure, automation, collaboration, and analytics to accelerate processes for more timely responses and automated workflows. In turn, this can free up teams to focus on more strategic work.Guardrails and governance for Terraform: Popular open-source Infrastructure-as-Code tool Terraform can increase agility and reduce errors by automating the deployment of infrastructure and services that are used together to deliver applications. Our new tool verifies Terraform and can help reduce misconfigurations of Google Cloud resources that violate any of your organization’s policies. Benchmarking Container-Optimized OS: As part of our security-first approach to safeguarding customer data while also making it more scalable, we want to make sure that our Container-Optimized OS is in line with industry-standard best practices. To this end, the Google Cloud Security team has released a new CIS benchmark that clarifies and codifies the security measures we have been using, and makes recommendations for hardening. New reCAPTCHA Enterprise guidebook: Identifying when a fraudster is on the other end of the computer is a complex endeavor. Our new reCAPTCHA Enterprise guidebook helps organizations identify a broad range of online fraud and strengthen their website security.Take the State of DevOps 2022 survey: The State of DevOps report by Google Cloud and the DORA research team is the largest and longest running research of its kind, with inputs from more than 32,000 professionals worldwide. This year will focus on how security practices and capabilities predict overall software delivery and operations performance, so be sure to share your thoughts with us.Industry updatesSecurity improvements to Google Workspace: I wrote at the beginning of the year that data sovereignty is one of the major, driving megatrends shaping our industry today. At the beginning of May we announced Sovereign Controls for Google Workspace, which can provide digital sovereignty capabilities for organizations, both in the public and private sector, to control, limit, and monitor transfers of data to and from the EU starting at the end of 2022, with additional capabilities delivered throughout 2023. This commitment builds on our existing Client-side encryption, Data regions, and Access Controls capabilities. We are also extending Chrome’s Security Insights to Google Cloud and Google Workspace products, as part of our efforts to consistently provide advanced features to our customers. Can you hear the security now? Pindrop is joining forces with Google Cloud. If you’ve never heard of Pindrop, you’ve almost certainly encountered their technology, which is used to authenticate payments, place restaurant and shopping orders, and check financial accounts over the phone. Their technology provides the backbone for anti-fraud efforts in voice-based controls, as well. With Google Cloud, Pindrop can be better able to detect deep fakes and robocalls, help banks authenticate transactions, and provide retailers with secure AI-powered call center support.Compliance & Controls Expanding public sector and government compliance: Google Cloud is committed to providing government agencies with the security capabilities they need to achieve their missions. In addition to our aforementioned Autonomic Security Operations and new Assured Open Source Software (OSS) service, we’re expanding Assured Workloads. This can help enable regulated workloads to run securely at scale in Google Cloud’s infrastructure. We are also pleased to announce that 14 new Google Cloud services support FedRAMP Moderate and three services are being added to support FedRAMP High, with more coming this summer. (You can read the full list of those services at the end of this blog.)  Next month we’ll recap highlights from the RSA Conference and much more.  To have our Cloud CISO Perspectives post delivered every month to your inbox, sign up for our newsletter. We’ll be back next month with more security-related updates.Related ArticleCloud CISO Perspectives: April 2022Google Cloud CISO Phil Venables shares his thoughts on the latest security updates from the Google Cybersecurity Action Team.Read Article
Quelle: Google Cloud Platform

How a robotics startup switched clouds and reduced its Kubernetes ops costs with GKE Autopilot

Don’t look now, but Brain Corp operates over 20,000 of its robots in factories, supermarkets, schools and warehouses, taking on time-consuming assignments like cleaning floors, taking inventory, restocking shelves, etc. And BrainOS®, the AI software platform that powers these autonomous mobile robots, doesn’t just run in the robots themselves — it runs in the cloud. Specifically, Google Cloud. But that wasn’t always the case. Brain Corp recently partnered with Google Cloud to migrate its robotics platform from Amazon EKS to Google Kubernetes Engine (GKE) Autopilot. Running thousands of robots in production comes with tons of operational challenges, and Brain Corp. needed a way to reduce the day-to-day ops and security maintenance overhead that building a platform on Kubernetes (k8s) usually entails. Just by turning on Autopilot, they’ve offloaded all the work of keeping clusters highly available and patched with the latest security updates to Google Cloud Site Reliability Engineers (SREs) — a huge chunk of work. Brain Corp’s ops team can now focus on migrating additional robots to the new platform, not just “keeping the lights on” with their k8s clusters.Making the switch to Google CloudWhen Brain Corp decided to migrate off of EKS, they set out to find a cloud that had the best technology, tools, and platform to easily integrate data and robotics. Brain Corp’s Cloud Development team began by trying to implement a proof-of-concept architecture to support their robots on Google Cloud and another cloud provider. It became clear that Google Cloud was the right choice when it only took a week to get the POC up and running, whereas on the other cloud provider it took a month. During the POC, Brain Corp realized benefits beyond ease of use. Google’s focus on simple integration between its data products contributed significantly to moving from POC to production. Brain Corp was able to offload Kubernetes operational tasks to Google SREs using GKE Autopilot, which allowed them to focus on migrating robots to their new platform on Google Cloud. Making the switch to GKE AutopilotAlex Gartner, Cloud Infrastructure Lead at Brain Corp, says his team is responsible for “empowering developers to develop and deploy stuff quickly without having to think too hard about it.” On EKS, Brain Corp had dedicated infrastructure engineers who did nothing but manage k8s. Gartner was expecting to have his engineers do the same on standard GKE, but once he got a whiff of Autopilot, he quickly changed course. Because GKE Autopilot clusters are secure out of the box and supported by Google SREs, Brain Corp was able to reduce their operations cost and provide a better and more secure experience for their customers. Another reason for switching to Autopilot was that it provided more guardrails for developer environments. In the past, Brain Corp development environments might experience cluster outages because of a small misconfiguration. “With Autopilot, we don’t need to read every line of the docs on how to provision a k8s cluster with high availability and function in a degraded situation,” Gartner said. He noted that without Autopilot they would have to spend a whole month evaluating GKE failure scenarios to achieve the stability Autopilot provides by default. “Google SREs know their service better than we do so they’re able to think of failure scenarios we’ve never considered,” he said, much less replicate. For example, Brain Corp engineers have no real way to simulate out-of-quota or out-of-capacity scenarios.How has GKE Autopilot helped Brain Corp?Since adopting Autopilot, the Cloud Infrastructure team at Brain Corp has received fewer pages in the middle of the night because of a cluster or service going down unexpectedly. The clusters are scaled and maintained by Google Cloud. By imposing high-level guardrails on the cluster that you can’t disable, Autopilot “provides a better blast shield by default,” Gartner said. It also makes collecting performance metrics and visualizing them in Grafana dashboards drastically easier, since it exports tons of k8s and performance metrics by default. “Now we don’t need to spend time gathering or thinking about how to collect that information,” he said.Autopilot has also improved the developer experience for Brain Corp’s software engineers. They run a lot of background computing jobs and traditionally have not been able to easily fine-tune pod-level cost and compute requirements. Autopilot’s per-pod billing increases transparency, allowing devs to know exactly how much their jobs cost. They’ve also been able to easily orient compute requirements to the pods themselves. Billing at the app level instead of the cluster level makes chargeback easier than overprovisioning a cluster that five teams use and figuring out how to split the bill. “We don’t want to spend time optimizing k8s billing,” Gartner said. Cutting costs has been a huge advantage of switching to Autopilot. According to Gartner, there’s a “5-10% overhead you get billed for by just running a k8s node that we are not billed for anymore. We’re only paying for what our app actually uses.”How can Autopilot improve? GKE Autopilot launched last year, and isn’t at full feature parity with GKE yet. For example, certain scientific workloads require or perform better using specific CPU instruction sets. “GPU support is something we would love to see,” Gartner said. Even so, the benefits of GKE Autopilot over EKS far outweighed the limitations, and in the interim, they can spin up GKE Standard clusters for specialized workloads.With all the extra cycles that GKE Autopilot gives back to Brain Corp’s developers and engineers, they have lots of time to dream up new things that robots can do for us — watch this space. Curious about GKE and GKE Autopilot? Check out Google Cloud’s KubeCon talks available on-demand.Related ArticleGoogle Cloud at KubeCon EU: New projects, updated services, and how to connectEngage with experts and learn more about Google Kubernetes Engine at KubeCon EU.Read Article
Quelle: Google Cloud Platform

Google Cloud simplifies customer verification and benefits processing with Document AI for Identity cards

If you’ve opened an account at a bank, applied for a government benefit, or provided a proof of age document on an ecommerce website, chances are you’ve had to share a physical or digital copy of a Driver’s License or a passport as proof of your identity. For businesses or public sector organizations that need this information to provide services, processing images of identity documents has long been a time- and resource-intensive process that requires extensive human intervention. Solutions exist to help digitally capture the data, but they require extensive human intervention that impacts the speed and cost of processing and ultimately the time to service customers.The Google Cloud Document AI family of solutions has been designed to help solve some of the hardest problems for data capture at scale by extracting structured data from unstructured documents to help reduce processing costs and improve business speed and efficiency. Today, we’re announcing the general availability of identity parsers that bring the power of Document AI to customer verification, KYC, and other identity-based workflows. With Document AI for Identity, businesses can leverage automation to extract information from identity documents with a high degree of accuracy, without having to bear the cost and turnaround time of manual tasks by a service provider. Document AI for Identity leverages artificial Intelligence to provide a set of pre-trained models that can parse identity and supports US driver’s licenses (generally available), US passports (generally available), French driver’s licenses (preview) and French National ID cards (preview), with more documents to be added from around the world over the coming months.When our customers process high-volume workloads or complex workflows, they need a high degree of accuracy, since getting the first step wrong can derail the entire workflow. The introduction of special parsers for Identity processing can help solve one of the most commonly required document processing needs that our financial services and public sector customers face.Along with the identity parsers, Google Cloud is also offering its“Human in the Loop” service, in which verification for a subset of identity documents can be automatically assigned to a pool of humans (internal or external) for manual review, based on confidence scores. While there are multiple industries and applications that could benefit from Document AI for Identity, we’ve seen two main kinds of applications being adopted during the solution’s preview. One is around processing ID cards uploaded as unstructured images at scale, so that enterprises can have IDs on file. The second use case is to perform advanced checks on identity documents to validate their authenticity and / or to detect fraud. Google Cloud’s fraud detector API (which is currently in preview) can complement Document AI for Identity and apply an extra layer of normalization to help validate the identity as a government-issued ID by checking for suspicious words, image manipulation, and other common issues with forged identity documents. With new versions of driver’s licenses being frequently released, Document AI for identity uses specialized models and constantly-updated training data to help make sure the parsers can offer a high degree of accuracy. For all use cases, Document AI does not retain any customer data after completing the processing request (successfully or with an error). Check out this demoand visit the Document AI for Identity landing page for more information on how Document AI can help solve your identity processing needs, and ask your Google Cloud account team to help you integrate Identity Document AI into your workflows.Related ArticleSnap Inc. adopts Google Cloud TPU for deep learning recommendation modelsSnap, Inc. is using Google Cloud solutions to quickly turn millions of data points into personalized customer ad recommendations.Read Article
Quelle: Google Cloud Platform

“Take that leap of faith” Meet the Googler helping customers create financial inclusion

What brought you to Google?Honestly, the opportunity fell in my lap. I had just graduated from NYU with an MS in business ops and hospitality, and a staffing agency for Google reached out—they were looking for hospitality majors to support recruitment, hosting an engineer for the day as they walked through their interview process. I took the chance!Can you tell us a little bit more about your role as senior program manager at Google Cloud?I am the lead program manager across our program to grow Cloud with Black+ -owned businesses. We created this program to help enable digital acceleration for institutions playing a crucial role in combating systemic racism, and to increase their presence in the financial services industry. In my role, I work directly with customers, bringing together their vision with our engineering and innovation, to help them see their future on cloud.I’m proud to share that by aligning our mission with the right partners, the team has identified, integrated, and onboarded seven Black-owned financial services institutions onto Google Cloud. For example, we worked with First Independence, a black owned bank headquartered in Michigan who has been serving the local community—including small businesses—for 52 years. We partnered with a digital lending platform to help them digitize their loan process, allowing clients to quickly and easily apply for loans under the federal Paycheck Protection Program (PPP loans). Without the new tech infrastructure, many of their clients may have missed the opportunity to get this federal – and for many businesses, critical – support due to slow processes.We started small and learned a lot along the way, now we want to expand to other industries. Helping one bank at a time creates a lasting impact. (You can read more about the banks here)How do you feel like your background in hospitality supports your current role?I like to think of myself as a problem solver. It’s very cliche, but I love really working with people and helping them figure out how to get to their end goal. In this particular role, it’s working with customers, specifically financial institutions, that didn’t trust putting financial information on the cloud. Once I started to engage with these customers, I was able to build trust with them through a larger goal of helping the community. Once we built that rapport, they felt more comfortable. I want to help our underbanked communities be financially secure, have financial literacy and build generational wealth.We now have other industries that have heard about us and want to learn more about the program.Why do you think that cloud is well positioned to help advance financial inclusion?We all know about the wealth gap. We all know about the education gap. Cloud technology can help, Cloud’s scale and flexibility could actually change the lives generationally of people that need help.We can really shift the focus on not just saying Black Lives Matter, saying a name, or wearing a t-shirt, but also empowering organizations to grow their impact and better serve their communities.How would you describe your experience at Google?To begin my career here as a TVC and wanting to be a full-timer; to getting a role right before a pandemic, then being promoted last year, I still can’t believe it. I couldn’t be at a company that I didn’t align with what they were putting out there. I’m not a fake it ‘til you make it kind of person. I’m very honest and transparent. And, I feel that Google, at its core, is a great company. Do you have advice for other people who may want to align their passion with their profession?Take that leap of faith. I followed a great manager to this role, took a chance, and am so glad I did.Related ArticleMeet the people of Google Cloud: Jim Hogan, driving innovation through inclusionJim Hogan shares his experience as an autistic Googler and how inclusion drives innovation.Read Article
Quelle: Google Cloud Platform

Google Cloud enables inclusive financial services with Black-owned business initiative

In June of 2020, Sundar Pichai outlined a number of commitments Google will make to racial equity, starting with the Black+ community in the United States. As part of this initiative, we formed a team at Google Cloud to help black entrepreneurs accelerate the growth of their businesses with cloud technology. Racial equity is inextricably linked to economic opportunity. According to McKinsey, advancing racial equity would create new oppor­tunities for participation in the economy for underrepresented individuals, resulting in significant benefits to businesses, families, and communities across the country. Black-owned financial institutions play a vital role in closing the racial wealth gap by providing greater access to financial products and services to historically underrepresented and underserved communities. That is why we decided to focus our initial efforts on empowering Black entrepreneurs and Black businesses in the financial services industry.   Together with partners like Uncommon Impact Studio, World Wide Technology, and Zencore, we aim to bring data, technology, and marketing capabilities that are uniquely Google to Black-owned banks and fintechs. By implementing cloud technologies, seven Black-owned financial institutions have been able to accelerate their digital transformation, scale their business, and connect their products and services to people that need them most. Let’s dive a little deeper into a few companies that are part of the initiative:  BetaBank: Improving access to capital for small businesses BetaBank recently announced its FDIC application to become one of the first digitally native banks built from the ground up on Google Cloud. BetaBank founder Seke Ballard recognized early that the financial lending system was broken, and he identified technology as the key to removing bias from small business lending. Ballard created an AI algorithm to weigh risk and calculate qualification of an SMB loan application with more accuracy, speed, and at a lower cost than traditional banks.BetaBank’s mission is to provide small business owners equitable access to financial services. Ballard and his team selected Google Cloud as the cloud infrastructure on which to build, run, and manage BetaBank. Google Cloud will provide a scalable, secure infrastructure  to grow BetaBank’s business and networks, and the tools to support regulatory compliance, fraud prevention, and overall security. OneUnited Bank: Delivering personalized customer experiences OneUnited Bank is one of the first and largest Black-owned digital banks in the United States.   OneUnited Bank worked with Google Cloud to implement Contact Center AI, a Google Cloud platform that enables companies to leverage AI to transform the performance of its call centers. The company also implemented Google Ads Search campaigns to connect with new customers.We recognize that there are things we can do that will 10x this company and there are ways that Google can help us Jim Slocum CIO One UnitedOneUnited paired Contact Center AI with its existing technology, and leveraged DialogFlow, Google Cloud’s conversational AI natural language understanding platform, to create a more personalized customer experience and scale their contact center interactions.  The success of the deployment was revelatory to OneUnited as to what cloud and AI technologies can do for them and their customers. First Independence Bank: Modern infrastructure for better community lending First Independence Bank is the only Black-owned bank headquartered in Michigan and has been serving the local community in Detroit for over 52 years. To ensure the bank could compete in the future, its legacy systems needed a digital upgrade. In September 2021, First Independence Bank partnered with a digital lending platform for business banking, to speed up its digital federal Paycheck Protection Program (PPP) loan application process  as a convenience to its PPP loan applicants. As part of this partnership, First Independence Bank has committed to migrate onto Google Cloud to create a more efficient lending process for customers.Data Capital Management: Harnessing the power of AIData Capital Management(DCM) is a digital investment research, services and advisory firm whose CEO and Co-Founder Michael Beal knew early on the power that artificial intelligence (AI) and machine learning (ML) can bring to the fund management industry. DCM worked closely with Google Cloud engineers to enhance its current offerings of DCM AI models (“AI Traders”) that investors can leverage to manage their stock holdings and digital wallets. Training AI models requires massive amounts of data and compute power. As the firm’s operations grew, the opportunity to optimize performance with Google Cloud was a primary factor for the decision to migrate DCM’s DCM.ai investor portal and all supporting investment research, execution, and reporting features from their legacy provider to Google Cloud.  What’s next? Building on our commitment to increase racial equity through technology, we are expanding this program beyond financial services to bring the full value of Google Cloud to other industries including education, entertainment, healthcare, and clean energy. If your company is interested in getting involved, please fill out this form.Related ArticleMeet the people of Google Cloud: Priyanka Vergadia, bringing Google Cloud to life in illustrationsWhen COVID shut down our world, Developer Advocate Priyanka Vergadia found ways to connect with the developer community through illustrat…Read Article
Quelle: Google Cloud Platform

Achieving cloud-native network automation at a global scale with Nephio

In 2007, in order to meet ever increasing traffic demands of YouTube, Google started building what is now the Google Global Cache program. Over the past 15 years, we have added thousands of edge caching locations around the world, with widely varying hosting conditions—some in customer data centers, some in remote locations with limited connectivity. Google manages the software and hardware lifecycles of all these systems remotely. Although the fleet size and serving corpus have grown by several orders of magnitude during this time, the operations team overseeing it has remained relatively small and agile. How did we do it?We started with a set of automation tools for software deployment (remotely executing commands), a set of tools for auditing/repairs (if this condition occurs, run that command), and a third set of tools for configuration management. As the fleet grew and was deployed in more varied environments, we discovered and fixed more edge cases in our automation tools. Soon, the system started reaching its scaling limits, and we built a new, more uniform and more scalable system in its place. We learned a few key lessons in the process:Intent-driven, continuously reconciling systems are more robust at scale than imperative, fire-and-forget tools.Distributed actuation of intent is a must for large-scale edge deployments. Triggering all actions from a centralized location is not reliable and does not scale, especially for edge deployments.Uniformity in systems is easier to maintain. Being able to manage deployment, repairs, and configuration using common components and common workflows (in other words, files checked into a repository with presubmit validation, review, version control, and rollback capability) reduces cognitive load for the operations team and allows more rapid response with fewer human errorsThis pattern repeats time and time again across many large distributed systems at Google, and we believe these tenets are key as network function vendors and communication service providers look to adopt cloud-based network technologies. For example, in a 5G deployment involving hundreds of locations (or many hundreds of thousands, in the case of RAN), with containerized software components, the industry needs better tools to handle deployment and operations at scale. Working with the community to address these issues, we hope to drive a common Kubernetes-based, cloud-native network automation architecture, while also providing extension points for vendors to innovate and adapt to their specific requirements.That’s why Google Cloud founded the Nephio project in April 2022. The Nephio community launched with 24 founding organizations and has now grown 2X since launching. In addition to the founding members, new participating organizations include Vodafone, Verizon, Telefonica, Deutsche Telekom, KT, HPE, Red Hat, Windriver, Tech Mahindra, and others. Over 150 developers across the globe participated in the community kickoff meeting hosted by the Linux Foundation on May 17, 2022.Google Cloud is collaborating with communication service providers, network function vendors, and cloud providers in Nephio by:Working with the community to refine the cloud native automation architecture, and define a common data model based on the Kubernetes Resource Model (KRM) and Configuration as Data (CaD) approach. This new model needs to support cloud infrastructure, network function deployment, and management of user journeys.Contributing to the development of an open, fully functional reference implementation of this architecture.Open sourcing several key building blocks, such as kpt, Porch and ConfigSync. We are also planning to open source controllers, Google Cloud infrastructure CRDs, additional sample NF CRDs, and operators to jumpstart the Nephio project.Google Cloud will also integrate Nephio with our Google Distributed Cloud Edge platform, combining the advantages of a fully managed hardware platform with Nephio-powered deployment and management of network functions to our customers.The Nephio community is complementary to many of the existing open source communities and standards. Nephio is closely working with adjacent communities in CNCF, LF Networking, and LF Edge to provide an end-to-end automation framework for telecommunication networks.By working with the community in this open manner, we believe that, together, we can advance the state of the art of network automation, improving the deployment and management of network functions on cloud native infrastructure. We welcome the industry to join us in this effort. For more information, please visit the Nephio website at www.nephio.org. And please register to join us online or in-person at the Nephio developer summit on June 22 and 23.Related ArticleAchieving cloud-native network automation in telecommunications togetherGoogle Cloud and Linux Foundation launch Nephio an open source cloud native network automation program.Read Article
Quelle: Google Cloud Platform

The Retirement Tracker simplifies and socializes early retirement on Google Cloud

A lot of people talk about retirement but far fewer people have the information and tools to plan for it properly. Just how much money you need to live comfortably once you stop working can be the million-dollar (or more!) question. Although there is no shortage of retirement calculators, many only provide a limited one-time analysis and require detailed personal information that may be sold to third parties. We developedThe Retirement Tracker with one idea: to empower individuals to take control over their retirement planning with tools to easily plan, track, and even socialize their early retirement.With The Retirement Tracker, people can aggregate their financial accounts—including savings, 401Ks, and stock portfolios—on one safe, convenient retirement app. The Retirement Tracker analyzes real-time data from these accounts to track net worth and automatically update retirement targets. A small part of this information, such as stock transactions, can even be shared among people’s self-created investment groups to encourage information sharing and friendly competition.Scaling up for early retirement on Google CloudWhen building The Retirement Tracker, we needed a technology partner that would enable us to securely and effectively scale while saving time and administrative costs. That’s why we started working withGoogle Cloud and partnering with theGoogle for Startups Cloud Program.Google Cloud gives us a highly secure-by-design infrastructure, valuable cloud credits to obtain products from an expansive technology platform, access to dedicated startup experts, and potential for joining the Google Cloud Marketplace.Even though we are a small team, we innovate quickly and easily onGoogle Workspace using Gmail, Google Docs, Sheets, Calendar, and Meet. We also store and protect all sensitive company documents on Google Cloud and post our“Restimators” investment video series on YouTube. More recently, we’ve adoptedFirebase to scale and manage our infrastructure while accelerating the development of The Retirement Tracker.In just days, we implemented Plaid authentication and authorization protocols, enabling customers to quickly and securely connect details about their investment and savings accounts to The Retirement Tracker. This is a process that possibly would have taken us months if we had to manually build these security capabilities from scratch.Google Firebase now delivers a seamless customer experience by aggregating and displaying near real-time data from multiple financial accounts on a single dashboard. On the back end, Firebase automatically queries read-only tokens, securely accesses account balance changes, and encrypts sensitive data in the cloud.  Firebase also makes it easy for customers to administer internal investment groups and selectively socialize information such as stock purchases and sales—without revealing transaction quantities or prices. Customers create these small invite-only groups to help family and friends improve their retirement portfolios with friendly competition and strategic crowdsourcing. Customers can also participate in additional investment discussions hosted by The Retirement Trackeron Discord.Building a sustainable financial futureSince we started using Google and Google Cloud solutions, everything is easier to build and scale. We constantly perfect the customer experience with new features and services, while leaving our IT and cloud infrastructure in the hands of Google Cloud experts. Demand for our app is growing fast as we prepare to move The Retirement Tracker out of beta in 2022. Moving forward, we’re excited to continue to grow in the Google for Startups Cloud Program, and with our dedicated Google team to improve the observability and reliability of The Retirement Tracker to handle the volume of users we’re anticipating in 2023 and beyond. To help us do so, we’re exploring additional Google Cloud solutions such asLooker,BigQuery, andCloud Spanner. These solutions will enable us to rapidly expand our services and offer customers a variety of new benefits from using The Retirement Tracker. Our participation in the Google for Startups Cloud Program has been instrumental to our success. The Startup Success Manager has worked with our team to identify programs we could apply to in order to strengthen our relationship even further. With Google Cloud, we’re making early retirement easier and more accessible on one convenient, highly secure mobile app. We can’t wait to see what we accomplish next as we drive innovation and financial inclusion by empowering people to plan, track, and socialize retirement planning that can be at once so important and so difficult for so many people.  If you want to learn more about how Google Cloud can help your startup, visit our pagehere to get more information about our program, and sign up for our communications to get a look at our community activities, digital events, special offers, and more.
Quelle: Google Cloud Platform

Snap Inc. adopts Google Cloud TPU for deep learning recommendation models

While many people still think of academic research when it comes to deep learning, Snap Inc. has been applying deep learning models to improve its recommendation engines on a daily basis. Using Google’s Cloud Tensor Processing Units (TPUs), Snap has accelerated its pace of innovation and model improvement to enhance the user experience. Snap’s blog Training Large-Scale Recommendation Models with TPUs tells the story of how the Snap ad ranking team leveraged Google’s leading-edge TPUs to train deep learning models quickly and efficiently. But there’s a lot more to the story than the how, and that’s what we’re sharing here.Faster leads to betterSnap’s ad ranking team is charged with training the models that make sure the right ad is served to the right Snapchatter at the right time. With 300+ million users daily and millions of ads to rank, training models quickly and efficiently is a large part of a Snap ML engineer’s daily workload. It’s simple, really: the more models Snap’s engineers can train, the more likely they are to find the models that perform better—and the less it costs to do so. Better ad recommendation models translate to more relevant ads for users, driving greater engagement and improving conversion rates for advertisers.Over the past decade, there has been tremendous evolution in the hardware accelerators used to train large ML models like those Snap uses for ad ranking, from general-purpose multicore central processing units (CPUs) to graphics processing units (GPUs) to TPUs. TPUs are Google’s custom-developed application specific integrated circuits (ASICs) used to accelerate ML workloads. TPUs are designed from the ground up to minimize time to accuracy when training large models. Models that previously took weeks to train on other hardware platforms can now be trained in hours on TPUs—a product of Google’s leadership and experience in machine learning (dig into the technology in Snap’s blog).Benchmarking successSnap wanted to understand for itself what kind of improvements in training speed it might see using TPUs. So, the Snap team benchmarked model training using TPUs versus both GPUs and CPUs, and the results were impressive. GPUs underperformed TPUs in terms of both throughput and cost, with a reduction in throughput of 67 percent and an increase in costs of 52 percent when using GPUs. Similarly, TPU-based training drastically outperformed CPU-based training for Snap’s most common models. For example, when looking at their standard ad recommendation model, TPUs slashed processing costs by as much as 74 percent while increasing throughput by as much as 250 percent—all with the same level of accuracy.Because TPU embedding API is a native and optimized solution for embedding-based operations, it performs embedding-based computations and lookups more efficiently. This is particularly valuable to recommenders, which have additional requirements such as fast embedding lookups and high memory bandwidth.Benefits across the boardFor Snap’s ad ranking team, those improvements translate into tangible workflow advantages. It’s not unusual for Snap to have a month’s worth of data that includes all the logs of users who were shown particular ads and a record of whether they interacted with an ad or not. That means it has millions of data points to process, and Snap wants to model them as quickly as possible so it can make better recommendations going forward. It’s an iterative process, and the faster Snap can get the results from one experiment, the faster its engineers can spin up another with even better results—and they’d much prefer to do that in hours rather than days. Increased efficiency and velocity benefit Snapchatters, too. The better the models are, the more likely they are to correctly predict the likelihood that a given user will interact with a particular ad, improving the user experience and boosting engagement. Improved engagement leads to higher conversion rates and greater advertiser value—and given the volumes of ads and users Snap deals with, even a one percent improvement has real monetary impact.Working at the leading edgeSnap is working hard to improve its recommendation quality with the goal of delivering greater value to advertisers and a better experience for Snapchatters. That includes going all-in on leading-edge solutions like Google TPUs that allow its talented ML engineers to shine. Now that you know the whole story, see how Snap got there with the help of Google: Training Large-Scale Recommendation Models with TPUs.Related ArticleCloud TPU VMs are generally availableCloud TPU VMs with Ranking & Recommendation acceleration are generally available on Google Cloud. Customers will have direct access to TP…Read Article
Quelle: Google Cloud Platform