At Google I/O, generative AI gets to work

Over the past decade, artificial intelligence has evolved from experimental prototypes and early successes to mainstream enterprise use. And the recent advancements in generative AI have begun to change the way we create, connect, and collaborate. As Google CEO Sundar Pichai said in his keynote, every business and organization is thinking about how to drive transformation. That’s why we’re focused on making it easy and scalable for others to innovate with AI.In March, we announced exciting new products that infuse generative AI into our Google Cloud offerings, empowering developers to responsibly build with enterprise-level safety, security, and privacy. They include Gen App Builder, which lets developers quickly and easily create generative chat and enterprise search applications, and Generative AI support in Vertex AI, which expands our machine learning development platform with access to foundation models from Google and others to quickly build, customize and deploy models. We also introduced our vision for Google Workspace, and delivered generative AI features to trusted testers in Gmail and Google Docs that help people write.Last month we introduced Security AI Workbench, an industry-first extensible platform powered by our new LLM security model Sec-PaLM, which incorporates Google’s unique visibility into the evolving threat landscape and is fine-tuned for cybersecurity operations.Today at Google I/O, we are excited to share the next steps not only in our own AI journey, but also those of our customers and partners as well. We’ve already seen a number of organizations begin to develop with and deploy our generative AI offerings. These organizations have been able to move their ideas from experimentation to enterprise-ready applications with the training models, security, compute infrastructure, and cost controls needed to provide their customers with transformative experiences. Our open ecosystem, which provides opportunities for every kind of partner, continues to grow as well. And we are also pleased to share new services and capabilities across Google Cloud and Workspace, including Duet AI—our AI-powered collaborator—to enable more users and developers to start seeing the impact AI can have on their organization.Customers bringing ideas to life with generative AILeading companies in a variety of industries like eDreams ODIGEO, GitLab, Oxbotica, and more, are using our generative AI technologies to create engaging content, synthesize and organize information, automate business processes, and build amazing customer experiences. A few examples we showcased today include:Adore Me, a New York-based intimate apparel brand, is creating production-worthy copy with generative AI features in Docs and Gmail. This is accelerating projects and processes in ways that even surprised the company.Canva, the visual communication platform, uses Google Cloud’s rich generative AI capabilities in language translation to better support its non-English speaking users. Users can now easily translate presentations, posters, social media posts, and more into over a hundred languages. The company is also testing ways that Google’s PaLM technology can turn short video clips into longer, more compelling stories. The result will be a more seamless design experience while growing the Canva brand.Character.AI, a leading conversational AI platform, selected Google Cloud as its preferred cloud infrastructure provider because we offer the speed, security and flexibility required to meet the needs of its rapidly growing community of creators. We are enabling Character.AI to train and infer LLMs faster and more efficiently, and enhancing the customer experience by inspiring imagination, discovery, and understanding. Deutsche Bank is testing Google’s generative AI and large language models (LLMs) at scale to provide new insights to financial analysts, driving operational efficiencies and execution velocity. There is an opportunity to significantly reduce the time it takes to perform banking operations and financial analysts’ tasks, empowering employees by increasing their productivity while helping to safeguard customer data privacy, data integrity, and system security.Instacart is always looking for opportunities to adopt the latest technological innovations, and by joining the Workspace Labs program, they have access to the new features and can discover how generative AI will make an impact for their teams.Orange is exploring a next-generation contact center with Google Cloud. With customers in 26 countries, the global telecommunications firm is testing generative AI to transcribe the call, summarize the exchange between the customer and service representatives, and suggest possible follow up actions to the agent based on the discussion. This experiment has the potential to dramatically improve both the efficiency and quality of customer interactions. Orange is working closely with Google to help ensure data protection and make sure that systematic employee review of Generative AI output and transparency can be implemented.Replit is developing a collaborative software development platform powered by AI. Developers using Replit’s Ghostwriter coding AI already have 30% of their code written by generative AI today. With real-time debugging of the code output and context awareness of the program’s files, Ghostwriter frees up developers’ time for more challenging and creative aspects of programming.Uber is creating generative AI for customer-service chatbots and agent assist capabilities, which handle a range of common service issues with human-like interactions with the aim of achieving greater customer satisfaction and cost efficiency. Additionally, Uber is working on using our synthetic data systems (a technique for improving the quality of LLMs) in areas like product development, fraud detection, and employee productivity.Wendy’s is working with Google Cloud on a groundbreaking AI solution, Wendy’s FreshAI, designed to revolutionize the quick service restaurant industry. The technology is transforming Wendy’s drive-thru food ordering experience with Google Cloud’s generative AI and LLMs—with the ability to discern the billions of possible order combinations on the Wendy’s menu. In June, Wendy’s plans to launch its first pilot of the technology in a Columbus, Ohio-area restaurant, before expanding to more drive-thru locations.Leading companies build with generative AI on Google CloudPartnering creates a strong ecosystem of real-world options for customersAt Google Cloud, we are dedicated to being the most open hyperscale cloud provider, and that includes our AI ecosystem. Today, we are excited to expand upon the partnerships announced earlier this year for every layer of the AI stack—chipmakers, companies building foundation models and AI platforms, technology partners enabling companies to develop and deploy machine learning (ML) models, app-builders solving customer use cases with generative AI, and global services and consulting firms that help enterprise customers implement all of this technology at scale. We announced new or expanded partnerships with SaaS companies like Box, Dialpad, Jasper, Salesforce, and UKG; and consultancies including Accenture, BCG, Cognizant, Deloitte, and KPMG. Together with our previous announcements with companies like AI21 Labs, Aible, Anthropic, Anyscale, Bending Spoons, Cohere, Faraday, Glean, Gretel, Labelbox, Midjourney, Osmo, Replit, Snorkel AI, Tabnine, Weights & Biases, and many more, they provide the a wide range of options for businesses and governments looking to bring generative AI into their organizations. Introducing new generative AI capabilities for Google CloudTo help cloud users of all skill levels solve their everyday work challenges, we’re excited to announce Duet AI for Google Cloud, a new generative AI-powered collaborator. Duet AI serves as your expert pair programmer and assists cloud users with contextual code completion, offering suggestions tuned to your code base, generating entire functions in real-time, and assisting you with code reviews and inspections. It can fundamentally transform the way cloud users of all skill sets build new experiences and is embedded across Google Cloud interfaces—within the integrated development environment (IDE), Google Cloud Console, and even chat. For developers looking to create generative AI applications more simply and efficiently, we are also introducing new foundation models and capabilities across our Google Cloud AI products. And to continue to enable and inspire more customers and partners, we are opening up generative AI support in Vertex AI and expanding access to many of these new innovations to more organizations.New foundation models are now available in Vertex AI. Codey, our code generation foundation model, helps accelerate software development with code generation, code completion, and code chat. Imagen, our text-to-image foundation model, lets customers generate and customize studio-grade images. And Chirp, our state-of-the-art speech model, allows customers to more deeply engage with their customers and constituents inclusively in their native languages with captioning and voice assistance. They can each be accessed via APIs, tuned through our intuitive Generative AI Studio, and feature enterprise-grade security and reliability, including encryption, access control, content moderation, and recitation capabilities that let organizations see the sources behind model outputs. Text Embeddings API is a new API endpoint that lets developers build recommendation engines, classifiers, question-answering systems, similarity matching, and other sophisticated applications based on semantic understanding of text or images. Reinforcement Learning from Human Feedback (RLHF) allows organizations to incorporate human feedback to deeply customize and improve model performance. Underpinning all of these innovations is our AI-optimized infrastructure. We provide the widest choice of compute options among leading cloud providers and are excited to continue to build them out with the introduction of new A3 Virtual Machines based on NVIDIA’s H100 GPU. These VMs, alongside the recently announced G2 VMs, offer a comprehensive range of GPU power for training and serving AI models.Extending generative AI across Google Workspace Earlier this year, we shared our vision for bringing generative AI to Workspace, and gave many users early access to features that helped them write in Gmail and Google Docs. Today, we are excited to announce Duet AI for Google Workspace, which brings together our powerful generative AI features and lets users collaborate with AI so they can get more done every day. We’re delivering the following features to trusted testers via Workspace Labs: In Gmail, we’re adding the ability to draft responses that consider the context of your existing email thread—and making the experience available on mobile.In Google Slides and Meet, we’re enabling you to easily generate images from text descriptions. Custom images in slides can help bring your story to life, and in Meet they can be used to create custom backgrounds.In Google Sheets, we’re automating data classification and the creation of custom plans—helping you analyze and organize data faster than ever. Moving the industry forward, responsiblyCustomers continue to amaze us with their ideas and creativity, and we look forward to continuing to help them discover their own paths forward with generative AI. While the potential for impact on business is great, we remain committed to taking a responsible approach, guided by our AI Principles. As we gather more feedback from our customers and users, we will continue to bring new innovations to market, with a goal to enable organizations of every size and industry to increase efficiency, connect with customers in new ways, and unlock entirely new revenue streams.
Quelle: Google Cloud Platform

Dialing up the impact of digital natives in the MENA region with Google Cloud

Entrepreneurs with the passion to drive positive impact have been selecting the Middle East and North Africa (MENA) region as the launchpad for their businesses since 2015, based on insights from Google Cloud’s digital natives unit. Today, the region is taking center stage due to the thousands of startups, digital natives, and web3 companies thriving. With more than 5,500 technology startups in the region, an amicable business climate, ample access to venture capital, and digital transformation being a top priority on government agendas, digital natives in the region have been on the rise.Forbes recently announced the Top 50 most funded startups in the MENA region. Collectively, these companies raised a whopping USD 3.2 billion in 2022, with startups in the region continuing to attract significant funding to date in comparison to other regions. The list also highlighted that UAE-based companies were the most represented for raising USD 964 million in total funding for that year, followed by the Kingdom of Saudi Arabia (KSA) where USD 946.7 million were raised, and Egypt reigning third place for raising USD 508.5 million.On the list, UAE-based fintech Tabby ranked second with USD 275 million in funds, and Sary, a Saudi based online marketplace, came in seventh for securing USD 112 million. Breadfast, an on-demand supermarket and household essentials provider based in Egypt also secured USD 26 million in funds during the same year.Tech-enabled success for digital natives in the MENA regionThe common factor between companies such as Tabby, Sary and Breadfast is that they are all fully tech-enabled businesses running on Google Cloud. These three companies leverage Google Cloud’s scalable, secure and reliable platform, and innovative cloud solutions to create seamless experiences every day for their customers across KSA, United Arab Emirates (UAE), Egypt, Kuwait, and Pakistan.Tabby provides “buy now, pay later” solutions via an online application that has been built on Google Cloud from day one. Tabby has successfully grown a customer base of 2.5 million active shoppers in the region since its start in 2019, with the support of the scalability provided by Google Cloud that provides uninterrupted and secure financial services to customers. With an online retail boom on the horizon for the MENA region, Tabby is poised for a growth trajectory as the volume of active e-shoppers will continue to rise and more markets become activated in the region’s digital economy. Tabby’s development team is able to take several strides ahead of market demand by developing a seamless and innovative product that can accommodate an average of 10 million shoppers per day. By running the entire IT infrastructure on Google Cloud, the team dedicates their time and resources to focus on what is important to the business and that is to provide a product that caters to customer and market requirements, rather than exhaust resources on time consuming tasks such as the daily management of IT assets. Tabby also believes in the power of big data and turns to Google Cloud’s data analytics solutions such as Big Query to roll out new monetary policies for customers. Before a new credit policy is introduced to shoppers, Tabby tests its viability on Big Query and analyzes different implementation scenarios in real-time to test out its effectiveness. This helps the team roll out policies that have been proven to be effective with shoppers.Throughout the year, the MENA region experiences a peak in shopping cycles connected with local festivities such as the holy month of Ramadan, White Friday and Christmas. It is around high peak shopping periods that Tabby’s application experiences significant spikes, as the team manages 140 million requests per day in comparison to 80 million requests on a regular day. Nonetheless, with the support of Google Cloud’s scalable infrastructure Tabby holds a record of zero down-time during peak periods, and can scale operations successfully with low latency — ultimately locking in an excellent service to customers.“From the first day Tabby went live in 2019 to date, we have experienced zero-downtime in our systems during high traffic periods because of Google Cloud’s scalable and flexible infrastructure. We are able to support 2.5 million shoppers across the Middle East because we run on a robust and reliable infrastructure. Scalability is key for the team at Tabby. We are able to build new products very quickly on Google Cloud in comparison to other cloud providers.”A report by eCommerce DB revealed that Saudi Arabia is the 27th largest market globally for e-commerce with a projected revenue of USD 11,977 million by the end of 2023. Mordor Intelligence also revealed that the Saudi e-commerce market is expected to show a compound annual growth rate (CAGR 2023-2027) of 13.9%, resulting in a projected market volume of US$20,155.8 million by 2027. Enter Sary, a Saudi-based B2B marketplace that connects businesses of all sizes to millions of shoppers in Saudi Arabia, Egypt and Pakistan via mobile and web applications. Sary is not a common marketplace, it aims to support local businesses and empower homegrown names to reach customers at scale via its platform in the countries where it operates.Sary is home to 70,000 businesses from all walks of life and as the company set out to expand its footprint it was time to move away from an unsophisticated cloud setup to a more advanced and robust cloud provider that provides the security and scalability that supports plans to tap into new markets.Sary attributes a big part of its success to running a robust infrastructure on Google Cloud, as it witnessed an 84% increase in operational system throughput since migrating the entire IT infrastructure. This means that businesses relying on the platform as their main marketplace are able to process orders at scale without any down-time or system interruptions, and generate positive revenue streams. Sary also leverages Google Kubernetes Engine (GKE) to automatically scale system bandwidth based on the volume of traffic the website or application receives. This solution helps the company manage IT costs effectively, while still delivering an uncompromised service to customers.“The support we receive every day from the Google Cloud team has been phenomenal. They have been with us every step of the way. We are able to free up time to focus on what is important and that is to deliver business value to our customers who depend on Sary for their success.”Egypt is another country that is rising as a strategic player in the MENA digital natives scene over the recent years. The 2022 Egypt Venture Investment Reportrevealed that the startup ecosystem observed a 168% year-on-year increase in capital investments to reach a new all time high record of USD 491 million. Breadfast is one of the companies disrupting the scene in Egypt as an early adopter of operating a cloud-native supply chain, before the arrival of rapid online grocery delivery companies in the country. Now a household name, Breadfast is a cloud native on-demand supermarket and household essentials provider that delivers to over 200,000 homes in Cairo.  The team at Breadfast built a fully tech-enabled business across all operational touchpoints that comprises manufacturing facilities, supply fulfillment points, 30 dark stores, 15 specialized coffee outlets and last-mile delivery. Running a tech-driven business generates additional costs that can be optimized when working with a cloud provider. And ever since Breadfast migrated the entire IT infrastructure to Google Cloud in 2022, the company has become more profitable as operating costs were reduced by 35% while improving system throughput with the support of Google Cloud’s scalable and secure infrastructure. To fulfill its brand promise of product delivery within 60 minutes anywhere in Cairo, Breadfast also turns to Google Cloud‘s resilient infrastructure that delivers efficient operational throughput to ensure no interruptions affect server vitality and impact order processing timelines. Breadfast successfully increased system up-time to 99.5 % since it migrated to Google Cloud, and was able to deliver six million orders across the city within a span of 30 minutes in 2022.“In our line of business time is of the essence. Two minutes of downtime in our systems takes 12 hours to fix on ground, which can have a downward impact on our customers. We decided to migrate our IT infrastructure to Google Cloud as the trusted cloud provider because of its resilience and the operational uptime is now at 99.5% ever since we made the move. This enabled Breadfast to deliver millions of orders in 2022.”Build your business with Google CloudGoogle Cloud opened up its secure and scalable infrastructure to businesses in the Middle East and North Africa region, where artificial intelligence (AI) and machine learning (ML) is embedded in cloud solutions that bring meaning to data and can help automate almost everything. Google Cloud also provides digital natives with the freedom to run applications where they need them with open, hybrid, and multi-cloud solutions. This way, an application is built once and can run anywhere, even on-premises.With no configuration required, digital natives can access limitless data effortlessly with Google Cloud solutions such as Big Query and Looker. These unique data analytics solutions are the single source of truth as they rely on AI and ML to design solutions that provide a deep understanding of customer data. Powered by data-driven understanding of customers, businesses today can preempt customer trends and bring them the right products and solutions based on their needs. Businesses can also accurately track down granular information such as if a driver delivered an order on time, and which item needs to be restocked in a warehouse.Google Cloud provides data loss prevention solutions which help digital natives encrypt critical data like customer information and financial records. Businesses can also discover, classify and protect their most sensitive data and detect customer churn or fraudulent activity using machine learning capabilities embedded in Big Query.To help entrepreneurs in the MENA region supercharge business growth, Google Cloud runs the Google for Startups Cloud Program that offers access to startup experts, cloud cost coverage up to USD 100,000 for each of the first two years, technical training, business support, and Google-wide offers. Sign up here for the program.Note: All customer metrics featured in the blogpost were derived from direct customer interviews with Google Cloud.
Quelle: Google Cloud Platform

Committed use discounts for RHEL and RHEL for SAP now available on Compute Engine

Optimizing your costs is a major priority for us here at Google Cloud. We are pleased to announce the general availability of committed use discounts (“CUDs”) for Red Hat Enterprise Linux and Red Hat Enterprise Linux for SAP. If you run consistent and predictable workloads on Compute Engine, you can utilize CUDs to save on Red Hat Enterprise Linux subscription costs by as much as 24% compared to on-demand (or “PAYG”) prices. “Red Hat Enterprise Linux on Google Cloud provides a consistent foundation for hybrid cloud environments and a reliable, high-performance operating environment for applications and cloud infrastructure. The introduction of committed use discounts for Red Hat Enterprise Linux for Google Cloud makes it even easier for customers to deploy on the world’s leading enterprise Linux platform to unlock greater business value in the cloud.” — Gunnar Hellekson, Vice President and General Manager, Red Hat Enterprise Linux Business Unit, Red Hat What are committed use discounts for Red Hat Enterprise Linux?Red Hat Enterprise Linux and Red Hat Enterprise Linux for SAP committed use discounts (collectively referred to as “Red Hat Enterprise Linux CUDs”) are resource-based commitments available for purchase in one- or three-year terms. When you purchase Red Hat Enterprise Linux CUDs, you are committing to paying the monthly Red Hat Enterprise Linux subscription fees for the duration you’ve selected for the number of licenses you specify, regardless of your actual usage. In exchange, you can save as much as 24% on Red Hat Enterprise Linux subscription costs compared to on-demand rates. Because you are billed monthly regardless of actual Red Hat Enterprise Linux usage, CUDs are ideal for your predictable and steady-state usage, to maximize your savings and make for easier budget planning. How do committed use discounts work for Red Hat Enterprise Linux?Red Hat Enterprise Linux CUDs are project- and region-specific, similar to the other software license CUDs available today. This means you will need to purchase Red Hat Enterprise Linux CUDs in the same region and project as the instances consuming these subscriptions. After you purchase Red Hat Enterprise Linux CUDs, discounts automatically apply to any running virtual machine (VM) instances within a selected project in the specified region. If you have multiple projects under the same billing account, commitments can also be shared across projects by turning on billing account sharing.When commitments expire, your running VMs continue to run at on-demand rates. It is important to note that after you purchase a commitment, you cannot edit or cancel it. You must pay the agreed-upon monthly amount for the duration of the commitment. Refer to Purchasing commitments for licenses for more information. How much can I save by using committed use discounts for Red Hat Enterprise Linux?By purchasing Red Hat Enterprise Linux CUDs, you can save as much as 20% on one-year commitments and up to 24% on three-year commitments compared to the current on-demand prices. However, it is important to remember that with CUDs, you will be charged for monthly subscription fees regardless of your actual Red Hat Enterprise Linux usage. Therefore, to maximize the discounts you can receive from CUDs, we recommend purchasing CUDs for steady and predictable workloads. Here is a helpful comparison between maximum discounts possible using CUDs versus its relative on-demand prices:Price as of this article’s publish date. Hourly costs are approximate. Calculations are derived based on the full CUD prices (as of this article’s publish date), assuming VMs running 730 hours per month,12 months per year. Discounts compared to current on-demand pricing, rounded to the nearest whole number.Based on our research, CUDs are a good fit for many Red Hat Enterprise Linux VMs, the majority of which run 24/7 workloads. When evaluating whether or not purchasing Red Hat Enterprise Linux CUD is a good choice for you, consider the following: Based on list prices for a one-year term, Red Hat Enterprise Linux CUDs can help you save on subscription costs if you utilize a Red Hat Enterprise Linux instance for ~80% or more of the time within the one year CUD term. For a three-year Red Hat Enterprise Linux CUD, you can start saving when a Red Hat Enterprise Linux instance runs for ~76% or more of the time. Additionally, remember that Red Hat Enterprise Linux CUDs automatically apply to all running VM instances within the same region and project. (However, one Red Hat Enterprise Linux CUD can only be applied to one VM instance at a time.)*Savings are estimates only. This analysis assumes only one Red Hat Enterprise Linux (large) instance running under the CUD project and region.What if I need to upgrade my Red Hat Enterprise Linux version after purchasing a commitment? Red Hat Enterprise Linux CUDs are version-agnostic and are not affected when you perform operating system (OS) upgrades or downgrades. For example, if you purchased a commitment for Red Hat Enterprise Linux 7, you may upgrade to Red Hat Enterprise Linux 8 and continue to use the same commitment without any action on your end. Additionally, commitments are not affected by future pricing changes to on-demand prices for Compute Engine resources.How can I purchase committed use discounts for Red Hat Enterprise Linux?The easiest way to purchase Red Hat Enterprise Linux CUDs is through the Google Cloud console. In the Google Cloud console, go to the Committed Use Discounts page. Click Purchase commitment to purchase a new commitment. Click New license committed use discount to purchase a new license commitment. Name your commitment and choose the region where you want it to apply. Choose a duration of the commitment, either 1 or 3 years. Choose a License family. Choose the License type and quantity. Choose the Number of licenses. Click Purchase.You can also purchase Red Hat Enterprise Linux commitments using the Google Cloud CLI or the Compute Engine API. For more information, refer to Purchasing commitments for licenses. We hope this helps you find the most cost-optimal plan for your Red Hat Enterprise Linux deployment needs.
Quelle: Google Cloud Platform

Introducing BigQuery Partner Center — a new way to discover and leverage integrated partner solutions

At Google, we are committed to building the most open and extensible Data Cloud. We want to provide our customers with more flexibility, interoperability and agility when building analytics solutions using BigQuery and tightly integrated partner products. We have therefore significantly expanded our Data Cloud partner ecosystem, and are increasing our investment in technology partners in a number of new areas.At the Google Data Cloud & AI Summit in March, we introduced BigQuery Partner Center, a new user interface in the Google Cloud console that enables our customers to easily discover, try, purchase and use a diverse range of partner products that have been validated through the Google Cloud Ready – BigQuery program. Google Cloud Ready – BigQuery is a program whereby Google Cloud engineering teams evaluate and validate BigQuery partner integrations and connectors using a series of tests and benchmarks based on standards set by Google. Customers can be assured of the quality of the integrations when using these partner products with BigQuery. These validated partners and solutions are now accessible directly from BigQuery Partner Center.Navigating in BigQuery Partner CenterCustomers can start exploring BigQuery Partner Center by launching the BigQuery Cloud Console.A video demo of how to discover and install a free trial of Confluent Cloud from the BigQuery Partner Center.Discover: In the Partner Center, you can find a list of validated partners organized in the following categories:BI, ML, and Advanced AnalyticsConnectors & Development ToolsData Governance, Master Data ManagementData Quality & ObservabilityETL & Data IntegrationTry: You will have the option to try out the product by signing up for a free trial version offered by the partner.Buy: If you choose to purchase any of the partner products, you can do it directly from Google Cloud Marketplace by clicking on the Marketplace hyperlink tag.Here’s an overview of how you can discover and use some of BigQuery’s partner solutions.Confluent Cloud is now available in BigQuery Partner Center to help customers easily connect and create Confluent streaming data pipelines into BigQuery, and extract real-time insights to support proactive decision-making while offloading operational burdens associated with managing open source Kafka..Fivetran offers a trial experience through the BigQuery Partner Center, which allows customers to replicate data from key applications, event streams, and file stores to BigQuery continuously. Moreover, customers can actively monitor their connector’s performance and health using logs and metrics provided through Google Cloud Monitoring.Neo4j provides an integration through BigQuery Partner Center that allows users to extend SQL analysis with graph-native data science and machine learning by working seamlessly between BigQuery and Neo4j Graph Data Science; whether using BigQuery SQL or notebooks. Data science teams can now improve and enrich existing analysis and ML using the graph-native data science capabilities within Neo4j by running in-memory graph analysis directly from BigQuery.Expanding partner ecosystem through Google Cloud Ready – BigQuery We are also excited to share that, since we introduced the Google Cloud Ready – BigQuery initiative last year, we have recognized over 50 technology partners that have successfully met a core set of integration requirements with BigQuery. To unlock more use cases that are critical to customers’ data-driven transformation journey, Google Cloud engineering teams closely worked with partners across many categories to test compatibility, tune functionality, and optimize integrations to ensure our customers have the best experience when using these partner products with BigQuery.For example, in the Data Quality & Observability category, we have most recently validated products from Anomalo, Datadog, Dynatrace, Monte Carlo and New Relic to enable better detection and remediation of data quality issues.In the Reverse ETL & Master Data Management category, we worked with Hightouch and Tamr to expand data management use cases for data cleansing, preparation, enrichment and data synchronization from BigQuery back to SaaS based applications.Data Governance and Security partners like Immuta and Privacera can provide enhanced data access controls and management capabilities for BigQuery, while Carto offers advanced geospatial and location intelligence capabilities that are well integrated with BigQuery.We also continue to expand partnerships in key categories such as Advanced Analytics and Data Integration with industry leading partners like Starburst, Sisense, Hex, and Hevo Data to ensure our customers have flexibility and options in choosing the right partner products to meet their business needs.With the general availability of BigQuery Partner Center, customers can now conveniently discover, try out and install a growing list of Google Cloud Ready – BigQuery validated partners from the BigQuery Cloud Console directly.Getting startedTo explore in the new Partner Center, launch the BigQuery Cloud Console.To see a full list of partner solutions and connectors that have been validated to work well with BigQuery, visit here. To learn more about the Google Cloud Ready – BigQuery validation program, visit our documentation page. If you are a partner interested in becoming “Google Cloud Ready” for BigQuery, please fill out this intake form. If you have any questions, feel free to contact us.
Quelle: Google Cloud Platform

Bringing our world-class expertise together under Google Cloud Consulting

Every day, we see how much our customers value Google Cloud experts working alongside their teams to drive innovation. We also know that being connected to the right services and partners at the right time accelerates customer success. Last year, we expanded our custom AI solution practice and launched our Global Delivery Center to deliver deep product expertise at a global scale. Today, we’re excited to announce the next step on our journey to bring all our services together with the launch of Google Cloud Consulting and our unified services portfolio at cloud.google.com/consulting. The Google Cloud Consulting portfolio provides a unified services capability, bringing together offerings, across multiple specializations, into a single place. This includes services from learning to technical account management to professional services and customer success. Through this single portfolio, you’ll have access to detailed descriptions of the various services, with examples of how you can leverage them to solve specific business challenges. This will make it easy to identify the right package of services for your business and will ensure you get the most out of your investment. At Google Cloud, we always work closely with our ecosystem of partners to deliver innovation and value to our customers, and Google Cloud Consulting further reinforces our commitment to being partner-first. By bringing together capabilities across the customer lifecycle — from onboarding to enablement to co-delivery and assurance — this unified portfolio makes it simpler for partners to work with Google Cloud Consulting and help drive the best outcomes for customers. “Our partnership with Google Cloud Consulting is helping us to grow our Google Cloud practice globally and accelerate our customers’ adoption of the platform. We are pushing the bounds of innovation together as the AI wave approaches,” said Ankur Kashyap, SVP and Global Head of Google Ecosystem Unit, HCLTech.Broadcom, a provider of enterprise security solutions, recently worked with Google Cloud Consulting to migrate its infrastructure from Amazon Web Services (AWS), and found the combination of technology and expertise critical for success. “Google’s deep technical skills and its data, security and AI offerings have accelerated our transformation towards becoming a software-led company,” said Andy Nallappan, Vice President, CTO and CSO, Broadcom. Kroger, the American retailer, worked with Google Cloud Consulting and Deloitte to accelerate its technical objectives. “Google Cloud Consulting and Deloitte brought us a technology architecture and application framework that we could implement in record time. We’re already seeing results across our stores, with associate tasks being optimized and overall productivity increasing,” said Jim Clendenen, VP, Enterprise Retail Systems, Kroger. Whether you’re just getting started in the cloud or seeking new ways to innovate, our portfolio of offerings is built to help you: Leverage Google Cloud professional service engineers and consultants to kickstart your cloud journey, from testing, planning and executing migrations to optimizing your operations.Work alongside our partners to provide expertise and assurance services.Benefit from access to cutting-edge tools, including best-in-class Artificial Intelligence and Machine Learning (AI/ML) solutions, data resources, and security services that you can use to build robust data platforms and protect your business from security threats. Receive bespoke, hands-on guidance from our Technical Account Managers who build familiarity with your applications, systems, and business goals, and proactively advise and accelerate your digital transformation. Train and certify your teams in Google Cloud with a range of learning services that can boost your long-term self-sufficiency, and help you foster a culture of innovation. These end-to-end capabilities are designed to meet you wherever you are in your cloud journey, so you can both build your business in the cloud and make digital breakthroughs safely.At Google Cloud, we’re committed to providing technology and services to help you grow and succeed. From developing innovative solutions, to pioneering with generative AI, to securely managing your data in the cloud, to transforming user experiences, we’re with you at all the key moments of your cloud journey. As we look forward to 2023, we’ll continue to expand the service catalog and focus on making it even easier to find and transact these services, and further streamline the experience of engaging with our services. Click here to see Google Cloud Consulting’s full portfolio of offerings. 
Quelle: Google Cloud Platform

Accelerate time to value with Google’s Data Cloud for your industry

Many data analytics practitioners today are interested in ways they can accelerate new scenarios and use cases to enable business outcomes and competitive advantage. As many enterprises look at rationalizing their data investments and modernizing their data analytics platform strategies, the prospect of migrating to a new cloud-first data platform like Google’s Data Cloud can be perceived as a risky and daunting task — not to mention the expense of the transition from redesign and remodeling of legacy data models in traditional data warehouse platforms to the refactoring of analytics dashboards and reporting for end users. The time and cost of this transition is not trivial. Many enterprises are looking for ways to deliver innovation at cloud speed without the time and costs of traditional replatforming where millions are spent on this type of transition. When access to all data within the enterprise and beyond is the future – it’s a big problem if you can’t leverage all of your data for its insights, and at cloud scale, because you’re stuck in the technologies and approaches of which aren’t designed to match your unique industry requirements. So, what is out there to address these challenges? Google’s Data Cloud for industries combines pre-built industry content, ecosystem integrations, and solution frameworks to accelerate your time to value. Google has developed a set of solutions and frameworks to address these issues as part of its latest offering called Google Cloud Cortex Framework, which is part of Google’s Data Cloud. Customers like Camanchaca accelerated build time for analytical models by 6x, and integrated Cortex content for improved supply chain and sustainability insights and saved 12,000 hours deploying 60 data models in less than 6 months. Accelerating time to value with Google Cloud Cortex FrameworkCortex Framework provides accelerators to simplify your cloud transition and data analytics journey in your industry. This blog explores some essentials you need to know about Cortex and how you can adopt and leverage its content to rapidly onramp your enterprise data from key applications such as SAP and Salesforce, along with data from Google, third-party, public and community data sets. Cortex is available today and it allows enterprises to accelerate time to value by providing endorsed connectors delivered by Google and our partners, reference architectures, ready to use data models and templates with BigQuery, Vertex AI examples, and an application layer that includes microservices templates for data sharing with BigQuery that developers can easily deploy, enhance, and make their own depending on the scope of their data analytics project or use case. Cortex content helps you get there faster — with lower time and complexity to implement. Let’s now explore some details of Cortex and how you can best take advantage of it with Google’s Data Cloud.   First, Cortex is both a framework for data analytics and a set of deployable accelerators; the below image provides an overview of the essentials of Cortex Framework focusing on key areas of endorsed connectors, reference architectures, deployment templates, and innovative solution accelerators delivered by Google and our partners. We’ll explore each of these focus areas of Cortex in greater depth below.Why Cortex Framework?  Leading connectors: First, Cortex provides leading connectors delivered by Google and our partners. These connectors have been tested and validated to provide interoperability with Cortex data models in BigQuery, Google’s cloud-scale enterprise data warehouse. By taking the guesswork out of selecting which tooling works to integrate to Cortex with BigQuery, we’re taking the time, effort, and cost out of evaluating the various tooling available in the market. Deployment accelerators: Cortex provides a set of predefined deployable templates and content for enterprise use cases with SAP and Salesforce that include BigQuery data models, Looker dashboards, Vertex AI examples, and microservices templates for synchronous and asynchronous data sharing with surrounding applications. These accelerators are available free of charge today via Cortex Foundation and can easily be deployed in hours. The figure below provides an overview of Cortex Foundation and focus areas for templates and content available today:Reference architectures: Cortex provides reference architectures for integrating with leading enterprise applications such as SAP and Salesforce as well as Google and third-party data sets and data providers. Reference architectures include blueprints for integration and deployment with BigQuery that are based on best practices for integration with Google’s Data Cloud and partner solutions based on real-world deployments. Examples include best practices and reference architectures for CDC (Change Data Capture) processing and BigQuery architecture and deployment best practices. The image below shows an example of reference architectures based on Cortex published best practices and options for CDC processing with Salesforce. You can take advantage of reference architectures such as this one today and benefit from these best practices to reduce the time, effort and cost of implementation based on what works and has been successful in real-world customer deployments.Innovative solutions: Cortex Foundation includes support for various use cases and insights across a variety of data sources. For example, Cortex Demand Sensing is a solution accelerator offering leveraging Google Cloud Cortex Framework to deliver accelerated value to Consumer Packaged Goods (CPG) customers who are looking to infuse innovation into their Supply Chain Management and Demand Forecasting processes.An accurate forecast is critical to reducing costs, and maximizing profitability. One gap for many CPG organizations is a near-term forecast that leverages all of the available information from various internal and external data sources to predict near-term changes in demand. As an enhanced view of demand materializes, CPG companies also need to manage and match demand and supply to identify near term changes in demand and their root cause, and then shape supply and demand to improve SLAs and increase profitability. Our approach shown below for Demand Sensing integrates SAP ERP and other data sets (e.g. Weather Trends, Demand Plan, etc) together with our Data Cloud solutions like BigQuery, Vertex AI and Looker to deliver extended insights and value to demand planners to improve the accuracy of demand predictions and help to defer cost and drive new revenue opportunities.The ecosystem advantageBuilding an ecosystem means connections with a diverse set of partners that accelerate your time to value. Google Cloud is excited to announce a range of new partner innovations that bring you more choice and optionality. Over 900 partnersput trust in BigQuery and Vertex AI to power their business by being part of the “Built with” Google Cloud initiative. These partners build their business on top of our data platform, enabling them to scale at high performance – both their technology and their business. In addition to this, more than 50 data platform partners offer fully validated integrations through our Google Cloud Ready – BigQuery initiative. A look aheadOur solutions roadmap will target expansion of Cortex Foundation templates and content support for additional solutions in sales and marketing, supply chain, and expansion of use cases and models for finance. You will also see significant expansion with predefined BigQuery data models and content for Google Ads, Google Marketing Platform, and other cross-media platforms and applications and improvements with deployment experience and expansion into analytical accelerators that span across data sets and industries. If you would like to connect with us to share more details on what we are working on and our roadmap, we’re happy to engage with you! Please feel free to contact us at cortex-framework@google.com to learn more about the work we are doing and how we might help with your specific use cases or project. We’d love to hear from you!Ready to start your journey?With Cortex Framework, you come first in benefiting from our open source Data Foundation solutions content and packaged industry solutions content available on our Google Cloud Marketplace and Looker. The Cortex content is available free of charge so you can easily get started with your Google Data Cloud journey today!Learn more about Google Cloud Cortex Framework and how you can accelerate business outcomes with less risk, complexity and cost. Cortex will help you get there faster with your enterprise data sources and establish a cloud-first data foundation with Google’s Data Cloud. Join the Data Cloud Summitto learn how customers like Richemont & Cartier use Cortex Framework to speed up time to value.
Quelle: Google Cloud Platform

Pub/Sub schema evolution is now GA

Pub/Sub schemas are designed to allow safe, structured communication between publishers and subscribers. In particular, the use of schemas provides that guarantee that any message published adheres to a schema and encoding, which the subscriber can rely on when reading the data. Schemas tend to evolve over time. For example, a retailer is capturing web events and sending them to Pub/Sub for downstream analytics with BigQuery. The schema now includes additional fields that need to be propagated through Pub/Sub. Up until now Pub/Sub has not allowed the schema associated with a topic to be altered. Instead, customers had to create new topics. That limitation changes today as the Pub/Sub team is excited to introduce schema evolution, designed to allow the safe and convenient update of schemas with zero downtime for publishers or subscribers.Schema revisionsA new revision of schema can now be created by updating an existing schema. Most often, schema updates only include adding or removing optional fields, which is considered a compatible change.All the versions of the schema will be available on the schema details page. You are able to delete one or multiple schema revisions from a schema, however you cannot delete the revision if the schema has only one revision. You can also quickly compare two revisions by using the view diff functionality.Topic changesCurrently you can attach an existing schema or create a new schema to be associated with a topic so that all the published messages to the topic will be validated against the schema by Pub/Sub. With schema evolution capability, you can now update a topic to specify a range of schema revisions against which Pub/Sub will try to validate messages, starting with the last version and working towards the first version. If first-revision is not specified, any revision <= last revision is allowed, and if last revision is not specified, then any revision >= first revision is allowed.Schema evolution exampleLet’s take a look at a typical way schema evolution may be used. You have a topic T that has a schema S associated with it. Publishers publish to the topic and subscribers subscribe to a subscription on the topic:Now you wish to add a new field to the schema and you want publishers to start including that field in messages. As the topic and schema owner, you may not necessarily have control over updates to all of the subscribers nor the schedule on which they get updated. You may also not be able to update all of your publishers simultaneously to publish messages with the new schema. You want to update the schema and allow publishers and subscribers to be updated at their own pace to take advantage of the new field. With schema evolution, you can perform the following steps to ensure a zero-downtime update to add the new field:1. Create a new schema revision that adds the field.2. Ensure the new revision is included in the range of revisions accepted by the topic.3. Update publishers to publish with the new schema revision.4. Update subscribers to accept messages with the new schema revision.Steps 3 and 4 can be interchanged since all schema updates ensure backwards and forwards compatibility. Once your migration to the new schema revision is complete, you may choose to update the topic to exclude the original revision, ensuring that publishers only use the new schema.These steps work for both protocol buffer and Avro schemas. However, some extra care needs to be taken when using Avro schemas. Your subscriber likely has a version of the schema compiled into it (the “reader” schema), but messages must be parsed with the schema that was used to encode them (the “writer” schema). Avro defines the rules for translating from the writer schema to the reader schema. Pub/Sub only allows schema revisions where both the new schema and the old schema could be used as the reader or writer schema. However, you may still need to fetch the writer schema from Pub/Sub using the attributes passed in to identify the schema and then parse using both the reader and writer schema. Our documentation provides examples on the best way to do this.BigQuery subscriptionsPub/Sub schema evolution is also powerful when combined with BigQuery subscriptions, which allow you to write messages published to Pub/Sub directly to BigQuery. When using the topic schema to write data, Pub/Sub ensures that at least one of the revisions associated with the topic is compatible with the BigQuery table. If you want to update your messages to add a new field that should be written to BigQuery, you should do the following:1. Add the OPTIONAL field to the BigQuery table schema.2. Add the field to your Pub/Sub schema.3. Ensure the new revision is included in the range of revisions accepted by the topic.4. Start publishing messages with the new schema revision.With these simple steps, you can evolve the data written to BigQuery as your needs change.Quotas and limitsSchema evolution feature comes with following limits:20 revisions per schema name at any time are allowed.Each individual schema revision does not count against the maximum 10,000 schemas per project.Additional resourcesPlease check out the additional resources available at to explore this feature further:DocumentationClient librariesSamplesQuotas
Quelle: Google Cloud Platform

Coop reduces food waste by forecasting with Google’s AI and Data Cloud

Although Coop has a rich history spanning nearly 160 years, the machine learning (ML) team supporting its modern operations is quite young. Its story began in 2018 with one simple mission: to leverage ML-powered forecasting to help inform business decisions, such as demand planning based on supply chain seasonality and expected customer demand. The end goal? By having insight into not only current data but also projections of what could happen in the future, the business can optimize operations to keep customers happy, save costs, and support its sustainability goals (more on that later!).Coop’s initial forecasting environment was one on-premises workstation that leveraged open-source frameworks such as PyTorch and TensorFlow. Fine tuning and scaling models to a larger number of CPUs or GPUs was cumbersome. In other words, the infrastructure couldn’t keep up with their ideas.So when the question arose of how to solve these challenges and operationalize the produced outcomes beyond those local machines, Coop leveraged the company’s wider migration to Google Cloud to find a solution that could stand the test of time.Setting up new grounds for innovationOver a two-day workshop with the Google Cloud team, Coop kicked things off by ingesting data from its vast data pipelines and SAP systems to BigQuery. At the same time, Coop’s ML team implemented physical accumulation cues of incoming new information and sorted out what kind of information this was. The team was relieved to not have to worry about setting up infrastructure and new instances.Next, the Coop team turned to Vertex AI Workbench to further develop its data science workflow, finding it surprisingly fast to get started. The goal was to train forecasting models to support Coop’s distribution centers so they could optimize their stock of fresh produce based on accurate numbers. Achieving higher accuracy, faster, to better meet customer demandDuring the proof-of-concept (POC) phase, Coop’s ML team had two custom-built models competing against an AutoML-powered Vertex AI Forecast model, which the team ultimately operationalized on Vertex AI: a single Extreme Gradient Boosting model and a Temporal Fusion Transformer in PyTorch. The team established that using Vertex AI Forecast was faster and more accurate than training a model manually on a custom virtual machine (VM).On the test set in the POC, the team reached 14.5 WAPE (Weighted Average Percentage Error), which means Vertex AI Forecast provided a 43% performance improvement relative to models trained in-house on a custom VM.After a successful POC and several internal tests, Coop is building a small-scale pilot (to be put live in production for one distribution center) that will conclude with the Coop ML team eventually streaming back the forecasting insights to SAP, where processes such as carrying out orders to importers and distributors take place. Upon successful completion and evaluation of the small-scale pilot in production in the next few months, they could possibly scale it out to full blown production across distribution centers throughout Switzerland. The architecture diagram below approximately illustrates the steps involved in both stages. The vision is of course to leverage Google’s data and AI services, including forecasting and post-forecasting optimization, to support all of Coop’s distribution centers in Switzerland in the near futureLeveraging Google Cloud to increase the relative forecasting accuracy by 43% over custom models trained by the Coop team can significantly affect the retailer’s supply chain. By taking this POC to pilot and possibly production, the Coop ML team hopes to improve its forecasting model to better support wider company goals, such as reducing food waste.Driving sustainability by reducing food wasteCoop believes that sustainability must be a key component of its business activity. With the aim to become a zero-waste company, its sustainability strategy feeds into all corporate divisions, from how it selects suppliers of organic, animal-friendly, and fair-trade products to efforts for reducing energy, CO2 emissions, waste materials, and water usage in its supply chains. Achieving these goals boils down to an optimal control problem. This is known as a Bayesian framework: Coop must carry out quantile inference to determine the scope of its distributions. For example, is it expecting to sell between 35 and 40 tomatoes on a given day, or is its confidence interval between 20 and 400? Reducing this amount of uncertainty with more specific and accurate numbers means Coop can order the precise number of units for distribution centers, ensuring customers can always find the products they need. At the same time, it prevents ordering in excess, which reduces food waste. Pushing the envelope of what can be achieved company-wideHaving challenged its in-house models against the Vertex AI Forecast model in the POC, Coop is in the process of rolling out a production pilot to one distribution center in the coming months, and possibly all distribution centers across Switzerland later thereafter. In the process, one of the most rewarding things was realizing that the ML team behind the project could use different Google Cloud tools, such as Google Kubernetes Engine and BigQuery, and Vertex AI to create its own ML platform. Beyond using pre-trained Vertex AI models, the team can automate and create data science workflows quickly so it’s not always dependent on infrastructure teams.Next, Coop’s ML team aims to use BigQuery as a pre-stage for Vertex AI. This will allow the entire data streaming process to flow more efficiently, serving data to any part of Vertex AI when needed. “The two tools integrate seamlessly, so we look forward to trying that combination for our forecasting use cases and potentially new use cases, too. We are also exploring the possibility of deploying different types of natural language processing-based solutions to other data science departments within Coop that are relying heavily on TensorFlow models,” says Martin Mendelin, Head of AI/ML Analytics, Coop. “By creating and customizing our own ML platform on Google Cloud, we’re creating a standard for other teams to follow, with the flexibility to work with open-source programs but in a stable, reliable environment where their ingenuity can flourish,” Mendelin adds. “The Google team went above and beyond with its expertise and customer focus to help us make this a reality. We’re confident that this will be a nice differentiator for our business.”
Quelle: Google Cloud Platform

Introducing time-bound Session Length defaults to improve your security posture

Google Cloud provides many layers of security for protecting your users and data. Session length is a configuration parameter that administrators can set to control how long users can access Google Cloud without having to reauthenticate. Managing session length is foundational to cloud security and it ensures access to Google Cloud services is time-bound after a successful authentication. Google Cloud session management provides flexible options for setting up session controls based on your organization’s security policy needs. To further improve security for our customers, we are rolling out a recommended default 16-hour session length to existing Google Cloud customers.Many apps and services can access sensitive data or perform sensitive actions. It’s important that only specific users can access that information and functionality for a period of time. By requiring periodic reauthentication, you can make it more difficult for unauthorized people to obtain that data if they gain access to credentials or devices.Enhancing your security with Google Cloud session controlsThere are two tiers of session management for Google Cloud: one for managing user connections to Google services (e.g. Gmail on the web), and another for managing user connections to Google Cloud services (e.g. Google Cloud console). This blog outlines the  session control updates for Google Cloud services.Google Cloud customers can quickly set up session length controls by selecting the default recommended reauthentication frequency. For existing customers who have session length configured to Never Expire, we are updating the session length to 16 hours.Google Cloud session control: Reauthentication policyThis new default session length rollout helps our customers gain situational awareness of their security posture. It ensures that customers did not mistakenly grant infinite session length to users or apps using Oauth user scopes. After the time bound session expires, users will need to reauthenticate with their login credentials to continue their access. The session length changes impact the following services and apps:Google Cloud Consolegcloud command-line toolAny other app that requires Google Cloud scopesThe session control settings can be customized for specific organizations, and the policies apply to all users within that organization. When choosing a session length, admins have the following options:Choose from a range of predefined session lengths, or set a custom session length between 1 and 24 hours. This is a timed session length that expires the session based on the session length regardless of the user’s activity.Configure whether users can use just their password, or are required to use a Security Key to reauthenticate.How to get started The session length will be on by default for 16 hours for existing customers and can be enabled at the Organizational Unit (OU) level. Here are steps for the admins and users to get started:Admins: Find the session length controls at Admin console > Security > Access and data control > Google Cloud session control. Visit the Help Center to learn more about how to set session length for Google Cloud services. End users: If a session ends, users will simply need to log in to their account again using the familiar Google login flow. Sample Use CasesThird-party SAML identity providers and session length controls If your organization uses a third-party SAML-based identity provider (IdP), the cloud sessions will expire, but the user may be transparently re-authenticated (i.e., without actually being asked to present their credentials) if their session with the IdP is valid at that time. This is expected behavior as Google will redirect the user to the IdP and accept a valid assertion from the IdP. To ensure that users are required to reauthenticate at the correct frequency, evaluate the configuration options on your IdP and review the Help Center to Set up SSO via a third party Identity provider.Trusted applications and session length controlsSome apps are not designed to gracefully handle the reauthentication scenario, causing confusing app behaviors or stack traces. Some other apps are deployed for server-to-server use cases with user credentials instead of the recommended service account credential, in which case there is no user to periodically reauthenticate. If you have specific apps like this, and you do not want them to be impacted by session length reauthentication, the org admin can add these apps to the trusted list for your organization. This will exempt the app from session length constraints, while implementing session controls for the rest of the apps and users within the organization.General Availability & Rollout PlanAvailable to all Google Cloud customersGradual rollout starting on March 15, 2023.Helpful links Help Center: Set session length for Google Cloud services Help Center: Control which third-party & internal apps access Google Workspace dataHelp Center: Use a security key for 2-Step VerificationCreating and managing organizationsUsing OAuth 2.0 for Server to Server ApplicationsRelated ArticleIntroducing IAM Deny, a simple way to harden your security posture at scaleOur latest new capability for Google Cloud IAM is IAM Deny, which can help create more effective security guardrails.Read Article
Quelle: Google Cloud Platform

Google Cloud and MongoDB expand partnership to support startups

Scale your Startups from ideation to growth with MongoDB Atlas on Google Cloud. By providing an integrated set of database and data services and a unified developer experience, MongoDB Atlas on Google Cloud lets companies at all stages build applications that are highly available, performant at global scale, and compliant with the most demanding security and privacy standards.Today we’re excited to announce that we’re expanding our partnership to also support startups together.In addition to the technology, each company has dedicated programs to help startups scale quicker with financial, business and technical support. Harness the power of our partnership for startupsThere are two key ways in which we believe our partnership can help startups scale quicker, more safely and more successfully: 1. Our technologiesMongoDB Atlas allows you to run our fully-managed developer data platform on Google Cloud in just a few clicks. Set up, scale, and operate MongoDB Atlas anywhere in the world with the versatility, security and high-availability you need. Run MongoDB Atlas on Google Cloud to gain true multi-cloud capabilities, best-in-class automation, workload intelligence, and proven practices with the most modern developer data platform available. With the Pay-As-You-Go option on the Google Cloud Marketplace, you only pay for the Atlas resources you use, with no upfront commitment required.Got global customers? Google Cloud is wherever they are and MongoDB Atlas makes it easy to distribute your data for low latency performance and global compliance needs. Selling to a tough enterprise crowd? Data in MongoDB Atlas is protected from the start with preconfigured security features for authentication, authorization, and encryption, and is stored in the same zero-trust, shared-risk model that Google itself depends on. As partners, Google Cloud and MongoDB co-engineer streamlined integrations between MongoDB Atlas and many Google Cloud services to make it easier to deploy apps (Dataflow, GKE, Cloud Run), pull in data from other sources (Apigee), run in flexible multi cloud environments (Anthos), easy deployment of MEAN stack, and Terraform and analyze data (BigQuery, Vertex AI). 2. Our dedicated startup programsThe Google for Startups Cloud program provides:Credits for Google Cloud, Google Workspace, access to training programs and technical support via a dedicated Startup Success Manager, our global Google Cloud Startup Community, and co-marketing opportunities for select startups.Credits: If you’re early in your startup journey and not yet backed with equity funding, you’ll have access to $2,000 of Google Cloud credits. If you are, your first year of Cloud and Firebase usage is covered with credits up to $100,000. Plus, in year two get 20% of Google Cloud and Firebase usage covered, up to an additional $100,000 in credits*Google-wide discounts: Free Google Workspace Business Plus for new signups and monthly credits on Google Maps Platform for 12 months for new signupsTraining: Google Cloud Skills Boost credits giving access to online courses and hands-on labsTechnical support: Get timely help 24/7 through Enhanced Support by applying Google Cloud creditsBusiness Support & Networking: Access to a Startup Success Manager, our global Google Cloud Startup Community, and co-marketing opportunities for select startupsThe MongoDB for Startups program provides: Credits for MongoDB Atlas, dedicated onboarding support, a wide range of hands-on training available on-demand, a complimentary technical advisor session, and co-marketing opportunities to help you amplify your business.  Credits: Free credits for MongoDB Atlas,  including usage of the core Atlas Database, in addition to extended data services for full-text search, data visualization, real-time analytics, building event-driven applications and more to supercharge your data infrastructureDedicated Onboarding Support: Bespoke onboarding resources tailored to help you successfully adopt and scale MongoDB Atlas  Hands-on Training: Free on-demand content access to MongoDB’s library of training with 150+ hands-on labsExpert Technical Advice: A dedicated one-on-one session with our technical experts for personalized recommendations to add scale and optimizeGo to Market Opportunities: Engage with MongoDB’s diverse community of startups and developers through networking events and work with MongoDB on co-marketing initiatives to amplify your startup’s growth and promote the innovative tech you are buildingStartups finding success with Google Cloud and MongoDB Atlas Startup programsMany startups have found these integrations and the interoperability between Google Cloud and MongoDB Atlas to be a powerful combination: Thunkable, a no-code app development platform, has found quick success (3 million users) with a team of just four to six engineers. “The engineering team has always been focused on building the product,” said Thunkable engineer, Jose Dominguez. “So not having to worry about the database was a great win for us. It allowed us to iterate very fast…. As we scale, supporting more enterprise customers, we don’t have to worry about database management issues.”Phonic — a software company that applies intelligent analytics to qualitative research in order to break down barriers between qualitative and quantitative data — uses Google Cloud for distributed file storage, App Engine for auto-scaling, and MongoDB Atlas to support its needs for flexible databases that can adjust to frequent schema changes.Next stepsTo apply to join the Google for Startups Cloud program and MongoDB Atlas Startup program, and to learn more about the benefits each offers, visit our partnership page. Companies enrolled in both startup programs will have exclusive access to joint events, technical support, bespoke offers and much more.
Quelle: Google Cloud Platform