A cloud built for developers — join our Innovators community!

Ten years from now, where will your company be? We believe that to be a leader in your industry, you need to be a technology leader in your industry, too. And no one plays a bigger role in that than developers. Today’s Google Cloud Next ‘21 keynote, “A Cloud Built For Developers,” focused on two areas to help developers support your organizational transformation: (1) making it easier for you to get your job done; and (2) investing in the developer community to learn and grow from each other.Making it easier for you to get your job doneFrom transformational infrastructure like Tau VMs to innovations in data and AI and security, every feature we release begins making it easier for developers everywhere to do what they love. Whether that’s by natively embedding key security or sustainability features into Google Cloud or featuring partner solutions right within our console, we’re focused on one thing: giving you the best developer experience of any cloud provider.Here’s how we’re helping you get your job done easier.Providing the industry’s best Kubernetes experienceKubernetes deployments can involve a fair bit of manual configuration, but with Google Cloud, you get the most automated and secure Kubernetes experience available. GKE Autopilot makes it much easier for you to use Kubernetes by provisioning and managing the entire cluster’s underlying infrastructure, including the control plane, node pools, and worker nodes, letting you focus on higher-level applications and services. With GKE Autopilot, you get a sophisticated cluster that uses the best practices brought to you by the team who developed Kubernetes itself. You’re always up to date and get the same results as the experts, without having to be an expert yourself. No other cloud provider offers anything like this.Helping you build secure, modern applications right from your laptopWith Google Cloud, we make it super easy to create modern applications.Easy to write – with Cloud Shell Editor, a context-aware, remote development environment, you can develop and manage applications securely from any browser. There’s nothing to download or install locally. It comes with an integrated debugger, source control, API explorer, and if you want to test locally on your laptop, it has local emulators for Kubernetes and serverless APIs. You can also run command-line instructions directly in Google Cloud documentation, so you can have a better learning experience within the context of where you’re already working.Easy to deploy – Cloud Build lets you build, test, and deploy on our serverless CI/CD platform across multiple environments such as VMs, serverless, Kubernetes, or Firebase. Cloud Run combines the best of serverless and containers. It allows you to deploy containers into production in seconds because there is no cluster to configure. And with Cloud Build fully integrated with Cloud Run, you can build and deploy your app with just one command. We also recently launched Cloud Deploy to make scaling deployment pipelines across your organization seamless. When it comes to secure deployment, Binary Authorization ensures only trusted container images are deployed on GKE or Cloud Run.Easy to extend – Cloud Functions is a scalable “pay-as-you-go” service that allows you to run your code without any server management. You can extend your application and scale up or down based on load.Easy to operate – with Google Cloud’s operations suite, we make it easy to monitor, troubleshoot, and improve application performance in production. You get one integrated view of alerts, events, metrics, and logs. You don’t have to jump between multiple tools while you are troubleshooting issues.Keeping you ahead of tomorrow’s threatsRecent cyber threats have stemmed from malicious actors who are compromising the software supply chain. To address these growing threats, we have been building security natively into the developer toolchain, anticipating and preventing issues ahead of time—not when you are most at risk.Cloud Build offers the first and only cloud CI/CD service with verifiable build provenance built in. This provenance lets you trace a binary to the source code to prevent tampering and prove that the code you’re running is really the code you think you’re running. And with new build integrity capabilities, another Google Cloud first, you can prevent anyone in your organization from deploying code that has not been built by your legitimate build system.Anthos Service Mesh uses a zero-trust security model where once your application is deployed, you can manage authentication, authorization, and encryption between services, with little to no changes to your applications. Both Anthos Service Mesh and Cloud Build Hybrid are available across Google Cloud and your on-premises environment. They work with VPC Service Controls and VPC Peering to automate developer security for your enterprise and allow you to build and manage your applications with assurance. We also believe that by helping create industry standards, we can provide safer and simpler services for developers. This is why we co-founded the Open Source Security Foundation with other technology leaders to create security standards in open source. We also proposed an industry-wide standard called SLSA (“salsa”), or Supply-chain Levels for Software Artifacts. It’s a security framework that provides common criteria for increasing levels of software security through automation and cryptographic signing at each stage of the software supply chain. Cloud Build now offers SLSA Level 1 compliance by default. No other cloud provider protects your software supply chain to this level. Because we started working on software supply-chain security long before it was in the headlines, you benefit from our years of investment in security.Encouraging you to redefine what’s possibleIn the world of data and AI, we’re bringing you the leading data cloud products in the industry, designed for optimal performance, reliability and immense scale. They are backed by new AI capabilities that help you more easily redefine older, less efficient ways of operating and demonstrate what only developers can make possible.Contract DocAI, a new capability within DocAI that automates data capture to reduce document processing costs. It’s based on AI templates and fully managed services that help you quickly build smarter applications and create more meaningful insights for your organization. This is just one of the many ways you can use data and AI to redefine what’s possible for your users.Making it easier to build more sustainably We’re also making it easier for developers to do the right thing, even when it’s hard. For too long, doing the right thing for the environment came at a high financial cost for organizations. By wrapping carbon reduction goals into your IT initiatives from the beginning, you become the transformation leader executing against board-level sustainability mandates.Many cloud providers have a vision for a sustainable future, and many aim to match their electricity consumption with 100% renewable energy by 2025 or 2030. Google accomplished 100% renewable energy in 2017 and we’re the only hyperscale cloud to do this today. Our data centers are twice as energy efficient as the average data center and we’re embedding tools right into our platform to help you build more sustainably. Here are just a few of our recent releases to support you:Google Cloud Carbon Footprint – gives you access to the energy-related emissions data you need for external carbon disclosures in just one click.Region Picker – lets you choose the data center region with the lowest gross carbon cost. On an annualized basis, we match all your usage with renewable energy so your net impact is zero no matter what region you pick. This tool helps you become carbon-free, not just carbon-neutral. Google Earth Engine and its integration with Google Cloud – with over 700 datasets and more than 50 petabytes of data today, Earth Engine gives developers access to the world’s largest catalog of satellite image data and tools for driving sustainable impact.“Sustainable IT – Decoded” masterclass – learn from some of the top global sustainability experts on how to build more sustainably.We’re excited to share this new portfolio of sustainability tools and resources with you to help make the greener choice easy, and make every day safer and more sustainable with Google.Investing in our developer communityCommunity building is one of the most effective ways to support developers, which is why we created Google Cloud Innovators.This new community program is designed for developers and technical practitioners using Google Cloud and we welcome everyone, from enterprise developers and data scientists to student developers and hobbyists. Through this program, you get deeper access to early technology previews, the latest roadmaps and content, and front-line engineers. You’ll get exclusive invitations to all our cloud developer community events and opportunities to partner with us to design and improve Google Cloud services. We’ll also recognize your expertise as community influencers by promoting your contributions and working closely with you to solve the world’s toughest problems. At the top of the Innovators program is the invite-only tier of top community thought leaders—our Champion Innovators. These individuals go above and beyond to inspire and encourage other developers in the community and help set the pace of innovation in the Google Cloud ecosystem. You can read about them in our Champion Innovators Directory.We also plan to integrate the Innovators program much more deeply into the Google Cloud user experience, so consider this just the beginning! We look forward to giving you the tools and resources you need along the road to make it easier to do the things you care about most. Join Innovators today!Related ArticleRead Article
Quelle: Google Cloud Platform

What’s New and What’s Next with Google Cloud Databases

If there’s one thing the past 18 months have taught us, it’s that the ability to adapt to change is paramount. In order to be successful in this digital, disruptive era, development teams need to innovate and iterate on the customer experience faster than ever before. Businesses need to be able to experiment rapidly, frequently, and economically—not only with products and services, but also with business models, processes, and strategies.At Google Cloud, we’ve been focused on building options that meet you where you are, and that create a path for innovation and growth in the future. In the newly published Gartner® Solutions Scorecard for Google Cloud Platform Operational Databases, July 2021, Google Cloud met 100% of the required criteria outlined by Gartner and received an overall score of 90 out of 100.Just a few months ago, we announced Spanner granular instance sizing, where customers can start using Spanner at 1/10th the cost of regular instances, equating to approximately $65 per month. This week, we announced in preview a PostgreSQL interface for Spanner. With this new PostgreSQL interface, enterprises can now use skills and tools from the popular PostgreSQL ecosystem to take advantage of Spanner’s unmatched global scale, 99.999% availability and strong consistency.This interface supports Spanner’s rich feature set using the most popular PostgreSQL data types and SQL features, reducing the barrier of entry for building transformational applications with Spanner. Developer teams can use the tools and skills they are familiar with and have the assurance that the schemas and queries they build against the PostgreSQL interface can be easily ported to another PostgreSQL environment, if needed, giving them flexibility and peace of mind. (Complete this form to request access to the preview.)At Wayfair, we are focused on delivering the best online shopping experience for all things home to our customers”, said Phil Portnoy, Associate Director at Wayfair. “To make this possible, we moved to managed database services in the cloud and are using PostgreSQL because it’s what our developers know and love. We are excited about Cloud Spanner’s new PostgreSQL interface because it makes Spanner more accessible to our application teams that need its unmatched scalability, strong consistency and 99.999% availability without the huge extra investment in specialized skills.Another big announcement we are making during NEXT is the unveiling of Google Distributed Cloud, which is a portfolio of fully managed hardware and software solutions that extend Google Cloud’s infrastructure and services to the edge and into a customer’s own data center. With Google Distributed Cloud, customers will be able to deploy managed databases, machine learning, data analytics, container management services from Google Cloud, as well as well known third-party services. Google Distributed Cloud will help customers reduce operational overhead, modernize applications and process data locally to harness real-time insights across deployments. You now have a sense of the biggest announcements coming out of NEXT. We have also been focused on additional innovations, including; simplifying migrations to the cloud, removing barriers for our transformative cloud-native databases and meeting developers with the tools they love.Simplifying migrations to the cloud with enterprise capabilitiesOver the past year, we’ve been focused on simplifying migrations to managed services on Google Cloud. We’ve built enterprise capabilities and removed major blockers for Cloud SQL, our fully managed relational database service for MySQL, PostgreSQL and SQL Server workloads. Cloud SQL offers easy integration with existing apps and Google Cloud services like Google Kubernetes Engine and BigQuery. More than 650 thousand GKE pods securely connect to Cloud SQL and BigQuery users query more than 125 Petabytes of data in Cloud SQL on average in a month using our federated querying capabilities. In addition, we’ve introduced Cloud SQL capabilities such as cross-region replicas, point-in-time recovery (PITR), customer-managed encryption keys (CMEK), VPC Service Controls, Cloud IAM and Active Directory support. Cloud SQL maintenance downtime is now on average 80% shorter than it was 12 months ago, and is shorter compared to close competitors. We’ve also been staying current with support for the latest release versions including PostgreSQL 13, and PostgreSQL 14 following shortly. In addition, we recently introduced Cloud SQL cost recommenders with Active Assist that empowers developers to better manage costs across their Cloud SQL databases. We’ve made migrations to Cloud SQL easier and faster with the Database Migration Service. More than 85% of all migrations are underway in under an hour, with the majority of customers migrating their databases from other clouds. Database Migration Service currently supports MySQL and PostgreSQL databases, with support for SQL Server migrations coming soon. Finally, we announced Datastream, a serverless change data capture (CDC) and replication service, which provides access to streaming, low-latency data from MySQL and Oracle databases, with destinations in Google Cloud such as Spanner and BigQuery. You can learn more about migrating to managed services on Google Cloud in this white paper.Transformative databases for always-on applicationsWith global reach and unlimited scale, we offer best-in-class databases for always-on applications. Cloud Spanner, the globally distributed, highly-available relational database now processes over 1 billion requests per second at peak and has been battle-tested with hundreds of applications and petabytes of data around the world. Cloud Bigtable, the highly performant NoSQL database has more than 10 Exabytes of data under management and this number continues to grow. And Firestore, our scalable document database for mobile, web, and server development, has built an active and thriving developer community and supports more than 750M monthly active end-users using Firebase Auth.We’ve been making investments in key features and enhancements including CMEK for Spanner and Bigtable, Data Access Audit Logging in Bigtable and Firestore, and 99.999% availability SLA for Bigtable, which joins Spanner and Firestore in meeting this level of service. All these features help customers meet stringent regulatory requirements, especially financial services organizations. And with capabilities like the soon to be released Bigtable Auto Scaling, you will be able to automatically add or remove capacity in response to changing demands of their workloads, so you only pay for the capacity you need. This will reduce management overhead, as teams can spend more time on strategic work and less time managing infrastructure. Bigtable nodes now support twice as much storage capability, from 2.5T per node to a max of 5T per node for SSD and from 8T per node to a max of 16T per node for HDD. This is especially cost-effective for batch workloads that operate on large amounts of data.Meet developers with the tools they loveDevelopers play a key role in building applications, so we’ve also been focused on improving their speed of innovation, and offering enhanced features and capabilities across the portfolio. Developers love Firestore because of its ability to serve both as a document database and as a backend-as-a-service. It’s therefore no surprise that Firestore has built a thriving developer community with more than 250K monthly active developers for rapid app development. We announced Key Visualizer for Bigtable, Spanner and Firestore allowing developers to quickly and visually identify performance issues. This feature generates visual reports that break down usage based on the row keys that are accessed in tables. We also launched Cloud SQL insights earlier this year, which brings industry-leading database observability to developers at no extra charge, helping them detect, diagnose, and prevent query performance problems for Cloud SQL databases. As mentioned earlier, we announced Spanner granular instance sizing, and this week, we announced in preview a PostgreSQL interface for Spanner. Developer teams can now use the tools and skills they already have and can be assured that the schemas and queries they build against the PostgreSQL interface can be easily ported to another PostgreSQL environment, if needed, giving them flexibility and peace of mind. And for modernizing Oracle database operations, we announced El Carro, an open source tool that implements the Kubernetes operator pattern to deliver automation for provisioning and ongoing operations like backups, patching, and high availability for databases running in hybrid and multicloud environments. And it does so using the same declarative syntax that DevOps teams are using to manage applications.Finally, we announced Sqlcommenter, an open source object-relational mapping (ORM) auto-instrumentation library, earlier this year and donated it to OpenTelemetry working closely with APM vendors such as Datadog, Dynatrace and Splunk. This announcement extends the vision of OpenTelemetry to databases and enables application-focused database observability with open standards. In conclusionThe pace of change has only accelerated, and databases play a critical role in empowering teams to build applications faster. Google Cloud databases provide a ground-breaking platform for innovation based on decades of our own first-hand experiences shaping the digital world. To get started with our Google Cloud databases, learn more here and explore all our NEXT breakout sessions, demos, hands-on labs, and partner insights here.Related ArticleTurn data into value with a unified and open data cloudAt Google Cloud Next we announced Google Earth Engine with Bigquery, Spark on Google Cloud and Vertex AI WorkbenchRead Article
Quelle: Google Cloud Platform

Google Cloud expands CCAI and DocAI solutions to accelerate time to value

Virtually all companies face two broad challenges: to marshal data for smarter decision making, and to deliver more personalized and convenient experiences for customers. Artificial intelligence (AI) can help, but the path from AI investment to business outcome can be difficult to chart. To fast-track time to value, today, we’re pleased to announce new additions to two of our core AI-powered solutions:Contact Center AI (CCAI) Insights, now generally available, provides out-of-the-box and custom modeling techniques to make it easier for contact center teams to better understand customer interaction data. CCAI Insights extends the impact of Google Cloud’s CCAI solution, which lets businesses enable rich and conversational customer experiences via capabilities such as AI-powered virtual agents and Agent Assist. Contract DocAI, now in preview, adds to Google Cloud’s Document AI solution, a scalable cloud-based AI platform to help businesses efficiently scan, analyze, and understand documents. Contract DocAI brings new features purpose-built for the most important and complicated documents of all: contracts. It lets users extract insights from the unstructured text in contracts, helping to accelerate contract lifecycles and reduce the cost of contract processing. These announcements build on the momentum we’ve been seeing with our AI solutions in delivering business value to our customers. According to a 2020 commissioned study conducted by Forrester Consulting, “New Technology: The Projected Total Economic Impact™ Of Google Cloud Contact Center AI,” CCAI can help customers save millions by efficiently directing customers between self-service resources and human agents, reducing average call times, and decreasing manual data entry.  Call center teams using Google Cloud’s CCAI for Chat have been able to manage up to 28% more conversations concurrently, for example, all while responding 15% faster to customer inquiries and increasing customer satisfaction by 10%. Likewise, leveraging DocAI, our customers are able to improve document processing and better serve customers—such as mortgage company Mr. Cooper, which increased the efficiency of mortgage document operations by 400%, improving the experience for both team members and their more than 3 million customers.With today’s announcements, we are pleased to continue on this journey of meeting customers where they are and putting ready-to-use AI solutions directly in their hands, to fast-track the path from AI investment to transformative business outcomes.Deepen customer understanding with CCAI InsightsCCAI Insights helps teams offer better customer experiences by using AI to mine raw contact center interaction data for actionable information, regardless of whether that data originated with a virtual or human agent. It provides a number of out-of-the-box analytics on customer conversations, including:Smart Highlighters: Automatically highlights important conversation moments, such as when an agent authenticates, a customer confirms their issue has been resolved, or an agent puts a customer on hold.Cloud Natural Language Processing (NLP) Integration: New Customer & Agent Sentiment functions within CCAI score each portion of a conversation so teams can understand any positive or negative sentiment; and Entity Extraction identifies and labels various entities within a conversation by types such as date, person, contact information, organization, location, events, products, and media, so teams can report on data to improve script or train agents to resolve resolution rates. Beyond out-of-the-box capabilities, CCAI Insights also helps teams categorize their conversations with Custom Highlighters — Defines rules, keywords, and natural language training phrases to help teams understand things like when customers mention a competitor or a recent promotional offering. Topic Modeling— Leverages advanced NLP technologies like BERT so teams can create an unsupervised model of their data to define the taxonomy of conversation drivers.Reviewing trends, searching, and filtering data to find conversations in CCAI Insights consoleAs part of Google Cloud’s CCAI solution, CCAI Insights also seamlessly hands off calls and chats handled by Dialogflow (CX or ES) and Agent Assist, letting it pull in advanced data from either. CCAI Insights also brings together the best of Google with native integrations to BigQuery and Looker, so customers can create more powerful and accurate data visualizations. TELUS, a Canadian communications technology company, sees great potential for CCAI Insights to empower its analyst teams to provide value back to the business.“With CCAI Insights, TELUS is looking at processing 20 million voice calls for analytics. This will help agents resolve customer inquiries faster and with less effort, which will lead to significant savings via agent effort reduction in the first year of production. Our goal is also to save our customers tens of thousands of hours by using AI to guide them to the channel that best supports their needs and allows for quicker resolutions such as self-serve and digital applications, making it easier to perform transactions on-line. This leads to customers spending reduced time with customer care and improved customer experiences.” —Mike Kellner,  Director, AI Data & Analytics, TELUS Accelerate contract lifecycle management with Contract DocAIAs part of Google Cloud’s DocAI solutions, Contract DocAI leverages a variety of AI technologies — including NLP, knowledge graph technology, and optical character recognition (OCR) — to accurately parse contracts at scale for key terms, such as those involving start and end dates, renewal conditions, parties involved, contract type, venue, or service level agreements. By automatically discerning important terms and the relationships among them, Contract DocAI also helps human reviewers work more efficiently, leading to faster and less expensive contract processing, while providing new semantic lenses to categorize and analyze contract content. Ironclad, a leading provider of contract management services, was among the earlier adopters of Contract DocAI.“With Contract DocAI, we built Smart Import for existing contracts, which has helped customers gain more access, visibility, and insights for all their contracts. Our customers can now upload contracts 75% faster while saving up to 40% on costs. Smart Import also unlocks contract data that was previously inaccessible and enables other parts of the business to make better, faster decisions. With the power of Google Cloud’s Contract DocAI, we’ve been able to focus on unlocking greater value for our customer, instead of having to spend time, effort, and money pursuing AI innovation ourselves.”—Cai GoGwilt, CTO, Ironclad Accelerate business results with AIAll of these new additions will help transform businesses by making the power of AI more accessible and more focused on achieving business outcomes. To learn more about CCAI Insights, click here, and to learn more about Contract DocAI, click here.For an even deeper dive, don’t miss these sessions at Google Cloud Next: “Using CCAI Insights to better understand your customers,” in which Dinesh Mahtani, Director of Data Analytics at TELUS, discusses how to leverage insights from customer interaction data to deliver better products and service; and “Google Cloud ad Ironclad partner to accelerate document workflows,” in which Ironclad CTO Cai GoGwilt explores how AI can turn contracting into a business hub instead of a process blocker.Related ArticleSopra Steria uses Google Cloud, Cisco, and ACTIVEO to power new generation of Virtual AgentsEuropean technology consulting firm, Sopra Steria uses Google Cloud, Cisco and Activio for virtual agentsRead Article
Quelle: Google Cloud Platform

Introducing Intelligent Products Essentials: helping manufacturers build AI-powered smart products, faster

Expectations for both consumer and commercial products have changed. Consumers want products that evolve with their needs, adapt to their preferences, and stay up-to-date over time. Manufacturers, in turn, need to create products that provide engaging customer experiences not only to better compete in the marketplace, but also to provide new monetization opportunities. However, embedding intelligence into new and existing products is challenging. Updating hardware is costly, and existing connected products do not have the capability to add new features. Furthermore, manufacturers do not have sufficient customer insights due to product telemetry and customer data silos, and may lack the AI expertise to quickly develop and deploy these features. That’s why today we’re launching Intelligent Products Essentials, a solution that allows manufacturers to rapidly deliver products that adapt to their owners, update features over-the-air using AI at the edge, and provide customer insights using analytics in the cloud. The solution is designed to assist manufacturers in their product development journeys—whether developing a new product or enhancing existing ones. With Intelligent Products Essentials, manufacturers can:Personalize customer experiences: Provide a compelling ownership experience that evolves over the lifetime of the product. For example, a chatbot that contextualizes responses based on product status and customer profile.Manage and update products over-the-air: Deploy updates to products in the field,  gather performance insights and evolve capabilities over time with monetization opportunities.Predict parts and service issues: Detect operating thresholds, anomalies and predict failures to proactively recommend service using AI, reducing warranty claims, decreasing parts shortages and increasing customer satisfaction.In order to help manufacturers quickly deploy these use cases and many more, Intelligent Products Essentials provides the following:Edge connections: Connect and ingest raw or time-series product telemetry from various device platforms utilizing IoT Core or Pub/Sub and enable deployment and management of firmware over-the-air and machine learning models with Vertex AI at the edge.Ownership App Template: Easily build connected product companion apps that work on smartphones, tablets, and computers. Use a pre-built API and accompanying sample app that can incorporate product or device registration, identity management, and provide application behavior analytics using Firebase.Product fleet management: Manage, update and analyze fleets of connected products via APIs, Google Kubernetes Engine, and Looker.AI services: Create new features or capabilities for your products using AI and machine learning products such as DialogFlow, Vision AI, AutoML, all from Vertex AI.Enterprise data integration: Integrate data sources such as Enterprise Asset Management (EAM), Enterprise Resource Planning (ERP), Customer Relationship Management (CRM) systems and others using Dataflow and BigQuery.Intelligent Products Essentials helps manufacturers build new features across consumer, industrial, enterprise, and transportation products. Manufacturers can implement the solution in-house, or work with one of our certified solution integration partners like Quantifi and Softserve. “The focus on intelligent products that Google Cloud is deploying provides a digital option for manufacturers and users. At its heart, systems like Intelligent Product Essentials are all about decision making. IDC sees faster and more effective decision-making as the fundamental reason for the drive to digitize products and processes. It’s how you can make faster and more effective decisions to meet heightened customer expectations, generate faster cash flow, and better revenue realization,” said Kevin Prouty, Group Vice President at IDC. “Digital offerings like Google’s Intelligent Product Essentials potentially go the last mile with the ability to connect the digital thread all the way through to the final user.”Customers adopting Intelligent Products EssentialsGE Appliances, a Haier company, are enhancing their appliances using new AI-powered intelligent features to enable:Intelligent cooking: Help cook the perfect meal to personal preferences, regardless of your expertise and abilities in the kitchen.Frictionless service: Build smart appliances that know when they need maintenance and make it simple to take action or schedule services.Integrated digital lifestyle: Make appliances useful at every step of the way by integrating them with digital lifestyle services – for example, automating appliance behaviors according to customer calendars, such as oven preheating or scheduling the dishwasher to run in the late evening.“Intelligent Products Essentials enhances our smart appliances ecosystem, offering richer consumer habit insights. This enables us to develop and offer new features and experiences to integrate with their digital lifestyle.“ —Shawn Stover, Vice-president Smart Home Solutions at GE Appliances.Serial 1, Powered by Harley-Davidson, is using Intelligent Product Essentials to manage and update its next generation eBicycles, and personalize its customers’ digital ownership experiences. “At Serial 1, we are dedicated to creating the easiest and most intuitive way to experience the fun, freedom, and adventure of riding a pedal-assist electric bicycle. Connectivity is a key component of delivering that mission, and working together to integrate Intelligent Product Essentials into our eBicycles will ensure that our customers enjoy the best possible user experience.”— Jason Huntsman, President, Serial 1. Magic Leap, an augmented reality pioneer with industry-leading hardware and software, is building field service solutions with Intelligent Products Essentials with the goal of connecting manufacturers, dealers, and customers to more proactive and intelligent service.“We look forward to using Intelligent Products Essentials to enable us to rapidly integrate manufacturers’ product data with dealer service partners into our field service solution. We’re excited to partner with Google Cloud as we continue to push the boundaries of physical interaction with the digital world.” — Walter Delph, Chief Business Officer, Magic LeapIntelligent Product Essentials is available today. To learn more, visit our website.Related ArticleWhat is Cloud IoT Core?Cloud IoT Core is a managed service to securely connect, manage, and ingest data from global device fleetsRead Article
Quelle: Google Cloud Platform

Introducing Google Distributed Cloud—in your data center, at the edge, and in the cloud

Now more than ever, organizations are looking to accelerate their cloud adoption. They want easier development, faster innovation, and efficient scale, all while simultaneously reducing their technology risk. However, some of their workloads cannot move to the public cloud entirely or right away, due to factors such as industry or region-specific compliance and data sovereignty needs, low latency or local data-processing requirements, or because they need to run close to other services.To ensure these workloads can still take advantage of what the cloud has to offer, today at Google Cloud Next ’21 we are announcing Google Distributed Cloud, a portfolio of solutions consisting of hardware and software that extend our infrastructure to the edge and into your data centers. Depending on your organization’s needs, you can run Google Distributed Cloud across multiple locations, including:Google’s network edge – Allowing customers to leverage over 140+ Google network edge locations around the world.Operator edge – Enabling customers to take advantage of an operator’s edge network and benefit from 5G/LTE services offered by our leading communication service provider (CSP) partners. The operator edge is optimized to support low-latency use cases, running edge applications with stringent latency and bandwidth requirements. Customer edge – Supporting customer-owned edge or remote locations such as retail stores, factory floors, or branch offices, which require localized compute and processing directly in the edge locations. Customer data centers – Supporting customer-owned data centers and colocation facilities to address strict data security and privacy requirements, and to modernize on-premises deployments while meeting regulatory compliance.Google Distributed Cloud is built on Anthos, an open-source-based platform that unifies the management of infrastructure and applications across on-premises, edge, and in multiple public clouds, all while offering consistent operation at scale. Google Distributed Cloud taps into our planet-scale infrastructure that delivers the highest levels of performance, availability, and security, while Anthos running on Google-managed hardware at the customer or edge location provides a services platform on which to run applications securely and remotely. Using Google Distributed Cloud, customers can migrate or modernize applications and process data locally with Google Cloud services, including databases, machine learning, data analytics and container management. Customers can also leverage third-party services from leading vendors in their own dedicated environment. At launch, a diverse portfolio of partners, including Cisco, Dell, HPE, and NetApp, will support the service.Our first products under this portfolio include Google Distributed Cloud Edge and Google Distributed Cloud Hosted.Google Distributed Cloud Edge Available in preview today, Google Distributed Cloud Edge is a fully managed product that brings Google Cloud’s infrastructure and services closer to where your data is being generated and consumed. Google Distributed Cloud Edge empowers you to run 5G Core and radio access network (RAN) functions at the edge, alongside enterprise applications, to support mission-critical use cases such as computer vision and Google AI edge inferencing. Google Distributed Cloud Edge is ideal for running local data processing, low-latency edge compute workloads, modernizing on-premises environments, and deploying private 5G/LTE solutions across a variety of industries. With Google Distributed Cloud Edge, retailers can provision applications at a Google network location, which allows in-store teams to focus on customers rather than sorting out IT. Manufacturers can save time and money by using video for visual inspections on factory floors, and CSPs can offer high-speed bandwidth with private 5G and localized compute to their customers. Google Distributed Cloud Edge builds on our telecommunication solutions and empowers CSPs to run workloads on Intel and NVIDIA technologies to deliver new 5G and edge use cases. Google Distributed Cloud Edge also allows ISV and network functions partners, application developers, and data scientists to deliver innovation and scale quickly and efficiently.“CSPs are looking for faster ways to deploy cloud-native network architecture that can bring flexibility and agility to their edge solutions” said Dan Rodriguez, VP & GM Network Platforms Group at Intel, “Google Distributed Cloud Edge will help accelerate the delivery of 5G Telco Cloud and services at the edge leveraging Intel® Smart Edge Open, Intel’s FlexRAN® reference software, and Intel® Xeon® Scalable processors.”“We are excited to partner with Google Cloud on Google Distributed Cloud Edge,” said Ronnie Vasishta, Senior Vice President of Telecom at NVIDIA. “This builds on our continued partnership to deliver GPU-accelerated computing and networking solutions that help the telecommunications industry and enterprises harness data and AI-on-5G or AI at the edge to unlock new business opportunities.”We are also committed to delivering cloud capabilities to our partners’ 5G networks and beyond. Google Distributed Cloud Edge furthers our previously announced global, strategic partnerships with both Ericssonand Nokia to bring new solutions built on a cloud-native 5G core and develop the network edge as a business services platform for enterprises.“This announcement builds on our on-going partnership with Google Cloud to develop Nokia cloud-native 5G core and Nokia radio solutions for Google’s edge computing platform. By extending this relationship into Google Distributed Cloud Edge, we will increase customer choice and flexibility, ultimately helping our global customer base with multiple cloud-based solutions to deliver 5G services on the network edge,” said Nishant Batra, Nokia Chief Strategy and Technology Officer.In addition to network modernization, we are focused on building an edge ecosystem to help CSPs move beyond connectivity services and monetize the edge. Together, 5G and edge provide a powerful combination to help enterprises continue to digitize their business while leveraging third-party services from our trusted partner ecosystem in their dedicated environment. “The announcement of Google Distributed Cloud supports Ericsson’s vision of the network becoming a platform of innovation, enabling companies across the ecosystem to deliver the applications of the future the way they need to, unlocking the full potential of 5G and edge,” said Rishi Bhaskar, Vice President and Head of Hyperscale Cloud Providers for Ericsson North America.Google Distributed Cloud HostedDesigned to run sensitive workloads, Google Distributed Cloud Hosted builds on the digital sovereignty vision we outlined last year, and supports public-sector customers and commercial entities that have strict data residency, security or privacy requirements. Google Distributed Cloud Hosted provides you with a safe and secure way to modernize an on-premises deployment, regardless of whether you do it yourself or choose to host through a designated, trusted partner. Google Distributed Cloud Hosted does not require connectivity to Google Cloud at any time to manage infrastructure, services, APIs, or tooling, and uses a local control plane provided by Anthos for operations. Google Distributed Cloud Hosted will be available in preview in the first half of 2022. To address the needs of customers and governments across Europe, we are also developing trusted partner offerings as part of our ‘Cloud. On Europe’s Terms’ initiative. These partners will provide governments and enterprises the highest levels of digital sovereignty, without compromising on functionality or pace of innovation. Each of these partnerships leverage different technologies. Last week, we announced a “Trusted Cloud” partnership with Thales. On Google Distributed Cloud Hosted, two of our initial partnerships are with T-Systems in Germany and OVHcloud in France. T-Systems is building a sovereign cloud offering in partnership with Google Cloud for private and public-sector organizations based in Germany, which will become available in mid-2022. “T-Systems and Google Cloud share a common goal of developing cloud-based solutions for European governments and enterprises that meet their digital sovereignty, sustainability and economic objectives,” said Frank Strecker, Senior Vice President Global Cloud Computing & Big Data and Edge, T-Systems. “Together we will offer a sovereign cloud solution for customers in Germany that gives them peace of mind to meet their rapidly evolving data, operational, and software sovereignty requirements.”We’ve also announced a strategic partnership with OVHcloud to accelerate French and European organizations’ ability to digitally transform and reimagine their businesses.“We have seen how Google Cloud listens to their customers, partners and policymakers in Europe and heard the need for even greater control and autonomy,” said Sylvain Rouri, CSO at OVHcloud. “Together, we are building a sovereign cloud services portfolio that provides clients with full control over their data, software and operations whilst leveraging the full power of Google Cloud and meeting the requirements of the General Data Protection Regulation.”A cloud infrastructure built for your evolving needsIn addition to building a true distributed cloud, our work on core Google Cloud infrastructure (compute, network, and storage capabilities) continues unabated. This year, Google Cloud expanded its global infrastructure by opening four new regions in Warsaw, Delhi, Melbourne, and Toronto. We now have 28 regions around the world, making us the largest and lowest latency network among hyperscale cloud providers. We also announced future availability in Berlin-Brandenburg; Columbus, OH; Israel; Madrid; Milan; Paris; Santiago; Saudi Arabia; and Turin. Combined with over 140 network-edge locations worldwide, these Google Cloud regions deliver the services, capacity, and performance you need to ensure a terrific experience for your users.We also continued to advance our network by investing in subsea cables that improve access to Google services. With the addition of Firmina, which will connect the East Coast of the United States to three locations in South America, we now have investments in 19 subsea cables. All of this allows us to provide transformative infrastructure to businesses that build on Google Cloud.Service-centric networking As our global network increases in reach, we’re building out service-centric networking capabilities to simplify everything from connectivity to observability. For organizations with interconnects, VPNs, and SD-WANs, Networking Connectivity Center provides a centralized management model, with monitoring and visualization through our Network Intelligence Center. And, with Private Service Connect, partners and customers such as Bloomberg, MongoDB, and Elastic are now able to easily connect services without having to configure the underlying network. Enterprises with workloads both on-premises and in the cloud can leverage hybrid load balancing to securely optimize application delivery. To help you detect and prevent malicious bot attacks, we recently integrated reCAPTCHA Enterprise with Cloud Amor. Together with Cloud IDS, the Google network edge is fortified with best-in-class security. Industry-leading computeOne of the reasons people choose Google Cloud is for access to the latest high-performance compute services. For example, Compute Engine can be configured with Tau VMs, which are optimized for scale-out workloads. Tau VMs offer 42% higher price-performance compared to general-purpose virtual machines from any of the leading public cloud vendors. Today, we are also excited to announce our new Compute Engine Spot VMs in Preview. Spot VMs offer customers better guaranteed minimum savings and more pricing predictability than spot instances from any other leading cloud provider. With fewer restrictions on excess compute capacity, Spot VMs are the future of preemptible instances for Google Cloud. Both Tau VMs and Spot VMs are supported with Google Kubernetes Engine (GKE). Reliable and secure storage Over the last year, we’ve been focused on making our storage easier to use, more performant, and the best choice for enterprises. We recently announced extensions to our popular Cloud Storage offering, and introduced two new services: Filestore Enterprise and Backup for GKE. Together, these new capabilities make it easier for you to protect your data out-of-the box across a wide variety of applications and use cases. For a deeper dive into these storage announcements, watch our on-demand webinar, listen to our developer podcast, and be sure to attend our storage-focused sessions at NEXT ’21. Delivering a flexible cloud strategyOur goal is to make your journey to the cloud easy. With transformative capabilities to help you innovate faster and save money, we follow an open approach to give you the greatest flexibility and choice as your organization evolves. Join us at the “Driving Transformation with Google Distributed Cloud” spotlight session and watch the interactive live demo of Google’s latest investments in application modernization and infrastructure in a distributed cloud environment.
Quelle: Google Cloud Platform

Announcing new tools to measure—and reduce—your environmental impact

Google Cloud is proud to support our customers with the cleanest cloud in the industry. For the past four years, we’ve matched 100% of our electricity use with renewable energy purchases, and we were the first company of our size to commit going even further by running on carbon-free energy 24/7 by 2030.  As we work to achieve 24/7 carbon-free energy, we help you take immediate action to decarbonize your digital applications and infrastructure. We’re also working with our customers across every industry to develop new solutions for the unique climate change challenges that organizations face. Today, we’re excited to expand our portfolio of carbon-free solutions and announce new partnerships that will help every company build a more sustainable future. First, we’re launching Carbon Footprint, a new product that provides customers with the gross carbon emissions associated with their Google Cloud Platform usage. Now available to every GCP user for free in the Cloud Console, this tool helps you measure, track and report on the gross carbon emissions associated with the electricity of your cloud usage. Of course, the net operational emissions associated with your Google Cloud usage is still zero. With growing requirements for Environmental Social and Governance (ESG) reporting, companies are looking for ways to show their employees, boards and customers their progress against climate targets. Using Carbon Footprint, you have access to the gross energy related emissions data you need for internal carbon inventories and external carbon disclosures, with one click. Built in collaboration with customers like Atos, Etsy, HSBC, L’Oréal, Salesforce, Thoughtworks and Twitter, our Carbon Footprint reporting introduces a new standard of transparency to support you in meeting your climate goals. You can monitor your gross cloud emissions over time, by project, by product and by region, giving IT teams and developers metrics that can help them reduce their carbon footprint. Our detailed calculation methodology is published so that auditors and reporting teams can verify that their cloud emissions data meets GHG Protocol guidance.“The power of knowledge combined with the power of technology innovation plays a vital role in proactively responding to the climate crisis we are facing. With Google Carbon Footprint reporting, Atos feeds emissions data in our Decarbonization Data Platform, demonstrating potential emissions reductions from the Google Cloud Platform to our customers. This reporting opens up new levels of emissions transparency, trajectory planning, and data insight to support our customers in meeting, and potentially accelerating towards, their climate goals.”—Nourdine Bihmane, Head of Decarbonization Business Line, Atos“The capability to measure and understand the environmental footprint of our Public Cloud usage is among the key axis of our sustainable tech roadmap. With Google Cloud Carbon Footprint, we are now able to directly follow the impact of our sustainable infrastructure approach and architecture principles.”—Hervé DUMAS, Sustainability IT Director, L’OrealWhile digital infrastructure emissions are just one part of your environmental footprint, accurately accounting for IT carbon emissions is necessary to measure progress against the carbon reduction targets required to avert the worst consequences of climate change. To help you account for emissions beyond our cloud and across your organization, we’re excited to partner with Salesforce Sustainability Cloud, integrating our Google Cloud Platform emissions data into their carbon accounting platform. “As we face unprecedented climate challenges, companies across the globe need to embed sustainability into the core of their business in order to meet growing customer and stakeholder expectations, and reduce their environmental impact. Together, Google Cloud and Salesforce Sustainability Cloud can help our joint customers accelerate their path to Net Zero, leveraging data-driven insights and visualizations to track and reduce their carbon emissions to drive sustainable change.”—Ari Alexander, GM of Salesforce Sustainability Cloud. From information to actionWith the gross energy-related emissions footprint of data associated with your Google Cloud usage now available, we’re committed to providing tools to not only measure your carbon footprint, but help you reduce it. We recently launched low-carbon region icons to help you choose cleaner regions to locate your Google Cloud resources. New users who see the icons are over 50% more likely to choose clean regions over others, ensuring their applications emit less carbon over time. For current Google Cloud users, we’re pleased to announce that Active Assist Recommender will include a new sustainability impact category, extending its original core pillars of cost, performance, security, and manageability. Starting with the Unattended Project Recommender, you’ll soon be able to estimate the gross carbon emissions you’ll save by removing your idle resources. Unattended Project Recommender uses machine learning to identify, with a high degree of confidence, projects that are likely abandoned based on API and networking activity, billing, usage of cloud services, and other signals, and provides actionable recommendations on how to remediate those abandoned projects. By deleting these projects, not only can you reduce costs and mitigate security risks, but you can also reduce your carbon emissions. In August, Active Assist analyzed the aggregate data from all customers across our platform, and over 600,000 gross kgCo2e was associated with projects that it recommended for cleanup or reclamation. If customers deleted these projects they would significantly reduce future gross carbon emissions. Check out this blog to learn more about Active Assist.As we roll-out this feature, users will see a recommendation card in the Carbon Footprint dashboard to reduce emissions. They can investigate the associated projects and choose to delete them to reduce emissionsSolutions for climate resilience Many of our customers face difficult questions about how their business impacts the natural environment today, and how it will be affected by climate change in the future. Answering these questions requires rich datasets about the planet, better analytics tools and smarter models to predict potential outcomes. For over a decade Google Earth Engine has supported scientists and developers with hyperscale computing power and the world’s largest catalog of satellite image data. Today, we are delighted to announce the preview of Earth Engine as part of Google Cloud Platform. Now, you can access Earth Engine and combine it with other geospatial-enabled products like BigQuery. By extending Earth Engine’s powerful platform to enterprises through Google Cloud, we are bringing the best of Google together. Over the past year we’ve worked with a number of organizations to use Earth Engine technology with tools like BigQuery and the Cloud AI Platform to develop new solutions for responsible commodity sourcing, sustainable land management and carbon emissions reduction. Earth Engine enables companies to track, monitor and predict changes in the Earth’s surface due to extreme weather events or human-caused activities, thus  helping them save on operational costs, mitigate and better manage risks, and become more resilient to climate change threats. This new offering will wrap the unique data, insights and functionality of Earth Engine with a fully-managed, enterprise-grade experience and reliability.As we work with our customers to accelerate their sustainability initiatives, earth observation data is proving critical to effectively plan for the long-term impacts of climate change. To extend our geospatial and sustainability use cases we’re also expanding our partnerships with CARTO, Climate Engine, Geotab, NGIS, and Planetto bring their data and core applications to Google Cloud.These partners will each make their existing platforms and datasets available globally on Google Cloud, giving you low-latency and reliable access to critical data and applications that will inform your sustainability initiatives. By integrating water availability, agricultural data, weather risks, and extensive daily satellite imagery into Earth Engine and BigQuery, you can achieve more ambitious goals for the sustainability of your business and our planet.Committing to help you meet your climate goalsWith each of these tools, we’re working to reduce the barriers you face in adopting more sustainable technology practices. We understand that building more sustainable applications and infrastructure is not easy. You face competing priorities, technical challenges, and the perception that climate action is costly. It doesn’t have to be this way. Today, we are making a sustainability pledge to you: teams across Google Cloud are committing to eliminating the barriers you face in building a more sustainable digital future for your organization, and will help you take action today to realize your climate goals. We’ll do this in a number of ways: In digital transformation projects and workshops, sustainability teams will always have a seat at the planning table, so we can work together on using cloud technology to build a more sustainable future. We’re putting low-carbon signals natively into our products to help developers choose more sustainable options early in their application development. We’ll ensure carbon impact is measured consistently with other key performance indicators. Leveraging the social cost of carbon, the ROI models and value assessments you conduct with Google Cloud will project your emissions impact too. We’ll be transparent about our carbon impact, by publishing third-party reviewed reports and methodologies, so you can trust the data for your own reports and disclosures. We’ll continue to work with the industry on best practices, including educational resources like Sustainable IT – Decoded, a new masterclass created in partnership with Intel, that shares the expertise of sustainability thought leaders. For the next decade we need to work together to avert the worst consequences of climate change. We’ve made tremendous progress in building technology that helps everyone do more for the planet, and we’re excited to see what you do with it. Visit this page to learn more about Google Cloud’s sustainability efforts.Related ArticleIT leaders are choosing a sustainable futureA recent Google-commissioned study by IDG highlights the importance of sustainable solutions when selecting a cloud provider.Read Article
Quelle: Google Cloud Platform

What’s new at Next

Google Cloud Next ‘21 has begun! We’ll be updating this blog post throughout Next with the latest announcements about Google Cloud products, solutions, and partnerships. Just bookmark this page to get all your Next news in one handy place.Cloud EverywhereIntroducing Google Distributed Cloud—in your data center, at the edge, and in the cloudGoogle Distributed Cloud is a portfolio of fully managed hardware and software solutions that extend Google Cloud’s infrastructure and services to the edge and data centers. Enabled by Anthos, it’s ideal for running local data processing, low-latency edge compute workloads, modernizing on-premises environments, running sensitive workloads that meet sovereignty requirements, or deploying private 5G/LTE solutions for customers. Read the full announcement here.Google Cloud Cortex FrameworkThe new Google Cloud Cortex Framework is a foundation of endorsed solution reference templates and content for customers to accelerate business outcomes with less risk, complexity, and cost. This allows customers to kickstart insights and reduce time-to-value with reference architectures, packaged services, and deployment accelerators. Customers can deploy templatized solutions from Google Cloud and our trusted partners for specific use cases and business scenarios in a faster, more cost-effective way. In our first release, customers can take advantage of a rich data foundation of building blocks and templates for SAP environments. Read the announcement.Customer Breakthrough in IndustryContact Center AI Insights [GA Announcement]Contact Center AI (CCAI) Insights extends the impact of Google Cloud’s CCAI solution, which lets businesses enable rich and conversational customer experiences via capabilities such as AI-powered virtual agents and Agent Assist. CCAI Insights builds on these capabilities with out-of-the-box and custom modeling techniques, making it easier for teams to use AI to mine raw contact center interaction data for actionable information, regardless of whether that data originated with a virtual or human agent.To learn more about CCAI Insights, click hereContract DocAI [Preview] Contract DocAI adds to Google Cloud’s DocAI solutions, a scalable cloud-based AI platform to help businesses efficiently scan, analyze, and understand documents. Contract DocAI brings new features purpose-built for the most important and complicated documents of all: contracts. By automatically discerning important terms and the relationships among them, Contract DocAI also helps human reviewers to work more efficiently, leading to faster and less expensive contract processing, while providing new semantic lenses to categorize and analyze contract content. To learn more about Contract DocAI, click hereLeading with DataVertex AI WorkbenchWe’re pleased to announce in public preview Vertex AI Workbench, a natural evolution of Google Cloud’s Notebook offerings, which respond to the ever-changing needs of our customers for standardized, integrated ML tooling.Vertex AI Workbench is the single environment for data scientists to complete all of their ML work, from experimentation, to deployment, to managing and monitoring models. It is a Jupyter-based, fully-managed, scalable, enterprise-ready compute infrastructure with security controls and user management capabilities.With Vertex AI Workbench, data analysts, data scientists, and all data and AI practitioners can analyze all their data from BigQuery, Dataproc, Spark, Looker, and Vertex AI in one interface. Vertex AI Workbench facilitates training data at scale, with fewer lines of code, and easy connectivity to our MLOps services to improve model survivability at the hand-off point to ML engineers.Bringing the Google MagicData Center Transformation Specialization coming soon! The Partner Advantage program will be launching a new Data Center Transformation Specialization scheduled for Q1 2022.  This Specialization is designed for our service partners who have demonstrated success with complex Data Center Transformation of enterprise workloads from private, public clouds and on-premise to Google Cloud. Stay tuned for the upcoming formal announcement. To understand more about the area that partners who achieve this specialization will support, please read this Data Center Transformation with Google white paper and sign up to learn more.
Quelle: Google Cloud Platform

And the winners are… Announcing the first-ever Google Cloud Customer Awards

Today, on the eve of Next ‘21, we’re thrilled to announce the winners of our inaugural Google Cloud Customer Awards, recognizing the most innovative, technically advanced and transformative cloud deployments from around the globe built on our platform.These awards come at a particularly challenging time for businesses. Consumers’ buying habits are shifting. Organizations are concerned about the resiliency of their supply chains and networks in the face of ransomware and other cyber attacks. New regulatory, privacy, and sovereignty legislation is impacting businesses’ future plans. And, of course, ensuring sustainable business practices to protect the environment continues to be a priority. Throughout it all, companies are leaning on IT departments to help them navigate these issues.Judging by the nominations, Google Cloud customers are up for the challenge. Senior Google experts independently judged and scored hundreds of customers implementations from around the world against set criteria, including technical complexity, transformation, and innovation in the cloud—all represented as quantifiable metrics. Awards were given to the strongest nominations globally by industry (e.g., healthcare and life sciences, manufacturing, gaming, government, and others), and for social impact, diversity, equity, and inclusion (DEI), and cross-industry achievements. In every category, IT departments delivered real business value as they helped their organizations transform.The Google Cloud Customer Awards are part of Google Cloud’s recognition program that includes the Google Cloud Partner Awards. We’ll showcase all the winners tomorrow as part of the Next ‘21 show opener.Congratulations to the winners and to everyone who took part!Diversity, Equity, and Inclusion  Pearson’s contributions to cloud technology solutions in the education industry have been outstanding in many ways this year, particularly in how it has leveraged AI and machine learning to transform its internal data and analytics capabilities. In addition, the company’s DEI programs have positively impacted employees’ personal and professional development. Pearson’s Women in Technology program actively tackled issues people were struggling with while working remotely during the COVID-19 pandemic, bringing a sense of connection, wellbeing, and fulfillment to Pearson employees. This included launching a new skills initiative with proven results.Social Impact This diverse group of customers displayed some exciting examples of technical innovation and transformation related to social impact. From democratizing data, to driving more accessible healthcare, to creating people-powered news reporting, the common theme among these customers was using technology to create more openness and transparency.Communications Service ProvidersCommunications service providers’ (CSPs) entries were largely focused on using technology to provide exceptional customer service. Contact Center AI and other Google Cloud solutions are helping CSPs deliver high-quality products, while achieving key business outcomes like reduced average call handle time. With tens of millions of conversations with customers each year, Google’s cloud technology is helping CSPs create a world-class experience every time.Financial ServicesA staggering 15 of our top 62 scorers were from financial services. This is a clear indicator that this sector excels at innovative thinking, technical execution, and transformation. Companies in this category used cloud technology to drive operational resilience, regulatory compliance, and better customer experiences, relying on an open cloud to support their customers worldwide.GamingIn the gaming industry, it’s all about record-breaking, era-defining, and constantly updating player experiences. Google Cloud is helping to empower ambitious game developers to deliver their games globally at-scale.Healthcare and Life SciencesIn this highly regulated industry, customers are enabling essential services and continuous patient care, all in the face of the COVID-19 pandemic. Google Cloud is proud to partner with healthcare customers, who are leveraging the cloud to create agility during a trying year.ManufacturingThe manufacturing sector is engaged in some very exciting transformations. In some cases, decades-old companies are moving away from traditional IT implementations, and reinventing themselves as purveyors of smart products with dynamic supply chains. They’re turning to Workspace, advanced analytics, and Google Cloud’s connected devices, products, and solutions to optimize their businesses—starting on the manufacturing floor.Media and EntertainmentThis year’s media and entertainment entries centered on audience engagement. Customers told us how they are using the cloud to reduce latency and unlock new experiences for their users. This includes using machine learning to run solutions such as Speech-to-Text, and engaging audiences with exciting breakthroughs in data analytics. Engagement within their own teams and organizations was also key to their success.RetailGoogle Cloud has always worked with retailers to solve their most challenging problems. But no one could have imagined more challenging problems than those retailers faced this year with the pandemic and multiple supply chain disruptions. Ecommerce revenue has become more vital than ever before, and brick-and-mortar retailers are making substantial investments in technology, training, and enablement to modernize their systems. Meanwhile, digital natives and many other retailers are becoming even more data-driven.GovernmentCollaboration and communication in the cloud were clear themes for our public sector customers. From employment, to tourism, to health and vaccination efforts, these organizations strive to help citizens in all aspects of their lives through digital services. Working together to build user-facing apps, data solutions, and new online working environments at speed, government customers have led an inspiring effort during unprecedented times.EducationIn 2020 alone, more than 93,000 Americans died of drug overdoses. One of the biggest hurdles to solving this crisis is a data problem that hinders the work of those trying to bring life-saving resources to areas and individuals who need them. Google Cloud and partner Maven Wave enabled an interdisciplinary team of developers, UX designers, community organizations, and clinical and managerial partners at Dell Medical School and the Steve Hicks School of Social Work at The University of Texas at Austin to create a unique platform that is helping to save lives.Cross-IndustryCOVID-19 has been—and continues to be—one of the greatest challenges faced by all industries. The ability to make decisions faster, ensure stability and business continuity, and work effectively as teams, has never been more important. Customers in the cross-industry category continued to innovate, grow, and build future-proof solutions using the cloud. Google Cloud customers are doing amazing things, and we can’t wait to see what they do next! Register for Next to hear from some of our customers. To find out why many of the world’s leading companies are choosing Google Cloud to help them innovate faster, make smarter decisions, and collaborate from anywhere, visit our customer pages.Related ArticleAnnouncing the winners of our Google Cloud 2020 Partner AwardsAnnouncing the winners of our Google Cloud 2020 Partner Awards.Read Article
Quelle: Google Cloud Platform

Using the Cloud Spanner Emulator in CI/CD pipelines

In past posts, we have seen how to provision and manage Cloud Spanner in production environments and how to use the Cloud Spanner Emulator in your development workflow with a sample Node.js app called OmegaTrade. We’ve covered:The Cloud Spanner emulator’s various deployment models,Running the emulator locally with OmegaTrade, and Running the emulator remotely with OmegaTradeIn this post, we will cover how to use the Cloud Spanner emulator in your Continuous Integration and Continuous Delivery/Deployment (CI/CD) pipelines, with sample steps covering the following tools: Cloud BuildJenkinsGitHub ActionsCircleCIAs a refresher, OmegaTrade is a stock ticker visualization application that stores stock prices in Cloud Spanner and renders visualizations using Google Charts. It consists of a frontend service that renders the visualizations and a backend service that writes to and reads from the Cloud Spanner instance. For the sake of simplicity, we focus on the backend service of the OmegaTrade application in this blog post and demonstrate how to use the Cloud Spanner emulator in CI/CD pipelines for this service. If you’d prefer to deploy the OmegaTrade backend manually, you can learn all about how to do that by reading this blog post and focusing on the section Manual deployment steps.CI/CD Approach for deploying OmegaTradeThe steps for all of the CI/CD tools we discuss are similar. The code is stored in a public GitHub repository, Docker files are used for creating the application docker image, Google Container Registry is used as the docker image repository and gcloud commands are used for application deployment to Cloud Run.Please note that all of the CI/CD tools require access to a GCP Service Account (SA) as well as integration with your GitHub repository. The main branch keeps the application code to be deployed on the dev environment where we will be deploying the Cloud Spanner emulator. Since it is using the emulator, we are calling this a dev environment.The prod branch keeps the application code to be deployed on the prod environment where we will be deploying an actual Cloud Spanner instance. We have two dockerfiles, one for dev (dockerfile.local.emulator) where we will be deploying the Cloud Spanner emulator, and another for prod (dockerfile.prod). dockerfile.local.emulator contains the Cloud Spanner emulator as well as the application layers (for testing purposes) whereas dockerfile.prod only contains the application code layer.Cloud BuildTo set up CI/CD with Cloud Build, please ensure the Cloud Build API is enabled from your Cloud Console as a prerequisite. The first step is to connect our GitHub repository with Cloud Build for automated deployment of our application over Cloud Run. From the Connect Repository option in the Cloud Build Triggers screen, select the source as GitHub (Cloud Build GitHub App), authenticate, select GitHub Account and Repository.In the backend, Cloud Build will install the Cloud Build app in your GitHub repository. You can find Cloud Build in Repo Settings ➞ Integrations. This app will monitor your repository and trigger pipeline processing upon any commit or push to whichever branch you mention in the trigger. Cloud Build allows folder-specific triggers.For this blog, we are going to create 2 triggers, one dedicated to dev deployment and another to prod. The Dev deployment will be communicating with the Spanner emulator whereas Prod one will be communicating with an actual Cloud Spanner instance.Create Google Cloud Build TriggersWe are using the Cloud Build.yaml file (build config file) for the deployment. A build config file defines the fields that are needed for Cloud Build to perform your tasks. We can create either a .yml or .json file for Cloud Build where we write instructions for our CI/CD pipeline.The Cloud Build config is purely parameterized which means we are using the same build config file for our frontend as well as backend deployment and providing values while creating triggers in the GCP UI.The substitution_option: ‘ALLOW_LOOSE’ allows Cloud Build triggers to run despite any missing substitution variable. We are using this option because we need some extra values for backend deployment. In this case, Cloud Build won’t return any errors.To set up the backend trigger, follow the below steps:From Cloud Build ➞ Create Trigger, enter a unique name, description for the triggerSelect Event as Push to a branchChoose GitHub repo in Source Enter ^main$ under BranchEnter backend/** in the Included files filter (this option triggers a backend trigger when there is any change(s) in the folder)Enter Cloud Build.yaml in Build Configuration. Similarly, create one for prod deployment with ^prod$ under Branch and substitute the rest of the fields with actual Cloud Spanner instance values.Here is a video walkthrough of the configuration steps for Cloud Build:JenkinsPrerequisites: Basic Knowledge of JenkinsJenkins must be installed and setupTo set up Jenkins CI/CD, we are using the Jenkins Multibranch Pipeline, which allows for creating and triggering pipelines based on branch and pull requests. For this blog post, we are going to create 2 branches: dev and prod. Dev app code will be communicating with the Spanner emulator hosted on Google Compute Engine whereas the Prod one would be communicating with an actual Cloud Spanner instance.Create service accounts in GCPAfter installing Jenkins on GKE, we need to create a service account (SA) in GCP. After creating the SA, we’ll need to give it the correct permissions.For this blog, we’ve given wide scope permission — admin, but in your environment, you may want to follow the  Principle of Least Privilege.Once done, create an SA key, as we’re going to need it later.Now let’s create one secret — kaniko-secret in Kubernetes with the kaniko service account key. kaniko is a tool to build container images from a Dockerfile, inside a container or Kubernetes cluster.For this, we’re using a cloud shell, so before applying the commands we have to upload key.json here.Connect Jenkins with GitHubIntegrating Jenkins with GitHub keeps your project updated. With this, your Jenkins build will automatically schedule when there is any new push/commit to the branch by pulling the code and data files from GitHub to Jenkins.To do this, add a webhook from the settings option of your repo. Enter the Jenkins URL, append /github-webhook/, and choose application/json in the content type. Select from which events you would like to trigger this webhook For instance, you can go with the 3rd option — Let me select individual events, and then allow Push and Pull Requests. Upon saving you will see a green check, which means the first (testing) delivery is successful.Create a GitHub Personal Access TokenA GitHub Personal access token is a good alternative to using passwords for authentication to GitHub when using the GitHub API or the command line. Generate one from your GitHub profile settings➞Developer Settings➞Personal Access Tokens with all necessary permission that you want to give to this token. Copy the token value.Add CredentialsCredentials allow Jenkins to interact with 3rd party services or tools like Docker Container Registry in an authorized manner. Jenkins stores credentials in encrypted form on the master Jenkins instance (encrypted by the Jenkins instance ID) and the credentials are only handled in Pipeline projects via their credential IDs.Here we’ll be adding the kaniko-executor-sa key (created above) in Jenkins, which allows Jenkins to communicate with GCR.From the admin option on the top left, go to Credentials➞Stores from parent(in right)➞Global credentials (unrestricted)➞Add Credentials.Select Secret File from Kind and upload the file.Create one Secret with username and password where the username will be your GitHub username and the password will be the Access Token that you copied above.Install PluginsWe’ll be installing some plugins that help make the multibranch pipeline experience smoother.From Manage Plugins in Manage Jenkins, install:Pipeline: Multibranch build strategy extensionBlue OceanCreate Multibranch PipelineTo create a multibranch pipeline, go to New Item➞Multibranch Pipeline. Enter the item name, the display name, and the description.Choose GitHub from Branch Source, select GitHub Personal Access Token that you’ve created before, enter the GitHub repo name, and hit validate. It should show Credentials OK.From Behaviors, you may want to select the following:You can select other available options based on your trigger requirements. The options mentioned above tend to help improve discoverability.From Build Strategies, choose Trigger builds for included regions.Build StrategiesThis feature is based on the Pipeline: Multibranch build strategy extension plugin. Without it, you won’t be able to create GitHub folder-based triggers.If you have the code for more than one service in a single repo, then creating a repo-based trigger is not recommended, because all the pipelines will get triggered by any change in the repo. With this plugin, however, you can specify folders from which a build will occur. Like in the above example, this particular pipeline will only get triggered when there is a change in the specific folder.Under Build Configuration, enter the Jenkinsfile pathLeave the rest of the options as it is and hit Save. As you save it, Jenkins will scan the repository and display logs. Once it is done, Jenkins will start building and deploying the app over GCP Cloud Run.Jenkins CI/CD CodeHere is a video walkthrough of the configuration steps for Jenkins:GitHub ActionsWith GitHub Actions, we create workflows. Here we’ll be creating 2 workflows, one dedicated to dev and another one to prod. The Dev deployment will be communicating to the Spanner emulator whereas the Prod one will talk to the actual Cloud Spanner instance.First, we have to set up some secrets in GitHub which will be used by GitHub Actions during deployments.To create a multibranch pipeline, go to Setting➞Secrets. We need to create 2 secrets here. One secret for GCP Service Account email would be used for deployments and another secret would be for the associated Service Account key. To generate your Service Account key from the GCP Console, Go to Service Account➞Choose your SA and from 3 vertical dots, Create Key.Here, the first secret, i.e. GCP_SA_EMAIL is the Service Account email and the second one, i.e GCP_SA_KEY is the Service Account key we generated.Refer to the screenshots below.From the GitHub repo, go to Actions, where you will find the pre-created workflows. Choose any of them, clear all the code from there, paste the Workflow code and name the file. As you commit that, you will see a .github/workflows folder appear in the GitHub repository.From Actions, you can see your pipeline running and working.The GitHub Actions workflows are branch and folder-specific as we have mentioned the keywords push for branch and working-directory for folder. The commit needs to satisfy these 2 things to run a specific workflow. We are using previously created GitHub Actions for configuring the Google Cloud SDK in the GitHub Actions environment and here we are passing Service account and SA keys for authorization. Once authorization is done, we are building, tagging, and pushing docker images to GCR and deploying applications over Cloud Run. This workflow pipeline will run on GitHub hosted infrastructure with the latest Ubuntu version as an underlying OS.Here is a video walkthrough of the configuration steps for GitHub Actions:CircleCIFor CircleCI integration, you need to have an account on CircleCI and add your GitHub Repository as a project in CircleCI. Once this is done, CircleCI will take care of deployments of your app on GCP Cloud Run.Once you add a project, CircleCI will ask you to write and commit config.yml to your GitHub repository. There are templates already present for different languages and frameworks. You can ignore these right now and follow the CircleCI Pipeline Code. You can just paste it and commit it.This pipeline has 2 jobs, one for dockerizing and deploying apps in the dev environment, and another one for the prod environment. The pipeline for the Dev environment will run when there is a change or commit in the main branch whereas the one for the prod environment will run when there is a change or commit in the prod branch.To allow CircleCI to deploy apps to GCP, we have to add the GCP Service Account in the CircleCI Environment variables which will be passed each time the CircleCI pipeline runs. From Projects➞Your Project➞Projects Settings (on the top right)➞Environment Variables. Click add Environment Variables and add the Service Account Key.Once this is done, CircleCI will deploy your app over GCP Cloud Run.Below is a video walkthrough of the configuration steps for CircleCI:ConclusionIn this blog post, we have shown how to deploy the backend service of the OmegaTrade app with the Cloud Spanner emulator in 4 different CI/CD pipelines:Cloud BuildJenkinsGitHub ActionsCircleCIWe’ve also briefly covered deploying to production. Note that if you’re setting up a Development environment, you may wish to consider using the emulator for your testing/validation needs. This blog post has more details on how you can set that up (locally or remotely).To learn more about Cloud Spanner, visit the product page. To learn more about the Cloud Spanner emulator, take a look at the official documentation.Related ArticleDeploying the Cloud Spanner Emulator remotelyLearn how to deploy the Cloud Spanner emulator remotely to GCE and Cloud Run, both manually and via Terraform.Read Article
Quelle: Google Cloud Platform

How Cherre transforms real estate data with Cloud SQL for PostgreSQL

Editor’s note: Today we are hearing from Ben Hizak, co-founder of Cherre. He shares Cherre’s approach to consolidating multiple data sets into one data warehouse to power BI and analytics for their customers. He explains how the company uses Google Cloud and Cloud SQL to  bring real estate datasets together and help customers make better data-driven decisions.Cherre’s quest to transform real estate investing and underwriting into a science requires us to connect thousands of datasets and make them accessible for investment, management and underwriting decisions. By delivering actionable insights based on fresh data, our clients can answer questions that they could never answer before. Each of our clients is interested in a different combination of public and paid datasets, which is then to be combined with that client’s private data. All of this has to be done in a secure, scalable and repeatable manner. We process this data in BigQuery and then store it in Cloud SQL for PostgreSQL, a single source of truth that our clients can use to train machine learning models, consume via an API which runs on Google Kubernetes Engine (GKE), or visualize using BI tools like Looker.A sample combination of public and paid datasets displayed in LookerBefore Cherre, this kind of consolidated data work was simply not available in the real estate industry. Before Cherre, datasets were never in one place, and each dataset contained different properties with different attributes. For example, data on transactions from one vendor and data on securitized debt from another lived in completely different worlds prior to Cherre. And private data vendors never before let raw data leave their systems. Building that trust so those vendors now allow their data onto our platform has been a unique honor and responsibility.Our clients are large organizations such as large real estate funds, banks, and insurance companies, which manage billions of dollars each. For them, the difference between a right decision and a wrong decision can make the difference between a successful business and a failed business. Their need for speed is insatiable. We enable our clients to make better decisions faster by putting clean, connected data at their fingertips. Cherre customers can evaluate opportunities and trends faster and more accurately, and efficiently manage their portfolios, while saving millions of dollars in manual data collection and analytics costs.Consolidating data to turn investing into a scienceConsolidating thousands of datasets and millions of data points requires both technical capabilities and human sensitivities. We serve as a center point for two sets of stakeholders — for large institutions that require the highest levels of security and for providers of proprietary data that stake their reputations on our ability to deliver their data on comparable or better infrastructure than they provide to their own clients. Our reputation relies on the strength of our technical infrastructure. Why PostgresWhen choosing a database to back our API, Postgres was the natural solution. It is feature-rich, fast, and has a strong community. Postgres also has strong geospatial abilities, which is an absolute requirement for clients in our industry.Postgres and GraphQL combine well to serve different use cases in a ‘wide data’ scenarioTo understand our stack, it helps to understand our data. Most data engineers are working in a “thin and fast” scenario; a small number of fields have to be processed very rapidly. Our case is the opposite. Real estate data is “wide and slow(er)”; our input consists of many thousands of tables. We reduce these to something more manageable, just over 100 tables and thousands of columns for certain clients, which they then can consume via our API and  visualize in Looker. Some of our clients even run their calculations directly on our infrastructureREST doesn’t cut itEvery client asks different questions from their data. While some might care about specifics in a certain building — number of tenants, average occupancy or length of occupancy — others will be more interested in general information, such as fire safety codes. . Our clients expect to be able to filter on any column and traverse datasets without having to understand how they join together. In the classic world of REST APIs, clients would have to pull all the objects and know how to assemble them on the client’s side. However, that would be antithetical to our purpose — to make data ingestions as simple as possible. Data wrangling is our job. We had to look for a different approach, which required a very different architecture.GraphQL—a more modern type of API — comes in handy.  Not only does it prevent over-fetching and under-fetching of data, but it also joins tables and traverses relationships (one-to-many, many-to-one, many-to-many) in a way that is seamless for the client. Real estate data can be difficult to wrangle in that it comes from thousands of authorities, in various formats and qualities. The data is built on various objects, often lacking a consistent key or identical data point, which makes for complex and very subtle joins. Cherre wanted to take on that burden, and not pass it to our clients. This is where GraphQL shines. Employing GraphQL lets Cherre focus on the minutiae so that our clients can focus on the big picture.GraphQL allows each of our clients to query only the objects and fields that they care about, and have them be joined, filtered and aggregated on the server side, yielding a response that is easy to consume. Take the following example in which we conduct a one-to-many join: buildings with their apartments (units), and filtering on both entities:In this example, we can see a many-to-one query, because each apartment belongs to only one building:And in this example we can see a one-to-many query that pulls apartments based on the building they belong to (in this case buildings that were built after 2010 and have a doorman).Disposable tables – Because you can’t always get it right the first timeSince Cherre is working on thousands of datasets and tens of thousands of input fields, the chances that we will misunderstand and miscalculate a field are so high as to be a certainty. To preempt this, we wanted to be able to rebuild our output from scratch. We were looking for something like “disposable infrastructure” but for datasets. We decided that Cherre would not operate on existing data and instead, would always recompute what we need from scratch. This approach is often called immutable tables. If there’s a mistake, you can fix the code and rerun — voila, mistake corrected.To achieve this, Cherre builds immutable tables that live for a short time. We then funnel some or all of the client traffic to the new table. We keep the old table in case we need to do a zero-downtime rollback, but ultimately delete the old table to conserve space. We delete data with full confidence that we can rebuild it from scratch, by running the appropriate version of the code.Immutable tables are more costly and take longer to compute, but they allow Cherre to iterate more quickly with the confidence that nothing is ever lost. It also allows us to compare two versions of our output.Why Cloud SQLAs you can see, our database needs are not trivial, which is why we chose Cloud SQL for PostgreSQL. We need to control our costs, and we do it using Cloud SQL committed-use discounts (CUDs and) and a cost monitoring tool from DoIT international. We also need high availability, read replicas, and backup management. Cloud SQL does all that for us so we don’t have to spend development time on it. It also saves us time we’d have otherwise had to spend patching and upgrading the software. We use Google Cloud projects to implement robust information security, so having a database that sits well within the Google projects system enables us to be consistent across the organization with the way we manage permissions and security. And we leverage Config Connector, which allows us to provision Google Cloud resources through Kubernetes. Since we want to use infrastructure-as-code for everything and we are using Google Kubernetes Engine (GKE), it was natural for us to use Config Connector for our operations.Looking aheadWe’re extremely proud of what we’ve built, but there’s always room for improvement. As technology continues to evolve and as we continue to develop, we’re seeking out new opportunities to become more efficient. We’re also looking to leverage deeper database optimization, which is much easier to do now, thanks to Query Insights. More information is becoming available every day. The buyers, sellers and other real estate professionals  that can make smarter decisions based on that data will gain an edge in the marketplace. We’re committed to making the connections they need to get ahead. Read more about Cloud SQL, Cloud SQL insights and Looker in our documentation or take the Cloud SQL insights code lab.
Quelle: Google Cloud Platform