Announcing open innovations for a new era of systems design

We’re at a pivotal moment in systems design. Demand for computing is growing at insatiable rates. At the same time, the slowing of Moore’s law means that improvements to CPU performance, power consumption, memory and storage cost efficiencies have all plateaued. These headwinds are further exacerbated by new challenges in reliability, and security. At Google, we’ve responded to these challenges and opportunities with system design innovations across the stack: from new custom-silicon accelerators (e.g., TPU, VCU, and IPU), new hardware and data center infrastructure, all the way to new distributed systems and cloud solutions. But this is only the beginning. There are many more opportunities for advancements, including closely-coupled accelerators for core data center functions to minimize the so-called “data center tax.” As server and data center infrastructure diverges from decades-old traditional designs to be more modular, heterogeneous, disaggregated, and software-defined, distributed systems are also entering a new epoch — one defined by optimizations for the “killer microsecond” and novel programming models optimized for low-latency and accelerators. At Google, we believe that these new opportunities and challenges are best addressed together, across the industry. Today, at the Open Compute Project (OCP) Global Summit, we are demonstrating our support of open hardware ecosystems, presenting at more than 40 talks, and announcing several key contributions:Server design: We will share Google’s vision for a “multi-brained” server of the future, transforming traditional server designs to more modular disaggregated distributed systems across host computing, accelerators, memory expansion trays, infrastructure processing units, etc. We are sharing the work we are doing with all our OCP partners on the varied innovations needed to make this a reality — from modular hardware with DC-MHS, standardized management with OpenBMC and RedFish, standardized root of trust, and standardized interfaces including CXL, NVMe and beyond.Trusted computing: The root of trust is an essential part of future systems. Google has a tradition of making contributions for transparent and best in-class security, including our OpenTitan discrete security solutions on consumer devices. We are looking ahead to future innovations in confidential computing and varied use-cases that require chip-level attestation at the level of a package or System on a Chip (SoC). Together with other industry leaders, AMD, Microsoft, and NVIDIA, we are contributing Caliptra, a re-usable IP block for root of trust measurement, to OCP. In the coming months we will roll out initial code for the community to collectively harden together.Reliable computing: To address the challenges of reliability at scale, we’ve formed a new server-component resilience workstream at OCP,  along with AMD, ARM, Intel, Meta, Microsoft, and NVIDIA. Through this workstream, we’ll develop consistent metrics about silent data errors and corruptions for the broader industry to track. We’ll also contribute test execution frameworks and suites, and provide access to test environments with faulty devices. This will enable the broader community — across industry and academia — to take a systems-approach to addressing silicon faults and silent data errors. Sustainability: Finally, we’re announcing our support for a new initiative within OCP to support environmental sustainability as a key tenet across the ecosystem. Google has been a leader in environmental sustainability for many years. We have been carbon neutral since 2007, powered by 100% renewable energy since 2017, and have an ambitious goal to achieve net-zero emissions across all of our operations and value chain by 2030. In turn, as the cleanest cloud in the industry, we have helped customers track and reduce their carbon footprint and achieve significant energy savings. We’re excited to share these best practices with OCP and work with the broader community to standardize sustainability measurement and optimization in this important area. As the industry body focused on system integration (e.g., compute, memory, storage, management, power and cooling), the OCP Foundation is uniquely positioned to facilitate the industry-wide codesign we need. Google is active in OCP, serving in leadership roles, incubating new initiatives, and supporting numerous contributions.These announcements are the latest example of our history of fostering open and standards-based ecosystems. Open ecosystems enable a diverse product marketplace, with agility in time-to-market, and the opportunity to be strategic about innovation. Google’s open source leadership is multidimensional: driving industry standardization and adoption, strong and varied community contributions to grow the ecosystem, as well as broad policy and organizational leadership and sharing of best practices. The four initiatives we are announcing today, in combination with the Google-led talks at the OCP Summit, provide a small glimpse into the exciting new era of systems ahead. We look forward to working with the broader OCP community and other industry organizations to build a vibrant open hardware ecosystem to support even more innovation in this space. Please join us in this exciting journey.Related ArticleJupiter evolving: Reflecting on Google’s data center network transformationThanks to optical circuit switching (OCS) and wave division multiplexing (WDM) in the Jupiter data center network, Google enjoys a host o…Read Article
Quelle: Google Cloud Platform

Unifying data and AI to bring unstructured data analytics to BigQuery

Over one third of organizations believe that data analytics and machine learning have the most potential to significantly alter the way they run business over the next 3 to 5 years. However, only 26% of organizations are data driven. One of the biggest reasons for this gap is that a major portion of the data generated today is unstructured, which includes images, documents, and videos. It is estimated to cover roughly up to 80% of all data, which has so far remained untapped by organizations.One of the goals of Google’s data cloud is to help customers realize value from data of all types and formats. Earlier this year, we announced BigLake, which unifies data lakes and warehouses under a single management framework, enabling you to analyze, search, secure, govern and share unstructured data using BigQuery. At Next ‘22, we announced the preview of object tables, a new table type in BigQuery that provides a structured record interface for unstructured data stored in Google Cloud Storage. This enables you to directly run analytics and machine learning on images, audio, documents and other file types using existing frameworks like SQL and remote functions natively in BigQuery itself. Object tables also extend our best practices of securing, sharing and governing structured data to unstructured, without needing to learn or deploy new tools.Directly process unstructured data using BigQuery MLObject tables contain metadata such as URI (Uniform Resource Identifier), content type, and size that can be queried just like other BigQuery tables. You can then derive inferences using machine learning models on unstructured data with BigQuery ML. As part of preview, you can import open source TensorFlow Hub image models, or your own custom models to annotate the images. Very soon, we plan to enable this for audio, video, text and many other formats, and pre-trained models to enable out-of-the box analysis. Check out this video to learn more and watch a demo.code_block[StructValue([(u’code’, u’# Create an object tablernCREATE EXTERNAL TABLE my_dataset.object_tablernWITH CONNECTION us.my_connection rnOPTIONS(uris=[“gs://mybucket/images/*.jpg”],rn object_metadata=”SIMPLE”, metadata_cache_mode=”AUTOMATIC”);rnrn # Generate inferences with BQMLrnSELECT * FROM ML.PREDICT(rn MODEL my_dataset.vision_model, rn (SELECT ML.DECODE_IMAGE(data) AS img FROM my_dataset.object_table)rn);’), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e08491c1750>)])]By analyzing unstructured data natively in BigQuery, businesses canEliminate manual effort as pre-processing steps such as tuning image sizes to model requirements are automatedLeverage the simple and familiar SQL interface to quickly gain insightsSave costs by utilizing existing BigQuery slots without needing to provision new forms of computeAdswerve is a leading Google Marketing, Analytics and Cloud partner on a mission to humanize data. Twiddy & Co. is Adswerve’s client – a vacation rental company in North Carolina. By combining structured and unstructured data, Twiddy and Adswerve used BigQuery ML to analyze images of rental listings and predict the click-through rate, enabling data-driven photo editorial decisions. “Twiddy now has the capability to use advanced image analysis to stay competitive in an ever changing landscape of vacation rental providers – and can do this using their in-house SQL skills.” said Pat Grady, Technology Evangelist, AdswerveProcess unstructured data using remote functionsCustomers today use remote functions (UDFs) to process structured data for languages and libraries that are not supported in BigQuery. We are extending this capability to process unstructured data using object tables. Object tables provide signed URLs to allow remote UDFs running on Cloud Functions or Cloud Run to process the object table content. This is particularly useful for running Google’s pre-trained AI models, including Vision AI, Speech-to-Text, Document AI, open source libraries such as Apache Tika, or deploying your own custom models where performance SLAs are important. Here’s an example of an object table being created over PDF files that are parsed using an open source library running as a remote UDF.code_block[StructValue([(u’code’, u’SELECT uri, extract_title(samples.parse_tika(signed_url)) AS titlernFROM EXTERNAL_OBJECT_TRANSFORM(TABLE pdf_files_object_table, rn [“SIGNED_URL”]);’), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e083ebb4310>)])]Extending more BigQuery capabilities to unstructured dataBusiness intelligence – The results of analyzing unstructured data either directly in BigQuery ML or via UDFs can be combined with your structured data to build unified reports using Looker Studio (at no charge), Looker or any of your preferred BI solutions. This allows you to gain more comprehensive business insights. For example, online retailers can analyze product return rates by correlating them with the images of defective products. Similarly, digital advertisers can correlate ad performance with various attributes of ad creatives to make more informed decisions.BigQuery search index – Customers are increasingly using the search functionality of BigQuery to power search use cases. These capabilities now extend to unstructured data analytics as well. Whether you use BigQueryML to produce inference on images or use remote UDFs with Doc AI to produce document extraction, the results can now be search indexed and used to support search access patterns. Here’s an example of search index on data that is parsed from PDF files:code_block[StructValue([(u’code’, u’CREATE SEARCH INDEX my_index ON pdf_text_extract(ALL COLUMNS);rnrnSELECT * FROM pdf_text_extract WHERE SEARCH(pdf_text, “Google”);’), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e084957e110>)])]Security and governance – We are extending BigQuery’s row-level security capabilities to help you secure objects in Google Cloud Storage. By securing specific rows in an object table, you can restrict the ability of end users to retrieve the signed URLs of corresponding URIs present in the table. This is a shared responsibility security model, for which administrators need to ensure that end users don’t have direct access to Google Cloud Storage, and use signed URLs from object tables as the only access mechanism.Here’s an example of a policy for PII images that are secured to be first processed through a blur pipeline:code_block[StructValue([(u’code’, u’CREATE ROW ACCESS POLICY pii_data ON object_table_imagesrnGRANT TO (“group:admin@example.com”) rnFILTER USING (ARRAY_LENGTH(metadata)=1 AND rn metadata[OFFSET(0)].name=”face_detected”)’), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e083dd17e90>)])]Soon, Dataplex will support object tables, allowing you to automatically create object tables in BigQuery and manage and govern unstructured data at scale.Data sharing – You can now use Analytics Hub to share unstructured data with partners, customers and suppliers while not compromising on security and governance. Subscribers can consume the rows of object tables that are shared with them, and use signed URLs for unstructured data objects. Getting StartedSubmit this form to try these new capabilities that unlock the power of your unstructured data in BigQuery. Watch this demo to learn more about these new capabilities.Special thanks to engineering leaders Amir Hormati, Justin Levandoski and Yuri Volobuev for contributing to this post.Related ArticleBuilt with BigQuery: BigQuery ML enables Faraday to make predictions for any US consumer brandHow Building with BigQuery ML enables Faraday to make predictions for any US consumer brand.Read Article
Quelle: Google Cloud Platform

EVO2CLOUD – Vodafone’s SAP migration from on-prem to Google Cloud

Editor’s note: Vodafone is migrating its SAP system, the backbone for its financial, procurement and HR services, to Google Cloud. Vodafone’s SAP system has been running on-prem for 15 years, during which time it has significantly grown in size, making this one of the largest and one of the most complex SAP migrations in EMEA. By integrating its cloud-hosted SAP system to its data ocean running on Google Cloud, Vodafone aims to introduce operational efficiency and drive innovation.Vodafone: from telco to tech-coVodafone, a leading telecommunications company in Europe and Africa, is accelerating its digital transformation from a telco to a tech-co that provides connectivity and digital services such as 5G services, IoT, TV and hosting platforms. Vodafone is partnering with Google Cloud to enable various elements of this transformation — from building one of the industry’s largest data oceans on Google Cloud to  driving value from data insights and deploying AI/ML models. One of Vodafone’s core initiatives is ‘EVO2CLOUD’, a strategic program to migrate its  SAP workloads to Google Cloud. Vodafone uses SAP for its financial, procurement and HR services; it’s the backbone of its internal and external operations. High availability and reliability are fundamental requirements to ensure smooth operation with minimal downtime. Moreover, hosting SAP on Google Cloud is a foundation for digital innovation and maintaining cybersecurity.EVO2CLOUD: enabling SAP on Google CloudWhen complete, EVO2CLOUD will have been one of the largest SAP to Google Cloud migrations. Over the course of two to three years, EVO2CLOUD will enable the transformation of a broad SAP ecosystem composed of more than 100 applications that have been running on-prem for the past 15 years, to a leaner, more agile and scalable deployment that is cloud-first and data-led. With EVO2CLOUD, Vodafone aims to improve operational efficiency, increase its NPS score and maximize business value by incorporating SAP into its cloud and data ecosystem,  introducing data analytics capabilities to the organization and enabling future innovations. As such, EVO2CLOUD is providing standardized SAP solutions and facilitating the transition to a data-centric model that leverages real-time, reliable data to drive data-based corporate decision making. SAP’s operating model on Google CloudVodafone foresees a step change in its operating model, where it can leverage an on-demand, highly performant, and memory-optimized M1 and M2 infrastructure at a low cost. Thanks to infrastructure as code, this improved operating model will provide increased capacity, high availability, flexibility and consistent enforcement of security rules. Vodafone is also reshaping its security architecture and leveraging the latest technologies to ensure privacy, data protection, and resilient threat detection mechanisms. Furthermore, it expects to increase its release-cycle frequency from bi-annual rollouts to weekly release cycles, increasing agility and introducing features faster. In short, Vodafone wants to build agility and flexibility in all that it does — from design all the way to delivery and operations, and DevSecOps will need to be an integral part of its operating model.Leveraging data to drive innovationBefore migrating to Google Cloud, it was difficult for Vodafone to extract and make use of its SAP data. Now with the transition to the cloud and with Google Cloud tools, it can expand how it uses its data for analytics and process mining. This includes operations and monitoring opportunities to map data with other external sources, e.g., combining HR data from SAP with other non-SAP data, resulting in data enrichment and additional business value. Vodafone is continuing to explore opportunities with Google Cloud to identify even more ways to leverage their data.Why Google Cloud and what’s NextIn fact, Vodafone is not only building its system on Google Cloud, but rather sees this project as the first step in a three-phase transformation: Redesigning the SAP environment and migrating to Google Cloud to make it ready for integration with Vodafone’s data ocean.Integrating SAP with Vodafone’s data ocean that sits on Google BigQuery. Leveraging cloud-based data analytics tools to optimize data usage, processes, and how Vodafone operates its business. Moving to Google Cloud is in line with Vodafone’s data-centric strategy, which aims to introduce enhanced features in data analytics and artificial intelligence, and effectively serves Vodafone’s employees and customers in more real-time. Transformation and change managementThe migration to Google Cloud is underway with Vodafone, Google Cloud, SAP and Accenture working together as one team to make this transformation a success. “An innovative and strategic initiative, co-shaped with a truly integrated partnership. A daily collaboration among four parties, Vodafone, Google, SAP and Accenture are executing the cloud transformation of a complex SAP estate within a compressed timeframe, for rapid benefits realization and accelerated innovations in the cloud.” – Antonio Leomanni, EVO2CLOUD program lead, AccentureVodafone recently celebrated the pilot’s go-live, an important milestone in this program. Change management has been fundamental to this transformation, incorporating learning and enablement, financial governance, lifecycle management, security, architecture reviews and innovation. By focusing on these disciplines, Vodafone and Google Cloud are ensuring the success of this transformation and strengthening their partnership.ConclusionIn conclusion, the SAP migration aligns with Vodafone’s data strategy by enabling a step change towards operational efficiency and innovation, by integrating SAP to Vodafone’s data ocean. The key to the success of this ongoing migration is:Clear migration requirements and objectives – infrastructure availability, security and resilience. Strong change managementApplication of the right technologies and toolsTo learn more about how Google Cloud is advancing the telecommunications industry visit us here.Related ArticleDelivering data-driven IT and networks, Google Cloud expands its analytics partner ecosystem for telecommunicationsCommunication Service Providers are becoming data driven and leveraging Google Cloud and their partners to solve tough problems.Read Article
Quelle: Google Cloud Platform

Fortress Vault on Google Cloud: Bringing private data to NFTs

Over the past two years, the general population has become more acquainted with cryptocurrencies and the first iterations of NFTs, which were among the earliest use cases for blockchain technology. This public awareness and participation has led to a growing interest in, and demand for, Web3 technology at the enterprise level. But building trust in a new wave of technology, especially in large organizations, doesn’t happen overnight. That is why it’s critical for Web3 technologists to bring the broader benefits, use cases, and core capabilities of blockchain to the forefront of the conversation. If businesses don’t understand how this new technology can help them, how can they prioritize it among competing tech plans and resources? And without baseline protocols that account for privacy, confidential data, and IP, how can they future-proof a business? Answering these questions and delivering trustworthy infrastructure is exactly why Scott Purcell and I founded Fortress Web3 Technologies — to bring about the next wave of Web3  utility. The company’s goal is to provide infrastructure that eliminates barriers to Web3 adoption with RESTful APIs and widgetized services that enable businesses to quickly launch and scale their Web3 initiatives. Our tools include embeddable wallets for NFTs and fungible rewards tokens; NFT minting engines; and core financial services . These include payments, compliance, and crypto liquidity via our wholly-owned financial institution, Fortress Trust. Being overseen by a chartered, regulated entity ensures privacy, compliance and business continuity.Fortress chose Google Cloud to help usher in this new-wave technology because no other cloud provider is better suited to helping regulated industries get up to scale on our Web3 infrastructure and blockchain technology. I’ll get into more specifics below, but at the highest level: IPFS (the current standard distributed storage) is going to face major resistance when it comes to industries that are heavily regulated or deal in ownership rights. By leveraging Google Cloud, which has critical certifications such as HIPPA, Department of Defense, ISO, and Motion Picture, we’re striking the appropriate balance between decentralization and centralization, using the best of both technologies. The Fortress Vault on Google Cloud is a huge and necessary step forward as the first ever NFT-database solution to protect intellectual property, confidential documents, and other electronic records. It represents the first technology that marries privately stored content with the accessibility, privacy, portability, and provenance that blockchain provides. Understanding Non-Fungible Tokens (NFTs)An NFT is not an expensive jpeg. From a technical point of view, an NFT is a unique key stored in a distributed and trustless ledger we call a blockchain. This blockchain token is uniquely identifiable from any other token and acts as a digital key to authenticate ownership and unlock data held in a database. While different blockchains have adopted different standards, Ethereum standards are a good proxy to represent overall concepts. Going back to the primitives, if you read the EIP 721 proposal, metadata is explicitly optional. While today’s NFT hype has indeed leveraged that technology to monetize and distribute digital art, the potential of blockchain is in the ability to digitally represent ownership of a wide variety of different asset classes on a decentralized ledger. Unique, non-fungible tokens are not a new concept. We use them every day in technical systems for things like authentication, database keys, idempotency, and much more. Now, thanks to blockchain technology, you can take those out of their walled gardens and into an open platform that can lead to transformational utility and applications. Take real estate, for example. Instead of a paper-based title documenting you as the owner of your home, imagine that the title is tokenized with an NFT on a blockchain. Any platform could cryptographically verify the authenticity of that form of title along with its provenance in real time and confirm that you’re the rightful owner of that property. But, perhaps you don’t want the title of your property visible to others, nor the associated permits, tax documents, architectural drawings, contractor lists, and other documents. Maybe you just want banks, insurance companies, and others to be able to confirm that you are indeed the owner without revealing the details of those records. The NFT metadata records immutable public-facing provenance, while the underlying data remains private and protected using Fortress Vault on Google Cloud. Apply that same utility to other sensitive information such as medical records, intellectual property, estate documents, corporate contracts, and other confidential information and it’s easy to see how enterprises are just now exploring how to hold traditional assets as NFTs.Fortress Vault: Intellectual Property, Confidential Documents, and Other Electronic RecordsWhat NFTs and Web3 have been lacking is the ability to make the tokenized data accessible exclusively by the owner — and only the owner.  NFTs are a digital key to unlock everything ranging from music and event tickets, to real estate deeds and healthcare records, to estate documents, and to everything in the world that’s digital.This is why we created the Fortress Vault.  When building it, we had to make a fundamental decision: Either go with a distributed and permissionless storage protocol like IPFS, filecoin, or other blockchain-based database offerings, or work with an industry-leading cloud platform that understands data integrity and is establishing itself as the leader in the space. Ultimately, we chose Google Cloud for its industry-leading object storage, professional management, fault tolerance, and myriad of certifications for architecture and data integrity.Some of the challenges faced when vaulting a vast variety and quantity of digital content at scale include:Balancing data availability versus cost of storageData redundancyLong term archival needsBusiness continuityFlexibility to meet current and future needs of the rapidly evolving Web3 industry. Google Cloud is the clear leader across all of these pain points. The object lifecycle management of Google Cloud Storage enables efficient  transition between storage classes when either the data matures to a certain point or it’s updated with newer files. Content in the Fortress Vault can range from on-demand data to long-term uses, such as estate planning documents that won’t be accessed for 30 years. When storing NFT data, robust disaster recovery is table stakes. We quickly gravitated to the automatic redundancy options and multi-region storage buckets that let us customize where we store our data without massive devops and management overhead. By leveraging Google Cloud, we can offer industry leading retention, redundancy, and integrity for our customers’ NFT content.Working with a leader in data storage was key to making this a reality. Additionally, Google Cloud shares our vision of bringing every industry forward into the world of Web3. We are both focused on building the critical infrastructure that allows everyone from Web3 native companies to Fortune 500 brands navigate the strategic shift to blockchain technology.Why Web3 Matters“Web3” is shorthand  for the “third wave” of the internet and the technological innovation that brought us here. Web 1 — the earliest internet — democratized reading and access to information, opening the doors to  mass communication.  Web 2 expanded on that with the ability to read and “write.” It democratized publishing by letting people directly engage in producing information through blogs, social media, gaming, and contributions to collective knowledge. Web 3 expands our technological capabilities even more with the ability to read, write, and “own.”  With blockchain, we can now establish clear provenance with visibility into the  origination of ownership of any tokenized asset, and we can see the chain of ownership. We can rely on this next-generation technology to track, authenticate, protect, and keep a ledger of our assets. With the Fortress Vault on Google Cloud, we have the capability to ensure the integrity of non-public data while making it accessible via NFTs. This is a game changer for Web3 adoption, particularly  in industries like music, event ticketing, gaming, finance, transportation, real estate, and healthcare. Every industry can benefit from the ability to tokenize assets on blockchain technology without leaving the trusted safety of Google Cloud data storage. The market for NFTs is everyone. And the Fortress Vault on Google Cloud is the technology evolution that makes it possible for Web3 innovators to confidently build, launch, and scale their initiatives across every industry imaginable.Related ArticleWhat’s new in Google Cloud databases: More unified. More open. More intelligent.Google Cloud databases deliver an integrated experience, support legacy migrations, leverage AI and ML and provide developers world class…Read Article
Quelle: Google Cloud Platform

Reliable peering to access Google Cloud

Peering is often seen as a complex and nuanced topic, particularly for some of our Cloud customers. Today we’d like to demystify peering’s inner workings and share how a peering policy update that requires local redundancy helps improve reliability for our users and customers. Redundancy is a well understood and documented concept to improve reliability. We have talked previously about how our significant investments in infrastructure and peering enables our internet content to reach users and how we are making our peering more secure. Google Cloud on the internetEvery day Google Cloud customers collaborate with colleagues using Workspace, leverage Google Cloud CDN to serve content to users worldwide or choose to deploy a Global Cloud Load Balancer to leverage our anycast IPs. Each use case has the same thing in common: these and many other Google products rely on peering to connect Google’s global network to ISPs worldwide to reach their destination, users like you and me. Peering delivers internet trafficPeering is the physical fiber interconnection between networks such as Google and your Internet Service Provider (ISP), or between Google and cloud customers, which occurs at various facilities all around the world. Its purpose is to exchange public internet traffic between networks to optimize for cost and performance. Google has built our network to over 100 facilities worldwide to peer with networks both large and small. This is how Google provides a great experience for all of our users, reduces costs for ISPs, and is one of several ways our cloud customers can connect to the Google network. One of the other common ways enterprises connect to Google Cloud that is often confused with peering is Dedicated Interconnect, which offers private connectivity between your on-premise environment and Google Cloud. Think of peering like part of a city water system where the pipes are the fiber optic cables and the water is the bits of data coming to your phone, computer, or data center. Just as your city’s water system needs to interconnect to your house plumbing, Google’s global network needs to interconnect to your neighborhood ISP to deliver all types of Google traffic. The water flowing out of your sink faucet is analogous to being able to use Google services on your home Wi-Fi. Peering infrastructureThousands of networks including Google are peering with each other all over the world every day. Networks who peer mutually agree on the locations and capacity to address traffic demand, cost, and performance. Since there are so many networks worldwide it is not practical for every network to peer with each other so most networks retain some type of IP transit that allows users to reach the entirety of the internet. Essentially, IP transit is a paid service offered to networks to ‘transit’ another well connected network to reach the entirety of the internet. This transit also acts as a failover path for when a peering connection is unavailable, and plays an important role in ensuring the universal reachability of every endpoint on the Internet. One potential downside to transit is that traffic may traverse an indirect and costly path to reach an end user which therefore can decrease performance compared to peering. Google’s preference is to deliver all traffic on the most optimal peering paths to maximize performance.When peering goes downWith any type of physical infrastructure, components can malfunction or need to be taken out of service for maintenance. The same is true for the infrastructure that supports peering. Downtime can sometimes last days or weeks depending upon the cause and time to repair. During downtime, internet traffic to and from Google gets rerouted to failover paths. Sometimes these paths are another peering location in the same city, sometimes they are rerouted hundreds or thousands of miles away to peering in a different city or even country, and in some cases to an IP transit connection if no other peering connection is available. Much of this depends upon how and where a network is peered with Google.  The further the traffic is physically rerouted from the intended peering connection, and if any IP transit connections are in the traffic path, the higher the likelihood of increased latency, packet loss, or jitter, all of which can translate into a frustrating or poor user experience. A deep and diverse peering footprintOver many years we have built our peering with ISPs and cloud customers to be both physically redundant and locationally diverse to ensure an optimal user experience for all Google services. This translates to a deep and diverse peering interconnection footprint with networks and customers around the world. As Google Cloud services like Premium Network Tier, Cloud VPN, and Workspace use peering to reach their end users, this type of planning helps to avoid user experience issues mentioned above. A more stable and predictable peering interconnectTo help achieve our goal of a reliable experience for all Google users we have recently updated our peering policy to require physical redundancy on all Google private peering connections within the same metropolitan area. This update will allow Google and ISPs to continue to exchange traffic locally during peering infrastructure outages and maintenance under most circumstances. For our customers and users this means more predictable traffic flows, consistent and stable latency, and a higher effective availability of peering that provides an overall more predictable experience with Google services, while still offering cost savings to ISPs. There are a multitude of factors that can influence performance of an application on the internet, however this change is designed so that outages and maintenance on our peering infrastructure will be a less noticeable and impacting experience.  You can read more details about the change on our peering page.Fig A – Two examples of metropolitan area peering redundancy. A redundant peering link (green) in the same metropolitan area helps keep traffic local during peering infrastructure maintenance or outages.Working with our peering partners and customersWe are working closely with our existing and new Google Cloud customers and ISP peers to ensure we build out locally redundant peering interconnects. We also know that many networks have challenges to build this configuration so we are identifying ways to work with them.  We encourage Google Cloud customers and any ISPs who are interested to review their redundancy topology with Google to contact us, and to also review our best peering practices. To learn more about peering and to request peering with Google please visit our Peering websiteRelated Article20+ Cloud Networking innovations unveiled at Google Cloud NextUpdates to the Google Cloud Networking portfolio center on content delivery, migrations, security, and observability, to name a few.Read Article
Quelle: Google Cloud Platform

Manage storage costs by automatically deleting expired data using Firestore Time-To-Live (TTL)

We are thrilled to announce that we have added time-to-live (TTL) support for both Firestore Native and Datastore mode!Use Firestore TTL policies to remove out-of-date data from the database. You can think of it as a managed deletion mechanism built-into Firestore. Once documents or entities are considered expired, they will be eligible for deletion. Similar to direct DELETE operations, it will also notify all external services (ex:  Function Triggers, etc.), upon a deletion event.Common use casesGarbage collection. TTL can be handy if you have data that has a well-defined lifecycle for a document.Support time relevant features natively. You can rely on TTLs if you want to build features relying on ephemeral data.Security and privacy compliance. There are some regulations that require data retention for no longer than a certain time. You will have the flexibility to configure different expiration at the document level, which can help you meet the requirements from varying sources.Example walkthroughSounds like a good candidate for your application? Let’s walk through an example to see how it works from end to end. The example below uses documents and collections, but it works similarly for entities and kinds.Assume you have a database that saves lots of documents in collection Chats and some of them will be useless at some point in the future.First of all, you need to decide on a field to use as the TTL field, and that field must contain a  timestamp value. For example, you can choose to designate the expireAt field as a TTL field, even  if your documents don’t contain values for this field yet. There are two ways of configuring TTL policies:Use the gcloud CLI. You can find some sample commands to view and modify TTL policies. Use the Google Cloud Console. You can navigate to the Firestore Time-to-live configuration page to configure a new policy.Now that you have configured TTL policies, the documents should be updated with the TTL field if not already. In this case it is expireAt that serves as TTL field.That’s everything you need to do. Once a document expires, it’s eligible for deletion, and Firestore will perform the deletion on your behalf.Want to learn more? Check out the documentation and happy databasing.Special thanks to Minh Nguyen, Lead Product Manager for Firestore, and Joseph Batchik, Software Engineer for Firestore, for contributing to this post.Related ArticleAll you need to know about Firestore: A cheatsheetBuilding applications is a heavy lift due to the technical complexity, which includes the complexity of backend services that are used to…Read Article
Quelle: Google Cloud Platform

Best kept security secrets: How Cloud EKM can help resolve the cloud trust paradox

Whether driven by government policy, industry regulation, or geo-political considerations, the evolution of cloud computing has led organizations to want even more control over their data and more transparency from their cloud services. At Google Cloud, one of the best tools for achieving that level of control and transparency is a bit of technological magic we call Cloud External Key Manager (EKM). Cloud EKM can help you protect your cloud data at rest with encryption keys which are stored and managed in a third-party key management system that’s outside Google Cloud’s infrastructure, and ultimately outside Google’s control. This can help you achieve full separation between your encryption keys and your data stored in the cloud. Cloud EKM works with symmetric and asymmetric encryption keys, and offers organization policies that allow for fine-grained control over what types of keys are used. Via Key Access Justification (KAJ) it also offers the way for clients to control each key use.At their core, many cloud security and cloud computing discussions are about the kinds of trust that Cloud EKM specifically and encryption more broadly can help create. While the concept of digital trust is much bigger than cybersecurity and its tripartite components of security, privacy, and compliance, one of the most crucial themes of cloud computing is the cloud trust paradox. In order to trust the cloud more, you must be able to trust it less, and external control of keys and their use can help reduce concerns over unauthorized access to sensitive data.How it worksFrom our Cloud EKM documentation, you can use keys that you manage within a supported external key management partner to protect data within Google Cloud. You can protect data at rest in services that support CMEK, or by calling the Cloud Key Management Service API directly.Cloud EKM provides several benefits:Key provenance: You control the location and distribution of your externally-managed keys. Externally-managed keys are never cached or stored within Google Cloud. Google cannot see them. Instead, Cloud EKM communicates directly with the external key management device for each request.Access control: You manage access to your externally-managed keys. Before you can use an externally-managed key to encrypt or decrypt data in Google Cloud, you must grant the Google Cloud project access to use the key. You can revoke this access at any time.Centralized key management: You can manage your keys and access policies from a single location and user interface, whether the data they protect resides in the cloud or on your premises. The system that managed the keys is entirely outside Google control.In all cases, the key resides on the external system, and is never sent to Google. Here’s how it works:Create or use an existing key in a supported external key management partner system. This key has a unique URI.Grant your Google Cloud project access to use the key, in the external key management partner system.Create a Cloud EKM key in your Google Cloud project, using the URI for the externally-managed key.The Cloud EKM key and the external key management partner key work together to protect your data. The external key is never exposed to Google and cannot be accessed by Google employees. Furthermore, Cloud EKM can be combined with Key Access Justifications (KAJ) to establish cryptographic control over data access. KAJ with Cloud EKM can give customers the ability to deny Google Cloud administrators access to their data at rest for any reason, even in situations typically exempted from customer control, such as outages or responses to third-party data requests. KAJ does this by providing customers a clear reason why data is being decrypted, which they can use to programmatically decide whether to permit decryption and thus allow access to their data. Previously, we’ve discussed three patterns where keeping the keys off the cloud may in fact be truly necessary or outweighs the benefits of cloud-based key management. Here’s a brief summary of those three scenarios where Cloud EKM can help solve these Hold Your Own Key dilemmas.Scenario 1: The last data to go to the cloudAs organizations complete their digital transformations by migrating data processing workloads to the cloud, there is often a pool of data that can not be moved to the cloud. Perhaps it’s the most sensitive data, the most regulated data, or the data with the toughest internal security control requirements.Finance, healthcare, manufacturing and other heavily-regulated organizations face myriad risk, compliance, and policy reasons that may make it challenging to send some of their data to a public cloud provider. However, the organization may be willing to migrate this data set to the cloud as long as it is encrypted and they have sole possession of the encryption keys. Scenario 2: Regional regulations and concernsRegional requirements are playing a larger role in how organizations migrate to and operate workloads in the public cloud. Some organizations are already facing situations where they are based in one country and want to use a cloud provider based in a different country, but they aren’t comfortable with or legally allowed to give the provider access to encryption keys for their stored data. Here the situations are more varied, and can include an organization’s desire to stay ahead of evolving regulatory demands or industry-specific mandates. Ultimately, this scenario allows organizations to utilize Google Cloud while keeping their encryption keys in the location of their choice, and under their physical and administrative control.Scenario 3: Centralized encryption key controlThe focus here is on operational efficiency. Keeping all the keys within one system to cover multiple cloud and on-premise environments can help reduce  overhead and attack surface, thus helping to improve security. As Gartner researchers concluded in their report, “Develop an Enterprisewide Encryption Key Management Strategy or Lose the Data1,” organizations are motivated to reduce the number of key management tools. “By minimizing the number of third-party encryption solutions being deployed within an environment, organizations can focus on establishing a cryptographic center of excellence,” Gartner researchers saidGiven that few organizations are 100% cloud-based today for workloads that require encryption, keeping keys on-prem can streamline key management. Centralizing key management can give the cloud user a central location to enforce policies around access to keys and access to data-at-rest, while a single set of keys can help reduce management complexity. A properly implemented system with adequate security and redundancy outweighs the need to have multiple systems.Do I need Cloud EKM?Whether protecting highly sensitive data, retaining key control to address geopolitical and regional concerns, or supporting hybrid and multi-cloud architectures, Cloud EKM is best suited for those Google Cloud customers who must keep their encryption keys off of the cloud and always under their full control. To learn more about Cloud EKM, please review these resources:Our research explaining why Google Cloud users can benefit from Cloud EKMThe most recent updates to Cloud EKMTake a deeper dive into the cloud trust paradox1. Gartner, Develop an Enterprisewide Encryption Key Management Strategy or Lose the Data, David Mahdi, Brian Lowans, March 2022.Related ArticleBest Kept Security Secrets: Tap into the power of Organization Policy ServiceOrganization Policy Service is a powerful tool for creating broad security guardrails in the cloud. Learn more about how this Best Kept S…Read Article
Quelle: Google Cloud Platform

All 123 things we announced at Google Cloud Next ‘22

We loved hosting Google Cloud Next ‘22 this week in cities around the world and are excited to share our favorite moments and announcements. We kicked off our 24-hour livestream broadcast with an opening keynote in New York City, then moved west to share our “Top 10 Cloud Predictions” developer keynote from the Google Cloud headquarters in Sunnyvale, California. Next ‘22 then crossed the Pacific to Tokyo, Japan, then down to Bengaluru, India, and finished out in Munich, Germany. Thank you to the thousands of developers who joined our global Innovators Hive events, and be sure to check out all the breakout sessions.There’s too much goodness for anyone to catch it all, so here are some video highlights before you dig in below for all the details. Missed the opening keynote? Here’s a 13-minute highlight reel.Curious about our Top 10 Predictions? See the 2 minute speed-round.Want to go deeper? Here’s a full recap of all 123 (!) announcements we made this week, all in one place.Open Infrastructure Cloud Enterprise architects and developers have a big job: to help your company innovate faster, while at the same time working with resources that may not be growing as fast. In the session What’s next for enterprise architects and developers, Sachin Gupta, VP & GM, Infrastructure at Google Cloud, took us on a tour of new enhancements to the infrastructure portfolio that can help accelerate your innovation and TCO. Here’s a complete list. Google Cloud regionsWe introduced five new Google Cloud regions:1. Austria2. Greece3. Norway4. South Africa5. Sweden Compute6. The new C3 Compute Engine virtual machine family is powered by 4th Generation Intel® Xeon® Scalable processors and features Google’s custom Intel Infrastructure Processing Unit (IPU). 7. Cloud TPU v4 Pods, Google’s custom ML infrastructure for training large-scale, state-of-the-art ML models with high price-performance and carbon efficiency, are now GA. 8. The A2 Ultra GPU, now GA, is powered by NVIDIA A100 Tensor Core GPU with 80 GBs of GPU memory, and delivers 25% higher throughput on inference and 2x higher performance on HPC simulations than original A2 machine shapes. 9. New Flexible Committed Use Discounts (Flex CUDs) can help you save up to 46% off on-demand Compute Engine pricing, in exchange for a one- or three-year commitment.10. Batch, now GA, is a fully managed service that helps you run batch jobs easily, reliably, and at scale.Networking 11. L7 for Private Service Connect provides consumer-controlled security, routing, and telemetry.12. Private Service Connect over interconnect provides support for on-prem traffic through Cloud Interconnects to Private Service Connect endpoints.13. Private Service Connect for hybrid environments lets producers and consumers securely connect and access managed services from cloud or on-prem.14. There are five new managed services partners for Private Service Connect: Confluent, Databricks, Datastax, Grafana, and Neo4J. 15. The new C3 virtual machine family features 200 Gbps networking, offering 2x the bandwidth of the C2 family, and line-rate encryption. 16. Network Function Optimizer delivers enhanced networking capabilities that allow customers to connect multiple container network functions, apply labels for selection and to steer the traffic to them. 17. Dynamic compression for Cloud CDN is GA, and reduces the size of responses transferred from the edge to a client to accelerate page load times and reduce egress traffic.18. Media CDN supports the Live Stream API to ingest and package source content into HTTP-Live Streaming and DASH formats for optimized live streaming.19. Dynamic Ad Insertion with Google Ad Manager for Media CDN provides customized video ad placements.20. Media CDN’s new third-party ad insertion capability is based on the Video Stitcher API. 21. Network Actions for Media CDN, in private Preview, is a fully managed serverless solution based on open-source web assembly, and unlocks custom use cases such as security controls, cache offload, custom logs, and more.22. Cloud Firewall Standard is a new tier for Cloud Firewall that offers expanded policies via objects for firewall rules that simplify configuration and micro-segmentation. 23. Cloud Firewall Essentials, Cloud Firewall’s new foundational tier, includes recent support for Global and Regional Network Firewall Policies, and IAM-governed Tags. Learn more about the Cloud Firewall family in MOD107.24. Google Cloud Armor now supports ML-based Adaptive Protection to automatically deploy its proposed rules.25. Cloud Armor gained enhanced tuning for preconfigured WAF rules, adding field exclusion, signature opt-in, and expanded JSON content type support. 26. Preconfigured WAF rules for OWASP Top 10 web-app vulnerability risks are now GA.27. Google Cloud Armor was named a Strong Performer in The Forrester Wave™: Web Application Firewalls, Q3 2022. 28. Network Analyzer for Network Intelligence Center automatically monitors VPC network configurations and detects misconfigurations and suboptimal configurations, and is now GA.29. Network Intelligence Center is integrated with the Recommender API.30. The Performance Dashboard for Network Intelligence Center provides visibility into the performance of the entire Google Cloud network and into your project’s resources. 31. Network Intelligence Center Network Topology has been enhanced with a new “top talkers” view. 32. Firewall Insights for Network Intelligence Center includes enhancements such as IPv6 rule coverage and custom insight refresh cycle to generate shadowed rule insights for projects. You can learn more about all our networking announcements in our blog post, and in the session Simplify and secure your network for all workloads.Hybrid and multicloud33. A new user interface in Anthos provides simplified cluster configuration. 34. New fleet management capabilities in Anthos let you manage growing fleets of container clusters across clouds, on-premises, and at the edge and for different use cases. 35. Anthos clusters in retail edge environments now support virtual machines (GA). Learn more about Anthos in session MOD208.36. Google Distributed Cloud Edge GPU-Optimized Config, is now GA in server-rack form factors powered by 12 Nvidia T4 GPUs. Learn more in session MOD207.Developer productivity37. A new Google Cloud Skills Boost annual subscription includes Innovators Plus developer benefits for $299/year.38. Cloud Deploy supports continuous deployment directly to Cloud Run, with one-click approvals and rollbacks, enterprise security and audit, and built-in delivery metrics. Learn more in sessionBLD203.39. New Cloud Run Integrations tie in Google Cloud services with a single click, for example, configuring domains with a Load Balancer or connecting to a Redis Cache.40. Cloud Run customized health checks offer user-defined, container-level HTTP and TCP Startup probes. 41. A new workshop helps you discover how to unlock efficiency and innovation with a GKE Autopilot.42. Google has joined the Eclipse Adoptium Working Group, a consortium of leaders in the Java community working to promote a higher quality, developer-centric standard for Java distributions, and will contribute to and commercially support the Adoptium Temurin JDK.Developer security43. Software Delivery Shield is a solution for improving the security of all the code, people, systems, and processes that contribute to development and delivery of your software supply chain. Learn more in the blog, and at session SEC100. 44. The new Cloud Workstations provides fully managed development environments on Google Cloud, and is a part of Software Delivery Shield.45. A partnership with JetBrains offers fully managed Jetbrains IDEs as part of Cloud Workstations.46. Source Protect for Cloud Code gives developers real-time security feedback as they work in their IDEs, identifying issues such as vulnerable dependencies and license reporting.47. Assured Open Source Software now includes 250 packages across Java and Python that are all regularly scanned, analyzed, and fuzz-tested for vulnerabilities.48. Container Analysis now includes on-push vulnerability scanning of Maven and Go containers and non-containerized Maven packages.49. Container Analysis can now automatically generate a Software Bill of Materials (SBOM).50. Cloud Build officially supports SLSA Level 3 builds.51. There’s authenticated and non-falsifiable build provenance in Cloud Build for both containerized applications and non-containerized Maven and Python packages.52. Cloud Build can display security insights for built applications. 53. Google Kubernetes Engine’s (GKE) security posture dashboard provides detailed assessments, severity ratings, and advice on the security posture of your clusters and workloads, including insights into OS vulnerabilities and workload configurations. Management and migration tools54. Migration Center brings assessment, planning, migration, and modernization tooling into a centralized location.55. Dual Run for Google Cloud allows customers to simultaneously run mainframe workloads on existing mainframes and on Google Cloud, to perform real-time testing before promoting the new Google Cloud environment as their system of record. Learn more here.56. Workload Manager, now in Preview for SAP workloads, is a Compute Engine service that provides automated analysis of enterprise systems on Google Cloud.57. Google Cloud Carbon Footprint, now GA, provides granular emissions data for cloud workloads and transparency into the energy scores of Google Cloud regions.58. Active Assist carbon emissions estimates are GA, making it simpler to remove unattended projects. Data CloudGoogle’s Data Cloud can transform your decision making and turn data into action by operationalizing data analytics and AI. In the session What’s next for data analysts and data scientists, June Yang, VP, Cloud AI and Industry Solutions and Sudhir Sampatrao Hasbe, Sr. Director, Product Management, discussed the latest data analytics and AI innovations. And in What’s next for data engineers, Andi Gutmans, VP & GM for Databases talked about product innovations across Spanner, AlloyDB, Cloud SQL and BigQuery. Here’s the breakdown. AI and ML59. Translation Hub is a fully managed, self-serve AI Agent that lets localization managers and other employees translate content into 135 languages at the click of a button, helping promote more inclusive, impactful communication while also cutting costs and hyperscaling content.60. DocAI Warehouse leverages AI to help businesses store, organize, search, govern, and manage documents and their extracted data and metadata. 61. The new Document AI Workbench feature lets users extract data from any document by creating business-specific custom document parsers.62. The Vertex AI Vision service can make powerful computer vision and image recognition AI more accessible to data practitioners.63. OpenXLA Project is a consortium that includes Amazon Web Services, AMD, Arm, Google, Intel, Meta, NVIDIA, and more, and whose projects will accelerate machine learning by addressing incompatibilities between frameworks and hardware. Data analytics64. BigQuery now lets you analyze unstructured and streaming data, such as raw documents and PDFs, video and audio, even call center logs—as much as 90% of all data is considered unstructured.65. BigLake now supports a trio of popular data formats: Apache Iceberg, the Linux Foundation’s Delta Lake, and — coming soon — Apache Hudi.66. The integration of BigQuery with Apache Spark lets data practitioners create procedures in BigQuery with Spark that can integrate with their SQL pipelines, greatly speeding up and enhancing processing times. 67. System Insights now includes Cloud SQL security and performance recommenders.68. Updates to Dataplex will automate common processes related to data quality and data lineage, cutting down on manual work to clean up data and enhancing accuracy overall.69. We expanded integrations to our data cloud products with several popular enterprise data platforms, including Collibra, Databricks, Elastic, Fivetran, MongoDB, Sisu Data, Reltio, and Striim.Business intelligence70. Google Cloud’s business intelligence family is now consolidated under the Looker umbrella, and Data Studio is now Looker Studio, available at no cost.71. You can now access Looker data models from Looker Studio. 72. A new Looker Studio Pro offers new enterprise management features, team collaboration capabilities, and SLAs. 73. Looker (Google Cloud core) is available in the Google Cloud console and is integrated with core cloud infrastructure services, such as key security and management services. 74. Enhancements to Looker, BigQuery, and Microsoft Power BI make it easier for Tableau and Microsoft customers to analyze trusted data from Looker and simply connect it with BigQuery.75. Looker will be integrated into many of your favorite Google Workspace programs — starting with Google Sheets — combining our productivity and intelligence tools in one place. 76. A new partnership with Sisu Data provides easy access to Sisu capabilities from inside Looker and BigQuery, finding root causes 80% faster than traditional approaches. Learn more about our business intelligence innovations here or in the session, Bringing together a complete, unified BI platform with Looker and Data Studio. Databases77. Cloud Bigtable change streams allows you to track writes, updates, and deletes to Bigtable databases and replicate them to downstream systems such as BigQuery. 78. The AlloyDB partner ecosystem now includes more than 30 partner solutions to support business intelligence, analytics, data governance, observability, and system integration.79. A Spanner PostgreSQL interface now supports its first group of PostgreSQL ecosystem drivers, starting with Java (JDBC) and Go (pgx). 80. By integrating Vertex AI with Spanner, you can use a simple SQL statement in Spanner to call a machine learning model in Vertex AI.81. For applications using a Firestore backend, we’ve removed the limits for write throughput and concurrent active connections. 82. The new Firestore COUNT() function lets you perform cost-efficient, scalable, count aggregations. 83. Support for Time-to-live (TTL) in Firestore lets you pre-specify when documents should expire, and rely on Firestore to automatically delete expired documents.Trusted CloudThe changing threat landscape requires a ground-up security transformation. Sandra Joyce, EVP, Intelligence & Government Affairs at Mandiant, and Sunil Potti, VP & GM, Cloud Security for Google Cloud, discussed Google Cloud’s security vision and our latest product innovations in the session What’s next for security professionals, including:84. The new Chronicle Security Operations is a modern, cloud-native suite that can better enable cybersecurity teams to detect, investigate, and respond to threats. 85. Confidential Space allows multiple parties to securely collaborate, boosted by a trust guarantee that their data stays protected from their partners.86. A Google Cloud portfolio of solutions helps customers address their digital sovereignty concerns.87. We’ll be integrating the groundbreaking technology of Foreseeti Security, a startup focused on attack simulation and risk quantification, into Security Command Center in Q4. It can help you apply targeted remediations before attackers can take advantage of high-risk vulnerabilities.88. To help organizations better manage risks in their online channels, reCAPTCHA Enterprise and Signifyd will partner to bring to market a joint anti-fraud and abuse solution. 89. Palo Alto Networks customers can now pair Prisma Access with BeyondCorp Enterprise Essentials to help secure private and SaaS app access while mitigating internet threats across managed and unmanaged devices with a secure enterprise browsing experience. 90. We packaged our best practices and implementation experience for customers in Zero Trust Advisory solutions, and our Cybersecurity Action Team and select partners can help guide you through the Zero Trust journey with exploratory workshops, architecture reviews, customized recommendations, and implementation support. Collaboration CloudIn the session What’s next in productivity and hybrid work, Aparna Pappu, VP & GM for Google Workspace and Ilyn Brown, VP, Product Management for Google Workspace made a wealth of announcements focused on helping organizations thrive in a hybrid world. This included our investments in immersive connections, our approach to bringing people closer together through our communication products, smart canvas, our next-generation collaboration experience, and enhancing our cloud-first security model to help people work safer.Read about all of these Workspace announcements here.Google Meet, Chat, and Voice91. Adaptive framing gives everyone a chance to be seen in the conference room when collaborating with remote colleagues, using AI-powered cameras from Huddly and Logitech.92. Meeting room check-ins lets participants know who is in the room by displaying their names alongside the room.93. Companion mode mobile gives in-room attendees the ability to fully participate by raising their hand, chatting, or asking questions from their phone while leveraging the in-room audio and video. 94. Assigning conference rooms to breakout rooms helps manage the logistics for in-room attendees during breakout discussions. 95. Automatic video framing centers participants in their video tile before joining a meeting and lets them manually reframe at any time. 96. Auto transcriptions removes the burden of taking notes (English, with French, German, Portuguese and Spanish coming in 2023).97. Speaker spotlight in Slides collapses the boundary between the story and storyteller in a hybrid world by placing the speaker’s video directly within their content. 98. Custom emojis and inline threaded conversations enable people to express themselves more authentically and go deeper on specific conversations. 99. Broadcast-only spaces in Google Chat make it easier for leaders to make broad announcements and maintain connections across their organizations. 100. SIP Link in Google Voice allows companies to assign and manage phone numbers provided by their telecommunication provider alongside Google-provisioned numbers. 101. APIs for Meet and Chat give developers programmatic access to common functions like creating and starting meetings or initiating messages directly from a third-party app. Asana and LumApps will be the first partners to leverage these in their apps. 102. Meet add-on SDK enables developers to embed their app directly into the Meet experience, with Figma being one of the first add-on partners. 103. Chat and AppSheet integration enable people to create and interact with custom AppSheet apps right within Chat. Our low-code and no-code platform lets anyone without coding experience build mobile and web applications quickly.Smart canvas104. Custom building blocks in Google Docs enable users to build their own reusable components that can be easily accessed with the @ menu. 105. Variables in Google Docs enable users to define common data elements in a Doc, such as a client name or contract number, and update it throughout the document by changing the value in one place. 106. Smart chips and a new timeline view in Google Sheets extends the power of smart canvas allowing people to easily pull in people, files, and calendar details. 107. Smart chip data extraction lets users quickly populate spreadsheets with important information from chips they use across Workspace. 108. Smart chips for third-party apps lets users view and engage with rich third-party data in the flow of work rather than switching tabs or context, including AODocs, Asana, Atlassian, Figma, LumApps, Miro, Tableau, and ZenDesk coming soon. Work safer109. We’re extending client-side encryption (CSE) to Gmail and Google Calendar, allowing Enterprise Plus and Education Plus/Standard customers to have complete control over access to their data to address a broad range of data sovereignty and compliance requirements.110. Data Loss Prevention (DLP) checks in Google Chat lets admins create custom policies to prevent sensitive information from leaking, scans content in real-time and applies corrective action fast.111. Trust rules in Drive, currently in beta, allow for more granular control of internal and external sharing, providing admins more flexibility in establishing collaboration boundaries. Customers and partnersWe wouldn’t be here today without our customers and partners. Throughout the keynotes and breakout sessions, they joined us on stage to discuss how they are using Google Cloud technologies to transform how organizations do business. The following organizations announced new or expanded relationships with Google Cloud: 112. Coinbase, a leading crypto exchange, will move to Google Cloud’s infrastructure and certain Google Cloud customers will now be able to pay for cloud services with cryptocurrencies using Coinbase Commerce.113. The Australian Securities Exchange (ASX) has selected Google Cloud as its preferred cloud partner to build its data product innovation strategy. 114. Multinational insurance company Prudential plc and Google Cloud announced a strategic partnership to enhance overall health and financial inclusion for communities across Asia and Africa.115. Rite Aid will rely on Google Cloud to help it realize its vision of a modern pharmacy, building new, personalized experiences that allow Rite Aid pharmacists to spend more time engaging with customers.116. Snap and Google Cloud will expand their ten-year partnership to power the next phase of Snap’s growth, with a focus on infrastructure, big data and analytics solutions, and AI/ML. 117. Toyota announced the launch of Speech On-Device for select vehicles powered by Google Cloud technology, which will enable voice requests to be served directly by vehicles’ multimedia system processors, without the need for internet connectivity. 118. Together, T-Mobile and Google Cloud will help to improve the wireless provider’s customer experience services using our expertise in data analytics, AI, and ML, and our extensive portfolio of leading 5G and edge computing solutions.119. African e-commerce company Twiga Foods is working with Google Cloud to run an efficient food value chain that connects farmers directly with vendors.120.  Wayfair has completed a full migration of its data center applications and services to the cloud, with Google Cloud as the foundation of its overall cloud strategy.121. Paramount Global, one of the world’s largest producers of premium entertainment content, is using Media CDN, citing consistently superior performance and offload metrics. 122. Accenture and Google Cloud expanded their global partnership, creating a new, dedicated Google Cloud professional services group to accelerate customers’ consumption of Google Cloud services. 123. Lufthansa Group announced that Google Cloud helped it cut its Co2 emissions by an estimated 7,400 tons per year — the equivalent of 18 Boeing 777 roundtrip flights between Zurich and New York City or 370 rotations between London and Zurich. OK, that was a lot of announcements. Thank you to all of the teams at Google as well as our customers and partners who are building together with us. As Thomas Kurian shared at the end of his opening keynote, “We’re excited to develop technology today to help you create a better tomorrow.”  We’re already making plans for next year — watch this space to stay in the loop about Google Cloud Next ‘23, and other events in your area.Related ArticleWhat’s next for digital transformation in the cloudGoogle Cloud ’Next 22 is here! Check out the official kickoff blog and hear from our CEO, Thomas Kurian, on new customer wins, partnershi…Read Article
Quelle: Google Cloud Platform

At Next ’22, introducing new capabilities for secure transformations

Organizations large and small are realizing that digital transformation and the changing threat landscape require a grounds up effort to transform security. At Google Cloud, we continue to invest in our vision of invisible security where advanced capabilities are engineered into our platforms, operations are simplified, and stronger security outcomes can be achieved. We made five major security announcements yesterday at Google Cloud Next: Introducing Chronicle Security Operations, to help detect, investigate, and respond to cyberthreats with the speed, scale, and intelligence of GoogleIntroducing Confidential Space, to help unlock the value of secure data collaborationAdvancing digital sovereignty on Europe’s terms, to address growing demand for cloud solutions with high levels of control, transparency, and sovereigntyIntroducing Software Delivery Shield, to help improve software supply chain security New and expanded Google Cloud partnerships with leaders across the security ecosystem  Today at Next ‘22, we’re introducing additional new security products, partnerships, and solutions across security analytics, anti-fraud measures, device security, Zero Trust, and open source software security to help our customers around the world address their most pressing security challenges.Our Assured Open Source Software service, which we announced earlier this year, is now available in Preview. Assured OSS enables enterprise and public sector users of open source software to easily incorporate the same trusted OSS packages that Google uses into their own developer workflows. You can sign up for the Preview of Assured OSS here.Security teams must continually measure and manage risk in their cloud environments. Earlier this year we acquired Foreseeti, a startup focused on attack simulation and risk quantification. We’re excited to announce that the integration of Foreseeti’s groundbreaking technology, which can help teams understand their exposure and prioritize contextualized vulnerability findings, will be coming to Security Command Center in Preview in Q4. Security Command Center will use Forseeti’s advanced attack path simulations to help you apply targeted remediations before attackers can take advantage of high-risk vulnerabilities.To help organizations better manage risks in their online channels, reCAPTCHA Enterprise and Signifyd will partner to bring to market a joint anti-fraud and abuse solution. This solution will combine the behavioral analysis capabilities of reCAPTCHA Enterprise with the anti-fraud capabilities of Signifyd to help enterprises reduce abuse, account takeovers, and payment fraud. We continue to invest in new initiatives with our BeyondCorp Alliance partners. Palo Alto Networks customers can now pair Prisma Access with BeyondCorp Enterprise Essentials to help secure private and SaaS app access while mitigating internet threats across managed and unmanaged devices with a secure enterprise browsing experience. We now package best practices and implementation experience for our customers in Zero Trust Advisory solutions. Our Cybersecurity Action Team and select partners can help guide you through the Zero Trust journey with exploratory workshops, architecture reviews, customized recommendations, and implementation support. Google Cloud Armor, which was instrumental in stopping the largest Layer 7 DDoS attack to date, was named a Strong Performer in The Forrester Wave™: Web Application Firewalls, Q3 2022. This is our debut in the WAF Wave, and it’s encouraging to see the recognition for the product in this market segment.Google Workspace has received several security updates and advances. They bring data loss prevention (DLP) to Google Chat to help prevent sensitive information leaks, new Trust rules for Google Drive for more granular control of internal and external sharing, and Client-side encryption (CSE) in Gmail and Google Calendar to help address a broad range of data sovereignty and compliance requirements. You can learn more in our Workspace blog.Learn more at Google Cloud NextOur growing team at Google Cloud Security remains focused on delivering solutions that can make governments and enterprises safer with Google, in our trusted cloud and through products that bring our security capabilities to on-premises environments and other clouds. Learn more about these announcements and capabilities by attending the Security sessions at Google Cloud Next all this week and on-demand soon after.Related ArticleIntroducing Chronicle Security Operations: Detect, investigate, and respond to cyberthreats with the speed, scale, and intelligence of GoogleWe are excited to unveil Chronicle Security Operations, a modern, cloud-native suite that can better enables cybersecurity teams to detec…Read Article
Quelle: Google Cloud Platform

The future of sustainable flying is data-driven for Lufthansa Group

In the past few years, the airline industry hasannounced its commitment to achieving net-zero carbon emissions in the next 30 years. Meeting this target will largely depend on the industry’s ability to access sustainable fuel, acquire the latest carbon-friendly aircraft technology, and develop optimization strategies for efficient operations in the air and on the ground. Adding to the complexity are the daily challenges the industry faces, such as unpredictable resource availability, volatile weather conditions, and economic instability while trying to meet passenger expectations.The Lufthansa Group recognized that this increasingly complex environment required a new approach to data management. The airline partnered with Google Cloud to develop a platform that facilitates better planning and steering of the airline’s daily flight operations. The efficiencies gained through the deployment of Google Cloud have led to measurable CO2 reductions through more efficient aircraft deployment. AI-enabled scenario planning and increased visibility into weather patterns, routing options, aircraft fuel-efficiency, and aircraft usage have played a significant role in the airline’s success.Missed connections: The need for centralized data access grows Sustainability is a growing priority to the day-to-day operations at Lufthansa Group. The airline works with Google Cloud to run an Operations Decision Support Suite (OPSD). OPSD is a cloud-based operational planning tool integrating data from the core systems running aircraft rotation, passenger management, crew management, and technical fleet management. It was initially introduced to the Lufthansa Group subsidiary Swiss International Air Lines and is now to be rolled out across the entire Group.  Taking advantage of GoogleBigQuery andVertex AI for analytics and modeling, OPSD derives predictive intelligence from the data, suggesting scenarios for the airlines’ operations control center team so they can make optimal decisions at any given time and help the company make headway on its sustainability goals, including carbon emissions targets.“Since our collaboration with Google Cloud, we have elevated our passenger experience by improving our hub steering and avoiding missed connections,” explains Christian Most, project lead for the Lufthansa Group’s OPSD. “By this, we have an impact also on our sustainability by increasing the efficiency of our operations. [This] is a technology where you can align multiple target functions into one and weigh these different input factors based on the situation, and then get the best solution.”Lufthansa Group considered several technology partners for the optimization project but selected Google Cloud because the OPSD team needed a solution that could work across its many operating units. Google Cloud offered to come on board as a strategic partner rather than just a service provider to ensure Lufthansa Group builds a platform that can meet these cross-functional needs, according to Most.Smart aircraft assignments lead to less fuel burnLufthansa Group works with and deploys tools that accelerate its ambitious carbon-reduction targets. Google Cloud was already well established in this space, having launched asustainability partnership program in October 2021 to develop new technologies that can deliver the massive datasets and cloud-native solutions customers need to accelerate their sustainability initiatives.In February 2021, the OPSD and Google Cloud team began working on several pilot projects after a yearlong delay due to the pandemic. They identified flight rotation management as one of the key areas of focus to improve fuel efficiency.Flight rotation involves choosing aircraft that will increase efficiencies for a particular flight based on a variety of factors, including the aircraft’s passenger capacity, weight, maintenance schedules, and fuel burn. The amount of fuel consumed during flight varies from plane to plane, depending mainly on the trajectory of the aircraft and the type of engine. This is why it’s important to have clear visibility into aircraft availability, because the differences in fuel efficiency between similar aircraft can range up to 10%. Cleaning and maintenance schedules are also important considerations. The cleanliness of an aircraft can make an additional difference of 1% when it comes to fuel efficiency. Also, some flight systems might be calibrated differently than others, which can lead to more drag — the amount of resistance an aircraft encounters in flight.The OPSD collates data about aircraft maintenance, passenger booking data, routes, and cargo data so the team can allocate the most efficient aircraft for each flight, which helps the company reduce its absolute emissions.For example, an Airbus A321neo, which stands for “new engine option,” would be 15% to 20% more efficient than a standard Airbus A321 on a four-hour trip to the Canary Islands. How does the team know this? The scenario-planning capabilities enabled by Google Cloud help controllers determine which aircraft is ideal for a particular route, considering all available live data. The efficiencies gained from these decisions, even in small numbers, add up to significant savings.“When we uncover an opportunity to improve sustainability, this is where the magic happens,” says Most. “If you have multiple factors to weigh against, you will see at the end a benefit for operations, for the customer, and for your costs. Improving sustainability always comes with many advantages for other factors.”Vertex AI delivers more accurate weather reports Most of us know firsthand the impact that weather can have on travel, whether we’ve been stuck at an airport after a canceled flight or missed a connection due to weather-related delays. Lufthansa Group and its frontrunner SWISS are trying to improve the traveler experience in the most efficient way possible by using data and Vertex AI to predict individual connecting timesand to plan more effectively for inclement weather.For example, the company is able to integrate historical data from weather agencies across Switzerland to predict the duration of a phenomenon called Bise. Bise is a cold north-easterly wind that blows through the Swiss Mittelland. In Zurich, a strong Bise can cause flight delays and cancellations because air traffic control has to reduce the arrival capacity by up to 30%. Furthermore and to make things worse, the departure capacity is also heavily reduced as the flight paths of arriving and departing aircraft intersect. Using Vertex AI, SWISS can model scenarios based on the weather data to gain a more accurate view of delays and plan accordingly.“Now we can predict how long the Bise will last, how strong the winds are, the capacity limitations we expect for this timeframe, and thus be able to accurately predict the impact on our flight operations, which we didn’t have in the past,” Most said.In the event of a cancellation, the system also provides a range of efficient rebooking options via relevant hubs throughout the Lufthansa Group and it’s partners . Looking ahead, the company plans to extend the capabilities of Google Cloud to other stakeholders, including airports, air traffic controllers and other providers, to drive additional efficiencies.“It’s called the industrial cloud, where we all work together and share information in order to optimize our system jointly rather than optimizing yourself, which can lead to bottlenecks in the system,” Most explained. “That’s the ultimate vision we are pursuing.”The sustainability journey continuesWithin 18 months of working with Google Cloud on these initiatives, SWISS has already:Cut Co2 emissions by an estimated 7,400 tons per year — the equivalent of 18 Boeing 777 roundtrip flights between Zurich and New York City or 370 rotations between London and Zurich.Optimized at least half the flights in the SWISS network.Saved 5.2 million Swiss francs this year, streamlining its four key operational domains — aircraft rotation, passenger management, crew management, and technical fleet management.Increased agility and flexibility to respond to unexpected events via scenario-planning capabilities.These achievements have been so significant, in fact, that Lufthansa Group recently wasrecognized with a Google Cloud Sustainability Customer Award. And because of the strength of the continued collaboration, SWISS and the Lufthansa Group are able to explore new possibilities for even more sustainability progress. “It’s very easy to work with Google Cloud; it’s very honest,” says Most. “They have so many ideas about what else we could do with the technology, so it’s a lot of fun.”Related ArticleGoogle Cloud announces new products, partners and programs to accelerate sustainable transformationsIn advance of the Google Cloud Sustainability Summit, we announced new programs and tools to help drive sustainable digital transformation.Read Article
Quelle: Google Cloud Platform