From data ingestion to insight prediction: Google Cloud smart analytics accelerates your business transformation

A growing number of businesses each year are bringing their most valued assets, their data, to Google Cloud for smart analytics. Every day, customers upload petabytes of new data into BigQuery, our exabyte-scale, serverless data warehouse, and the volume of data analyzed has grown by over 300 percent in just the last year. Large enterprises and small start-ups alike trust Google Cloud to store, analyze and find insights in their data—and we want to bring them the tools they need to make data-driven insights actionable across their organizations.Today, we’re announcing a number of new capabilities to our data analytics offerings. We’re introducing radically simple ways to move data into Google Cloud—and to clean, categorize, and understand it. We’re providing significant enhancements to our data warehousing infrastructure, and making it even easier for enterprises to seamlessly adopt BigQuery. We’re also expanding the ways we’re bringing machine learning to our analytics platform so that businesses can easily adopt predictive analytics with greater accuracy.Here’s an overview of what’s new:Simplifying data migration and integrationCloud Data Fusion (beta)BigQuery DTS SaaS application connectors (beta)Data warehouse migration service to BigQuery (beta)Cloud Dataflow SQL (public alpha, coming soon)Dataflow FlexRS (beta)Accelerating time to insightsBigQuery BI Engine (beta)Connected sheets (beta, coming soon)Turning data into predictionsBigQuery ML (GA, coming soon), with additional models supportedAutoML Tables (beta)Enhancing data discovery and governanceCloud Data Catalog (beta, coming soon)Simplifying data migration and integrationBefore you can analyze your data, you first need to move and unify it in the cloud. Today, we’re announcing several new ways we’re making it easier to bring together data from on premises, different applications, and other clouds to Google Cloud Platform (GCP).Introducing Cloud Data Fusion: blend and transform data from disparate sources in one locationMany large organizations have massive amounts of data locked up in siloed systems and need a way to get a full or transformed view of their data to drive their use cases. Cloud Data Fusion, in beta, addresses this challenge.Cloud Data Fusion is a fully-managed and cloud-native data integration service with a broad library of open-source transformations and more than a hundred out-of-the-box connectors for a wide array of systems and data formats. This means anyone can easily ingest and integrate data from various sources and transform that data, for example, blending or joining it with other data sources, before using BigQuery to analyze it.Data Fusion’s control center allows you to explore and manage all your datasets and data pipelines in one location. It’s as simple as dragging and dropping data pipelines into the control center—no coding necessary.“Data Fusion lowers the barrier to entry for big data work by providing an intuitive visual interface and pipeline abstraction,” says Robert Medeiros, R&D Architect, TELUS Digital. “This increased accessibility, combined with a growing collection of pre-built ‘connectors’ and transformations, translates to rapid results and in many cases allows data analysts and scientists to ‘self-serve’ without needing help from those with deep cloud or software engineering expertise.”BigQuery DTS now supports over 100 SaaS application integrations through partner connectorsThe BigQuery Data Transfer Service automates data movement from SaaS applications to Google BigQuery on a scheduled, managed basis. Your analytics team can lay the foundation for a data warehouse without writing a single line of code. In addition to Google’s first party apps, BigQuery Data Transfer Service now supports more than 100 popular SaaS applications, including Salesforce, Marketo, Workday, Stripe, and many more.Data warehouse migration service: simplify migration to Google CloudA large number of enterprises need to modernize their data warehouse infrastructure and are now looking for easier ways to migrate those data warehouses to BigQuery. We have built a data warehouse migration service to automate migrating data and schema to BigQuery from Teradata and Amazon Redshift, as well as data loading from Amazon S3. This service will significantly reduce migration time. You can find the documentation for this process here, and our recently-announced data warehousing migration offer makes it even easier for enterprises to move from traditional data warehouses to BigQuery.Cloud Dataflow SQL and Dataflow FlexRS: launch data pipelines with SQL and schedule jobs more flexiblyData analysts rely on data pipelines to drive analytics, yet are often dependent on data engineers to build those pipelines. Cloud Dataflow SQL, coming soon in public alpha, makes it possible for data analysts to build their own Dataflow pipelines using familiar SQL that also automatically detects the need for batch or stream data processing.Dataflow SQL uses the same SQL dialect used in BigQuery. This allows data analysts to use Dataflow SQL from within the BigQuery UI, to join Cloud Pub/Sub streams with files or tables from across your data infrastructure, and then to directly query the merged data in real time. This means you can generate real-time insights and create a dashboard to visualize the results.To receive a release notification for Dataflow SQL’s public alpha, please fill out this form.Today, we’re also announcing Dataflow Flexible Resource Scheduling (FlexRS), in beta, which offers cost benefits for batch processing jobs through scheduling flexibility, enabling overnight jobs. If you’re processing non time-sensitive data, you can benefit from preemptible resource pricing.Accelerating time-to-insights and fostering data collaboration at scale, without compromising securityOnce businesses have ingested their most important data into BigQuery, we help them share their data in easy-to-understand ways so users across an entire organization can take advantage of those same insights.BigQuery BI Engine: bring business intelligence directly to your dataData analysts and business users often use business intelligence (BI) reports and dashboards to analyze data from a data warehouse. Today, we’re introducing BigQuery BI Engine in beta, providing an extraordinarily fast, in-memory analysis service for BigQuery. With BigQuery BI Engine, users can analyze complex data sets interactively with sub-second query response time and with high concurrency. Today, BigQuery BI Engine is available through Google Data Studio for interactive reporting and dashboarding, and in the coming months, our technology partners like Looker and Tableau will be able to leverage BI Engine as well.”With BigQuery BI Engine behind the scenes, we’re able to gain deep insights very quickly in Data Studio,” says Rolf Seegelken, Senior ​Data Analyst, Zalando. “The performance of even our most computationally intensive dashboards has sped up to the point where response times are now less than a second. Nothing beats ‘instant’ in today’s age, to keep our teams engaged in the data!”Connected sheets: access the power of BigQuery through a spreadsheet interfaceA wide range of business users rely on spreadsheets as an indispensable tool for data analysis. Today we’re announcing connected sheets, a new type of spreadsheet that combines the simplicity of a spreadsheet interface with the power of BigQuery. That means no row limits with this connected sheet—it works with the full dataset from BigQuery, whether that’s millions or even billions of rows of data. It also means you don’t need to learn SQL—you’re simply using regular Sheets functionality, including formulas, pivot tables, and charts, to do the analysis.With a few clicks, you can visualize data as a dashboard in Sheets and securely share it with anyone in your organization.”Connected sheets are helping us democratize data,” says Nikunj Shanti, Chief Product Officer at AirAsia. “Analysts and business users are able to create pivots or charts, leveraging their existing skills on massive datasets, without needing SQL. This direct access to the underlying data in BigQuery provides access to the most granular data available for analysis. It’s a game changer for AirAsia.”Sign up to learn more about the beta of connected sheets, which will become available in the next few months. You can read more about this new integration today in our G Suite blog post.Connected sheets and BigQuery BI Engine are complemented by our broad range of updates to BigQuery. These include a new, updated BigQuery interface, now in GA, as well as the general availability of BigQuery GIS, enabling seamless analysis of spatial data in BigQuery, the only cloud data warehouse to support rich GIS functionalities out-of-the-box.Bringing data and AI together—and making it accessible to anyonePredictive insights are increasingly becoming an important way businesses can anticipate needs like estimating customer demand or scheduling routine maintenance. Data warehouses often store the most valuable data sets for the enterprise, but unlocking these insights has traditionally been the domain of machine learning experts—a skill not shared by most data analysts or business users. We’ve changed that with BigQuery ML.BigQuery ML generally available (coming soon), with expanded machine learning modelsLast year, we announced BigQuery ML, enabling data analysts to build and deploy machine learning models on massive datasets directly inside BigQuery using familiar SQL.We’re also continuing to expand BigQuery ML functionality to address even more business needs. We’ve made new models available like k-means clustering (in beta) and matrix factorization (in alpha) to build customer segmentations and product recommendations. Customers can also now also build and directly import TensorFlow Deep Neural Network models (in alpha) through BigQuery ML.“Geotab is providing new smart city solutions leveraging aggregate data from over 1 million connected vehicles. We’re able to use BigQuery GIS to understand traffic flow patterns and BigQuery ML helped us derive insight into predicting hazardous driving areas in cities based on inclement weather,” explains Neil Cawse, CEO of Geotab.AutoML Tables: apply machine learning to tabular data without writing a single line of codeNot everyone who can benefit from machine learning insights is a SQL expert. To make it even easier to apply ML on structured data stored in BigQuery and Cloud Storage, we’re excited to announce AutoML Tables, in beta. AutoML Tables lets your entire team of data scientists, analysts and developers automatically build and deploy state-of-the-art machine learning models on structured data in just a few clicks, reducing the total time required from weeks to days—without writing a single line of code.You can read more on AutoML Tables in this blog post, or learn how retailers can apply it to their unique businesses challenges here.Operate with trust on an enterprise-ready data platformThe variety, volume and velocity of data from disparate systems, business processes, and other sources has meant that many organizations increasingly grapple with data access, discovery, management, security and governance. Finding and validating datasets can often be a complex, manual process, and increasing regulatory and compliance requirements has made it all the more important.Data Catalog: data discovery and governance, simplifiedTo help organizations to quickly discover, manage and understand their data assets, we’re introducing Data Catalog in beta, a fully managed and scalable metadata management service. Data Catalog offers a simple and easy-to-use search interface for data discovery, powered by the same Google search technology that supports Gmail and Drive, and offers a flexible and powerful cataloging system for capturing technical and business metadata. For security and data governance, it integrates with Cloud DLP, so you can discover and catalog sensitive data assets, and Cloud IAM, where we honor source access control lists (ACLs), simplifying access management.After deploying Data Catalog with his team, David Parfett, Director of Data Architecture at Sky explains, “With the increasing amount of data assets in our organization, we are confident that Data Catalog will allow us to quickly and easily discover our data assets across GCP and scale in line with our growing business.”  We’re also working with strategic partners like Collibra, Informatica, Tableau, and Looker to build integrations with Data Catalog, allowing customers to have a unified data discovery experience for hybrid cloud scenarios, using their platform of choice.“Our relationship with Google Cloud has accelerated in recent months, and this partnership is the next step in our shared commitment to providing a foundation for data governance that sets organizations up to succeed,” said Jim Cushman, Chief Product Officer for Collibra. “We’re excited to continue building this partnership, with a mutual goal of integrating our technologies and making it easier for enterprise organizations to understand and use the data that is vital to their business.”To learn more, and request access to Data Catalog, fill out this form.Looking forwardFrom Fortune 500 enterprises to start-ups, more and more businesses continue to look to the cloud to help them store, manage, and generate insights from their data. And we’ll continue to develop new, transformative tools to help them do just that. For more information about data analytics on Google Cloud, visit our website.
Quelle: Google Cloud Platform

From Manufacturing to Climate Analytics: DockerCon speakers on real-world use cases

DockerCon brings industry leaders and experts of the container world to one event where they share their knowledge, experience and guidance. This year is no different. For the next few weeks, we’re going to highlight a few of our amazing speakers and the talks they will be leading.

In this third highlight, we have several speakers who will be sharing their real world Docker use cases and learnings.  These are the folks who have already put things in place and are here to share and inspire. Interested in transforming legacy applications? Or maybe large scale data analytics is your focus. Maybe you’re a software vendor – or have plans to be – and want to learn about containerizing your application. To learn more, register now to attend the session featuring real Docker users like you.
In case you missed them, check out our previous speaker highlights:

Storage, service mesh and networking experts
Security experts

Transforming a 15+ Year Old Semiconductor Manufacturing Environment
More on Jeanie’s session here.
 

Jeanie Schwenk
Engineer, Scrum Master and Agile Project Manager at Jireh Semiconductor

What is your breakout about?
I was just starting to look at Docker at this time last year. Our company’s transformation journey hasn’t been a straightforward one, and we haven’t reached our final destination, yet, but we are well on our way. We started with numerous legacy software applications running on HP 9000 servers, which were new in 2001. Now, one by one, we’re moving over 230 applications that run the semiconductor factory into Docker containers.
Why should people go to your session?
You’re not alone in the problems you face in transitioning legacy application and legacy development processes. The chance to share and learn together – our roadblocks, solutions, and ideas – will help all of us have more impact.
What are you looking forward to the most at this DockerCon?
For me personally, the networking and learning opportunities are the draw. Being able to share our story and encourage others facing similar challenges is a rare opportunity.

CMD and Conquer: Containerizing the Monolith
More on Tony and Nelson’s session here.

Tony Lee
Software Engineer at Splunk

Nelson Wang
Sr Software Engineer at Splunk

What is your breakout about?
It is about Splunk’s journey to containerize its own on-prem software with tons of pre- and post-installation configurations. We’re going to discuss the strategy behind Splunk’s official Docker image, and how we captured some of the paradigms behind software containerization for the benefit of both external customers and internal engineering efforts.
 
Why should people attend your session?
“Migrating monolithic systems into a system of microservices is a simple, straightforward task” – No one ever.
 
Any journey that involves fundamentally changing the design of a software’s architecture will never be without some pain. Often times, it is a costly and expensive endeavor, and the full benefits cannot be reaped until the very end. At Splunk, the Infrastructure team took on the onerous task of figuring out how to make this transition more harmonious, bridging the gap between two different ideologies.
What are you looking forward to the most at this DockerCon?
Speaking and learning about how others use containers in their daily operation.
What is your favorite Docker Hub repo?
Obviously our own Splunk Docker Hub repo. Besides that, we enjoy using containerized nginx, MongoDB, and Redis.

Towards Reproducible Climate Research
More on Aparna’s session here.
 

Aparna Radhakrishnan
Software Development Engineer at Engility

What is your breakout about?
We will draw a path towards reproducible research using Docker containers for massive data publishing and analytics. The scale and magnitude of computing and data have proven to increase significantly in the last decade, thus making data delivery methods to the world a herculean research problem by itself. In the case of NOAA (National Oceanic and Atmospheric Administration), this world of evolving responses with respect to climate creates an avalanche of effects in various sectors – agriculture, health, GDP, etc.  We will discuss some of the pioneering efforts from collaborators from other laboratories and organizations (such as ESGF, Google, NASA JPL, Columbia University, PMEL, etc.) in the area of Docker containers in computing and analysis on and off the cloud.
Why should people go to your session?
Although climate research is an entirely different theme, Docker is in use even here. Anyone that wonders what Docker is capable of should attend. Whether its inspiration-driven research or industrial-driven research, Dockers are relevant. I am not a Docker guru, but I could do it and so can you!
What are you looking forward to the most at this DockerCon?
Women speakers and the city of San Francisco!

Modern Orchestrated IT for Enterprise CMS: A Case Study for Wiley Education Services
Learn more about Jesse and Blaine’s session here.

Blaine Helmick
Senior Manager, System Engineering at Wiley

Jesse Antoszyk
Solutions Architect at BoxBoat

What is your breakout about?
Jesse: We’re going to be talking about making Wiley Education Services successful running their CMS on Docker and Swarm! The journey and the end result are pretty compelling.
Blaine: Jesse teed this up perfectly. It’s all about the “hero’s journey” where we triumph over the mediocrity of single-purpose virtual machines.
Why should people go to your session?
Blaine: Would it be too absurd to abstract some Shakespeare?
“To containerize, or not to containerize that is the question: Whether ‘tis nobler in the mind to suffer the downtime and broken code of outrageous software, or to take countermeasures against a sea of denial of services attacks And by mitigating the traffic end them? To server 500: to fail; No more; We Docker.”
Ok, maybe that’s a little over the top.
Jesse: The journey to containerization extends beyond tech, it’s a shift in thinking. Implement containers, and implement CI/CD, but don’t forget to adopt the DevOps mentality to support them. Doing these things will allow you to focus on creating better products faster. Anyone interested in learning how we went from zero to Docker, the challenges we faced and solutions we came up with, will get a lot out of this session. We’re going to give a peek into the organizational and technical aspects of containerizing CMS at WES.
What are you looking forward to the most at this DockerCon?
Jesse: Hard to pick. I’m really stoked for most of the black belt talks. Speaking at DockerCon is pretty exciting as well!
Blaine: Last year I walked out quoting the “Pets versus Crops” metaphor – I switched it up from “Pets versus Cattle” to be more animal friendly I’m excited just thinking about what I’ll walk out of DCSF ’19 quoting.

Thank you to all our presenters and see you at DockerCon!

#DockerCon sneak peek: A chat with real world Docker users creating breakout sessions. Register for @DockerCon 2019 today. @blainehelmick @jcantoszClick To Tweet

Call to Action

Register for DockerCon 2019, April 29 – May 2 in San Francisco – Save $250 by registering before April 16!
Sign up and attend these additional events, running conjunction with DockerCon:

Women@DockerCon Summit, Monday, April 29th
Open Source Summit, Thursday, May 2nd
Official Docker Training and Certification
Workshops

The post From Manufacturing to Climate Analytics: DockerCon speakers on real-world use cases appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

The Masters drives innovation with hybrid cloud and AI

The Masters golf tournament may be traditional on the course, but it’s driving innovation with its infrastructure and online fan experiences behind the scenes.
“The Masters is surprisingly modern when it comes to its infrastructure in that it operates a hybrid cloud strategy,” reports Forbes. “For the vast majority of the year its technology needs are fairly modest, but during tournament week, the Masters official applications and website experience a huge spike in demand from millions of fans around the world.”
The solution is the hybrid cloud. IBM has partnered with the Augusta National for more than 20 years to shape the Masters digital experience and engage patrons online. With the IBM Cloud as the digital foundation of the Masters, IBM brings a hybrid cloud environment that allows the Masters to quickly scale and manage various sources of data across multiple locations and use artificial intelligence (AI) to enhance fan engagement.
Three-minute player highlight reels
New this year, “Round in Three Minutes” lets fans at home and away from the fairway experience highlights from each player. The online experience uses IBM Watson on the IBM Cloud to rate and curate a three-minute highlight reel for each player’s round. The feature in the Masters app evaluates the excitement level of each moment, including facial expressions, gestures and the roar of the crowd.
“For the first time ever in golf, we will capture virtually every shot of every player during every competitive round,” said Augusta National Golf Club Chairman Fred Ridley, as reported by GOLF.
IBM analyzed approximately 4,000 shots at the 2018 Masters and estimates it will track 5,000 holes and around 20,000 shots this year, shares SportTechie.
The Masters and IBM
Beyond the beauty of the golf greens, the powerful IBM Cloud hybrid environment, game-changing AI and enterprise-grade security are helping the Masters to scale, innovate and deliver for fans.

 
Learn more about the IBM technology enabling fan experiences at the Masters.
The post The Masters drives innovation with hybrid cloud and AI appeared first on Cloud computing news.
Quelle: Thoughts on Cloud

News to build on: 122+ announcements from Google Cloud Next ‘19

We hope you enjoyed Next ’19 as much as we did! The past few days brought our Google Cloud community together to learn about lots of new technologies and see how customers and partners are pushing their ideas and businesses forward with the cloud. It was a lot to digest, but we’ve boiled it down here into all the announcements from the week across infrastructure, application development, data management, analytics and AI, productivity, partnerships, and more.Infrastructure1. We announced two new regions in Seoul, South Korea and Salt Lake City, Utah to expand our global footprint and to support our growing customers around the world.Hybrid Cloud2. Anthos (the new name for Cloud Services Platform) is now generally available on Google Kubernetes Engine (GKE) and GKE On-Prem, so you can deploy, run and manage your applications on-premises or in the cloud. Coming soon, we’ll extend that flexibility to third-party clouds like AWS and Azure. And Anthos is launching with the support of more than 30 hardware, software and system integration partners so you can get up and running fast.3. With Anthos Migrate, powered by Velostrata’s migration technology, you can auto-migrate VMs from on-premises or other clouds directly into containers in GKE with minimal effort.4. Anthos Config Management lets you create multi-cluster policies out of the box that set and enforce role-based access controls, resource quotas, and namespaces—all from a single source of truth.Serverless5. Cloud Run, our fully managed serverless execution environment, offers serverless agility for containerized apps.6. Cloud Run on GKE brings the serverless developer experience and workload portability to your GKE cluster.7. Knative, the open API and runtime environment, brings a serverless developer experience and workload portability to your existing Kubernetes cluster anywhere.8. We’re also making new investments in our Cloud Functions and App Engine platforms with new second generation runtimes, a new open-sourced Functions Framework, and additional core capabilities, including connectivity to private GCP resources.DevOps/SRE9. The new Cloud Code makes it easy to develop and deploy cloud-native applications on Kubernetes, by extending your favorite local Integrated Development Environments (IDE) IntelliJ and Visual Studio Code.API Management10. Apigee hybrid (beta) is a new deployment option for the Apigee API management platform that lets you host your runtime anywhere—in your data center or the public cloud of your choice.11. Apigee security reporting (beta) offers visibility into the security status of your APIs.12. Now you can consume a variety of Google Cloud services directly from the Apigee API Management platform, including Cloud Functions (secured by IAM), Cloud Data Loss Prevention (templates support), Cloud ML Engine, and BigQuery. See the full list of extensions here.Data ManagementDatabases13. Coming soon to Google Cloud: bring your existing SQL Server workloads to GCP and run them in a fully managed database service.14. CloudSQL for PostgreSQL now supports version 11, with useful new features like partitioning improvements, stored procedures, and more parallelism.15. Cloud Bigtable multi-region replication is now generally available, giving you the flexibility to make your data available across a region or worldwide as demanded by your app.Storage16. A new low-cost archive class for Cloud Storage will offer the same consistent API as other classes of Cloud Storage and millisecond latency to access your content.17. Cloud Filestore, our managed file storage system, is now generally available for high-performance storage needs.18. Regional Persistent Disks will be generally available next week, providing active-active disk replication across two zones in the same region.19. Bucket Policy Only is now in beta for Google Cloud Storage, so you can enforce Cloud IAM policies at the bucket level for consistent and uniform access control for your Cloud Storage buckets.20. V4 signatures are now available in beta for Google Cloud Storage to provide improved security and let you access multiple object stores using the same application code. In addition to HMAC keys, V4 signed requests are also supported for Google RSA keys.21. Cloud IAM roles are now available for Transfer Service, allowing security and IT administrators to use Cloud IAM permissions for creating, reading, updating, and deleting transfer jobs.Networking22. Traffic Director delivers configuration and traffic control intelligence to sidecar service proxies, providing global resiliency for your services by allowing you to deploy application instances in multiple Google Cloud regions.23. High Availability VPN, soon in beta, lets you connect your on-premises deployment to GCP Virtual Private Cloud (VPC) with an industry-leading SLA of 99.99% service availability at general availability.24. 100 Gbps Cloud Interconnect connects your hybrid and multi-cloud deployments.25. Private Google Access from on-premises to the cloud is now generally available, allowing you to securely use Google services like Cloud Storage and BigQuery as well as third-party SaaS through Cloud Interconnect or VPN.26. With Network Service Tiers, Google Cloud customers can customize their network for performance or price on a per-workload basis by selecting Premium or Standard Tier.Security and identitySecurity27. Access Approval (beta) is a first-of-its-kind capability that allows you to explicitly approve access to your data or configurations on GCP before it happens.28. Data Loss Prevention (DLP) user interface (beta) lets you run DLP scans with just a few clicks—no code required, and no hardware or VMs to manage.29. Virtual Private Cloud (VPC) Service Controls (GA) go beyond your VPC and let you define a security perimeter around specific GCP resources such as Cloud Storage buckets, Bigtable instances, and BigQuery datasets to help mitigate data exfiltration risks.30. Cloud Security Command Center, a comprehensive security management and data risk platform for GCP,  is now generally available,31. Event Threat Detection in Cloud Security Command Center leverages Google-proprietary intelligence models to quickly detect damaging threats such as malware, crypto mining, and outgoing DDoS attacks. Sign up for the beta program.32. Security Health Analytics in Cloud Security Command Center automatically scans your GCP infrastructure to help surface configuration issues with public storage buckets, open firewall ports, stale encryption keys, deactivated security logging, and much more. Sign up for the alpha program.33. Cloud Security Scanner detects vulnerabilities such as cross-site-scripting (XSS), use of clear-text passwords, and outdated libraries in your GCP applications and displays results in Cloud Cloud Security Command Center. It’s GA for App Engine and now available in beta for GKE and Compute Engine.34. Security partner integrations with Capsule8, Cavirin, Chef, McAfee, Redlock, Stackrox, Tenable.io, and Twistlock consolidate findings and speed up response. Find them on GCP Marketplace.35. Stackdriver Incident Response and Management (coming soon to beta) in Cloud Security Command Center helps you respond to threats and remediate findings.36. Container Registry vulnerability scanning (GA) identifies package vulnerabilities for Ubuntu, Debian, and Alpine Linux, so you can find vulnerabilities before your containers are deployed.37. Binary Authorization (GA) is a deploy-time security control that integrates with your CI/CD system, gating images that do not meet your requirements from being deployed.38. GKE Sandbox (beta), based on the open-source gVisor project, provides additional isolation for multi-tenant workloads, helping to prevent container escapes, and increasing workload security.39. Managed SSL Certificates for GKE (beta) give you full lifecycle management (provisioning, deployment, renewal and deletion) of your GKE ingress certificates.40. Shielded VMs (GA) provide verifiable integrity of your Compute Engine VM instances so you can be confident they haven’t been compromised.41. Policy Intelligence (alpha) uses ML to help you understand and manage your policies and reduce risk.42. With Phishing Protection (beta), you can quickly report unsafe URLs to Google Safe Browsing and view status in Cloud Security Command Center.43. reCAPTCHA Enterprise (beta) helps you defend your website against fraudulent activity like scraping, credential stuffing, and automated account creation and help prevent costly exploits from automated software.Identity and access management44. Context-aware access enhancements, including the launch of BeyondCorp Alliance, to help you define and enforce granular access to apps and infrastructure based on a user’s identity and the context of their request.45. Android phone’s built-in security key—the strongest defense against phishing—is now available on your phone.46. Cloud Identity enhancements, including single sign-on to thousands of additional apps and integration with human resource management systems (HRMS).47. General availability of Identity Platform, which you can use to add identity management functionality to your own apps and services.Smart AnalyticsData analytics48. Data Fusion (beta) is a fully managed and cloud-native data integration service that helps you easily ingest and integrate data from various sources into BigQuery.49. BigQuery DTS now supports 100+ SaaS apps, enabling you to lay the foundation for a data warehouse without writing a single line of code.50. Cloud Dataflow SQL (public alpha) lets you build pipelines using familiar Standard SQL for unified batch and stream data processing.51. Dataflow Flexible Resource Scheduling (FlexRS), in beta, helps you flexibly schedule batch processing jobs for cost savings.52. Cloud Dataproc autoscaling (beta) removes the user burden associated with provisioning and decommissioning Hadoop and Spark clusters on Google Cloud Platform, providing you the same serverless convenience that you find in the rest of our data analytics platform.53. Dataproc Presto job type (beta) helps you write simpler ad hoc Presto queries against disparate data sources like Cloud Storage and Hive metastore. Now both queries and scripts run as part of the native Dataproc API.54. Dataproc Kerberos TLC (beta) enables Hadoop secure mode on Dataproc through thorough API support for Kerberos. This new integration gives you cross-realm trust, RPC and SSL encryption, and KDC administrator configuration capabilities.55. BigQuery BI Engine, in beta, is an in-memory analysis service that lets interact with large or complex data almost immediately, for optional visual analysis with partner tools..56. Connected sheets are a new type of spreadsheet that combines the simplicity of a spreadsheet interface with the power of BigQuery. With a few clicks, you can access BigQuery data in Sheets and securely share it with anyone in your organization.57. BigQuery ML is now generally available with new model types you can call with SQL queries.58. BigQuery: k-means clustering ML (beta) helps you establish groupings of data points based on axes or attributes that you specify, straight from Standard SQL in BigQuery.59. BigQuery: import TensorFlow models (alpha) lets you import your TensorFlow models and call them straight from BigQuery to create classifier and predictive models right from BigQuery.60. BigQuery: TensorFlow DNN classifier helps you classify your data, based on a large number of features or signals. You can train and deploy a DNN model of your choosing straight from BigQuery’s Standard SQL interface.61. BigQuery: TensorFlow DNN regressor lets you design a regression in TensorFlow and then call it to generate a trend line for your data in BigQuery.62. Cloud Data Catalog (beta), a fully managed metadata discovery and management platform, helps organizations quickly discover, manage, secure, and understand their data assets.63. Cloud Composer (generally available) helps you orchestrate your workloads across multiple clouds with a managed Apache Airflow service.AI and machine learning64. AI Platform, in beta, helps teams prepare, build, run, and manage ML projects via the same shared interface.65. AutoML Natural Language custom entity extraction and sentiment analysis (beta) lets you identify and isolate custom fields from input text and also train and serve industry-specific sentiment analysis models on your unstructured data.66. AutoML Tables (beta) helps you turn your structured data into predictive insights. You can ingest your data for modeling from BigQuery, Cloud Storage, and other sources.67. AutoML Vision object detection (beta) now helps you detect multiple objects in images, providing bounding boxes to identify object locations.68. AutoML Vision Edge (beta) helps you deploy fast, high accuracy models at the edge, and trigger real-time actions based on local data.69. AutoML Video Intelligence (beta) lets you upload your own video footage and custom tags, in order to train models that are specific to your business needs for tagging and retrieving video with custom attributes.70. Document Understanding AI, in beta, offers a scalable, serverless platform to automatically classify, extract, and digitize data within your scanned or digital documents.71. Vision Product Search, now generally available, lets you build visual search functionality into mobile apps so customers can photograph an item and get a list of similar products from a retailer’s catalog.72. Cloud Vision API—bundled enhancements (beta) lets you perform batch prediction, and document text detection now supports online annotation of PDFs, as well as files that contain a mix of scanned (raster) and rendered text.73. Cloud Natural Language API—bundled enhancements (beta) now includes support for Russian and Japanese languages, as well as built in entity-extraction for receipts and invoices.74. Our new V3 Translation API lets you define the vocabulary and terminology you want to override within translations as well as easily integrate your added brand-specific terms into your translation workflows.75. Video Intelligence API—bundled enhancements (beta) lets content creators search for tagged aspects of their video footage. The API now supports optical character recognition (generally available), object tracking (also generally available), and new streaming video annotation capability (in beta).76. Recommendations AI, in beta, helps retailers provide personalized 1:1 recommendations to drive customer engagement and growth.77. Contact Center AI is now in beta, helping businesses build modern, intuitive customer care experiences with the help of Cloud AI.Windows workloads on GCP78. For your Microsoft workloads, in addition to purchasing on-demand licenses from Google Cloud, you now have the flexibility to bring your existing licenses to GCP.79. Velostrata 4.2, our streaming migration tool, will soon give you the ability to specifically tag Microsoft workloads that require sole tenancy, and to automatically apply existing licenses.80. Coming soon, you’ll be able to use Managed Service for Microsoft Active Directory (AD), a highly available, hardened Google Cloud service running actual Microsoft AD, to manage your cloud-based AD-dependent workloads, automate AD server maintenance and security configuration, and extend your on-premises AD domain to the cloud.81. We’ve expanded Cloud SQL, our fully managed relational database server, to support Microsoft SQL Server, and we’ll be extending Anthos for hybrid deployments to Microsoft environments.Productivity & CollaborationG Suite82. Google Assistant is integrating with Calendar, available in beta, to help you know when and where your next meeting is, and stay on top of scheduling changes.83. G Suite Add-ons, coming soon to beta, offer a way for people to access their favorite workplace apps in the G Suite side panel to complete tasks, instead of toggling between multiple apps and tabs.84. Third-party Cloud Search, now generally available for eligible customers, can help employees search—and find—digital assets and people in their company.85. Drive metadata, available in beta, lets G Suite admins, and their delegates, create metadata categories and taxonomies to make content more discoverable in search.86. Hangouts Meet updates, including automatic live captions (generally available), the ability to make live streams “public” (coming soon), and up to 250 people can join a single meeting (coming soon).87. Google Voice for G Suite, generally available, gives businesses a phone number that works from anywhere, on any device, that can also transcribe voicemails and block spam calls with the help of Google AI.88. Hangouts Chat into Gmail, available in beta, lets team communications be accessed in one place on your desktop—the lower left section of Gmail which also highlights people, rooms, and bots.89. Office editing in Google Docs, Sheets and Slides, generally available, lets you work on Office files straight from G Suite without having to worry about converting file types.90. Visitor sharing in Google Drive, available in beta, provides a simple way for you to invite others outside of your organization to collaborate on files in G Suite using pincodes.91. Currents (the new name for the enterprise version of Google+), available in beta, helps employees share ideas and engage in meaningful discussions with others across their organization, regardless of title or geography.92. Access Transparency, generally available for G Suite Enterprise customers, to provide granular visibility into data that’s accessed by Google Cloud employees for support purposes.93. We enhanced our data regions to provide coverage for backups.94. Advanced phishing and malware protection, available in beta, help admins protect against anomalous attachments and inbound emails spoofing your domain in Google Groups.95. Updates to the security center and alert center for G Suite provide integrated remediation so admins can take action against threats.Chrome Enterprise96. Chrome Browser Cloud Management lives within the Google Admin console, and it allows you to manage browsers in your Windows, Mac and Linux environments from a single location. You can see your enrolled browsers, and set and apply policies across them from the same place. We’ve opened up Chrome Browser Cloud Management to all enterprises, even if they aren’t using other Google products in their enterprise yet.Customers97. Hot off the presses: our 2019 Customer Voices book offers perspectives from 40 Google Cloud customers across 7 major industries.98. Australia Post detailed how it delivers online and in-person for customers with the help of Google Cloud.99. Baker Hughes is using Google Cloud to build advanced analytics products that solve complex industrial problems. 100. Colgate-Pamolive shared how it is using G Suite, and now GCP to transform its business, taking advantage of data analytics and  migrating its SAP workloads to Google Cloud.101. Kohl’s described how it is moving most of its apps to the cloud in the next three years.102. McKesson, a Fortune 6 company, shared its aim is to deliver more value to its customers and the healthcare industry through common platforms and resources.103. Procter & Gamble shared how it is using Google Cloud to store, analyze, and activate its data.104. ScotiaBank is migrating 40 percent of its applications globally to Google Cloud and how it’s using data for advanced analytics that aid them in tasks like meeting customer needs and detecting fraud.105. Unilever used Google Cloud AI tools such as translation, visual analytics, and natural language processing (NLP) to generate insights faster and gain a deeper understanding of customer needs.106. UPS described how it uses analytics on Google Cloud to gather and analyze more than a billion data points every day.107. Viacom shared why it chose Google Cloud to perform automated content tagging, discovery and intelligence for more than 65 petabytes of content.108. Whirlpool is using G Suite to completely transform the way its workforce collaborates.Partnerships109. Partners such as Cisco, Dell EMC, HPE, and Lenovo have committed to delivering Anthos on their own hyperconverged infrastructure for their customers. By validating Anthos on their solution stacks, our mutual customers can choose hardware based on their storage, memory, and performance needs.110. Intel announced it will publish a production design for developers, OEMs and system integrators to offer Intel Verified hardware and channel marketing programs to accelerate Anthos deployment for enterprise customers.111. VMware and Google Cloud announced a collaboration for SD-WAN and Service Mesh integrations with support for Anthos.112. Our strategic open-source partnerships with Confluent, MongoDB, Elastic, Neo4j, Redis Labs, InfluxData, and DataStax tightly integrate their open source-centric technologies into GCP, providing a seamless user experience across management, billing and support.113. Accenture announced an expanded strategic collaboration with new enterprise solutions in customer experience transformation.114. Deloitte announced transformative solutions for the healthcare, finance, and retail sectors.115. Atos and CloudBees announced a partnership to provide customers with a complete DevOps solution running on GCP.116. Salesforce is bringing Contact Center AI to its Salesforce Service Cloud and Dialogflow Enterprise Edition to the Salesforce Einstein Platform.117. A new integration with G Suite and Dropbox lets you create, save and share G Suite files—like Google Docs, Sheets and Slides—right from Dropbox.118. Docusign introduced 3 new innovations to expand integration with GSuite119. We made a number of partner announcements around AI and machine learning, including Avaya, Genesys, Mitel, NVIDIA, Taulia, and UiPath.120. We announced that 21 of our partners achieved specializations in our three newest specialization areas—with many more to come.121. Our list of qualified MSPs is growing, and we introduced an MSP Initiative badge for qualified partners at Next ‘19, making it easier for our joint customers to discover partners who can help them to accelerate their Google Cloud journey.122. We were thrilled to announce our 2018 partner award winners. You can find the full list here.Add to this list our 123rd announcement: Google Cloud Next ‘20 will be happening from April 6-8 2020 back at Moscone in San Francisco. We hope to see you there!
Quelle: Google Cloud Platform