A little light reading: Transit trends, video datasets and more stories from around Google

At Google Cloud, we love to share how we’re shaping our cloud computing technology. Beyond the cloud blog, though, we know there are lots of fascinating stories from around Google. Here’s a reading list of stories that grabbed our attention recently.How stuffed is your bus? See transit trends from Google MapsThis post contains some fun graphics and data about the relative crowdedness of various bus and subway lines around the world (fun for us to look at, though perhaps not so much fun for those on the crowded subway cars). The trends are pulled from the aggregated, anonymized feedback data that Google Maps users can opt to give after they’ve used transit mode. One line of the Buenos Aires subway came in first for most crowded worldwide. You can also see breakdowns of the most crowded lines for certain cities.And take a look at how ML now helps predict transit delays For more on the topic of transportation trends, check out this blog post on how Google Maps now forecasts bus delays in hundreds of cities using machine learning. (Again, not pleasant for those waiting for the buses in question, but fascinating for ML enthusiasts.) Though some city transit agencies provide public delay data, not all do. This new prediction capability depends on an ML model that combines real-time car traffic forecasts with data on bus routes and stops. To build the model, teams extracted training data from sequences of bus positions over time, based on transit agency feeds, then aligned those to car traffic speeds on the bus’s path. Get a sense of scale with YouTube-8M SegmentsThe new YouTube-8M Segments is an extension of the large-scale YouTube-8M dataset, a video classification dataset with, you guessed it, more than 8 million videos. The dataset comes with precomputed audio and visual features from billions of video frames and audio segments. These new segments include human-verified labels at the five-second segment level within a set of the YouTube-8M videos. The idea behind the release is to speed up research into temporal localization—allowing better search within videos—with the aim of improving video tag predictions and enabling uses like capturing special video moments, for example. Human-labeled annotations provide a baseline to help researchers evaluate their algorithms more accurately without having to label every segment in a video. There’s an accompanying Kaggle competition challenge and ICCV workshop as well.Brush off your mail merge skillsThe use of mail merge—combining a data source with a master template document—has been around since the dawn of word processing. Mail merge can create customized copies of the master doc to include unique data records from the data source—for example, adding customer addresses to a form letter. With the launch of the Google Docs API, it’s now easy to do mail merge in the cloud and build custom mail merge apps.Automate all the things—even at homeIf you want your home technology to run as smoothly as your work technology, you’ve got a lot of interesting device options these days. This post covers some tips on connecting IoT devices to Google Assistant and using Actions to create routines and tasks. You’ll see how to control devices with voice commands as well as use a visual interface, and get some detail on the back-end integrations you can set up.What thought-provoking stories have you read lately? Let us know.
Quelle: Google Cloud Platform

Accelerate your OpenShift Network Performance on OpenStack with Kuryr

Kuryr and OpenShift on OpenStack OpenShift on OpenStack Many cloud applications are moving to containers while many others are running on virtual instances, leading to the need for containers and VMs to coexist in the same infrastructure.  Red Hat OpenShift running on top of OpenStack covers this use case as an on-prem solution, providing in […]
The post Accelerate your OpenShift Network Performance on OpenStack with Kuryr appeared first on Red Hat OpenShift Blog.
Quelle: OpenShift

Happy birthday Knative! Celebrating one year of portable serverless computing

Today marks the one-year anniversary of Knative, an open-source project initiated by Google that helps developers build, deploy and manage modern serverless workloads. What started as a Google-led project now has a rich ecosystem with partners from around the world, and together, we’ve had an amazing year! Here are just a few notable stats and milestones that Knative has achieved this year:Seven releases since launchA thriving, growing ecosystem: over 3,700 pull requests from 400+ contributors associated with over 80 different companies, including industry leaders like IBM, Red Hat, SAP, TriggerMesh and Pivotal. Addition of non-Google contributors at the approver, lead, and steering committee level20% monthly growth in contributionsWith all this momentum for the project, we thought now would be a good time to reflect on why we initially created Knative, the project’s ecosystem, and how it relates to Google Cloud’s serverless vision.Why we created KnativeServerless computing provides developers with a number of benefits: the ability to run applications without having to worry about managing the underlying infrastructure, to execute code only when needed, to autoscale workloads from zero to N depending on traffic, and many more. But while traditional serverless offerings provide the velocity that developers love, they have a lack of flexibility. Serverless traditionally requires developers to use specific languages and proprietary tools. It also locks developers into a cloud provider and prevents them from being able to easily move their workloads to other platforms. In other words, most serverless offerings force developers to choose between the velocity and simple developer experience of serverless, and the flexibility and portability of containers. We asked ourselves, what if we could offer the best of both worlds?Kubernetes has become the de facto standard for running containers. Even with all that Kubernetes offers, many platform providers and operators were implementing their own platforms to solve common needs like building code, scaling workloads, and connecting services with events. Not only was this a duplicative effort for everyone, it lead to vendor lock in and proprietary systems for developers. And thus, Knative was born.What is Knative?Knative offers a set of components that standardize mundane but difficult tasks such as building applications from source code to container images, routing and managing traffic during deployment, auto-scaling of workloads, and binding running services to a growing ecosystem of event sources. Idiomatic developer experienceKnative provides an idiomatic developer experience: Developers can use any language or framework, such as Django, Ruby on Rails, Spring and many more; common development patterns such as GitOps, DockerOps, or ManualOps;  and easily plug into existing build and CI/CD toolchains.A growing Knative ecosystemWhen we first announced Knative, it included three main components: build, eventing, and serving, all of which have received significant investment and adoption from the community. Recently the build component has been spun out of Knative into a new project, Tekton. Tekton focuses on solving a much broader set of continuous integration use-cases than was Knative’s original intent. But perhaps the biggest indicator of Knative’s momentum is the increase in commercial Knative-based products on the market. Our own Cloud Run is based on Knative, and several members of the community also have products based on Knative, including IBM, Red Hat, SAP, TriggerMesh and Pivotal. “We are excited to be partnering with Google on the Knative project. Knative enables us to build new innovative managed services in the cloud, easily, without having to recreate the essential building blocks. Knative is a game-changer, finally making serverless workload portability  a reality.” – Sebastien Goasguen, Co-Founder, TriggerMesh“Red Hat has been working alongside the community and innovators like Google on Knative since its inception. By adding the Knative APIs to Red Hat OpenShift, our enterprise Kubernetes platform, developers have the ability to build portable serverless applications. We look forward to enabling more serverless workloads with Red Hat OpenShift Serverless based on Knative as the project nears general availability. This has the potential to improve the general ease of Kubernetes for developers, helping teams to run modern applications across hybrid architectures.” – William Markito Oliveira, senior principal product manager, Red Hat To learn more about Knative and the community look out for an upcoming interview with Evan Anderson, Google Cloud engineer, and a Knative technical lead on the SAP Customer Experience Labs podcast. Knative: the basis of Google Cloud RunAt Google Cloud Next 2019, we announced Cloud Run, our newest serverless compute platform that lets you run stateless request-driven containers without having to worry about the underlying infrastructure—no more configuration, provisioning, patching and managing servers. Cloud Run autoscales your application from zero to N depending on traffic and you only pay for the resources that you use. Cloud Run is available both as a fully managed offering and also as an add-on in Google Kubernetes Engine (GKE). We believe Cloud Run is the best way to use Knative. With Cloud Run, you choose how to run  your serverless workloads: fully managed on Google Cloud or on GKE. You can even choose to move your workloads on-premises running on your own Kubernetes cluster or to a third-party cloud. Knative makes it easy to start with Cloud Run and later move to Cloud Run on GKE, or start in your own Kubernetes cluster and migrate to Cloud Run in the future. Because it uses Knative as the underlying platform, you can move your workloads freely across platforms, while significantly reducing switching costs. Customers such as Percy.io use both Cloud Run and Cloud Run on GKE and love the fact they can leverage the same experience and UI wherever they need.“We first started running our workloads on Cloud Run as fully managed on Google Cloud, but then wanted to leverage some of the benefits of Google Kubernetes Engine (GKE), so we decided to move some services to Cloud Run on GKE. The fact we can seamlessly move from one platform to another by just changing the endpoint is amazing, and that they both have the same UI and interface makes it extremely easy to manage.” – David Jones, Director of Engineering, Percy.ioGet started with Knative today!Knative brings portability to your serverless workloads and the simple and easy developer experience to your Kubernetes platform. It is truly the best of both worlds. If you operate your own Kubernetes environment, check out Knative today. If you’re a developer, check out Cloud Run as an easy way to experience the benefits of Knative. Get started with your free trial on Google Cloud—we can’t wait to see what you will build.
Quelle: Google Cloud Platform