Docker Official Images are now Multi-platform

This past week, Docker rolled out a big update to our Official Images to make them multi-platform aware. Now, when you run `docker run hello-world`, Docker CE and EE will pull and run the correct hello-world image whether that’s for x86-64 Linux, Windows, ARM, IBM Z mainframes or any other system where Docker runs. With Docker rapidly adding support for additional operating systems (like Windows) and CPU architectures (like IBM Z) this is an important UX improvement.
Docker Official Images are a curated set of container images that include:

Base operating system images like Ubuntu, BusyBox and Debian
Ready-to-use build and runtime images for popular programming languages like Go, Python and Java
Easy-to-use images for data stores such as PostgreSQL, Neo4j and Redis
Pre-packaged software images to run WordPress, Ghost and Redmine and many other popular open source projects

The official images have always been available for x86-64 Linux. Images for non x86 Linux architectures have also been available, but to be fetched either from a different namespace (`docker pull s390x/golang` on IBM Z mainframe) or using a different tag (`docker pull golang:nanoserver` on Windows). This was not the seamless and portable experience that we wanted for users of Docker’s new multi-arch and and multi-os orchestration features.
Luckily the Docker registry and distribution protocol have supported multi-platform images since Docker 1.10, using a technology called manifest lists. A manifest list can take the place of a single-architecture image manifest in a registry (for example for `golang`) and contains a list of (“platform”, “manifest-reference”) tuples. If a registry responds to a `docker pull` command with a registry list instead of an image manifest, Docker examines the manifest list and then pull the correct list entry for the platform that it happens to be running on.
The distribution protocol is backwards compatible, and manifest lists are only served to clients that indicate support in the `Accept` header. For clients that don’t support manifest lists, registries will fall back to the x86-64 Linux image manifest. Manifest lists are fully supported by Docker Content Trust to ensure that multi-platform image content is cryptographically signed and verified.
Manifest lists have been rolled out for Linux images for most CPU architectures, and Windows support is also getting there. If your favorite CPU architecture or OS isn’t covered yet, you can always continue to use a CPU or OS-specific tag or image when pulling. Fetching images by digest is also unaffected by this update.
If you’re interested in building multi-arch images, check out Phil Estes’ manifest-list tool and keep track of the PR to add a manifest command to the Docker CLI.
Manifest lists and multi-arch Docker images have been in the works for a long time. We’re excited that these features are now making it simpler to pull and use Docker Official Repo images seamlessly on the many platforms where Docker is available.
Resources:

Phil Estes’ and Utz Bacher’s posts on Official Images going multi-arch
Official Repo documentation
Details on multi-arch official images
Official Repo GitHub org
Manifest-list specification

 

.@Docker Official Images are now Multi-platformClick To Tweet

The post Docker Official Images are now Multi-platform appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

5 ways to unlock the value of video with the help of IBM Watson Media

Understanding video content is a significant challenge for media companies.
The biggest obstacle? Data within a video is largely unstructured and requires complex analysis. In a crowded landscape, it’s becoming essential for media and entertainment companies to extract new insights from video to meet consumers’ and advertisers’ needs.
AI technology can give streaming services a competitive edge. A new set of packaged services available through IBM Cloud that companies can scale and use across their video assets, IBM Watson Media, provides a valuable new resource for clients who need to solve key industry problems.
Here are a few ways cognitive capabilities can increase impact and efficiency across all aspects of streaming video:
1. Finding a needle in a haystack: Content search and discovery
One of the biggest resource drains for production teams is manually scanning stockpiles of footage to find relevant content. By tapping into rich metadata, IBM Watson Media expedites this process to make video more searchable, thereby enabling editors to discover and use archival assets faster.
2. Personalized recommendations: More detailed data for better content matches
In today’s streaming world, it’s essential to deliver the right content to the right viewer at the right time. To meet consumer demand for relevant programming, streaming services must provide highly specific content recommendations.
With deep insights into video content through enhanced metadata, IBM Watson Media provides media companies with a better understanding of what’s inside a video. This enables streaming services to improve their recommendation engines for viewers by analyzing this detailed data to find better matches. With increased personalization, streaming services can optimize the viewer experience and reduce churn.
3. Intelligent closed captioning: What’s love got to do with it?
Media companies rely on speech-to-text technology to deliver a near real-time transcript of commentary. However, closed captioning can be inaccurate, especially during sporting events that require an understanding of specific terminology.
Cognitive capabilities solve those challenges by unlocking what’s inside a video. For instance, IBM Watson understands the difference between romantic “love” and “love” as a score in tennis. This enables Watson to provide more accurate, intelligent captioning to live-streamed events such as the US Open, since it understands the context of the video.
4. Don’t let video go awry: AI helps media companies comply
Service providers and advertisers that encounter regulations around explicit content or product placement may use AI technology as a resource to support their compliance efforts.
Rather than manually digging through footage to flag violence or objectionable language, production teams can use AI to help facilitate their meeting their compliance obligations. By sourcing rich metadata to understand elements within a video, IBM Watson Media can be a resource to help identify specific content that should be screened for approval.
5. Take another look: Highlight clipping
IBM Watson Media can identify the most exciting parts of video footage in near real time. This functionality can be crucial for action-packed sporting events such as the US Open, where IBM Watson Media helped video editors quickly package and distribute highlight reels.
Cognitive technology can automatically identify exciting moments based on player movement, match data and crowd noise. By streamlining the process to create highlight reels, IBM Watson Media helps ensure fans won’t miss the action.
In a competitive landscape, media companies need AI technology to help solve pressing industry challenges. By bringing the cognitive power of Watson to video, IBM Watson Media empowers companies to unlock new value from their content.
Learn more about new Watson-powered video services.
The post 5 ways to unlock the value of video with the help of IBM Watson Media appeared first on Cloud computing news.
Quelle: Thoughts on Cloud

HashiCorp and Google expand collaboration, easing secret and infrastructure management

By Maya Kaczorowski, Security and Privacy Product Manager; and Emily Ye, Software Engineer

Open source technology encourages collaboration and innovation to address real world problems, including projects supported by Google Cloud. As part of our broad engagement with the open source community, we’ve been working with HashiCorp since 2013 to enable customers who use HashiCorp tools to make optimal use of Google Cloud Platform (GCP) services and features.

A longstanding, productive collaboration 
Google and HashiCorp have dedicated engineering teams focused on enhancing and expanding GCP support in HashiCorp products. We’re focused on technical and shared go-to-market efforts around HashiCorp products in several critical areas of infrastructure.

Cloud provisioning: The Google Cloud provider for HashiCorp Terraform allows management of a broad array of GCP resource types, with Bigtable and BigQuery being the most recent additions. Today, HashiCorp also announced support for GCP in the Terraform Module Registry to give users easy access to templates for setting up and running their GCP-based infrastructure. We plan to continue to broaden the number of GCP services that can be provisioned with Terraform, allowing Terraform users to adopt a familiar workflow across multiple cloud and on-premises environments. Using Terraform to move workloads to GCP simplifies the cloud adoption process for Google customers that use Terraform today in cross-cloud environments. 
Cloud security and secret management: We’re working to enhance the integration between HashiCorp Vault and GCP, including Vault authentication backends for IAM and signed VM metadata. This is in addition to work being done by HashiCorp for Kubernetes authentication. 

Using HashiCorp Vault with Google Cloud and Kubernetes 
Applications often require access to small pieces of sensitive data at build or run time, referred to as secrets. HashiCorp Vault is a popular open source tool for secret management, which allows a developer to store, manage and control access to tokens, passwords, certificates, API keys and other secrets. Vault has many options for authentication, known as authentication backends. These allow developers to use many kinds of credentials to access Vault, including tokens, or usernames and passwords.

As of today, developers on Google Cloud now have two authentication backends which they can use to validate a service’s identity to their instance of Vault: 

GCP IAM service accounts: a new Google Cloud Platform IAM authentication backend for Vault allows you to use an existing IAM identity to authenticate to Vault. 
Google Compute Engine instance identity tokens: announced today, this uses an instance’s signed metadata token to authenticate to Vault. 

With these authentication backends, it’s easier for a particular service running on Google Cloud to get access to a secret it needs at build or run time stored in Vault.

Fleetsmith is a secure cloud-based solution for managing a company’s Mac computers, that fully integrates with G Suite. They’ve been testing out the new Compute Engine metadata backend, and are currently using Vault on GCP for PKI and secret management. Learn more about how Fleetsmith did this in their blogpost.

“Fleetsmith and Google have shared values when it comes to security, and we built our product on Google Cloud Platform in part due to Google’s high bar for security. We’re excited about this new integration because it strengthens the security model for us as Google Cloud customers using Vault.” 

— Jesse Endahl, CPO and CSO, Fleetsmith 

If you’re using Vault for managing secrets in Kubernetes specifically, today HashiCorp announced a new Kubernetes authentication backend. This uses Kubernetes pod service accounts to authenticate to Vault, providing an alternative to storing secrets in directly in `etcd`.

Running HashiCorp Vault on Google Cloud 

You may already be running your own instance of HashiCorp Vault. Users can run Vault in either Compute Engine or Google Container Engine, and then use one of our new authentication backends to authenticate to Vault.

WePay, an online payment service provider, uses HashiCorp Vault on GCP:

 “Managing usernames, passwords and certificates is a challenge in a microservice world, where we have to securely manage many secrets for hundreds of microservices. WePay chose to use HashiCorp Vault to store secrets because it provides us with rotation, tight control and out-of-the-box audit logging for our secrets and other sensitive data. WePay runs Vault server infrastructure on Google Compute Engine for secret storage, key management and service to service authentication, for use by our microservice architecture based on Google Container Engine.”  

— Akshath Kumar, Site Reliability Engineer, WePay 
eBay also uses HashiCorp Vault on GCP:

“As a strong contributor and supporter of free open source software with vital projects such as regressr and datameta, eBay is a user of Hashicorp’s software products, including vaultproject.io on the Google Cloud Platform.”  

— Mitch Wyle, Director of Applied Science and Engineering, eBay 

Today, we’re publishing a solution on how to best set up and run HashiCorp Vault on Compute Engine. For best practices for running Vault on Compute Engine, read the solution brief “Using Vault on Compute Engine for Secret Management”.

Using HashiCorp Terraform to manage your resources on Google Cloud 
When you’re testing new code or software, you might want to spin up a test environment to simulate your application. HashiCorp Terraform is an infrastructure management and deployment tool that allows you to programmatically configure infrastructure across a variety of providers, including cloud providers like Google Cloud.

Using Terraform on Google Cloud, you can programmatically manage projects, IAM policies, Compute Engine resources, BigQuery datasets and more. To get started with Terraform for Google Cloud, check out the Terraform Google Cloud provider documentation, take a look at our tutorial for managing GCP projects with Terraform, which you can follow on our community page, or watch our Terraform for Google Cloud demo.

Google has released a number of Terraform modules that make working with Google Cloud even easier. These modules let you quickly compose your architectures as code and reuse architectural patterns for resources like load balancing, managed instance groups, NAT gateways and SQL databases. The modules can be found on the Terraform Module Registry.

Get involved 
We’re always excited about new contributors to open source projects we support. If you’d like to contribute, please get involved in projects like Kubernetes, istio, as well as Vault and Terraform. The community is what makes these projects successful. To learn more about open source projects we support, see Open Source at Google.
Quelle: Google Cloud Platform