Docker Talks Live Stream Monthly Recap

It’s time for a round up of Docker Talks, this time from the month of August. As you may remember, Chad Metcalf (@metcalfc) and I (@pmckee) started the weekly live-streaming video series to connect with you, our extended family of developers, and to help you succeed in your Docker journey.

In August, we held four sessions covering how to set up your local development environment with Node.js, Visual Studio remote debugging extension, the Awesome Compose project and common questions people have when starting with Docker. Below, I’ve put together the list of live streams for the month for your viewing and learning pleasure.

We live stream on our YouTube channel every Wednesday at 10 a.m. Pacific Time. You’ll find all of the past streams there and you can subscribe to get notifications. See you on the next live stream.

Docker Talks Live! Setting up your local development environment with Node.jsChad and I explore how to set up your local development environment with Node.js and debugging inside of containers. (Streamed live Aug. 5)

Docker Live! Debugging Node.js with VSCode Docker ExtensionI talk about Visual Studio remote debugging extension, do some light debugging of Node.js, inspect some containers and more. (Streamed live Aug. 12)

Docker Live! Awesome Compose and ECSChad dives into the Awesome Compose project, a repository containing a curated list of Compose application samples, and ECS, which lets users complete the journey from a local Docker Compose application to AWS. (Streamed live Aug. 19)

Docker Live! Getting Started Q&A.Chad and I go over Docker basics and common questions people have when starting with Docker. What’s an image? What’s a container? Can you use Docker on Windows? And so on. (Streamed live Aug. 26)

The post Docker Talks Live Stream Monthly Recap appeared first on Docker Blog.

Welcome – It is intended to be the hub for all things related to the Kubernetes Contributor experience. Who exactly is a contributor? We all are–Whether you’re writing docs, reviewing code, participating in th…

Better together: orchestrating your Data Fusion pipelines with Cloud Composer

The data analytics world relies on ETL and ELT pipelines to derive meaningful insights from data. Data engineers and ETL developers are often required to build dozens of interdependent pipelines as part of their data platform, but orchestrating, managing, and monitoring all these pipelines can be quite a challenge.That’s why we’re pleased to announce that you can now orchestrate and manage your Data Fusion pipelines in Cloud Composer using a rich set of Cloud Data Fusion operators.The new Cloud Data Fusion operators let you easily manage your Cloud Data Fusion pipelines from Cloud Composer without having to write lots of code. By populating the operator with just a few parameters, you can now deploy, start, and stop your pipelines, letting you save time while ensuring accuracy and efficiency in your workflows.Managing your data pipelinesData Fusion is Google Cloud’s fully managed, cloud-native data integration service that is built on the open source CDAP platform. Data Fusion helps users build and manage ETL and ELT data pipelines through an intuitive graphical user interface. By removing the coding barrier, data analysts and business users can now join developers in being able to manage their data.Managing all your Data Fusion pipelines can be a challenge. Determining how and when to trigger your pipelines, for example, is not as simple as it sounds. In some cases, you may want to schedule a pipeline to run periodically, but quickly realize that your workflows have dependencies on other systems, processes, and pipelines. You may find that you often need to wait to run your pipeline until some other condition has been satisfied, such as receiving a Pub/Sub message, data arriving in a bucket, or dependent pipelines in which one pipeline is dependent on data outputted by the other pipeline.This is where Cloud Composer comes in. Google’s Cloud Composer, built on the open source Apache Airflow, is our fully managed orchestration service that lets you manage these pipelines throughout your data platform. Cloud Composer workflows are configured by building directed acyclic graphs (DAGs) in Python. And while DAGs describe the collection of tasks in a given workflow, it’s the operators that determine what actually gets done by a task. You can think of operators as a template, and these new Data Fusion operators let you easily deploy, start and stop your ETL/ELT Data Fusion pipelines by providing just a few parameters.Let’s look at a use case where Composer triggers a Data Fusion pipeline once a file arrives in a Cloud Storage bucket:The steps above will be carried out as a series of tasks in Composer. Once an operator is instantiated, it will become a single task in a workflow. We will use the CloudDataFusionStartPipeline operator to start the Data Fusion pipeline.Using these operators simplifies the DAG. Instead of writing Python code to call the Data Fusion or CDAP API, we’ve provided the operator with details of the pipeline, reducing complexity and improving reliability in the Cloud Composer workflow.Getting started: orchestrating pipelinesSo how would orchestrating these pipelines with these operators work in practice? Here’s an example of how to start one pipeline. The principles here can easily be extended to start, stop, and deploy all your Data Fusion pipelines from Cloud Composer.Assuming there’s a Data Fusion instance with a deployed pipeline ready to go, let’s create a Composer workflow that will check for the existence of a file in a Cloud Storage bucket. (Note: If you’re not familiar with Cloud Composer DAGs, you may want to start with this Airflow tutorial.) We’ll also add one of the new Data Fusion operators to the Cloud Composer DAG so that we can trigger the pipeline when this file arrives, passing in the new file name as a runtime argument.We can then start our Cloud Composer workflow and see it in action.1. Check for existence of object in Cloud Storage bucketAdd the GCSObjectExistenceSensor sensor to your DAG. Once this task is started, it will wait for an object to be uploaded to your Cloud Storage bucket2.Start the Data Fusion pipelineUse the CloudDataFusionStartPipelineOperator operator to start a deployed pipeline in Data Fusion. This task will be considered complete once the pipeline has started successfully in Data Fusion.You can check out the airflow documentation for further details on the parameters required for this operator. 3. Set the order of the task flow using the bit shift operatorWhen we start this DAG, the gcs_sensor task will run first, and only when this task has completed successfully will the start_pipeline task execute.4. Upload your DAG to Cloud Composer DAG bucket and start workflowNow your DAG is complete, click the link to the DAGs Folder from the Cloud Composer landing page and upload your DAG.Click on the Airflow web server link to launch the Airflow UI and then trigger the DAG by clicking the run button.5. The tasks are now running. Once a file is uploaded to our source bucket, the Data Fusion pipeline will be triggered.Operate and orchestrateNow that you no longer have to write lines of Python code and maintain tests that call the Data Fusion API, you have more time to focus on the rest of your workflow. These Data Fusion operators are a great addition to the suite of operators already available for Google Cloud. Cloud Composer and Airflow also support operators for BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Datastore, Cloud Storage, and Cloud Pub/Sub, allowing greater integration across your entire data platform. Using the new Data Fusion operators is a straightforward way to yield a simpler and more easy-to-read DAG in Cloud Composer. By reducing complexity and removing the coding barrier, managing ETL and ELT pipelines becomes more accessible to members of your organization. Check out the Airflow documentation to learn more about these new operators.
Quelle: Google Cloud Platform

An application modernization bonanza — What happened at Next OnAir

Week seven of Google Cloud Next ‘20: OnAir was all about application modernization—of your existing workloads, and the ones you will build tomorrow. We kicked things off with not one, not two, but three keynotes: the first, by Eyal Manor, GM & VP, Engineering; Pali Bhat, VP, Product & Design and Chen Goldberg, Engineering Director, talks about all things Anthos. Then, a second keynote by Bhat and Aparna Sinha, Director, Product Management shows you how Google Cloud can help bring your application development and delivery processes to the next level. Finally, in a third developer keynote, VP of Developer Relations Amr Awadallah tackled the question of whether you can have both innovation and stability in enterprise IT.And then the party really got started. Key announcements from app modernization weekAnthos just keeps getting better and more full-featured. With Anthos’ new hybrid AI capabilities, you can now run Google Cloud AI services on-premises, near your most sensitive data. We announced Anthos attached clusters, to let you manage AWS EKS and Azure AKS clusters with the Anthos control plane, and the beta availability of Anthos for bare metal, a low-overhead alternative to run Anthos at the edge. In addition, Google Cloud application development tools are now more tightly integrated with Anthos than ever before, and the Anthos Identity Service lets you extend your existing identity solutions with Anthos workloads. Finally, it’s easier than ever to migrate workloads into Anthos, even when they run windows, or run on Cloud Foundry. Read this blog for more details. We showered new features on developers this week too. First, we added support for Cloud Run into our Cloud Code IDE plugins, and made it easier to incorporate changes in your local development environment with the help of a new Cloud Run emulator in Cloud Code. New Google Cloud Buildpacks in Cloud Code help you start writing new applications quickly, while Events for Cloud Run lets you connect Cloud Run services with events from a variety of sources. New Workflows in beta let you integrate custom actions, Google APIs and third-party APIs into your code, and the beta Artifact Registry can help you secure your software supply chain by letting you manage and secure artifacts. Finally, Cloud Run now supports traffic splitting and gradual rollouts, and we’ve made it easier to set up Continuous Deployment directly from the Cloud Run user interface. For all this developer news (and more!), check out this post. Are you ready to modernize your applications, but don’t know where to start? The new Google Cloud Application Modernization Program, or Google CAMP, can help you get to the future faster. With data-driven assessment and benchmarking, a full suite of developer tooling and compute platforms, and proven best practices and recommendations from Google and the DevOps Research and Assessment team (DORA), click here to learn more about how Google CAMP can help you get a leg up on your modernization project.Break out the knowledgeThis is quite the week for breakout sessions—not counting solutions keynotes and highlight reels, we debuted 53 new sessions this week! Take your pick from sessions on application modernization and containers, application development, operations and SRE, cost management, security and serverless. Click here for a full list and to add them to your watchlist. Watch demosA conference isn’t complete without demos, and Next OnAir app modernization week was no different. We debuted three new interactive demos, plus four video demos, to educate you about trends in application development or to get you up to speed about Google Cloud’s latest products and features. To wit: go on an app modernization journey and watch how Anthos balances security with agility, reliability with efficiency, and portability with consistency. Watch cloud-native app development in action or see how Cloud Code makes it easy to manage Kubernetes config and debug a service on a Kubernetes cluster. Check out the complete list of interactive demo content from the week. Don’t just take our word for itPerhaps the most important part of Next OnAir is hearing from your peers at companies that have adopted Google Cloud. Texas retailer H-E-B takes to the OnAir airwaves to talk about how it modernized with containers and Anthos. Lowe’s talks software development, CI/CD and monitoring; and MTX talks about how Google Cloud has helped state agencies through the pandemic. Etsy shares best practices for cost management using billing data. Game maker Niantic talks about custom metrics and SRE, and Shopify opines on observability. And you’ll be on the edge of your seat listening to Major League Baseball talk about Anthos on bare metal. Check out all these customer stories and others from the full app modernization week session guide. Looking ahead: AISeven weeks of Next OnAir down, two to go. Up next: artificial intelligence. Join us on Tuesday, September 1, when Principal Software Engineer Ting Lu and VP of Product Management Rajen Sheth take to the stage to talk about generating value with Cloud AI. Of course, we’ll also bring you live technical talks and learning opportunities, aligned with each week’s content. Click “Learn” on the Explore page to find each week’s schedule. Haven’t yet registered for Google Cloud ’20 Next: OnAir? Get started at
Quelle: Google Cloud Platform