Docker Talks Live Stream Monthly Recap

Here at Docker, we have a deep love for developers and with more and more of the community working remotely, we thought it would be a great time to start live streaming and connecting with the community virtually. 

To that end, Chad Metcalf (@metcalfc) and I (@pmckee) have started to live stream every Wednesday at 10am Pacific Time on YouTube. You can find all of the past streams and subscribe to get notifications when we go live on our YouTube channel.

Every week we will cover a new topic focusing on developers and developer productivity using the Docker platform. We will have guest speakers, demo a bunch of code and answer any questions that you might have. 

Below I’ve compiled a list of past live streams that you can watch at your leisure and we look forward to seeing you on the next live stream.

Docker AWS – A match made in heaven

Cloud container runtimes are complex and the learning curve can be steep for some developers. Not all development teams have DevOps teams to partner with which shifts the burden of understanding runtime environments, CLIs, and configuration for the cloud to the development team. But one thing is for sure, developers love the simplicity of Docker and Compose.

In this live stream, follow along as Chad Metcalf (@metcalfc) uses the new ECS context along with Docker Compose commands to run an application locally and then without changes deploy directly to ECS.

Running Docker containers in Azure ACI

Back in June, we announced our partnership with Microsoft to help developers seamlessly move their code and applications from their desktops running Docker to the Cloud running on Azure Container Instances (ACI) .

Developers can now easily switch between their local Docker context to an ACI context and run a single Docker container or a service composed of a group of multiple containers defined with a Docker Compose file. All this is done without setting up infrastructure and takes advantage of features such as mounting Azure Storage and GitHub repositories as volumes.

In this live stream, Chad Metcalf (@metcalfc) walks us through the new Azure ACI integration using Docker Context and Compose. He starts out by running a single container locally and then switching context to ACI and deploying that same container in the cloud. Then he moves on to show the same developer workflow but now using Docker Compose and ACI to deploy a multi-container application.

Getting Started Q&A

At Docker, our mission is to help developers become more productive. The Docker product is essential to every developer who is using containers and deploying to the cloud. Whether that’s on-prem or on a public cloud. 

In this live stream, we answer the top questions developers have when getting started with Docker. We talk about running containers locally, setting up Docker Compose files, building images and a little bit of networking.

If you have a question that you would like us to answer, please feel free to fill out this form and we would be happy to answer these questions on the next Q&A live stream. Or feel free to join us live and drop your questions in the chat box.

VSCode Docker Extension

VSCode is a developer favorite and Microsoft has created a fantastic plug-in to help developers manage the development lifecycle using Docker. In this live stream, Chad Metcalf (@metcalfc) walks Peter McKee (@pmckee) through the major features of the VSCode Docker Extension and answers your questions. We cover the new context features and show how to start, stop and connect to containers using the VSCode Docker Extension.

Resources

For more on how to use Docker and to sign-up for a free account, check out the resources below: 

Create a Free Docker AccountDownload DockerDocker OverviewGetting started tutorial
The post Docker Talks Live Stream Monthly Recap appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

How To Setup Your Local Node.js Development Environment Using Docker – Part 2

In part I of this series, we took a look at creating Docker images and running Containers for Node.js applications. We also took a look at setting up a database in a container and how volumes and network play a part in setting up your local development environment.

In this article we’ll take a look at creating and running a development image where we can compile, add modules and debug our application all inside of a container. This helps speed up the developer setup time when moving to a new application or project. 

We’ll also take a quick look at using Docker Compose to help streamline the processes of setting up and running a full microservices application locally on your development machine.

Fork the Code Repository

The first thing we want to do is download the code to our local development machine. Let’s do this using the following git command:

git clone git@github.com:pmckeetx/memphis.git

Now that we have the code local, let’s take a look at the project structure. Open the code in your favorite IDE and expand the root level directories. You’ll see the following file structure.

├── docker-compose.yml
├── notes-service
│ ├── config
│ ├── node_modules
│ ├── nodemon.json
│ ├── package-lock.json
│ ├── package.json
│ └── server.js
├── reading-list-service
│ ├── config
│ ├── node_modules
│ ├── nodemon.json
│ ├── package-lock.json
│ ├── package.json
│ └── server.js
├── users-service
│ ├── Dockerfile
│ ├── config
│ ├── node_modules
│ ├── nodemon.json
│ ├── package-lock.json
│ ├── package.json
│ └── server.js
└── yoda-ui
├── README.md
├── node_modules
├── package.json
├── public
├── src
└── yarn.lock

The application is made up of a couple simple microservices and a front-end written in React.js. It uses MongoDB as it’s datastore.

In part I of this series, we created a couple of Dockerfiles for our services and also took a look at running them in containers and connecting them to an instance of MongoDb running in a container. 

Local development in Containers

There are many ways to use Docker and containers to do local development and a lot of it depends on your application structure. We’ll start at with the very basic and then progress into more complicated setups

Using a Development Image

One of the easiest ways to start using containers in your development workflow is to use a development image. A development image is an image that has all the tools that you need to develop and compile your application with.

In this article we are using node.js, so our image should have Node.js installed as well as npm or yarn. Let’s create a development image that we can use to run our node.js application inside of.

Development Dockerfile

Create a local directory on your development machine that we can use as a working directory to save our Dockerfile and any other files that we’ll need for our development image.

$ mkdir -p ~/projects/dev-image

Create a Dockerfile in this folder and add the following commands.

FROM node:12.18.3
RUN apt-get update && apt-get install -y
nano
vim

We start off by using the node:12.18.3 official image. I’ve found that this image is fine for creating a development image. I like to add a couple of text editors to the image in case I want to quickly edit a file while inside the container.

We did not add an ENTRYPOINT or CMD to the Dockerfile because we will rely on the base image’s ENTRYPOINT and we will override the CMD when we start the image.

Let’s build our image.

$ docker build -t node-dev-image .

And now we can run it.

$ docker run -it –rm –name dev -v $(pwd):/code node-dev-image bash

You will be presented with a bash command prompt. Now, inside the container we can create a JavaScript file and run it with Node.js.

Run the following commands to test our image.

$ cat <<EOF > index.js
console.log( ‘Hello from inside our container’ )
EOF
$ node index.js

Nice. It appears that we have a working development image. We can now do everything that we would do in our normal bash terminal.

If you ran the above Docker command inside of the notes-service directory, then you will have access to the code inside of the container. 

You can start the notes-service by simply navigating to the /code directory and running npm run start.

Using Compose to Develop locally

The notes-service project uses MongoDb as it’s data store. If you remember from Part I of this series, we had to start the Mongo container manually and connect it to the same network that our notes-service is running on. We also had to create a couple of volumes so we could persist our data across restarts of our application and MongoDb.

In this section, we’ll create a Compose file to start our notes-serice and the MongoDb with one command. We’ll also set up the Compose file to start the notes-serice in debug mode so that we can connect a debugger to the running node process.

Open the notes-service in your favorite IDE or text editor and create a new file named docker-compose.dev.yml. Copy and paste the the below commands into the file.

version: ‘3.8’
services:
notes:
build:
context: .
ports:
– 8080:8080
– 9229:9229
environment:
– SERVER_PORT=8080
– DATABASE_CONNECTIONSTRING=mongodb://mongo:27017/notes
volumes:
– ./:/code
command: npm run debug

mongo:
image: mongo:4.2.8
ports:
– 27017:27017
volumes:
– mongodb:/data/db
– mongodb_config:/data/configdb
volumes:
mongodb:
mongodb_config:

This compose file is super convenient because now we do not have to type all the parameters to pass to the docker run command. We can declaratively do that in the compose file.

We are exposing port 9229 so that we can attach a debugger. We are also mapping our local source code into the running container so that we can make changes in our text editor and have those changes picked up in the container.

One other really cool feature of using the a compose file, is that we have service resolution setup to use the service names. So we are now able to use “mongo” in our connection string. The reason we use mongo is because that is what we have named our mongo service in the compose file as.

Let’s start our application and confirm that it is running properly.

$ docker-compose -f docker-compose.dev.yml up –build

We pass the “–build” flag so Docker will compile our image and then start it.

If all goes will you should see something similar:

Now let’s test our API endpoint. Run the following curl command:

$ curl –request GET –url http://localhost:8080/services/m/notes

You should receive the following response:

{“code”:”success”,”meta”:{“total”:0,”count”:0},”payload”:[]}

Connecting a Debugger

We’ll use the debugger that comes with the Chrome browser. Open Chrome on your machine and then type the following into the address bar.

about:inspect

The following screen will open.

Click the “Open dedicated DevTools for Node” link. This will open the DevTools that are connected to the running node.js process inside our container.

Let’s change the source code and then set a breakpoint. 

Add the following code to the server.js file on line 19 and save the file. 

server.use( ‘/foo’, (req, res) =&gt; {
return res.json({ "foo": "bar" })
})

If you take a look at the terminal where our compose application is running, you’ll see that nodemon noticed the changes and reloaded our application.

Navigate back to the Chrome DevTools and set a breakpoint on line 20 and then run the following curl command to trigger the breakpoint.

$ curl –request GET –url http://localhost:8080/foo

BOOM You should have seen the code break on line 20 and now you are able to use the debugger just like you would normally. You can inspect and watch variables, set conditional breakpoints, view stack traces and a bunch of other stuff.

Conclusion

In this article we took a look at creating a general development image that we can use pretty much like our normal command line. We also set up our compose file to map our source code into the running container and exposed the debugging port.

Resources

Getting Started with Dockerhttps://www.docker.com/get-startedBest practices for writing Dockerfileshttps://docs.docker.com/develop/develop-images/dockerfile_best-practices/https://www.docker.com/blog/speed-up-your-development-flow-with-these-dockerfile-best-practices/Docker Desktop https://docs.docker.com/desktop/Docker Compose https://docs.docker.com/compose/Project skeleton sampleshttps://github.com/docker/awesome-compose

The post How To Setup Your Local Node.js Development Environment Using Docker – Part 2 appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

GitOps — Evolution of DevOps

medium.com – In easy words, GitOps is a smart and intelligent way for Continuous Deployment of Cloudnative application. As it continuously monitor and matches the production environment with the development code …
Quelle: news.kubernauts.io

GitOps

gitops.tech – Since its inception in 2017 by Weaveworks, GitOps has caused quite some fuss on Twitter and KubeCon. This site aggregates the essence of GitOps to help clear up the confusion about the topic. We’re w…
Quelle: news.kubernauts.io