Getting started with Kafka on OpenShift

medium.com – Its distributed streaming platform — Is the first definition you can find on google. It is used for a broad range of applications, here, we use it as a message streaming service. It is a combination …
Quelle: news.kubernauts.io

Supporting federal agency compliance during the pandemic (and beyond)

If digital transformation was only a trend a few years ago, it’s now quickly becoming a reality for many federal government agencies. The COVID-19 pandemic has pushed all kinds of agencies to reconsider the timelines and impact of their digital initiatives, whether this means moving core technology infrastructure to the cloud, rolling out more modern productivity tools for government employees, or using artificial intelligence to better deliver citizen services.At Google Cloud, we continue to help federal agencies of all sizes tackle their trickiest problems as they rapidly transform and digitize. At the same time—building on our FedRAMP High authorization announcement from last year—we’re committed to pursuing the latest government certifications, such as the Department of Defense’s (DoD) upcoming Cybersecurity Maturity Model Certification (CMMC), to ensure federal agencies and the vendors that work with them are fully compliant.Applying intelligent automation to assist the U.S. Patent and Trademark OfficeRecently, Accenture Federal Services (AFS) was awarded a position on the U.S. Patent and Trademark Office (USPTO) Intelligent Automation and Innovation Support Services (IAISS) blanket purchase agreement (BPA), a multi-contract vehicle. The five-year BPA includes piloting, testing, and implementing advanced technologies, including intelligent automation, artificial intelligence (AI), microservices, machine learning, natural language processing, robotic process automation, and blockchain. The goal of IAISS is to transform business processes and enhance mission delivery, and it’s expected to be a model for the federal government nationwide. AFS and Google Cloud previously worked with the USPTO to help the agency’s more than 9,000 patent examiners rapidly perform more thorough searches by augmenting their on-premise search tools with Google’s AI. The new solution—created by merging Google’s machine learning models with Accenture’s design, prototyping, and data science capabilities—helps extend examiners’ expertise during the patent search process.Supporting secure cloud management at the Defense Innovation UnitWe also recently announced that Google Cloud was chosen by the Defense Innovation Unit (DIU)—an organization within the Department of Defense (DoD) focused on scaling commercial technology across the DoD—to build a secure cloud management solution to detect, protect against, and respond to cyber threats worldwide.The multi-cloud solution will be built on Anthos, Google Cloud’s app modernization platform, allowing DIU to prototype web services and applications across Google Cloud, Amazon Web Services, and Microsoft Azure—while being centrally managed from the Google Cloud Console. The solution will provide real-time network monitoring, access control, and full audit trails, enabling DIU to maintain its strict cloud security posture without compromising speed and reliability. As a pioneer in zero-trust security and deploying innovative approaches to protect and secure networks, we’re looking forward to partnering with DIU on this critical initiative.Supporting Cybersecurity Maturity Model Certification (CMMC) readinessFinally, while COVID-19 has driven a lot of how federal agencies are working day-to-day, the need for strong cybersecurity protections is as important as ever. At Google Cloud, meeting the highest standards for cybersecurity in the ever-evolving threat and regulatory landscape is one of our primary goals. In January of this year, the DoD published the Cybersecurity Maturity Model Certification (CMMC), a new standard designed to ensure cyber hygiene throughout the DoD supply chain. While the CMMC standard is not yet operational, the CMMC Advisory Board has advised cloud providers to conduct gap analysis against NIST SP 800-53, NIST SP 800-171, and preliminary versions of CMMC requirements. We’ve contracted with a third-party assessor to perform preliminary analyses of Google Cloud against the underlying CMMC controls, and we’re confident we’ll be able to meet the currently proposed controls—and to provide our customers with the right guidance to empower them in their CMMC journeys. For questions about Google’s existing compliance offerings, FedRAMP, or the CMMC, please contact Google Cloud sales. You can also visit our Compliance Resource Center and Government and Public Sector Compliance page to learn more about how we support your specific compliance needs. And to read more about our work with the public sector, including how we’re helping support agencies through the pandemic, visit our website.
Quelle: Google Cloud Platform

Grow your cloud career with high-growth jobs and skill badges

Cloud computing and data skills are especially in demand, as organizations are increasingly turning to digital solutions to transform the way they work and do business. The World Economic Forum predicts there will be close to a 30 percent increase in demand for data, AI, engineering, and cloud computing roles by 2022. Since April, Google Cloud learners have more than doubled year-over-year1. Of those who have started learning with us in 2020, many are looking to upskill or reskill into stable, well paying career paths.To help our expanding community of learners ramp quickly with their cloud careers, Google Cloud is unveiling a new Grow your cloud career webpage where you can find information on in-demand cloud career paths and free upskilling and reskilling resources. You can earn your first Google Cloud skill badges for your resume, which signify to employers that you have hands-on Google Cloud experience. We also have a special no cost learning section for small business leaders to help you build your first website and transform your business with data and AI.If you’re not sure which cloud role is right for you, we recommend exploring these three high-growth career paths.Data AnalystBy 2025, an estimated 463 exabytes of data is expected to be generated everyday. From online purchases, to personal health trackers, to smart factories, and more, the world generates massive amounts of data, but without Data Analysts this data is meaningless. Data Analysts interpret and gather insights from data, enabling better decision making. Their work is instrumental across several industries and for many business functions, including product development, supply chain management, and customer experience. You don’t need a technical background to get started in this role, but you will need to develop foundational skills in SQL (Structured Query Language), data visualization, and data warehousing. Cloud EngineerWith more than 88 percent of organizations now using cloud and planning to increase their usage, it’s no wonder that the Cloud Engineer role was one of the top in-demand job roles in the U.S. in 2019. Cloud Engineers play a critical role in setting up their company’s infrastructure, deploying applications, and monitoring cloud systems and operations. If you have education or experience in IT, the Cloud Engineer role may be the most natural path for you. It will give you a broad foundation in cloud and expose you to several different functions. Although working in cloud will require a shift in mindset for most with a traditional IT background, particularly in terms of automated infrastructure, scale, and agile workflows, there are several transferable IT skills that will continue to serve you well in this role.Cloud Application DeveloperFor those with a software development background, expanding your skills into cloud development is a must. Cloud offers developers several benefits, including scalability, better security, cost efficiencies, and ease of deployment. As a Cloud Developer, you are responsible for designing, building, testing, deploying, and monitoring highly scalable and reliable cloud-native applications. To upskill into this role, you will need to gain a deep understanding of cloud platforms, databases, and systems integration. If you’re ready to jumpstart your cloud career, visit our Grow your cloud career page where you can start upskilling and earning Google Cloud recognized skill badges for the Data Analyst, Cloud Engineer, or Cloud Developer job roles—get started at no cost here.1. According to internal data.
Quelle: Google Cloud Platform

How To Setup Your Local Node.js Development Environment Using Docker

Docker is the defacto toolset for building modern applications and setting up a CI/CD pipeline – helping you build, ship and run your applications in containers on-prem and in the cloud. 

Whether you’re running on simple compute instances such as AWS EC2 or Azure VMs or something a little more fancy like a hosted Kubernetes service like AWS EKS or Azure AKS, Docker’s toolset is your new BFF. 

But what about your local development environment? Setting up local dev environments can be frustrating to say the least.

Remember the last time you joined a new development team?

You needed to configure your local machine, install development tools, pull repositories, fight through out-of-date onboarding docs and READMEs, get everything running and working locally without knowing a thing about the code and it’s architecture. Oh and don’t forget about databases, caching layers and message queues. These are notoriously hard to set up and develop on locally.

I’ve never worked at a place where we didn’t expect at least a week or more of on-boarding for new developers. 

So what are we to do? Well, there is no silver bullet and these things are hard to do (that’s why you get paid the big bucks) but with the help of Docker and it’s toolset, we can make things a whole lot easier.

In Part I of this tutorial we’ll walk through setting up a local development environment for a relatively complex application that uses React for it’s front end, Node and Express for a couple of micro-services and MongoDb for our datastore. We’ll use Docker to build our images and Docker Compose to make everything a whole lot easier.

If you have any questions, comments or just want to connect. You can reach me in our Community Slack or on twitter at @pmckee.

Let’s get started.

Prerequisites

To complete this tutorial, you will need:

Docker installed on your development machine. You can download and install Docker Desktop from the links below:Docker Desktop for MacDocker Desktop for WindowsGit installed on your development machine.An IDE or text editor to use for editing files. I would recommend VSCode

Fork the Code Repository

The first thing we want to do is download the code to our local development machine. Let’s do this using the following git command:

git clone git@github.com:pmckeetx/memphis.git

Now that we have the code local, let’s take a look at the project structure. Open the code in your favorite IDE and expand the root level directories. You’ll see the following file structure.

├── docker-compose.yml├── notes-service│   ├── config│   ├── node_modules│   ├── nodemon.json│   ├── package-lock.json│   ├── package.json│   └── server.js├── reading-list-service│   ├── config│   ├── node_modules│   ├── nodemon.json│   ├── package-lock.json│   ├── package.json│   └── server.js├── users-service│   ├── Dockerfile│   ├── config│   ├── node_modules│   ├── nodemon.json│   ├── package-lock.json│   ├── package.json│   └── server.js└── yoda-ui    ├── README.md    ├── node_modules    ├── package.json    ├── public    ├── src   └── yarn.lock

The application is made up of a couple simple microservices and a front-end written in React.js. It uses MongoDB as it’s datastore.

Typically at this point, we would start a local version of MongoDB or start looking through the project to find out where our applications will be looking for MongoDB.

Then we would start each of our microservices independently and then finally start the UI and hope that the default configuration just works.

This can be very complicated and frustrating. Especially if our micro-services are using different versions of node.js and are configured differently.

So let’s walk through making this process easier by dockerizing our application and putting our database into a container. 

Dockerizing Applications

Docker is a great way to provide consistent development environments. It will allow us to run each of our services and UI in a container. We’ll also set up things so that we can develop locally and start our dependencies with one docker command.

The first thing we want to do is dockerize each of our applications. Let’s start with the microservices because they are all written in node.js and we’ll be able to use the same Dockerfile.

Create Dockerfiles

Create a Dockerfile in the notes-services directory and add the following commands.

This is a very basic Dockerfile to use with node.js. If you are not familiar with the commands, you can start with our getting started guide. Also take a look at our reference documentation.

Building Docker Images

Now that we’ve created our Dockerfile, let’s build our image. Make sure you’re still located in the notes-services directory and run the following command:

docker build -t notes-service.

Now that we have our image built,  let’s run it as a container and test that it’s working.

docker run –rm -p 8081:8081 –name notes notes-service

Looks like we have an issue connecting to the mongodb. Two things are broken at this point. We didn’t provide a connection string to the application. The second is that we do not have MongoDB running locally.

At this point we could provide a connection string to a shared instance of our database but we want to be able to manage our database locally and not have to worry about messing up our colleagues’ data they might be using to develop. 

Local Database and Containers

Instead of downloading MongoDB, installing, configuring and then running the Mongo database service. We can use the Docker Official Image for MongoDB and run it in a container.

Before we run MongoDB in a container, we want to create a couple of volumes that Docker can manage to store our persistent data and configuration. I like to use the managed volumes that docker provides instead of using bind mounts. You can read all about volumes in our documentation.

Let’s create our volumes now. We’ll create one for the data and one for configuration of MongoDB.

docker volume create mongodbdocker volume create mongodb_config

Now we’ll create a network that our application and database will use to talk with each other. The network is called a user defined bridge network and gives us a nice DNS lookup service which we can use when creating our connection string.

docker network create mongodb

Now we can run MongoDB in a container and attach to the volumes and network we created above. Docker will pull the image from Hub and run it for you locally.

docker run -it –rm -d -v mongodb:/data/db -v mongodb_config:/data/configdb -p 27017:27017 –network mongodb –name mongodb mongo

Okay, now that we have a  running mongodb, we also need to set a couple of environment variables so our application knows what port to listen on and what connection string to use to access the database. We’ll do this right in the docker run command.

docker run -it –rm -d –network mongodb –name notes -p 8081:8081 -e SERVER_PORT=8081 -e SERVER_PORT=8081 -e DATABASE_CONNECTIONSTRING=mongodb://mongodb:27017/yoda_notes notes-service

Let’s test that our application is connected to the database and is able to add a note.

curl –request POST –url http://localhost:8081/services/m/notes   –header ‘content-type: application/json’   –data ‘{ “name”: “this is a note”, “text”: “this is a note that I wanted to take while I was working on writing a blog post.”, “owner”: “peter”}

You should receive the following json back from our service.

{“code”:”success”,”payload”:{“_id”:”5efd0a1552cd422b59d4f994″,”name”:”this is a note”,”text”:”this is a note that I wanted to take while I was working on writing a blog post.”,”owner”:”peter”,”createDate”:”2020-07-01T22:11:33.256Z”}}

Conclusion

Awesome! We’ve completed the first steps in Dockerizing our local development environment for Node.js.

In Part II of the series, we’ll take a look at how we can use Docker Compose to simplify the process we just went through.

In the meantime, you can read more about networking, volumes and Dockerfile best practices with the below links:

Docker NetworkingVolumesBest practices for writing Dockerfiles
The post How To Setup Your Local Node.js Development Environment Using Docker appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/