Docker + Arm Virtual Meetup Recap: Building Multi-arch Apps with Buildx

Docker support for cross-platform applications is better than ever. At this month’s Docker Virtual Meetup, we featured Docker Architect Elton Stoneman showing how to build and run truly cross-platform apps using Docker’s buildx functionality. 
With Docker Desktop, you can now describe all the compilation and packaging steps for your app in a single Dockerfile, and use it to build an image that will run on Linux, Windows, Intel and Arm – 32-bit and 64-bit. In the video, Elton covers the Docker runtime and its understanding of OS and CPU architecture, together with the concept of multi-architecture images and manifests.
The key takeaways from the meetup on using buildx:

Everything should be multi-platform
Always use multi-stage Dockerfiles 
buildx is experimental but solid (based on BuildKit)
Alternatively use docker manifest — also experimental

Not a Docker Desktop user? Jason Andrews, a Solutions Director at Arm, posted this great article on how to setup buildx using Docker Community Engine on Linux. 
Check out the full meetup on Docker’s YouTube Channel:

You can also access the demo repo here. The sample code for this meetup is from Elton’s latest book, Learn Docker in a Month of Lunches, an accessible task-focused guide to Docker on Linux, Windows, or Mac systems. In it, you’ll learn practical Docker skills to help you tackle the challenges of modern IT, from cloud migration and microservices to handling legacy systems. There’s no excessive theory or niche-use cases — just a quick-and-easy guide to the essentials of Docker you’ll use every day (use the code webdoc40 for 40% off).
To get started building multi-arch apps today:

Download Docker Desktop 
Read about Building Multi-Arch Images for Arm and x86 with Docker Desktop
Watch the DockerCon session on Developing Containers for Arm

#Docker + @arm virtual meetup recap: How to build multi-arch images with buildxClick To Tweet

The post Docker + Arm Virtual Meetup Recap: Building Multi-arch Apps with Buildx appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

New in Docker Hub: Personal Access Tokens

The Hub token list view.
On the heels of our recent update on image tag details, the Docker Hub team is excited to share the availability of personal access tokens (PATs) as an alternative way to authenticate into Docker Hub.
Already available as part of Docker Trusted Registry, personal access tokens can now be used as a substitute for your password in Docker Hub, especially for integrating your Hub account with other tools. You’ll be able to leverage these tokens for authenticating your Hub account from the Docker CLI – either from Docker Desktop or Docker Engine: 
docker login –username <username>
When you’re prompted for a password, enter your token instead.
The advantage of using tokens is the ability to create and manage multiple tokens at once so you can generate different tokens for each integration – and revoke them independently at any time.
Create and Manage Personal Access Tokens in Docker Hub 
Personal access tokens are created and managed in your Account Settings.
From here, you can:

Create new access tokens
Modify existing tokens
Delete access tokens

Creating an access token in Docker Hub.
Note that the actual token is only shown once, at the time of creation. You will need to copy the token and save it in either a credential manager or use it immediately. If you lose a token, you will need to delete the lost token and create a new one. 
The Next Step for Tokens
Personal access tokens open a new set of ways to authenticate into your Docker Hub account. Their introduction also serves as a foundational building block for more advanced access control capabilities, including multi-factor authentication and team-based access controls – both areas that we’re working on at the moment. We’re excited to share this and many other updates that are coming to Docker Hub over the next few months. Give access tokens a try and let us know what you think!
To learn more about personal access tokens for Docker Hub:

Read more about Docker Hub
Explore the Docker Hub documentation 
Get started with Docker by creating your Hub account

New in #DockerHub: Personal Access TokensClick To Tweet

The post New in Docker Hub: Personal Access Tokens appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

How Wiley Education Services Empowers Students with Docker Enterprise

We sat down recently with our customer, Wiley Education Services, to find out how Docker Enterprise helps them connect with and empower higher education students. Wiley Education Services (WES) is a division of Wiley Publishing that delivers online services to over 60 higher education institutions.
We spoke with Blaine Helmick, Senior Manager of Systems Engineering about innovation and technology in education. Read on to learn more about Wiley, or watch the short video interview with Blaine:

On Wiley’s Mission…
Our mission at Wiley Education Services is empowering people, to connect people to their futures. We serve over 60 higher education partners around the world, and our role is to connect you to our higher education partners when you’re looking for a degree and you’re frankly looking to change your life. 
On the Innovation at a 200 Year Old Company… 
Wiley has been around for over 200 years. One of the really amazing things about being in an organization that’s been around that long is that you have to have a culture of innovation at your core. 
Technology like Docker has really empowered our business because it allows us to innovate, and it allows us to experiment. That’s critical because experimentation and being allowed to fail is what allows us to innovate and learn. 
An Architecture that Wouldn’t Scale…
Our business first looked at Docker right about when I joined in late 2016. Our marketing solution for higher education partners was running on two VMs in AWS, including the database, and it just wasn’t scalable. We just didn’t have the ability that we needed to quickly adapt and evolve to meet the needs of our higher education partners. If we wanted to stay competitive we were going to have to make some architectural changes. 
And a Memorable Outage…
A memorable experience at Wiley for me was when we had a huge outage on the servers that ran our core service. We were down for a good three days while we essentially rebuilt our infrastructure. It led to an epiphany about the analysis we had done of our infrastructure. Our partner had already recommended Docker, but it hadn’t coalesced yet. That was the moment it did. 
On How Docker Helps Wiley Achieve Their Vision
Docker helps us make good on the mission at Wiley by reducing the amount of time that it takes for us to get one of our websites out in the marketplace to help and empower students. If we can make the connection to students faster and deliver new information that helps connect those students to our higher education partners, that’s incredibly important. Closing that gap is what empowers us and ultimately empowers those students to succeed.
On Innovation without the Cost…
Docker has been eye-opening for me. One key benefit that we got immediately with Docker Enterprise was speed — how quickly we can create and publish containers, test them, and bring them back down if they don’t work without the long cycles we needed before. That’s huge, because it lets us innovate quickly.
The second thing that it brought us was cost savings. The ability to bring up new tech rapidly without necessarily having to make huge investments has been tremendous. 
The Future at Wiley Education Services
Now we’re looking to bring Docker throughout the organization so that we’re not just taking what we have and putting it out into docker containers, but also introducing things like microservices, or being able to take multiple datacenters from around the world and orchestrate our containers so we have full redundancy.
But it isn’t just about redundancy. It’s about meeting our students — and customers — where they are. So having the capability that Docker and Kubernetes provides will let us take our marketing sites and move them throughout the world so we’re reaching students where they are. That’s what allows us to reach those students faster. 
To learn more about Wiley’s story and how Docker can help you innovate:

Watch Wiley’s DockerCon presentation
Read the Digital Transformation Imperative eBook

@blainehelmick explains how @WileyEdServices empowers students at over 60 colleges and universities with #Docker EnterpriseClick To Tweet

The post How Wiley Education Services Empowers Students with Docker Enterprise appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

How InterSystems Builds an Enterprise Database at Scale with Docker Enterprise

We sat down recently with InterSystems, our partner and customer, to talk about how they deliver an enterprise database at scale to their customers. InterSystems’s software powers mission-critical applications at hospitals, banks, government agencies and other organizations.
We spoke with Joe Carroll, Product Specialist, and Todd Winey, Director of Partner Programs at InterSystems about how containerization and Docker are helping transform their business.
Here’s what they told us. You can also catch the highlights in this 2 minute video:

On InterSystems and Enterprise Databases…
Joe Carroll: InterSystems is a 41 year old database and data platform company. We’ve been in data storage for a very long time and our customers tend to be traditional enterprises — healthcare, finance, shipping and logistics as well as government agencies. Anywhere that there’s mission critical data we tend to be around. Our customers have really important systems that impact people’s lives, and the mission critical nature of that data characterizes who our customers are and who we are.
On Digital Transformation in Established Industries…
Todd Winey: Many of those organizations and industries have been traditionally seen as laggards in terms of their technology adoption in the past, so the speed with which they’re moving to digital transformation is a key theme for everyone involved. Our customers are really seeing the benefits of being able to adopt technology faster and with a higher degree of confidence.
Our goal is to support them on that journey with our software. To do that, we’ve had to transform our business at InterSystems.
Why Docker…
Todd: From a software delivery standpoint, Docker has provided some truly amazing capabilities. Quality development was probably one of the biggest bottlenecks we faced, along with creating the processes we needed to ensure good quality software was going out the door.
And so rather than solve it by trying to throw more people at the problem, Docker’s platform allows us to do a lot of automation in that process so that we’re getting orders of magnitude improvements without additional staff.
On Scaling Software Testing for an Enterprise Database…
Joe: The InterSystems IRIS platform is the latest iteration of our database platform, and is built container and cloud first. We modernized the database with Docker Enterprise is our database software. 
The scalability the Docker Enterprise gives us in terms of our testing infrastructure allows us to go from what was before a tens of tests a day to thousands of tests every night, to eventually tens of thousands of tests every night. We’re able to provide higher quality of software four our customers because of this testing infrastructure. 
We’re able to test at scale, and do that quickly without sacrificing quality. And when we kick off our testing suite we can be confident the software will run on whatever cloud provider we need, so we can ensure portability of our application. 
On How Docker has Helped…
Todd: Adopting Docker Enterprise really provided three key benefits. We’re getting quarterly releases out the door whereas before we’re we’re looking at maybe one major release, and really starting to push the envelope on continuous delivery. So our customers who are ready to take software with a new build on a daily basis, we can meet those demands.
Staffing of the software quality development process is much easier. We’re able to do more with less, which is what every business wants and what every business wants out a digital transformation.
We’ve always delivered our software across a large number of operating systems. Docker allows us to treat cloud as one more operating system, so we can provide flexibility to our customers to run our software confidently where they want.
The Future for InterSystems
InterSystems expects to continue improving its software release cycle, delivering software updates daily to its enterprise database.
To learn more about InterSystems and how Docker helps enterprises build better software:

Watch the InterSystems DockerCon 2019 session
See what’s new in Docker Enterprise 3.0

How @InterSystems builds their enterprise database software at scale with Docker EnterpriseClick To Tweet

The post How InterSystems Builds an Enterprise Database at Scale with Docker Enterprise appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

Powering Docker App: Next Steps for Cloud Native Application Bundles (CNAB)

Last year at DockerCon and Microsoft Connect, we announced the Cloud Native Application Bundle (CNAB) specification in partnership with Microsoft, HashiCorp, and Bitnami. Since then the CNAB community has grown to include Pivotal, Intel, DataDog, and others, and we are all happy to announce that the CNAB core specification has reached 1.0.
We are also announcing the formation of the CNAB project under the Joint Development Foundation, a part of the Linux Foundation that’s chartered with driving adoption of open source and standards. The CNAB specification is available at cnab.io. Docker is working hard with our partners and friends in the open source community to improve software development and operations for everyone.
Docker’s Implementation of CNAB — Docker App
Docker was one of the first to implement the CNAB specification with Docker App, our reference implementation available on GitHub. Docker App can be used to both build CNAB bundles for Docker Compose (which can then be used with any other CNAB client), and also to install, upgrade, and uninstall any other CNAB bundle.
It also forms the underpinnings of application templates in Docker Desktop Enterprise. With Docker App, we are making CNAB-compliant applications as easy to use as Docker images; you will get the same benefits of immutability and a simple user experience for building, sharing and running containers now applied to multi-service applications. 
Docker’s contribution to CNAB stems from our desire to build an application ecosystem like we did for the container ecosystem. And CNAB is the building block for this — it’s a packaging format. Going forward, Docker recognizes that a single container is not enough to express an application. This is why Docker’s strategy is to help grow this application ecosystem and make modern applications a core part of our platform. This means building, managing, and securing all of your applications from traditional applications to cutting-edge microservices — and deploying them anywhere.
What Comes Next
Docker is excited to see the momentum building with CNAB from our new partners, Pivotal, DataDog, and Intel as well as Microsoft, Bitnami, and HashiCorp who started the journey with us. There is sill a lot more to do, including ensuring that both Docker App – our reference implementation of the spec – and all other CNAB tools become core 1.0 compliant. We will also continue to provide community leadership and help drive further industry adoption of CNAB. Stay tuned for more exciting announcements from Docker and the CNAB community.

Powering Docker App: Next Steps for #Cloud Native Application Bundles (CNAB)Click To Tweet

Getting Started
Docker welcomes you to get involved with the community. Here are some resources to get started:

CNAB explainer video presented by Docker and Microsoft. 
CNAB 1.0 Core Specification
Join the #cnab channel in the CNCF’s Slack. The CNAB community has an open meeting every other Wednesday at 09:00AM US Pacific.
Video from DockerCon19: Deploying Distributed Applications with Docker App and CNAB
Docker App: Make your Docker Compose applications reusable, and share them on Docker Hub 
Deis Labs has a blog series on CNAB: Part 1, and Part 2.

The post Powering Docker App: Next Steps for Cloud Native Application Bundles (CNAB) appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

Introducing Docker Hub’s New & Improved Tag User Experience

One of Docker’s core missions is delivering choice and flexibility across different application languages and frameworks, operating systems, and infrastructure. When it comes to modern applications, the choice of infrastructure is not just whether the application is run on-premises, on virtual machines or bare metal, or in the cloud. It can also be a choice of which architecture – x86, Arm, or GPU. 
Today, we’re happy to share some updates in Docker Hub that make it easier to access multi-architecture images and scanning results through the Tag UX. 
Navigating to Image Tags
In this example, we’re looking at a listing for a Docker Official Image that supports x86, PowerPC and IBMz as listed in the labels. When you land on the image page on Docker Hub, you can quickly identify if an image supports multiple architectures in the labels underneath the image name. For further details, you can click on ‘Tags’:

In this section, you can now view the different architectures separately to easily identify the right image for the architecture you need, complete with image size and operating system information:

If you click on the digest for a particular architecture, you will now also be able to see the actual source of the image – the layer-by-layer details that make up the image. 
If vulnerability scanning was completed on that image, you’ll also be able to quickly identify if the image has any known vulnerabilities and if there are, whether these vulnerabilities are rated as critical, major or minor. The scanning results are based on a binary level scan of the image against the CVE database. 
In this second example, you’ll see that layers 2 to 6 have passed vulnerability scanning via the green checkboxes, but the very first layer of the image has issues:

When you click on the first row, you’ll see that the image contains multiple components and that multiple components have known vulnerabilities ranging from minor to critical. To explore further, click on the caret to expand and view all of the found vulnerabilities:

Each vulnerability is linked directly to the CVE so that you can learn more about the CVE and its implications. That will allow you to decide if it needs to be fixed or not. 
Get Started with Docker and Explore Docker Hub
Docker Hub is a great place to start learning more about Docker containers. Create your own account and get access to a guided tutorial on using Docker Hub and the free Docker Desktop. With your account, you also get free public repositories and one free private repository to start sharing and collaborating on content.

New in #DockerHub – Improved Tag User Experience, Including Multi-Architecture DetailClick To Tweet

Create a Docker Hub Account
To learn more about Docker Hub:

Create your Docker Account
Read the Docker Hub Quickstart
Find out how to integrate Docker Hub and Docker Trusted Registry

The post Introducing Docker Hub’s New & Improved Tag User Experience appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

Top 12 Questions from the Docker Enterprise 3.0 Webinar Series

Earlier in August, we hosted a series of virtual events to introduce Docker Enterprise 3.0. Thousands of you registered and joined us, and many of you asked great questions. This blog contains the top questions and answers from the event series.

Docker Enterprise in the Cloud, On-Prem, with Kubernetes
Q: Can Docker Enterprise be used on AWS and other cloud providers?
A: Yes! Docker Enterprise, including the Docker Universal Control Plane (UCP) and Docker Trusted Registry (DTR), can be deployed to any of the leading cloud environments, including AWS, Azure and GCP. With Docker Enterprise 3.0, we also launched the Docker Cluster CLI plugin for use with Docker Certified Infrastructure. The plugin (now supporting AWS and Azure) allows for simple installation and upgrading of Docker Enterprise on selected cloud providers.
Q: Is Docker Cluster only available in the public cloud, or is it possible to add local machines or VMs?
A: Additional support for VMware vSphere environments is coming shortly. If you have other platforms that need to be supported, please engage with your account team to provide that feedback!
Q: Does Docker Kubernetes Service (DKS) work with both on-premises and other Kubernetes environments such as EKS, AKS, GKE?
A: Docker Kubernetes Service is an integrated and certified Kubernetes distribution that is included in the Docker Enterprise platform – both in Docker Desktop Enterprise and in our Universal Control Plane (UCP). As a conformant Kubernetes distribution, there is an inherent compatibility between Docker Kubernetes Service and other cloud-based Kubernetes environments. However, if you are using EKS/AKS/GKE, you will not need to install another Kubernetes distribution for your runtime environment and that means you will not need UCP. You will still benefit from other elements of the Docker platform including Docker Desktop Enterprise for local developer environments and Docker Hub and Docker Trusted Registry for collaborating with trusted content. 
Q: My organization is new to containers and Kubernetes – what’s the recommended path to get started?
A: Most of our customers are new to these technologies. Docker Enterprise Solutions offer an easy on-ramp for customers to deploy and operationalize Docker Enterprise (including Docker Kubernetes Service) within their environments. Solutions include Docker Enterprise platform subscriptions and professional services that leverage a prescriptive methodology developed over time working with hundreds of enterprise customers. You can learn more at docker.com/solutions/docker-enterprise-solutions.
Docker Enterprise 3.0 Security
Q: Is there a way to determine that an image in Docker Hub is validated and has the appropriate security settings?
A: Official and Verified Publisher images provide a first level screen in terms of validating that they came from a trusted source – either with Docker’s direct oversight or from validated 3rd party vendors. The Certified Images in Docker Hub must be run through an additional security test. To provide an even higher level assurance, we recommend scanning the images for known vulnerabilities once they are added to your own private registry.
Q: What if a vulnerability is discovered after you have deployed it to production? Will you be alerted?
A: Yes! Docker Enterprise tracks the layers that have been scanned. If a new vulnerability is detected, you will be alerted on previously scanned images and, using UCP, have the ability to track where those images are deployed. 
We track vulnerabilities closely – each of the Official images can be traced back to Github. The vulnerability scanning capabilities will compare the layers in the image against the CVE database. If there is an older release version of software contained in the image, vulnerability scanning (binary level scanning) will pick that up and flag it. Then you can re-build that image with the latest patch
Docker Desktop, Docker App and CNAB
Q: How Docker Desktop Enterprise different than the community version?
The key differences are in two areas: developer productivity and IT manageability. To improve productivity, Docker Desktop Enterprise includes an application designer interface that makes it easy to build container-based applications using pre-defined templates. When it comes to improving manageability, Docker Desktop Enterprise can be deployed via IT’s choice of endpoint management tools, with optional lockable settings. You can see a full list of enhancements here.
Q: I want our developers to work with Docker on Windows desktops, but the production environments are Linux. Can they develop for Linux in Docker Desktop for Windows?
A: Yes! Docker Desktop for Windows already exists today for native .NET and Linux based development. We also recently introduced the Tech Preview to support WSL2 – an improved Linux experience within Windows! You can learn more about WSL2 here.
Q: Is CNAB a viable solution for deploying edge architecture applications?
A: Yes. The CNAB specification is designed to support multiple configuration formats, making it future-proof and inclusive of things like Helm charts and Object YAML files. That allows you to support both existing tech stacks and future tech stacks.
Q: What application frameworks does Docker App support?
A: There is no restriction on application frameworks for Docker App. Today, Docker App supports the packaging of multiple Docker Compose files into a single bundle. These Compose files can be mapped to monolithic or n-tier applications or microservices – there is no dependency on the application architecture. As Docker App expands to support Helm charts and Kubernetes YAML, this will further embrace other configuration formats.
Q: Is there an easy way in Docker App to convert any custom development app to a container ready app?
A: We have some tooling to assist on this. Using Assemble and Templates adds your code to a “scaffold” for containers. Someone does have to create the template – but it is possible to reuse other templates that others have created.
Q: Does Docker App depend on the underlying infrastructure in terms of virtual machines, bare metal, etc.?
A: The underlying infrastructure generally doesn’t matter. Docker Enterprise runs on VMs or bare metal and the parameterized fields within the Docker App can be used to adjust settings (like the port setting) so that you can be adjusted at deployment. 
Build, Share and Run Anywhere
The questions attendees asked made it clear that developers and ops teams alike value choice and flexibility. They want to be able to build, share and run applications anywhere, and have the peace of mind that applications are secure.
You can learn more by catching the on-demand 5 part webinar series.
Get the On-Demand Series

The Top 12 questions from our Docker Enterprise 3.0 webinar seriesClick To Tweet

To learn more about Docker Enterprise 3.0:

Check out what’s new in 3.0
Learn about Docker Desktop Enterprise

The post Top 12 Questions from the Docker Enterprise 3.0 Webinar Series appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

Why you Have to Fail Fearlessly to Succeed: The Citizens Bank Story of Innovation with Docker

We had the chance recently to sit down with the Citizens Bank mortgage division and ask them how they’ve incorporated innovation into a regulated and traditional business that is still very much paper-based.
The most important lesson they’ve learned: you have to be willing to “fail fearlessly,” but to do that, you also have to minimize the consequences and cost of failure so you can constantly try new ideas. With Docker Enterprise, the team has been able to take ideas from concept to production in as little as a day.
Here’s what they told us. You can also catch the highlights in this 2 minute video:

On focus: 
Matt Rider, CIO Mortgage Division: Our focus is changing the mortgage technology experience at the front end with the borrower and on the back end for the loan officers and the processors. How do we bring those two together? How do we reduce the aggravation that comes with obtaining a mortgage?
On founding an “innovation team” . . .
Matt: When I came here I recognized that we were never going to achieve our vision if we kept doing things the same way. We wanted to reduce the aggravation that comes with obtaining a mortgage. But you can’t change when you’re supporting what’s in front of you, dealing with production issues and how do we keep the lights on. You need a separate entity that’s going to look forward, has the funding they need, and most importantly is not afraid to fail. The innovation team was key to that.
Sharon Frazier, SVP Innovation: The innovation team was formed because we knew we wanted to disrupt ourselves. We knew we wanted to start greenfield. We created a cross-functional team because we were basically building a product to deliver to ourselves. In essence, we were acting like a startup.
On the importance of failing. . . 
Matt: You have to fail. You have to be able to take on new risks and try new ways of doing things, so how the organization and leadership reacts matters. You have to empower teams to fail and ask what we can do differently next time. 
Don Bauer, Senior DevOps Manager: Docker has allowed us to fail fearlessly. We can test new things easily and quickly and if they work, awesome. But if they don’t, we didn’t spend weeks or months on it. We might have spent a couple of hours or days.
On Docker . . .
Matt: We didn’t have a platform for innovation. And that’s when Docker came on our radar. We did our due diligence and our research and we realized that was the pivotal piece that was going to set us free.
Sharon: Docker is the building block for our new platform. It allows our developers to be self-sufficient. When they want to create a new service or new component within the application, they can self-serve through the delivery platform.
Don: The Docker platform has made it really easy for us to tackle every part of the pipeline all the way from our development environments through production. It has really helped us with being able to tackle new problems and challenges every day.
On results…
Mike Noe, Senior DevOps Engineer: In November 2016 we started our innovation team and had about a dozen containers and maybe three or four services running. Since then we’ve grown to over 3,000 containers across our entire platform and over 1,000 services. That includes our test and staging clusters as well as our ops and production cluster.
Don: The way we’ve developed applications has changed drastically with Docker. We’re no longer building big monoliths and trying to cram everything into one package that we’re going to have a hard time maintaining. We’re moving to single flow and building smaller but single purpose services. We couldn’t do that without Docker and we couldn’t manage those services without Docker.
Matt: Docker has definitely helped us innovate. It has definitely helped us to accelerate ideas that we’ve had and move from idea to operate in a matter of hours in some instances. So Docker has given us a lot of capabilities there that will distinguish us in the mortgage industry.

Why you Have to Fail Fearlessly to Succeed: A Story of Innovation at Citizens BankClick To Tweet

To learn more:

Read about Citizens Bank and Docker Enterprise
Watch the Citizens Bank DockerCon presentation
Start a free trial of Docker Enterprise

The post Why you Have to Fail Fearlessly to Succeed: The Citizens Bank Story of Innovation with Docker appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

5 Things That Happen When You Get Locked In to an Application Platform

The Consequences of Application Platform Lock-in 
If you’ve worked in IT for a few years, you’ve seen it happen. You select an application framework, operating system, database platform, or other infrastructure because it meets the checklist, the price is right, or sometimes because of internal politics. You quickly discover that it doesn’t play well with other solutions or across platforms — except of course it’s “easy and seamless” when used with offerings from the same vendor.

But try telling your developers that they can’t use their favorite framework, development toolset, or have to use a specific operating system for everything they do. If developers feel like they don’t have flexibility, they quickly adopt their own tools, creating a second wave of shadow IT.
And it doesn’t just affect developers. IT operations and security get bogged down in managing multiple systems and software sprawl. The business suffers because efficiency and innovation lag when teams get caught up in fighting fires.
Below are 5 things that can go wrong when you get locked in to an infrastructure platform:
#1 Other Platforms Become Inaccessible
Will the platform you pick work with any combination of public and private clouds? Will you get cornered into using a specific operating system for anything tied to their platform? When an infrastructure vendor pushes you to use their other platforms because they’re “well-integrated,” think carefully about whether you’re willing to limit your choices as this will likely cost more and result in unhappy developers.
#2 Your Best Developers Find Other Opportunities
Developers want to work with the best frameworks and tools for their task at hand. Node.js and .NET Core may be popular with developers, but there are a wide range of tools out there. The  2019 Stack Overflow Developer survey results, make it clear that  developers have diverse preferences and work best when they have the most choice and flexibility.
#3 Application Development Will Slow Down
If developers are forced into a particular development framework, innovation and creativity can be hindered. Good developers rarely have trouble finding work, so your best talent may leave. Even if most of your developers stick around, they’ll spend more time testing applications across platforms. What works on one machine won’t necessarily work on another if developers are busy finding work-arounds to use the tools they prefer.
#4 Operations Teams Will Spend More Time Fighting Fires
Keeping the lights on already consumes 70 to 80 percent of IT budgets. Using application platforms and tools from a single vendor may seem like it will save time, but the reality can be quite different. Other solutions come into the picture (usually whether you want them to or not!), creating silos of infrastructure that IT ops teams need to look after.
#5 The Business May Not Be Able to Pick the Best Technology
If you’re stuck primarily with one platform and framework but a new, promising tool isn’t compatible, you can either adopt another platform or pick a second-rate tooling. Platform lock-in means the business can end up forced to make technology choices that ultimately don’t serve the best interests of the company.
Platform Lock-in is Bad for Your Business
At Docker, we believe in simplicity and choice. We simplified containers and offer the broadest choice for developers and operators. You should be able to build, share and run applications across any combination of clouds, operating systems, languages and frameworks. That’s why Docker Enterprise works with every cloud provider, runs on all major operating systems, and supports Kubernetes for orchestration across that hybrid architecture. It’s no accident that containerd, the runtime engine developed by Docker, is the industry standard.
And developers love the flexibility Docker gives them. They even ranked Docker as the #1 “Most Loved Platform”, #2 “Most Wanted Platform” and #3 “Platform In Use” in the Stack Overflow survey. 
Start a Free Trial

5 Things That Happen When You Get Locked In to an App PlatformClick To Tweet

To learn more about how Docker gives you simplicity and choice:

Try a free, hosted trial or catch our Docker Enterprise 3.0 Launch Webinar Series
Learn about Docker for Developers
Learn more about the Docker Kubernetes Service

The post 5 Things That Happen When You Get Locked In to an Application Platform appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

Don’t Pick an Ops Platform Your Devs Won’t Use

In all of the excitement and buzz around Kubernetes, one important factor in the conversation that seems to be glossed over is how and where containerized applications are built. Going back to Docker’s roots, it was developers who were the first ones to adopt Docker containers. It solved their own local development issues and made it easier and faster to get applications out the door.
Fast forward 5 years, and developers are more important than ever. They build modern apps and modernize existing apps that are the backbone of organizations. If you’re in IT operations and selecting application platforms, one of the  biggest mistakes you can make is making this decision in isolation, without development buy-in. 
Avoiding Shadow IT, Round 2
In the early days of public cloud, developers started going around IT to get fast access to computing resources, creating the first round of “Shadow IT”.  Today, most large enterprises have embraced cloud applications and infrastructure, and work collaboratively across application development and operations teams to serve their needs.
But there’s a risk we’ll invite the same thing to happen again by making a container platform decision that doesn’t involve your developers. Here are 3 reasons to include developers in your platform decisions.
1. Intuitive Tooling = Developer Productivity
Developers are resourceful, but they want tools and solutions that are simple and “just work”; it makes their job easier. They want familiar tools, where they can invoke commands they know well. This translates into greater productivity and more energy being put towards innovation. Some enterprises even measure developer onboarding time as a performance indicator. 
Millions of developers use Docker Desktop already because it works with standard Docker and Kubernetes CLIs, without new commands to learn or workflows to master. They can integrate directly with their IDE and CI tools without having to relearn how to build apps. 
With Docker Desktop Enterprise, developers can create a local, certified Kubernetes environment with a single click. We make it even simpler to build containerized applications for those without prior Docker knowledge with the new Application Designer GUI and customizable application templates.
Ensuring your application platform works consistently with your developers’ tools will ensure tighter integration between the groups. 
2. Platform for Today & Tomorrow
For many enterprises, the technology stacks of yesterday are not the ones in use today; and the technology stacks of tomorrow will likely be different than what is being used today. In the search to improve developer productivity, development teams also explore new application stacks that make their job easier – new languages, new frameworks and architectures, new operating systems. New platforms enter the mix this way. 
Put another way, your development teams want to pick and choose the right tools for the job and not be forced into using a single operating system or language. If you can give them choice and flexibility to use the platforms that best suit their need, you can avoid the second wave of “shadow IT.”  This will also help operations teams who wish to have flexibility in where applications are deployed – whether that is on-premises, virtual or bare metal, or in one or more public clouds.
The Docker Platform provides a fully agnostic approach to containerization – supporting any language or framework and any infrastructure. With the new Docker Application packaging framework, we also look to extend beyond containers and support applications that may include cloud-based services and serverless functions down the road. 
3. Intrinsic Security
Everyone knows security is important, but it’s often seen as a hindrance to developer productivity. Operations teams can help developers build secure applications by providing guardrails and governance models that are built into the development lifecycle.
One of the best ways to do this is by providing pre-approved application templates to your developers. These are the scaffolding of modern applications that have security intrinsically built-in. They leverage approved technology stacks that are pre-screened for known vulnerabilities, ensuring that all patches are in place. Docker Desktop Enterprise and Docker Trusted Registry combine to provide these capabilities so that your developers can ship production-ready code faster. 
The Proof is in the Results
Application platform investments can often stall or just don’t see much adoption. Working closely with your application architects and development teams ensures that your investments in Kubernetes and a container platform will not go to waste. Our experience with over 800 enterprise organizations who rely on Docker Enterprise demonstrates how organizations that can bring Dev and Ops together improve collaboration and drive more value than those that do not.

Don’t pick an ops platform your devs won’t use. Here are 3 reasons to include #developers in your platform decisions:Click To Tweet

Start a Free Trial
To learn more about building, sharing, and running modern applications: 

Try a free, hosted trial or catch our Docker Enterprise 3.0 Launch Webinar Series
If you’re attending VMworld in San Francisco, stop by booth #1969 and learn more about how we’re enabling modern application delivery
Learn more about the Docker Kubernetes Service

The post Don’t Pick an Ops Platform Your Devs Won’t Use appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/