Open Technology Summit focuses on contributors

The Open Technology Summit, now in its fifth year, has become an annual state of the union for the established and budding open source projects that IBM supports.
The conclusion drawn at Sunday’s OTS during IBM InterConnect in Las Vegas is that the state of open tech is strong and getting stronger.
The event brought together leaders from some of today’s top open source projects: , Cloud Foundry, the Linux Foundation, JS Foundation and the Apache Software Foundation, plus the IBM leaders that support these projects.
“The open source community is only as good as the people who are contributing,” Willie Tejada, IBM Chief Developer Advocate, told the capacity crowd.

&;We’ve been systematically building an open innovation platform — cloud, , etc.” @angelluisdiaz https://t.co/HHMqWmi3v4 pic.twitter.com/945FkRbkZg
— IBM Cloud (@) March 20, 2017

Judging by the success stories shared on stage, contributor quality appears to be quite high. In short, the open source community is thriving.
Finding success in the open
The Linux Foundation has become one of the great success stories in open source, thanks largely to the huge number of contributors it has attracted. In his talk, the organization’s executive director, Jim Zemlin, told the crowd that across its various projects, contributors add a staggering 10,800 lines of code, remove 5,300 lines of code and modify 1,875 lines of code per day.
Zemlin called open source “the new norm” for software and application development.

&8220;Open source is now the new norm for software development.&; &; @jzemlin IBMOTS https://t.co/y3V3IGfcTK pic.twitter.com/83k9yLdJdf
— IBM Cloud (@IBMcloud) March 20, 2017

Cloud Foundry Foundation executive director Abby Kearns stressed her organization’s commitment to bringing forward greater diversity among its community.
“When I think about innovation, I think about diversity,” said Kearns, who took over as executive director four months ago. “We have the potential to change our industry, our countries and the world.”
Like Cloud Foundry, the OpenStack community has seen tremendous growth in its user community thanks to increased integration and cooperation with other open source communities. OpenStack Foundation executive director Jonathan Bryce and Lauren Sell, vice president of marketing and community services, shared their community’s pithy, tongue-in-cheek motto:

&8220;In 2014, there was 323 developers contributing to OpenStack. In 2016, we had 531.&8221; @jbryce IBMOTS ibminterconnect pic.twitter.com/6PxYzrVxsL
— IBM WebSphere (@IBMWebSphere) March 20, 2017

The community, which aims to create a single platform for bare metal servers, virtual machines and containers, has seen 5 million cores deployed on it. Contributors have jumped from 323 in 2014 to 531 in 2016.
Sell echoed several of the other speakers, when she noted that we’re living in a “multi-cloud world,” and that open technologies are enabling it.
IBM: Contributors, collaborators, solution providers
While it’s well known that IBM has helped start and lead many of the open source communities that it supports, the company also offers a robust set of unique capabilities around these technologies. The company is constantly working to expand its offerings around open technologies.
For example, IBM Cloud Platform Vice President and CTO Jason McGee previewed the announcement that Kubernetes is now available on IBM Bluemix Container Service.
“This service lets us bring together the power of that project and all of the amazing technology in the engine with Docker and the orchestration layer with Kubernetes and combine it with the power of cloud-based delivery,” McGee said.
David Kenny, senior vice president, IBM Watson and Cloud Platform, also spoke about “the power of the community to move the technology faster and to consume it and learn from it.”
“We’re very much committed as IBM to be participants,” he said. “Certainly IBM Cloud and IBM Watson are two pretty big initiatives at IBM these days, and both of those have come together around the belief that open source is a key part of our platform.”

“IBMCloud and Watson have come together around the belief that is a key part of our platform.” &8211; @davidwkenny IBMOTS pic.twitter.com/gU9DCzMsoC
— Kevin J. Allen (@KevJosephAllen) March 20, 2017

Moving forward as a community
Looking toward the future of open tech, it was clear that its success will depend on the next generation of contributors.
Tejada went so far as to call the open source movement a religion. “The most important piece is to understand the core premises of the religion.” He identified those as:

Embrace the new face of development
Acknowledge and adapt to the new methodologies of application development
Seize the opportunity to do more with less at an accelerated rate

For more on IBM work in open technology, visit developerWorks Open.
The post Open Technology Summit focuses on contributors appeared first on news.
Quelle: Thoughts on Cloud

The 4 Biggest Questions About Docker from VMworld 2016

Simply incredible. We spent last week at speaking with thousands of enterprise security, infrastructure and virtualization pros. It was humbling to witness all of the curiosity and excitement around at the show, and how Docker clearly made a strong impression on the attendees.

This curiosity around Docker and its use within enterprise environments is the reason why i’m writing this blog. We noticed that there were many of the same questions that arose, and we figured we should share them with you, as you start your journey towards adopting Docker containers and VMs.
Here are the most commonly asked questions from the conference.

What is Docker? Or even a container? Is it a lightweight VM? Can I use it with vSphere? What value do they provide?

 

Containers are really about applications, not servers. That&;s why they aren&8217;t VMs. @docker VMWorld
— Karen Lopez (@datachick) August 29, 2016
 
A Docker container is a standard unit in which application code, binaries and libraries can be packaged and isolated. The Docker Engine is the runtime installed on your infrastructure of choice and is what executes commands to build and deploy containers. Many containers can be connected together to form a single application or one container can include the entire codebase. Docker provides an abstraction layer between the application itself and the underlying compute infrastructure making the application completely portable to any other endpoint running Docker.
Docker containers are not VMs nor even lightweight VMs as their architecture is different.The image below displays the key differences between Docker containers and VMs .  Docker containers share the OS kernel on the host where each VM has a full copy of an OS inside the VM.

This does not mean these two models are mutually exclusive. Docker containers run anywhere a Docker Engine is installed&;and Docker Engine runs on bare metal, in VMs (vSphere, Hyper-V) and clouds (AWS, Google, Azure, and more). This also means that Docker containers are portable from any one of the above environments to the other without having to recode the application. Additionally many users add containers into an existing virtual infrastructure to increase the density of workloads possible per VM.

There are several reasons why Docker containers are being adopted within the enterprise:

Security &; Docker containers are completely isolated from one another, even when running on the same host and sharing the same OS. This makes them ideal for enterprise teams leveraging (for example) bare metal servers and are looking to comply with industry security regulations. And with the Docker Datacenter platform enterprise teams receive on-premises tools chock full of security features.
Portability across infrastructure and app environments &8211; Docker containers can run anywhere the Docker Engine is installed. This gives teams the ability to move their applications across different environment without having to tweak the code. For example, teams can easily move from vSphere to other environments like Azure and AWS .
Optimize Resources &8211; Docker containers can be deployed within VMs, and in fact vSphere is a great place to run them. This allows teams to run multiple containers within VMs. This reduces the overall VM footprint and decreases maintenance costs associated with maintaining legacy apps. Given that there are now less VMs, companies can spend less on vSphere including reduced hypervisor licensing costs as well.

 

Are you currently using @docker containers & VMs together? VMWorld
— Docker (@docker) August 21, 2016

Speed &8211; Docker containers help streamline the application lifecycle, helping developers build applications more quickly and IT ops teams react faster to changing business needs. Containers spin up on average in ⅜ of a second, compared to VMs which take several seconds or minutes. This sub second spin up time of Docker containers allow teams to onboard developers more quickly and deploy out to production more frequently.

Does Docker support Windows Server?

Will @Docker like containers ever catch on in Windows? http://t.co/jMHaVVVMFo VMworld
— Keith Townsend (@CTOAdvisor) August 26, 2014

Today Docker Engine runs on all major Linux distros like: Ubuntu, CentOS, RHEL, OpenSUSE and more.  Support for Windows Server is the most popular question as most companies have a mix of Windows and Linux based applications.  I’m pleased to say that very soon, Docker Engine will run on Windows Server 2016.  This means that the same Docker container technology and workflow can be applied to Linux and Windows Server workloads. For example, going forward, admins can have applications that have a back-end windows piece e.g. Microsoft SQL server and leverage a linux-based web front end, and have be part of the same app… running in vSphere VMs, baremetal or cloud (boom)!
Windows Server 2016 and Docker is available as a tech preview to try here.

Docker sells commercial solutions built specifically with enterprise teams in mind

 

And here are the @Docker Commercial Management tools: Cloud VMworld pic.twitter.com/CxYKBVX8pL
— Arjan Timmerman (@Arjantim) August 29, 2016

Our commercial management platform, Docker Datacenter, is what enterprise teams are leveraging across the entire application lifecycle. Developers use our solution to quickly create apps, update apps and deploy them and IT Ops uses the platform to secure their application environment, comply with industry regulations, and deploy applications out to production more frequently.  In addition they are able to reduce the overall application-related costs to the business.
As mentioned, Docker Datacenter is our enterprise solution. Sold as a monthly or annual subscription, Docker Datacenter (DDC) delivers an on-premises Containers as a Service environment that IT ops teams use to manage and secure the environment and devs use to create applications in a self-service manner. The tool provides an image registry, orchestration/management plane and commercial support from the Docker Customer Success team. This support also includes validated configurations of operating systems and support for previous versions of the Docker engine.
Oh, and Docker Datacenter has got the GUIs
 
lots of options with @Docker &8211; CLI, API, and GUI for deploying VMworld tfdx
— Tim Smith (@tsmith_co) August 29, 2016

Many VMware customers are accustomed to managing VMs in their vCenter GUI. So, they were happy to know that yes, there are Docker tools to help manage images and containers, and they come complete with a GUI. Well, there’s a couple actually. And just like how VMware users use tools built by VMware, for VMware, we recommend Docker users use tools built by Docker, for Docker.
With Docker Datacenter, IT Operations teams have the ability to manage, orchestrate and scale their Dockerized apps across their environment. The tool is chock full of enterprise features including:

Ability to deploy containers onto nodes directly from within the UCP GUI
Manage nodes, images and applications
Scale instances horizontally for times of peak application usage
Role-based access controls to control who can access what
Integration with LDAP/AD to quickly create teams and organizations

Here is a quick look at the Docker Datacenter management  dashboard.

Docker Datacenter also provides the capability to store, manage, and secure your images.Key features include:

Ability to sign images and ensure images are not tampered with
Ability to manage images, repositories, tags
Quickly update/patch apps and push new images to DTR
Integration with Universal Control Plane for quick deployment

How Docker Datacenter is priced, and what we mean when we say Docker “node”

The Docker Datacenter subscription is licensed by the number of Docker engines you require. A node is anything (VM, bare metal server or Cloud instance) with the Docker Engine installed on it. A good way to understand how many engines you require is to think about the number of existing VMs, or bare metal servers or cloud instances you want to begin Dockerizing. Datacenter is available on a monthly or annual subscription basis with the option of business day or business critical support to align to your application service levels.  Check out our pricing page to learn more.
For any virtualization gurus looking to learn more about Docker and how Docker containers and VMs can be used together I highly recommend you give this ebook on “Docker for the Virtualization Admin” a read.
Additional Resources

Read the eBook: Docker for The Virtualization Admin
Learn more about Docker Datacenter
See a demo of Docker Datacenter
Hear from Docker Datacenter Customers

 

Top 4 Docker questions from VMworld answered hereClick To Tweet

The post The 4 Biggest Questions About Docker from VMworld 2016 appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/