4 Reasons I’m Excited to Attend DockerCon 2023

DockerCon is back! DockerCon is back! DockerCon is back! I can’t tell you how excited this makes me. Now, you may recognize that I work at Docker, but I haven’t always. Although I’ve worked at Docker for about 18 months, I’ve been in the Docker space for a long time. In fact, my first presentation about Docker was all the way back in December 2015.

Since then, I’ve helped organize, run, and speak at many meetups, and I was recognized as a Docker Captain in March 2017. I even received the inaugural Community Leader of the Year award for the North America region in 2018. As I look back throughout my career, many of my fondest memories can be attributed to my time at DockerCon. This will be my sixth in-person DockerCon, and here are four reasons I’m happy to be back in person this year.

Let’s go!

Michael Irwin at DockerCon EU 2018 in Barcelona.

#1 — Developer-focused content

We’ve all been to many “developer-focused” conferences, only to find out most of the sessions are sponsored sessions, the keynotes are relatively boring, and there really isn’t much focus on developers. I remember going to DockerCons and learning everything about Docker’s latest features, scaling our efforts to my team and across the organization, deepening my understanding of various cloud-native design patterns and architectures, and helping my team be as productive as possible. Especially earlier in my career, this experience helped me become the developer I am today.

As I’m helping plan DockerCon this year, I’ll admit we want all of the same things from the past, just updated. We want to help each and every developer better their craft and better deliver results for their customers… whoever they might be.

A selfie before my “Containers for Beginners” talk at DockerCon 2019 in San Francisco.

#2 — The hallway track

Honestly, this is probably one of my favorite parts of DockerCon.The Hallway Track is a special track of DockerCon in which attendees can network and learn from each other. If you want to learn about something, simply make a request! If you want to teach others, submit a session! Then, small groups get together and just chat. These hallway moments have truly been some of the best moments of DockerCon, both learning and teaching. There’s simply no better way to learn than from others who have walked the same journey.

The hallway track offers many chances to learn and connect.

#3 — Reconnecting with and making new friends

During my time as a Docker Captain from 2017-2022 (I had to semi-retire when I joined Docker), DockerCon was such a fun time to get together and spend time with my fellow Captains. In many ways, this felt like a family reunion. We learned together, taught each other, and provided insight and direction to the Docker product and executive teams. 

Although connecting with old friends was great, I always made new friends every year. Many of those came from the Hallway Track, but random conversations at meals, the conference party, and other one-offs have provided me with friendships and contacts I still use to this day. Whenever I’m stuck with any problem, there’s a good chance I can reach out to someone that I met at DockerCon.

Docker Captains gathered at DockerCon EU 2017 in Copenhagen.

Group selfie taken during a pre-conference bike ride at DockerCon 2019 in San Francisco.

#4 — Fun all around!

I may or may not be known for roaming around the DockerCon EU 2017 vendor hall in an inflatable dinosaur suit or using that same suit to start my “Containers for Beginners” talk at DockerCon 2019. Why? To be completely honest, because it’s fun! And while a conference isn’t only about having fun, it’s certainly a lot easier to be a part of a community when you’re doing so. DockerCon is not afraid to have a little bit of fun.

Me wearing a dino suit at the Docker booth at DockerCon EU 2017 in Copenhagen.

While these are some of the reasons I’m excited to have DockerCon back in person this year, and I’m sure there are tons more! We’d love to hear what makes you excited. Tweet #DockerCon why you’re excited, and we just might highlight you.

Learn more at the DockerCon 2023 website and register by August 31 to take advantage of early bird pricing. 

Learn more

Register for DockerCon 2023.

Get the latest release of Docker Desktop.

Vote on what’s next! Check out our public roadmap.

Have questions? The Docker community is here to help.

New to Docker? Get started.

Quelle: https://blog.docker.com/feed/

Why Are There More Than 100 Million Pull Requests for AI/ML Images on Docker Hub?

A quick look at pull requests of well-known AI/ML-related images on Docker Hub shows more than 100 million pull requests. What is driving this level of demand in the AI/ML space? The same things that drive developers to use Docker for any project: accelerating development, streamlining collaboration, and ensuring consistency within projects. 

In this article, we’ll look more closely at how Docker provides a powerful tool for AI/ML development.

As we interact with more development teams who use Docker as part of their AI/ML efforts, we are learning about new and exciting use cases and hearing first-hand how using Docker has helped simplify the process of sharing AI/ML solutions with their teams and other AI/ML practitioners.

Why is Docker the deployment choice for millions of developers when working with AI/ML?

AI/ML development involves managing complex dependencies, libraries, and configurations, which can be challenging and time-consuming. Although these complexities are not limited to AI/ML development, with AI/ML, they can be more taxing on developers. Docker, however, has been helping developers address such issues for 10 years now.

Consistency across environments

Docker allows you to create a containerized environment that includes all the dependencies required for your AI/ML project, including libraries, tools, and frameworks. This environment can be easily shared and replicated across different machines and operating systems, ensuring consistency and reproducibility. Docker images can also be version-controlled and shared via container registries such as Docker Hub, thus enabling seamless collaboration and continuous integration and delivery.

Scalability

Docker provides a lightweight and efficient way to scale AI/ML applications. With Docker, you can run multiple containers on the same machine or across different machines in a cluster, enabling horizontal scaling. This approach can help you handle large datasets, run multiple experiments in parallel, and increase the overall performance of your applications.

Portability

Docker provides portability, allowing you to run your AI/ML applications on any platform that supports Docker, including local machines, cloud-based infrastructures, and edge devices. Docker images can be built once and deployed anywhere, eliminating compatibility issues and reducing the need for complex configurations. This can help you streamline the deployment process and focus on the development of your models.

Reproducibility

Docker enables reproducibility by providing a way to package the entire AI/ML application and its dependencies into a container. This container can be easily shared and replicated, ensuring that experiments are reproducible, regardless of the environment they are run in. Docker provides a way to specify the exact versions of dependencies and configurations needed to reproduce results, which can help validate experiments and ensure reliability and repeatability.

Easy collaboration

Docker makes it easy to collaborate on AI/ML projects with team members or colleagues. Docker images or containers can be easily shared and distributed, ensuring that everyone has access to the same environment and dependencies. This collaboration can help streamline the development process and reduce the time and effort required to set up development environments.

Conclusion

Docker provides a powerful tool for AI/ML development, providing consistency, scalability, portability, reproducibility, and collaboration. By using Docker to package and distribute AI/ML applications and their dependencies, developers can simplify the development process and focus on building and improving their models. 

Check out the Accelerated AI/ML Development page to learn more about how Docker fits into the AI/ML development process.

If you have an interesting use case or story about Docker in your AI/ML workflow, we would love to hear from you and maybe even share your story.

Learn more

Get the latest release of Docker Desktop.

Vote on what’s next! Check out our public roadmap.

Have questions? The Docker community is here to help.

New to Docker? Get started.

Quelle: https://blog.docker.com/feed/

Amazon Location Service unterstützt jetzt die Veröffentlichung von Gerätepositions-Updates auf EventBridge

Amazon Location Service unterstützt jetzt die Veröffentlichung von Positions-Updates für geortete Geräte auf Amazon EventBridge. So können Kunden die Positions-Updates verwenden, um Features bereitzustellen, die auf den physischen Standort der georteten Geräte zugeschnitten sind. Entwickler können Anwendungen erstellen, die die Bewegung von Geräten auf einer Karte anzeigen oder Bewegungsdaten in einem Langzeitspeicher erfassen. Diese Daten können dann z. B. für Erkenntnisse über Asset-Bewegungen, vorausschauende Analysen oder die Einhaltung von Vorschriften verwendet werden. 
Quelle: aws.amazon.com

AWS Elemental MediaLive veröffentlicht jetzt 1-Sekunden-Metriken

AWS Elemental MediaLive speichert jetzt Kanalmetriken in Intervallen von 1 Sekunde auf Amazon CloudWatch, sodass Sie sich schnell ändernde Aktivitäten verfolgen können. Diese Metriken können mithilfe der Konsole oder der CloudWatch-API in Zeitabständen von 1 Sekunde bis zu 3 Stunden nach der Erstellung der Datenpunkte abgerufen werden.
Quelle: aws.amazon.com

Amazon Omics erhält FedRAMP Moderate-Autorisierung

Amazon Omics hat die Moderate Autorisierung des Federal Risk and Authorization Management Program (FedRAMP) für die AWS-Regionen USA Ost-West erhalten. Sie können Amazon Omics verwenden, um Ihre Daten in AWS bis zur Auswirkungsstufe Moderat zu speichern und zu verarbeiten.
Quelle: aws.amazon.com