How CISOs need to adapt their mental models for cloud security

Many security leaders head into the cloud armed mostly with tools, practices, skills and ultimately the mental models for how security works that were developed on premise. This leads to cost and efficiency problems that can be solved by mapping their existing mental models to those of the cloud. When it comes to understanding the differences between on-premises cybersecurity mental models and their cloud cybersecurity counterparts, a helpful place to start is by looking at the kinds of threats each one is attempting to block, detect, or investigate. Traditional on-premise threats focused on stealing data from databases, file storage, and other corporate resources. The most common defenses of these resources rely on layers of network, endpoint, and sometimes application security controls. The proverbial “crown jewels” of corporate data were not made accessible with an API to the outside world or stored in publicly accessible storage buckets. Other threats aimed to disrupt operations or deploy malware for various purposes, ranging from outright data theft to holding data for ransom.There are some threats that are specifically aimed at the cloud. Bad actors are always trying to take advantage of the ubiquitous nature of the cloud.One common cloud-centered attack vectorthat they pursue is constantly scanning IP address space for open storage buckets or internet-exposed compute resources.As Gartner points out, securing the cloud requires significant changes in strategy from the approach we take to protect on-prem data centers. Processes, tools, and architectures need to be designed using cloud-native approaches to protect critical cloud deployments. And when you are in the early stages of cloud adoption, it’s critical for you to be aware of the division of security responsibilities between your cloud service provider and your organization to make sure you are less vulnerable to attacks targeting cloud resources.Successful cloud security transformations can help better prepare CISOs for threats today, tomorrow, and beyond, but they require more than just a blueprint and a set of projects. CISOs and cybersecurity team leaders need to envision a new set of mental models for thinking about security, one that will require you to map your current security knowledge to cloud realities.  As a way to set the groundwork for this discussion, the cloud security transformation can start with a meaningful definition of what “cloud native” means. Cloud native is really an architecture that takes full advantage of the distributed, scalable, and flexible nature of the public cloud. (To be fair, the term implies that you need to be born in the cloud to be a native, but we’re not trying to be elitist about it. Perhaps a better term would be “cloud-focused” or doing the security “the cloudy way.”)However we define it, adopting cloud is a way to maximize your focus on writing code, creating business value, and keeping your customers happy while taking advantage of cloud-native inherent properties—including security. One sure way to import legacy mistakes, some predating cloud by decades, into the future would be to merely lift-and-shift your current security tools and practices into the public cloud environment.Going cloud-native means abstracting away many layers of infrastructure, whether it’s network servers, security appliances, or operating systems. It’s about using modern tools built for the cloud and built in the cloud. Another way to think about it: You’ll worry less about all these things because you’re going to build code on top of that to help you move more quickly. Abandoning legacy security hardware maintenance requirements is part of the win here. To put another way, security will follow in the steps of IT that has been transformed by the SRE and DevOps revolution. You can extend this thinking to cloud native security, where some of your familiar tools combine with solutions provided by cloud service providers to take advantage of cloud native architecture to secure what’s built and launched in the cloud. While we talked about the differences between on-prem targeted threats compared to threats targeting cloud infrastructure, here are other vital areas to re-evaluate in terms of a cloud security mental model.Network securitySome organizations practice network security in the cloud as if it were a rented data center. While many traditional practices that worked reasonably well on-premise for decades, along with many traditional network architectures, are either not applicable in the cloud or not optimal for cloud computing. However, concepts like a demilitarized zone (DMZ) can be adapted to today’s cloud environments. For example, a more modern approach to DMZ would use microsegmentation and govern access by identity in context. Making sure that the right identity, in the right context, has access to the correct resource gives you strong control. Even if you get it wrong, microsegmentation can limit a breach blast radius. Becoming cloud native also drives the adoption of new approaches to enterprise network security, such as BeyondProd. It also benefits organizations because it gets them away from traditional network perimeter security to focus on who and what can access your services—rather than where requests for access originated.Although network security changes driven by cloud adoption can be enormous and transformational, not all areas shift in the same way.Endpoint securityIn the cloud, the concept of a security endpoint changes. Think of it this way: A virtual server is a server. But what about a container? What about microservices and SaaS? With software as a service cloud model, there’s no real endpoint there. All along your cloud security path, users only need to know what happens where. Here is a helpful mental model translation: An API can be seen as sort of an endpoint. Some of the security thinking developed for endpoints applies to cloud APIs as well. Securing access, permissions, privileged access thinking can be carried over, but the concept of endpoint operating system maintenance does not. Even with automation of service agents on virtual machines in the cloud, insecure agents may increase risks because they are operating at scale in the cloud. Case in point: This major Microsoft Azure cross-tenant vulnerability highlighted a new type of risk that wasn’t even on the radar of many of its customers.In light of this, across the spectrum of endpoint security approaches, some disappear (such as patching operating systems for SaaS and PaaS), some survive (such as the need to secure privileged access,) and yet others are transformed. Detection and response With a move to the cloud comes changes to the threats you’ll face, and changes to how you detect and respond to them. This means that using on-prem detection technology and approaches as a foundation for future development may not work well. Copying all your on-premises detection tools and their threat detection content won’t reduce risks in the way that most cloud-first organizations will need..Moving to the cloud provides the opportunity to rethink how you can achieve your security goals of confidentiality, integrity, availability, and reliability with the new opportunities created by cloud process and technology.Cloud is distributed, often immutable, API-driven, automatically scalable, and centered on the identity layer and often contains ephemeral workloads created for a particular task. All these things combine to affect how you handle threat detection for the cloud environment and necessitate new detection methods and mechanisms. There are six key domains where threats in the cloud can be best detected: identify, API, managed services, network, compute, and Kubernetes. These provide the coverage needed related to network, identity, compute, and container infrastructure. They also provide specific detection mechanisms for API access logs and network traffic captures.As with endpoint security, some approaches become less important (such as network IDS on encrypted links), others can grow in importance (such as detecting access anomalies,) while others transform (such as detecting threats from the provider backplane).Data securityThe cloud is changing data security in significant ways, and that includes new ways of looking at data loss prevention, data encryption, data governance, and data access. Cloud adoption sets you on a path to what we at Google call“autonomic data security.”Autonomic data security means security has been integrated throughout the data lifecycle and is improving over time. At the same time, it makes things easier on users, freeing them from having to define and redefine myriad rules about who can do what, when, and with which data. It lets you keep pace with constantly evolving cyberthreats and business changes, so you can keep your IT assets more secure and make your business decisions faster.Similar to other categories, some data security approaches wane in importance or disappear (such as manual data classification at cloud scale), some retain their importance from on-prem to cloud unchanged, while others transform (such as pervasive encryption with effective and secure key management).Identity and access managementThe context for identity and access management (IAM) in the cloud is obviously different from your on-premise data center. In the cloud, every person and service has its own identity and you want to be able to control access. Within the cloud, IAM gives you fine-grained access control and visibility for centrally managing cloud resources. Your administrators authorize who can act on specific resources, giving you full control and visibility to manage cloud resources centrally. What’s more, if you have complex organizational structures, hundreds of workgroups, and a multitude of projects, IAM gives you a unified view into security policy across your entire organization.With identity and access management tools, you’re able to grant access to cloud resources at fine-grained levels, well beyond project-level access. You can create more granular access control policies to resources based on attributes like device security status, IP address, resource type, and date and time. These policies help ensure that the appropriate security controls are in place when granting access to cloud resources.The concept of Zero Trust is strongly in play here. It’s the idea that implicit trust in any single component of a complex, interconnected system can create significant security risks. Instead, trust needs to be established via multiple mechanisms and continuously verified. To protect a cloud-native environment, a zero trust security framework requires all users to be authenticated, authorized, and validated for security configuration and posture before being granted or keeping access to cloud-based applications and data.This means that IAM mental models from on premise security mostly survive, but a lot of underlying technology changes dramatically, and the importance of IAM in security grows significantly as well. Shared fate for greater trust in cloud securityClearly, cloud is much more than “someone else’s computer.” That’s why trust is such a critical component of your relationship with your chosen cloud service providers. Many cloud service providers acknowledge shared responsibility, meaning that they supply the underlying infrastructure but leave you responsible for many seemingly inscrutable  security tasks.With Google Cloud, we operate in a shared fate model for risk management in conjunction with our customers. We believe that it’s our responsibility to be active partners as our customers deploy securely on our platform, not delineators of where our responsibility ends. We stand with you from day one, helping you implement best practices for safely migrating to and operating in a trusted cloud. Get ready to go cloud nativeWe offer you several great resources to help you prepare for cloud migration, and guide you as you review your current security approaches for signs of on-prem thinking.Listen to our podcast series where Phil Venables, Vice President, CISO at Google Cloud, and Nick Godfrey, Director, Financial Services Security & Compliance and member of Office of the CISO at Google Cloud, join me in a discussion on preparing for cloud migration (Podcast 1, Podcast 2). You can deepen your cloud native skills by earning a Professional Cloud Security Engineer certification from Google.Related ArticleHow autonomic data security can help define cloud’s futureHere’s how Autonomic Data Security can help transform old-world security models to the new world of data in the cloud.Read Article
Quelle: Google Cloud Platform

Docker Captain Take 5 — James Spurin

Docker Captains are select members of the community that are both experts in their field and are passionate about sharing their Docker knowledge with others. “Docker Captains Take 5” is a regular blog series where we get a closer look at our Captains and ask them the same broad set of questions ranging from what their best Docker tip is to whether they prefer cats or dogs (personally, we like whales and turtles over here). Today, we’re interviewing James Spurin who recently joined the Captain’s Program. He is a DevOps Consultant and Course/Content Creator at DiveInto and is based in Hertfordshire, United Kingdom. Check out James’ socials on LinkedIn and Twitter!

How/when did you first discover Docker?

I’m part of the earlier ISP generation, so my early career involved working at Demon Internet, one of the first internet providers in the UK back in 1998-2000.

Back then, it was cool to host and serve your personal ISP services on your own managed system (generally hidden in a cupboard at home and served via a cable modem to the world) for the likes of Web/DNS/Email and other services.

Whilst times have changed, and I’ve moved to more appropriate cloud-based solutions for essential services and hosting, I’ve always been passionate about cosplaying with systems administration. A friend with the same passion recommended linuxserver.io to me. It’s a great resource that manages and maintains a fleet of common Docker images.

I transitioned many of the services I was manually running to Docker, either using their images or their Dockerfiles as a reference for learning how to create my own Docker images.

If you’re looking for a great way of starting with Docker, I highly recommend looking at the resources available on linuxserver.io.

The advice we would share with new starters back in my early ISP career days was to create and self-host an ISP in a box.

In essence, we’d combine a Web Server (using Apache at the time), Email (using Exim), and a DNS server (using Bind), alongside a custom domain name, to make it available on the internet. It provided a great learning opportunity for understanding how these protocols work.

Today my advice would be to try this out, but also with Docker in the mix!

What is your favorite Docker command?

My favorite Docker command would be docker buildx. With the growth of arm architecture, docker buildx is an excellent resource that I rely on tremendously. Being a content creator I leverage Docker extensively for creating lab environments that anyone can utilize with their own resources. See my “Dive Into Ansible” repository for an example that utilizes docker-compose and has had over 250k pulls.

Just a few years ago, building images for arm alongside AMD64 could have been considered a niche in my area. Only a tiny percentage of my students were using a Raspberry Pi for personal computing.

These days, however, especially with the growth of Apple Silicon, cross-built images are much more of a necessity when providing community container images. As a result, Buildx is one of my favorite CLI Plugins and is a step I consider essential as a milestone in a successful Docker project.

What is your top tip for working with Docker that others may not know?

Consider Dockerfiles (or automated image builds) and guided instructions as a standard part of your projects from Day 1. Your users will thank you and your likelihood of open source contributors will grow.

Take, for example, the Python programming language. When browsing GitHub/Gitlab for Python projects, it’s common to see a requirements.txt file with dependencies related to the project.

The expectation is then for the consumer to install dependencies via pip. An experienced developer may utilize virtual environments, whereas a less experienced developer may install this straight into their running system (thus, potential cross-contamination).

Whilst Python 3+ is the standard for most common Python projects, there may be nuances between a version of Python locally installed and that used within a codebase. We should also consider that some dependencies require compilation, which presents another obstacle for general usage, especially if the likes of Developer Compilation Tools aren’t available.

By providing a Dockerfile that utilizes a trusted Python image and offering automated prebuilt images using the likes of DockerHub in conjunction with GitHub/Gitlab (to trigger automated builds), individuals can get involved and run projects as a single command in a matter of minutes. Such efforts also provide great reuse opportunities with Kubernetes, CI/CD pipelines, and automated testing.

What’s the coolest Docker demo you have done/seen?

The Flappy Moby efforts that took place at KubeCon Valencia. I liked this so much that I captured this at the time and created a video!

The project was novel; after all, who doesn’t love these types of games? It was a fantastic showpiece at the event. As a content creator and someone who has worked on creating games to demonstrate and teach technical concepts, I was also very appreciative of the efforts involved around the graphical elements to bring this to life.

Seeing Docker Desktop extensions in action inspired my own Docker Desktop extension journey and follow-ups. When I returned from Kubecon, I created a Docker Desktop extension that instantly provides an Ansible-based lab with six nodes and web terminals. Check out the related video of how this extension was made!

What have you worked on in the past six months that you’re particularly proud of?

I created a free Kubernetes Introduction course available on YouTube and Udemy which is receiving an incredible amount of views and positive feedback. This was a very personal project for me that focused on community giveback.

When I first started learning Kubernetes there were areas that I found frustrating. Learning resources in this space often show theoretical overviews of core Kubernetes architecture but lack hands-on demonstrations. I made this course to ensure that anyone could get a good understanding of Kubernetes alongside hands-on use of the essential components in just one hour.

The course also provided me with a unique opportunity to share perspectives on overlooked areas relating to Docker Inc. For example, I cover the positive efforts made by Docker to Cloud Native with their contributions of containerd and runC to the Cloud Native Computing Foundation and the Open Container Initiative, respectively.

It was a pleasure to work on a project that covered many of my favorite passions in one go, including, Kubernetes, Docker, Cloud Native, content, and community.

What do you anticipate will be Docker’s biggest announcement this year?

I’ve already mentioned this above, but it’s Docker Desktop extensions for me. When considered alongside Docker Desktop (now native for Windows, Mac, and Linux), we have a consistent Docker Desktop environment and Extension platform that can provide a consistent development resource on all major OS platforms.

What are some personal goals for the next year with respect to the Docker community?

My aims are focused on community, and I’m already working on content that will heavily emphasize Docker in conjunction with Kubernetes (there’s so much opportunity to do more with the Docker Desktop Kubernetes installation.) As the tagline in the Docker Slack announcement channel says… Docker, Docker, Docker!!!

What was your favorite thing about DockerCon 2022?

Community. While watching the various talks and discussions, I was active in the chat rooms.

The participants were highly engaged, and I made some great connections with individuals who were mutually chatting at the time.

There were also some very unexpected moments. For example, Justin Cormack and Ajeet Singh Raina were using some interesting vintage microphones that kicked off some good chat room and post-event discussions.

Looking to the distant future, what is the technology that you’re most excited about and that you think holds a lot of promise?

A technology that has blown my mind is Dall-E 2, an AI solution that can automatically create images based on textual information. If you haven’t heard of this, you must check this video out.

It’s possible at the moment to try out Dall-E Mini. Whilst this isn’t as powerful as Dall-E 2, it can be fun to use.

For example, this is a unique image created by AI using the input of “Docker”. Considering that this technology is not re-using existing images and has learnt the concept of “Docker” to make this, it is truly remarkable.

Rapid fire questions…

What new skill have you mastered during the pandemic?

Coffee is a personal passion and a fuel that I both depend upon and enjoy! The Aeropress is a cheap, simple, and effective device with many opportunities. I’ve explored how to make a fantastic Aeropress coffee, and I think I’ve nailed it! For those interested, check out some feeds from the Aeropress Barista Championships.

Cats or Dogs?

Cats. I have two, one named Whisper Benedict and the other named Florence Rhosyn. Whisper is a British Blue, and Flo is a British Blue and White. At the time, we only intended to get one cat, but the lady at the cattery offered us Flo at a discount, and we couldn’t resist.

The lady at the cattery was a breeder of British Blues and British Whites, and the Dad from the Blues had snuck in with the Mum of the Whites; alas, you can guess what happened. This gives Flo her very unique mottled colors.

The two of them are extraordinary characters. Although Whisper is the brawn of the two and would be assumed to be the Alpha cat, he’s an absolute softie and doesn’t mind anybody picking him up.

On the other hand, what Flo lacks in physique, she makes up with brains and agility.

Both my children Lily (11) and Anwen (4) can hold Flo, and nothing will happen. They’ve all grown up together, and it’s as if she knows that they are children. However, should you try to pick her up as an adult, you’re not getting away unscathed. Flo also seems to have this uncanny ability to know when we’re intending on taking her to the vets, even without a carry basket in sight!

Despite their characteristics, we wouldn’t have our furry family members any other way.

Salty, sour, or sweet?

Sweet!

Beach or mountains?

Beaches (with some favouritism towards Skiathos) please!

Your most often used emoji?

🚀
Quelle: https://blog.docker.com/feed/

Die mobile App der AWS-Konsole unterstützt Cost-Explorer-Service

Benutzer der mobilen App der AWS-Konsole können jetzt AWS Cost Explorer sowohl für die iOS- als auch für die Android-Anwendung verwenden. Die mobile App der Konsole bietet eine sichere Lösung für unterwegs, um AWS-Kosten und -Nutzung im Zeitverlauf zu visualisieren, zu verstehen und zu verwalten. Die Kunden können die Gesamtkosten und die Nutzung in allen Regionen und Services für die letzten acht Wochen analysieren, Trends erkennen, Kostentreiber ausfindig machen und Anomalien aufdecken.
Quelle: aws.amazon.com