Advance your future with learning sessions at the Government and Education Summit

Mark your calendars. TheGoogle Cloud Government and Education Summit is less than two weeks away – November 3-4. Reserve your spot for the online learning event at no cost.Organizations around the world are in great need of more workers who have robust IT skills and cloud expertise. The needs in the US alone are staggering. TheU.S. Bureau of Labor Statistics projects that the number of people employed in information technology positions will grow by 13 percent over the next 10 years – faster than any other occupation – with 667,000 new jobs created to fill demands in cloud computing, data science, and cybersecurity. And those are just the new positions. It’s estimated that there are currently 1.3 million open jobs in data analytics, IT support, project management, and UX design. Whether you currently work in technology and want to see what’s coming next in the cloud, are thinking about a future career in cloud technology, or are an educator teaching our next generation of technologists,Google Cloud’s Government and Education Summit offers free, interactive virtual learning sessions on cloud topics to hone your skills and help you become part of the cloud workforce.Spend a Day Learning with GoogleDay 2 (November 4) of the Summit is devoted tointeractive learning sessions led by Googlers and peers. There are three tracks to choose from. Attend all the sessions in your chosen track or mix and match sessions based on your needs so you can get the most of your day of learning with Google.Track 1: Beginners and Non-Technical LearnersEveryone needs to start somewhere on their journey to working in the cloud. If you are a student, 18 or older, thinking about a future working in cloud technology or someone ready to make a career move, the sessions in Track 1 are for you. Sessions include the following and more:Becoming a Cloud Digital Leader- Learn about Digital Leader, the newest training and certification program from Google Cloud. The program takes you through the core Google Cloud products and services and how they can be used to achieve desired business goals. No prerequisites required.Pathway to Proficiency: Getting Started with Google Cloud – Learn about the Google Cloud Skills Boost platform, the gateway to Google Cloud’s individual learning programs, and how to get free access for 30 days. Tour the curriculum to earn skill badges and get started on the path toward developing cloud-ready skills.Hand-On Lab: Introduction to SQL for BigQuery – SQL is a standard language for data operations that allows you to ask questions and get insights from structured datasets. In this hands-on lab, we introduce you to SQL and prepare you for the many labs and quests you can experience in Qwiklabs to further your education on data science topics.Track 2: Technical LearnersAre you already working in technology but want to hone your skills?  Are you looking to expand your career options in technology by adding proficiency in cloud topics? If so, this is the track for you. Track 2 is packed with learning opportunities, including:Build a Virtual Agent in Minutes – Learn to create a virtual agent with Google Cloud Dialogflow and understand the next steps to deploy CCAI to take your support to the next level.Build a Cloud Center of Excellence and Enable Adoption – Learn best practices and tips for successfully building a cloud center of excellence, including building your team.Managing Storage for Education – Learn about Google Cloud storage options and best practices for consuming storage services, moving data across multiple types of storage, and managing storage limits.Track 3: EducatorsIf you are a technology teacher or faculty member who would like to integrate Google Cloud curriculum into your courses or tap into Google Cloud for your research, the sessions in this track are designed for you. Learn how to help your students advance their cloud knowledge regardless of their skill levels. Sessions in Track 3 feature eight programs, including:Cloud Curriculum in the Classroom – Get your students ready for careers in cloud computing by learning about types of cloud curriculum available to faculty for classroom use.Connecting Your Students with Peer Communities – Join this session to learn about student programming like Developer Student Clubs (DSCs), which help student developers learn globally and work with their communities to solve real-life problems. You’ll also learn about student hackathons and more.Funding Research Opportunities – Learn how Google Cloud research credits can advance your research by giving you access to computing power that will make the next big thing possible.Get in the Game with Cloud HeroTo close out a day of Learning with Google, join us for Cloud Hero, a gamified learning experience. You’ll get hands-on learning about cloud infrastructure and have a chance to show your skills by completing online labs that help you practice Google Cloud skills in real time. Register to let us know if you want to attend this special session. No prior experience required.*  *  *Google welcomes all learners to our Google Cloud’s Government and Education Summit. Register for the Summit so that you can watch the sessions live or on demand. For additional opportunities to learn with Google, sign up for theSkills Challenge, and get 30 days unlimited access to Google Cloud Skills Boost, the destination for Google Cloud’s individual learning programs. If you are interested in careers in fields like IT support and program management, Grow with Google offers certifications in these highly sought-after disciplines. If you are a public sector technology leader or employer in any field, you can connect with skilled cloud candidates and grow your talent pipeline by becoming a participating employer withConnect with Google. 
Quelle: Google Cloud Platform

BigQuery Omni now available for AWS and Azure, for cross cloud data analytics

2021 has been a year punctuated with new realities.  As enterprises now interact mainly online, data and analytics teams need to better understand their data by collaborating across organizational boundaries. Industry research shows 90% of organizations have a multicloud strategy which adds complexity to data integration, orchestration and governance. While building and running enterprise solutions in the cloud, our customers constantly manage analytics across cloud providers. These providers unintentionally create data silos that cause friction for data analysts.  This month we announced the availability of BigQuery Omni, a multicloud analytics service that lets data teams break down data silos by using BigQuery to securely and cost effectively analyze data across clouds. For the first time, customers will be able to perform cross-cloud analytics from a single pane of glass, across Google Cloud, Amazon Web Services (AWS) and Microsoft Azure. BigQuery Omni will be available to all customers on AWS and for select customers on Microsoft Azure during Q4.  BigQuery Omni enables secure connections to your S3 data in AWS or Azure Blob Storage data in Azure. Data analysts can query that data directly through the familiar BigQuery user interface, bringing the power of BigQuery to where the data resides. Here are a few ways BigQuery Omni addresses the new reality customers face with multi cloud environments: Multicloud is here to stay: Enterprises are not consolidating, they are expanding and proliferating their data stack across clouds. For financial, strategic, and policy reasons customers need data residing in multiple clouds. Data platforms support for multicloud has become table stakes functionality. Multicloud data platforms provide value across clouds: Almost unanimously, our preview customers echoed that the key to providing game-changing analytics was through providing more functionality and integration across clouds.  For instance,  customers wanted to join player and ad engagement data to better understand campaign effectiveness. They wanted to join online purchases data with in-store checkouts to understand how to optimize the supply chain. Other scenarios included joining inventory and ad analytics data to drive marketing campaigns, and service and subscription data to understand enterprise efficiency. Data analysts require the ability to join data across clouds, simply and cost-effectively.Multicloud should work seamlessly: Providing a single-pane-of-glass over all data stores empowers a data analyst to extend their ability to drive business impact without learning new skills and shouldn’t need to worry about where the data is stored. Because BigQuery Omni is built using the same APIs as BigQuery, where data is stored (AWS, Azure, or Google Cloud) becomes an implementation detail. Consistent security patterns are crucial for enterprises to scale: As more data assets are created, providing the correct level of access can be challenging. Security teams need control over all data access with as much granularity as possible to ensure trust and data synchronization. Data quality unlocks innovation: Building a full cross-cloud stack is only valuable if the end user has the right data they need to make a decision.  Multiple copies, inconsistent, or out-of-date data all drive poor decisions for analysts. In addition, not every organization has the resources to build and maintain expensive pipelines.BigQuery customer Johnson & Johnson was an early adopter of BigQuery Omni on AWS; “We found that BigQuery Omni was significantly faster than other similar applications. We could write back the query results to other cloud storages easily and multi-user and parallel queries had no performance issues in Omni. How we see Omni is that it can be a single pane of glass using which we can connect to various clouds and access the data using, SQL like queries,” said Nitin Doeger, Data Engineering and Enablement manager at Johnson and Johnson.Another early adopter from the media and entertainment industry had data hosted in multiple cloud environments. Using BigQuery Omni they built cross cloud analytics to correlate advertising with in game purchases. Needing to optimize campaign spend and improve targeted ad personalization while lowering the cost per click for ads, their challenge was that campaign data was siloed across cloud environments with AWS, Microsoft Azure, and Google Cloud. In addition to this the data wasn’t synchronized across all environments and moving data introduced complexity, risk and cost. Using BigQuery they were able to analyze CRM data in S3 while keeping the data synchronized. This resulted in a marketing attribution solution to optimize campaign spend and ultimately helped improve campaign efficiency while reducing cost and improving data accessibility across teams. In 2022, new capabilities will include cross cloud transfer’ and authorized external tables to help data analysts drive governed, cross-cloud scenarios and workflows all from the BigQuery interface. Cross cloud transfer helps move the data you need to finish your analysis in Google Cloud and find insights leveraging unique capabilities of BigQuery ML, Looker and Dataflow. Authorized external tables will provide consistent and fine grained governance with row-level and column-level security for your data. Together these capabilities will unlock simplified and secure access across clouds for all your analytics needs. Below is a quick demo of those features relevant to multicloud data analysts and scientists.To get started with BigQuery Omni, simply create a connection to your data stores, and start running queries against your existing data, wherever it resides. Watch the multicloud session at Next 21 for more details. BigQuery Omni makes cross cloud analytics possible! We are excited with what the future holds and look forward to hearing about your cross cloud data analytics scenarios. Share your questions with us on the Google Cloud Community, we look forward to hearing from you.Related ArticleTurn data into value with a unified and open data cloudAt Google Cloud Next we announced Google Earth Engine with Bigquery, Spark on Google Cloud and Vertex AI WorkbenchRead Article
Quelle: Google Cloud Platform

9 things I freakin’ love about Google Cloud identity and environments

I’ve been at Google Cloud just a few weeks, following years of experience as an AWS Hero and building on other clouds. So last week’s Google Cloud Next–my first!—was a bit of a culture shock. On the GCP podcast, I used the word “intentionality” to describe what I’m seeing: a thoughtful, holistic approach that informs so much of how Google Cloud is put together. Not just in the headline-grabbing new announcements like Google Distributed Cloud, but in the everyday things too. Things like IAM and project setup. Step 1 of any cloud project is to provision access to an environment, and that’s why I always found it so frustrating in my past life when I had to deal with outdated or clunky stuff like:Homebrewed, sketchy SSO tooling and config filesNo centralized identity—I was a different person in every cloud accountMysterious logouts, redirects, and missing project context within the cloud consoleAccount organization features that were “bolted-on” rather than designed the right way from the beginningIn contrast, I recently shared a Twitter thread about how shockingly right Google Cloud gets identity and environments. It’s probably my favorite thing about Google Cloud so far, and so in this post I want to expand on what I’ve learned. If you’re searching for a better way to access and organize your cloud, let me make you one of today’s lucky 10,000.Nine things to love about Google Cloud identity and environments1. You are YOU!Every user is just a Google account (personal or corporate) that works across projects. For beginners, this lowers the barrier to entry and makes cloud feel like an extension of things you already know. For experts, it reduces the friction of having to juggle a bunch of unrelated identities. I love that you can permit any Google account into a cloud project as a collaborator—even a contributor from outside your organization! 2. No non-IAM root accountsGoogle Cloud has been designed from the ground up to avoid the chicken/egg problem of requiring a manually configured superuser that sits outside the rest of the identity management infrastructure. In the Google world, humans use Google accounts, and services use IAM-based service accounts —it’s as straightforward as that. (Even non-Google services can be IAM—yay, workload identity federation!)  3. Project discovery for humansProject, folder, and organization discovery are baked into the console, like browsing a file system scoped to your access level. This hardly even feels like a feature, it’s so subtle and yet absolutely fundamental. But once you see it, you can’t imagine going back to a world where environments exist in a vacuum with no contextual awareness of each other. The hierarchical organization model also means that project-per-application-per-environment best practices are the path of least resistance; if anything, I’ve erred on the side of setting up *too many* logical groupings. It’s just too much fun to play with projects and folders!4. Billing that protects you from yourselfThe project context gives you a logical container for the cost of the resources contained within it. My favorite part of this is that your billing entity is managed separately from the project itself. So you can delete a project and feel sure that all associated resources are gone and no longer racking up charges … without also trashing your ability to pay for future projects you might spin up. (Related: the free tier does not charge you money unless you click a big button that basically says “YES, IT’S OK TO CHARGE ME MONEY.” This guarantee, combined with the familiarity of Google Accounts for access, are the main reasons I now recommend Google Cloud to beginners in my network who are looking for a safe place to learn and explore cloud.)5. Organizational structure != billing structureFor organizations, billing is decoupled from the organization root. So permissions inheritance is a separate design decision from chargeback, as it should be. This keeps your Google Cloud footprint from converging toward Conway’s Law.6. SSO that just worksWant to use the CLI? You get SSO out of the box with your Google Account—no corporate organization required, and no manual shuffling with config files and access keys. Or, better yet, you can use Cloud Shell to run gcloud commands right in your browser, even (especially?) on the docs pages. (Random trivia: I think Cloud Shell is the only native cloud service that has the same name across AWS, Azure, and Google Cloud–but Google’s version has been around the longest and as far as I can tell is the most fully-featured.) 7. One group to rule them allRemember how user entities are just Google accounts? Guess what: you can use Google Groups to manage group access to IAM roles! That’s right: one set of users with permissions across docs, email, and cloud. It’s one reason why Google Workspace makes sense as a core piece of Google Cloud; it really does function like just another cloud service from an identity standpoint. 8. Never lose your placeIn other clouds, I’ve experienced a problem I call the Timeout of Doom: when your console session expires, you’re left on a generic error screen and it’s up to you to figure out how to rebuild your context from scratch–starting with remembering what account you used in the first place. Imagine my delight to realize that reaching your Google Cloud console is as easy as bookmarking a single URL.console.cloud.google.com works and remembers who you are (or, at least, suggests the set of people you might be)—no mystery logouts or redirects.9. Progressive complexity FTWIn my experience it’s been common for cloud providers to design most of their account features for organizations: if you’re an independent developer, you get more exposure to dangerous bills, less access to helpful SSO features, and generally must fend for yourself in a world that wasn’t really created with you in mind.I love that Google Cloud has found a way to work with enterprises while still maintaining their roots as a cloud that developers love to use. Sign in with your personal Google account, attach it to an organization when-and-if you’re ready, and in the meantime you get the same thoughtfulness around SSO and billing as the giant shop down the street. I’m not going to tell you my experience has been seamless; there are footguns here (every Google Workspace integration creates a new project?), and I’m still learning. But it’s that “intentionality” thing again. The Google Cloud identity and environment experience feels like it was designed, not just accreted; there’s an elegant simplicity to it that makes cloud feel fresh and exciting to me all over again. I can’t wait to see what’s next.In the meantime, I highly encourage you to do what I did and spin up a free trial to try things out for yourself. Then hit me up on Twitter with your favorite Google Cloud identity or environment feature!Related Article13 best practices for user account, authentication, and password management, 2021 editionGoogle Cloud offers our best practices to ensure you have a safe, scalable, usable account authentication system.Read Article
Quelle: Google Cloud Platform

Screaming In the Cloud with Corey Quinn and Docker CEO Scott Johnston

On August 31st, Docker announced updates to our product subscriptions — Docker Personal, Pro, Team and Business. Our CEO Scott Johnston recently joined Corey Quinn on an episode of Screaming in the Cloud to go over all the details and discuss how the changes have been received by businesses and the broader developer community. 

The episode with Scott is titled “Heresy in the Church of Docker Desktop with Scott Johnston.” It’s a play on the title of a talk Corey once gave (“Heresy in the Church of Docker”) after he met Scott when they both worked at Puppet.

There’s a substantial discussion around Docker Desktop. Scott describes it as a unique hybrid — one that’s based on upstream open-source technologies (Docker Engine, Docker Compose, BuildKit, etc.), while also being a commercial product that’s engineered for the native environments of Mac and Windows, and soon Linux. He also recalls life before Docker Desktop when developers had to contend with complex setup, maintenance, and “tricky stuff that can go wrong” — all of which Docker Desktop handles so that developers can simply focus on building great apps.

Scott and Corey also discuss the why behind the new subscription tiers, and Docker Business in particular. A key factor was large organizations who use Docker Desktop at scale — as in hundreds and thousands of developers — requesting capabilities to help them manage those large developer environments. Another factor was the need to balance continuing investment in Docker Desktop to give organizations increased productivity, flexibility and security, while also sustainably scaling the Docker business, and still providing a generous free experience with the Docker Personal subscription.

According to Scott, the response from businesses to the updated subscriptions has been overwhelmingly positive. Not only have there turned out to be far more Docker Desktop users inside organizations than previously thought, but many companies have already proactively purchased a Docker subscription. The positive momentum is allowing Docker to accelerate items in the company’s roadmap for developers, such as Docker Desktop for Linux.

You can listen to Episode 264 of “Screaming in the Cloud,” titled “Heresy in the Church of Docker Desktop with Scott Johnston,” here.

Considering an Alternative to Docker Desktop?

Read this blog recapping Docker Captain Bret Fisher‘s video where he reminded his audience of the many things — some of them complex and subtle — that Docker Desktop does that make it such a valuable developer tool.
The post Screaming In the Cloud with Corey Quinn and Docker CEO Scott Johnston appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

A closer look at locations in Eventarc

New locations in EventarcBack in August, we announced more Eventarc locations (17 new regions, as well as 6 new dual-region and multi-region locations to be precise). This takes the total number of locations in Eventarc to more than 30. You can see the full list in the Eventarc locations page or by running gcloud eventarc locations list . What does location mean in Eventarc?An Eventarc location usually refers to the single region that the Eventarc trigger gets created in. However, depending on the trigger type, the location can be more than a single region:Pub/Sub triggers only support single-region locations.Cloud Storage triggers support single-region, dual-region, and multi-region locations.Cloud Audit Logs triggers support single-region locations and the special global region.Before looking into trigger location in more detail, let’s look at other locations relevant in Eventarc.What other locations are relevant in Eventarc?Triggers connect event sources to event targets:Each event source, event target, and trigger has its own location. Sometimes, these locations have to match and sometimes they can be different.  Here’s an example of a trigger connecting Cloud Storage events from a bucket in the europe-west1 region to a Cloud Run service in the us-central1 region with a trigger located in the europe-west1 region:In many cases, you don’t have control over the location of the event source. In the example above, the Cloud Storage bucket is in theeurope-west1 region. That’s the location that you need to work with and it has implications for the trigger location (which I’ll get to later). The location of the event target is the region of the service where you want the events to go. You get to choose this from one of the supported regions when you deploy your Cloud Run service. You typically want this to be in the same region as your event source for latency and data locality reasons (but this is not strictly a requirement). In the example above, the event source (bucket) is in europe-west1 but the event target (Cloud Run service) is in  us-central1as specified by the –destination-run-region flag. The location of the trigger is dictated by the event source location, but the trigger type also comes into play. It is specified by the –location flag. Let’s take a look at the trigger location for each trigger type in more detail.Location in Pub/Sub triggersIn a Pub/Sub trigger, you connect a Pub/Sub topic to an event target. Pub/Sub topics are global and not tied to a single region. However, when you create a Pub/Sub trigger, you need to specify a region for it (because Eventarc triggers need to live in a region) with the –location flag as follows:By specifying a location, Eventarc automatically configures the geofencing feature in Pub/Sub such that events only persist in the specified location. As I noted above, you typically want to (but are not required to) choose the same region for the trigger and the Cloud Run service for lower latency and data locality. You can also use regional Pub/Sub service endpoints to publish to the topic to ensure that all of the data stays in a single region. Location in Cloud Storage triggersIn a Cloud Storage trigger, you connect a Cloud Storage bucket to an event target. A Cloud Storage bucket can be in a single-region (e.g. europe-west1), dual-region (e.g. eur4), or multi-region (e.g. eu) location. The location of the bucket dictates the location of the trigger and they have to match. The earlier trigger example was for a bucket in the  europe-west1 single-region location. Here’s another trigger connecting Cloud Storage events from a bucket in the eu multi-region location. Notice how the location flag matches the bucket region:If the bucket region and the trigger region do not match, you’ll see an error:Location in Cloud Audit Logs triggersIn a Cloud Audit Logs trigger, you connect any event source that emits Audit Logs to an event target. The location of the event source will dictate the trigger location. This is typically a single region but there is a special global region that’s necessary in some cases. For example, if you want to read Cloud Storage events from a bucket in the europe-west1 region with an Audit Logs trigger, you will create the trigger with the same location. Note that this will match all buckets in the europe-west1 region as there’s no filter by bucket in Audit Logs:On the other hand, if you want to match a dual-region or a multi-region bucket such as eu, you will create the trigger with the global location as Audit Logs triggers only support a single or global region. Note that this will match all buckets in all regions globally:As you can see from this example, if you want to read Cloud Storage events, the native Cloud Storage trigger is a much better option, but this example illustrates a typical case in which a global Audit Log trigger is necessary. That wraps up this closer look at locations in Eventarc. Feel free to reach out to me on Twitter @meteatamel for any questions or feedback.Related ArticleIntroducing the new Cloud Storage trigger in EventarcLearn how to use the new Cloud Storage trigger of EventarcRead Article
Quelle: Google Cloud Platform