Google Public Sector announces continuity-of-operations offering for government entities under cyberattack

Cyberattacks that target our government are all too common these days. From SolarWinds, to hacks against widely used email servers, to attacks against thedefense industrial base, we know that cyberattacks against the public and private sectors continue to be an issue. Our latest VirusTotal malware trends report illustrates this point as well, with findings that governmental domains are among the top categories used by attackers in 2022 to distribute malicious content.Given the external environment, government agencies in particular need reliable continuity plans in the event of an attack. In fact, two policy directives were recently issued to ensure that government entities can continue to operate. The first is a part of Presidential Policy Directive 40, which advocates that critical services are sustained in the event of an emergency—such as a natural disaster, a pandemic like COVID-19, or a major cybersecurity or ransomware attack—every U.S. government agency is expected to have a Continuity of Operations Plan (COOP) in place. More recently, the U.S. Cybersecurity & Infrastructure Security Agency (CISA) also emphasized the need for COOP in its incident response playbook to strengthen cybersecurity for federal and civilian agencies. Google Workspace is postured to help government organizations for its business and collaboration continuity needs, ensuring agency teams continue to work effectively and securely in the event of an attack, including having critical productivity tools like email, storage, document sharing and more.As federal, state and local agencies consider what they may do in case a breach threatens their operations, continuous access to email, chat, and videoconferencing systems throughout a catastrophic situation is a top priority. Because Google has pioneered the zero-trust approach in security with its BeyondCorp implementation, and has both FedRAMP High and Department of Defense Impact Level 4 certifications, Google Workspace can offer federal agencies peace of mind. “Federal and state and local agencies are aiming to increase security while maintaining trust and availability, a task that is unachievable without continuity.” says Aaron Walker, research manager for World Wide Government Trust and Resiliency Strategies at IDC. “Google Workspace will help agencies utilize zero trust principles to ensure availability of document, email, and collaboration tools as incidents, breaches, and attacks occur.”Making it easy to deploy secure alternatives Google Workspace allows access to communications and collaboration tools that organizations need during and after an incident to keep work going. If one communication system goes down during an emergency, Google Workspace can keep collaboration and communication running smoothly. For provisioned users, Google Workspace can be operational immediately, allowing agency personnel to stay connected, access documents, and collaborate securely across Google’s email and collaboration platform to assist customers in time of need. Specifically, using Google Workspace, agency personnel can access their emails, documents or their current calendars from their primary providers—and access the agency’s Active Directory. To help agencies implement this strategy and roll out Google Workspace, Google Public Sector is offering workshops for government agencies to help them determine the best strategies for resilient communications and collaboration. To learn more or to sign up for a workshop, email psworkspace@google.com or visit Google Workspace.
Quelle: Google Cloud Platform

Why should game companies choose Cloud Spanner to power their games

Organizations globally use Cloud Spanner, because of its unlimited scale, strong consistency, and up to 99.999% of availability. In particular, game companies, like Embark Studios use Spanner to help scale and solve the many challenges of a distributed system, such as concurrency and load balancing. According to a recent PwC report, the global gaming industry is one of the sectors that has experienced significant growth in the last couple of years, and the industry (excluding esports) is poised to be worth $321 billion by the year 2026. In the past three years alone, the industry has grown by half a billion players, totaling 2.7 billion players globally. It’s no surprise that game players seek an enriching experience that includes playing a variety of games, communicating with each other, and participating in out-of-game activities like watching game franchise movies, buying memorabilia, and more. Player expectations are rising and evolving and we see demand for cutting edge graphics, an immersive in-game storyline, multiplayer game interactions across the globe, and not to mention, 100% uptime. The right database is critical and essential to satisfy these requirements, and there are many factors for game developers to take into consideration when choosing the right database.The importance of choosing the right database Creating a highly interactive, multiplayer game is not easy. It is in fact one of the most fascinating distributed systems problems. The game needs to ensure high uptime, low latency, ability to scale on demand and consistent experiences across geographic locations. It also needs to support sophisticated workflows, special events and tournaments, in-game purchases, in-game experiences – and the list goes on. And whether you’re creating the next blockbuster or indie favorite, you need to consider a plethora of design challenges and constraints like capacity forecasting for millions of players demanding real-time experiences. In the midst of it all is the key question: What database will solve the many challenges of a distributed system? For example, updating a live table that is currently serving existing users requires a lot of caution and careful planning. Is the table serving end user traffic? Is the database serving relatively high QPS, and is it going to prevent users from accessing the table? How important is “zero downtime” for your game and your brand? Do you need to relax some constraints, likely around serving slightly stale reads? Do you need to worry about consistency across all your nodes? Do you need to scale out the database? And so on. For a traditional database, the onus is on the game developers to design around all the constraints above, taking focus away from the game itself. Often, in order to work around the limitations of the database, the developers end up designing and building a game application logic that is incredibly complicated.Another dimension that game companies have to take into account is the Total Cost of Ownership (TCO). How much do you need to spend on operating and maintaining the database? If it is a self maintained database, you have to take into account the database administrator, operational, and maintenance costs – not to mention the cost of patching and keeping security credentials up to date.No matter your size, it’s incredibly important as a company to optimize your resources, both human and capital, to allow you to focus on your game rather than infrastructure maintenance. Having a robust, secure, and scalable infrastructure provided to you as a service means faster, easier, and better optimized game development cycles. Choosing the right database is one of the key success drivers behind a bestselling game.Why Spanner is the best choice for powering game development backendsThat’s where Spanner comes in as the database of choice for your game development needs. Here are the reasons why you should choose Spanner:Spanner can solve many of the challenges of distributed transactions. As a fully-managed, globally distributed, and strongly consistent relational database with unlimited scale, Spanner is built for the cloud to combine the benefits of a relational database structure with non-relational horizontal scale. Spanner serves over 2 billion requests per second at peak without any downtime or maintenance windows. For game developers who often run smaller workloads or support a smaller user base initially and then expect to scale seamlessly, we’ve introduced granular instance sizing and committed use discounts. You can also get started at no cost with the Spanner free trial instance!Spanner easily handles large unexpected workloads, allowing you to focus on your game. One of the biggest pain points that game developers face is building a distributed system that can respond to sudden changes in workload. A major concern for any game company is scalability: if their game goes viral globally, their infrastructure must be able to respond to sudden, incredible demands, while maintaining a consistent gamer experience and operational stability. Since Spanner automatically handles replication, sharding, and transaction processing, your game can quickly scale as needed to meet any unforeseen or spiky usage patterns. By using Spanner, you can truly enjoy all the benefits of relational semantics (Google Standard SQL and PostgreSQL-dialect databases) with unlimited horizontal scale, and allow your teams to focus on what matters most: the game itself.Spanner can power gamers globally and provide them with a consistent experience. Spanner’s ability to provide clients with the strictest concurrency-control guarantees for transactions allows game companies to keep design patterns simple. Indeed, by using Spanner, you can focus your time and energy on solving the hard problems of game development rather than maintaining the database.”With Spanner, the latency is so low between the regions that we can maintain a triple continent database. That means our players won’t have to choose between playing with their American friends or their European friends. They can just switch regions whenever they want,” said Luna Duclos, a Lead Backend engineer at Embark Studios.Spanner’s use cases in the gaming vertical are endless. Spanner can accommodate a wide range of database management functions, including:Authenticating and authorizing players’ profiles.Creating various stateful ingestion workflows from game clients, servers, and other applications.Managing user, inventory, and profiles.Implementing in-game purchases and leaderboards.Storing and quickly serving players’ data.Below is a typical example architectural diagram of a popular game company, which uses Google Cloud to develop one of the world’s most popular online games.Figure 1Three Spanner features are especially important to game companies:High availability: Spanner has incredibly high availability and latency guarantees that makes building reliable applications on top of Spanner simpler. Spanner provides transactional write support, assuring high robustness guarantees and industry-leading 99.999% availability for multi-regional instances. This high availability means that designing around the SLAs isn’t needed.Backup and recovery management: Spanner’s backup management is easy to configure and offers point-in-time restoration to provide simplified business continuity. The backups are engineered to not have any impact on the serving traffic. Spanner also provides point-in-time recovery to make data fixing and the recovery of individual rows, tables, and whole databases simple, which is exceptionally useful when dealing with thousands or millions of transactions per day.No cost to get started: The Spanner free trial instance lets you learn and explore Spanner at no cost for 90 days. Once you decide Spanner is the right database, you can scale as needed with no limits, and only pay for the compute capacity and storage that you use. In addition, you don’t need to over provision for your game launch, and you can use the Autoscaler to manage your Spanner utilization.As your game grows overtime to support more players, Spanner will continue to be the database of choice for prototyping, scaling, and adjusting data models to maintain effective, low overhead engineering.We can’t wait to see how your business uses Spanner to unlock all of your gaming potentials!Get started with Cloud SpannerTo get started with Spanner, create a database, or try it out with a Spanner Qwiklab. You can learn more about how to develop global multiplayer games using Cloud Spanner and learn more about best practices for using Cloud Spanner as a gaming database. We would like to thank Aalok Muley and Sneha Shah for their help on this blog post.Related ArticleHow Pokémon GO scales to millions of requests?This blog is a behind-the-scenes look into how the Pokémon GO engineering team manages and maintains the scale. Joining me is James Promp…Read Article
Quelle: Google Cloud Platform

CISO Survival Guide: How financial services organizations can more securely move to the cloud

It’s not just children and adults who face excitement and nervousness on the first day of school. The first day in the cloud can be daunting for financial services organizations, too. Chief Information Security Officers must lead the cloud security component of their organization’s digital transformation, a complicated task beset by many questions that the members of our Google Cybersecurity Action Team can help answer. We want to help you move into the brave new world of digital transformation and build engaged, robust cybersecurity teams as you go because there is no “one size fits all” approach to cloud security. We’ve worked with many financial services organizations in the middle of their transformations. Some want to revolutionize how their organizations achieve their cybersecurity goals. Others want to have minimal viable security controls for Day 1 launches. Each organization has its own operational and technological needs, its own funding sources, and its own risk appetites, all of which can fundamentally influence security strategy.We’re here to offer our real-world knowledge and experiences from Google’s Office of the Cloud CISO to help you move boldly – and more securely – to the cloud. We do this as part of our commitment to operate in a shared fate model that helps our customers achieve the best possible security outcomes. We strongly believe that secure organizations make for a more secure world.First come the questions, so many questionsMany times, we go into customer organizations as they are on the cusp of moving to the cloud and hear questions such as:I’ve never done this before, what do I need to worry about first?How do we make sure we don’t move our technical and cyber debt to the cloud? What are the key threats that I need to pay attention to?What on-premises baggage am I going to be left with?How do I organize my team to best address the things that we need to focus on?What becomes apparent from these conversations is that technology and security leaders use moving to the cloud as an opportunity to transform their businesses. This is an excellent plan. However, just because technical and cyber debt were not created intentionally does not mean that they can be wished away. It takes a concerted effort to reduce risk by building on solid fundamentals and leveraging the advantages of the cloud to pay down that debt.  These areas of concern and the strategies for addressing them can be categorized around your organization and its operations, technology, and people – and your CISO leadership.Teach your organization to think cloudRecently, security teams have been organizing around security compliance models such as the NIST cybersecurity framework. While this provides a foundation to discuss security disciplines and general security posture, it doesn’t necessarily provide the best way to organize your security team for optimal impact. In addition, most of these frameworks were developed before cloud was widely adopted in regulated industries. We now have more specialized knowledge and tools to more effectively serve specialized cases and verticals.  As use of the cloud becomes more prevalent, frameworks need to evolve and adapt to new threats and a new operating environment with rapid business changes and agile IT . Fundamentally, digital transformation is about organizational change management. A key component of preparing for digital transformation is guiding the people in your organization to evolve beyond on-premises mindsets to adopt new ones. In our discussion on how CISOs need to adapt their mental models for cloud security, we noted that security during and after a digital transformation should focus on how network and endpoint security, detection and response, data security, and identity and access management (IAM) function in the cloud — and how taking advantage of those differences can help you build a more resilient security posture.  The right questions can drive security changesOne key question to ask yourself when making strategic and tactical decisions is: Why am I implementing this security control?  Digital transformation provides an excellent opportunity to re-examine your team (becauseculture comes first in cloud transformation) and lead the way to changes that address your organization’s go-forward strategies when it comes to firewalls, antivirus software, applications, data protection, your overall security and risk postures, and your backup plans.  Changing technical controls first rarely leads to success.Your organization needs to have a clear vision and set objectives to determine how to most effectively achieve its security goals. Most of the time this means that CISOs and their teams have to reach outside their comfort zone and work with technology, business, and other partners to achieve success. If your organization goes down the path of “it’s always been done this way on-premises,” your cloud transformation is more apt to be inefficient and ultimately block the business from achieving agility and security.  At the September conference Measuring Cyber Risk in the Financial Services Sector hosted by MIT and the Federal Reserve Board, an audience member posed an important question to the panel: Why do cyber insurers ask if I have file integrity monitoring installed?This kind of question from cyber insurers is indicative of the mindset that should evolve with the digital transformation process. We want to be open to new opportunities to rethink practices and architecture. File integrity in a vacuum means very little to the overall risk reduction of your organization. Depending on their objective, cyber insurers could have asked a different set of questions, such as: How do you ensure that critical payment data is not altered in the transaction flow? And how do you ensure that software running in production is authorized and not altered?  Both questions could be answered with file integrity monitoring. However, answering a question on a cyber insurer’s questionnaire provides little to no value. It’s a check-the-box exercise that doesn’t provide a measurable security benefit. Cloud provides the same opportunities to rethink standard controls and generate better security and business outcomes.  As you begin implementing security in the cloud, keep in mind what your organization’s ideal security posture should be and come to an agreement with stakeholders (including business and IT leaders) about how you can set and achieve your goals. The first steps offer an invaluable “pressure test” for your organization – and take comfort in the fact that very few CISOs get it right on the first try. That’s why you should be adaptable, be open to change, and work to minimize organizational strife as much as possible.  We will continue this discussion in the next blog focused on the realities of starting the operational transformation.To learn more now, check out our podcast on CISO frustrations, successes, and lessons learned, and our guidance report on cloud security transformations. ReviewGoogle Cybersecurity Action Team site for additional papers and other guidance.Related ArticleHow CISOs need to adapt their mental models for cloud securityCISOs: How well do you speak cloud? Here are 6 tips for adapting your mental models of security.Read Article
Quelle: Google Cloud Platform

Veterans Day: Q&A with Terradepth about mapping the ocean floor, the final frontier on Earth, with Google Cloud

Editor’s note: November 11th is Veterans Day—a day for us to recognize, celebrate, and honor military veterans of the United States Armed Forces. Today, Shannon Sullivan, Director, SCS Public Sector at Google, will be having a conversation with Terradepth and its founder to learn how they are using Google Cloud to grow their business and serve their customers. This feature highlights Terradepth and its founder, Joe Wolfel. Shannon: Hi Joe. It’s great to talk with you about Terradepth and your pioneering work in ocean floor mapping. Tell us about the business and what led you to found the company.Joe: When we founded Terradepth in 2018, we were inspired by the opportunity to make a positive difference to the planet. Judson (the other Terradepth founder) and I previously served in the U.S. Navy SEALs, so much of our time in the military was spent surrounded by the sea. When we moved into business, we were struck by how little we know about this environment and the lack of data that exists to support fast decisions if your business involves subsea activity. We were determined to expand people’s knowledge of the oceans. Joe: We’re one of the world’s only vertically integrated ocean data companies. We help people collect, manage, and deliver ocean data to support faster decision making. Our cloud platform makes it easy to use, share, and disseminate data. This helps to make up for the fact that people really know very little about the ocean floor—only 23% has been mapped so far. Because we use unmanned systems in the ocean, our technologies can go deeper, farther and for longer periods of time, which opens up entirely new areas. Shannon: That’s fascinating. Can you tell me a bit more about your customers? Where do they work and what challenges do they face?Joe: Our customers are anyone who cares about the ocean from the public to professional hydrographic surveyors who supply data to offshore wind companies, submarine telecommunications, and others.We’re collecting a repository of scalable, cost-efficient data that will inform every industry with a connection to our oceans. It can also help tackle the critical environmental and social issues facing the planet. This includes predicting atmospheric weather patterns, building underwater energy and telecom infrastructures, and protecting the future of our coastal communities.Joe: We’ve designed our Absolute Ocean data platform to be simple so that most anyone can use it to explore the ocean, but also sophisticated enough for a professional hydrographic surveyor responsible for annotating, analyzing, and delivering data.Terradepth founder Joe WolfelShannon: Ok, so that sounds like the choice of platform has been critical to deliver on your vision. What led you to choose Google Cloud for your business? Joe: The reason we chose Google Cloud to host Absolute Ocean is that we know just how powerful and useful Google Earth is and we wanted to try and do something similar for the ocean floor. When it comes to our infrastructure, Google Kubernetes Engine (GKE) is our orchestration platform, and we use Google Cloud SQL as our relational database.GKE is especially important for the growth of the business. It has a modern, flexible software architecture that doesn’t really exist yet in the maritime sector where most software solutions are PC-based. By pushing the boundaries and moving to the cloud we can give our clients the flexibility to use the platform in the ways they need.Shannon: That’s great. We hear this a lot from start-ups in sectors that haven’t yet adopted the cloud, especially the point about giving the customer greater choice. Can you also tell us a bit more about the Google for Startups Cloud Program. How helpful was that? Joe: We had a great experience. Leveraging the financial support and credits from the program is ideal because it enables us to make the most of limited resources when we’re starting out. We were also impressed by the support for developing on Google Cloud. Whenever we had questions, the Google startup experts answered them quickly and helped keep us on track.Shannon: This is good to hear. Thinking again about the maritime sector and ocean mapping, can you tell us more about how Google Cloud helps you solve challenges facing this industry?Joe: It’s helped us take a big leap forward. People might be surprised, but most maritime data is still transported by mailing hard discs back and forth or emailing PDFs. With Google Cloud we have a modern solution for capturing and managing data that enables individuals and businesses to make smart, fast decisions with respect to the ocean. Best of all, we didn’t have to build our infrastructure, it was already there. We just had to add on top of what Google Cloud already provides.Joe: This is critical from a vertical-integration standpoint. We’re collecting ocean data but we’re also offering the platform infrastructure. Google Cloud massively reduces the effort we need to build the Absolute Ocean platform, so we can stay focused on continually adding to the data from subsea surveys. Shannon: The other thing I’d like to focus on is your journey as start-up founders. You’ve already been able to accomplish a lot, and what you’re doing is game changing for businesses, communities, and environmental stewardship. I’m curious: are there things that you’re particularly excited about or proud of right now?Joe: The team at Terradepth is what inspires me the most. We have an outstanding group of people working incredibly hard to widen our understanding of the last unexplored frontier on the planet. The oceans contain 98.5 percent of the Earth’s habitable volume yet we know hardly anything about it. It also supports a highly valuable economy that comprises everything from marine fishing to ship building and maritime freight transport. This is estimated to be worth $3 trillion by 2030. Joe: That’s one of the reasons I enjoy working at Terradepth. There is a broader, larger mission that could literally change the way we interact with the ocean.Shannon: Shifting away from technology, and focusing again on Veterans Day, can you tell me more about your journey from serving as Navy SEALs to founding Terradepth?Joe: I think there’s a lot in common between serving in the military and founding a business. Being a Navy Seal gives you the confidence to problem solve and figure things out in unknown situations. You also acquire lots of stable, simple mental models that allow you to calculate, make good decisions, and accept risk. Other things that translate from combat situations include, how do you lead a team, and how to take care of people. There’s a certain amount of stress tolerance too. When things get really bad, which they can in a start-up, you can always sit back and think, “Well, I’ve seen a lot worse in combat.”Joe: The veteran network has also made a big difference, including our personal contacts and people who are in similar situations running their own companies. Although we haven’t used them ourselves,The Honor Foundation,The Station Foundation, andThe Commit Foundation do great work helping veterans transition to the civilian workplace. I also had the unique opportunity to work forMcChrystal Group, a consultancy led by Stanley McChrystal, a retired four-star army general.Shannon: Going back to everything that you said, are there any other challenges that you had to overcome as a veteran entrepreneur?Joe: In the military, you acquire a community and a culture that starts day one of your assessment and training. But when you launch your own business, you’ve got to create that culture by yourself. You can’t lift and shift the SEAL culture and just apply it to a private company. People tried that, and it didn’t work. You’ve got to be able to adapt and think critically about what you can retain from your military training to make your business team successful.Terradepth team membersShannon: You touched on the importance of culture in the military and the civilian workplace. Could you expand on that?Joe: The one thing the military really taught me, specifically in combat, is how to tackle highly complex or chaotic situations. The way to succeed is to get multiple diverse, but informed perspectives weighing in on a problem.Joe: We’ve taken that iterative thinking around how to attack complex problems and transferred it to Terradepth. We’ve created a cloud-based ocean data platform that enables different perspectives to collaborate on common problem sets and get to solutions instead of passing data back and forth between different stakeholders.Shannon: What other advice do you have for veterans looking to start their own business or their own company in the technology space?Joe: Be confident, but not overconfident. You’re not going to have deep subject matter expertise in technology so don’t let that lead you into making expensive mistakes. The flip side is that as a veteran you have a lot of the tools for success including leadership and decision-making.Joe: This really matters when pitching to investors. Most are used to seeing younger people with a wealth of subject matter expertise. We had to look hard for the right investors who were prepared to back generalists, such as us, with other strong skills. We played on our personality strengths, experience at building teams, and an ability to lead people through a series of challenges.Joe: One last piece of advice. Your learning doesn’t end when you leave the military. You’ve got to take responsibility for discovering more about your industry. This will make you credible with your investors, your customers, and your team as it expands. It won’t guarantee success, but it will certainly increase your chances!If you want to learn more about how Google Cloud can help your startup, visit our pagehere to get more information about our program, andsign up for our communications to get a look at our community activities, digital events, special offers, and more.
Quelle: Google Cloud Platform

Announcing Private Marketplace, now in preview

Google Cloud Marketplace is home to a wide variety of useful products for organizations across industries. The great selection it provides can create challenges, however, for cloud administrators who want to ensure that employees within their organization can easily view IT-vetted and approved applications. For example: when faced with such a broad range of products, employees might deploy an incompatible product version by accident; or they might not know your IT team prefers a specific product, leading them to adopt software your team hasn’t reviewed and approved.To help avoid these issues and make it easier for your teams to quickly access the software and tools they need to do their jobs, we’re proud to introduce Private Marketplace, in preview today. The new Private Marketplace feature allows IT and cloud administrators to create a private, curated version of Google Cloud Marketplace that’s accessible to employees within their organization. With your own Private Marketplace, you can:Curate product collections for your org: have your employees enter the Google Cloud Marketplace through a customized collection by default, surfacing the products that your IT team has selected first. This will ensure that they can identify pre-approved products quickly with certainty.Prevent redundant products: When organizations — especially larger ones — use common products across teams and business units, it may be easier to negotiate deeper discounts or take advantage of volume savings promotions. Aligning on common products can also reduce complexity and simplify knowledge sharing.Reduce “shadow IT”: When teams and business units buy products that your central IT or governance team doesn’t have visibility into, it can lead to non-compliant products and security risks. Setting up a Private Marketplace allows you to enforce controls by only showing approved products and encouraging their use. Improve visibility further by activating the Request Procurement workflow.Best of all, setting up Private Marketplace is easy. And if you have teams or users that typically use different products from those offered in Google Cloud Marketplace, you can create multiple collections for each of them so they only see what’s most relevant.How-to set up Private MarketplacePrivate Marketplace is easy to set up with just a few simple steps. And while it isn’t a replacement for identity and access management (IAM) or organization policy, Private Marketplace can transform how your organization scales compliant product discovery:Organization Administrators or Organization Governance Administrators can navigate to Google Cloud Marketplace > Marketplace Governance > Private MarketplaceClick Create collection.Enter the name and brief description of the collection you are setting up.In Add products, click + Add and paste the URLs of the product(s) you want to include.Click Share and select the organization, folder(s), or project(s) you want to share this collection with and click Save.Return to Marketplace Governance, and toggle the switch next to “Make your Private Marketplace visible in the Google Cloud Marketplace”And that’s it. The users in the organization, folder(s), or project(s) you set will be able to access your Private Marketplace. And they’ll still have access to the wider Marketplace from there if they want to discover products that aren’t yet included in your collection. Learn more about this new preview feature set in the Create a Private Marketplace and Create and Share a Collection documentation.Related ArticleService Catalog: Introducing version selection for Terraform solutionsAnnouncing support for multiple Terraform versions for Google Service Catalog Terraform solutions.Read Article
Quelle: Google Cloud Platform

Upskill for in-demand cloud roles with no-cost training on Coursera

Cloud technology has experienced accelerated adoption in recent years, with continued growth expected into 2023.1  This means that the need for organizations to attract and retain professionals with cloud skills continues to grow in parallel.2  Keep your cloud career growing, at pace with digital transformation In partnership with Google Cloud, Coursera is offeringno-cost access to some of our most popular cloud training to help you hone your skills and stand out in the job-market. Whether you’re looking to enhance your technical competencies, advance your career, acquire more hands-on experience, or earn learning credentials to validate your knowledge, we have resources available to support your journey. Future-proof your career with select no-cost training and earn certificatesClaim one choice from a variety of popular Google Cloud Projects, Professional Certificates, Specializations and courses, available to claim until December 31st, 2022. The Google Cloud training included in this promotion spans a variety of roles, like machine learning engineering; data engineering; and cloud engineering, architecture and security. Training content is available for both technical and non-technical roles, from foundational to advanced knowledge and experience levels. The training descriptions include any prerequisite knowledge you should have before getting started.The time requirements for completion also vary, so we’ve summarized it below to help you make your choice, and pick the level of commitment that is right for you. When you finish the training on Coursera, you will earn a certificate that you can share with your network on social media and your resume. Types of Google Cloud training available on Coursera Here is a rundown of the different types of training available on Coursera included in this offer, in order of time required to complete it:Projects: Approximately 30-90 minute time commitment to completeLearn new skills in an interactive environment by using software and tools in a cloud workspace with no download required.Courses: Approximately 4-19 hour time commitment to completeCourses typically include a series of introductory lessons, step-by-step hands-on exercises, Google knowledge resources, and knowledge checks. Specializations: Approximately 2-6 months time commitment to completeSpecializations are a series of courses that help you master a skill, and include a hands-on project. Professional Certificates: Approximately 1-9 months Professional Certificates include hands-on projects and courses, and upon completion you will earn a Professional Certificate. These can help you prepare for the relevant Google Cloud certification exam. Here is a look at some of our most popular training for in-demand cloud rolesWork through training at your own pace, and upskill for the role you’re in, or the one you’re looking to grow into. Popular training for in-demand roles include:For those in non-technical roles, working closely with cloud technology Professional Certificate – Cloud Digital Leader This is a foundational level series of four courses designed to give you knowledge about cloud technology and data, and digital transformation. It helps increase confidence in contributing to cloud-related business initiatives and discussions. If you’re in a tech-adjacent role such as sales, HR or operations, you will benefit from this training. For Application Developers Specialization – Developing applications with Google CloudThis Specialization is built for application developers who want to learn how to design, develop, and deploy applications that seamlessly integrate managed services from Google Cloud. It includes a variety of learning formats, including labs, presentations and demos. Labs can be completed in your preferred language: Node.js, Java, or Python. You’ll learn practical skills that are ready for immediate use in real IT environments.For experienced ML and AI Engineers Professional Certificate – ML EngineerPrepare for Google Cloud Certification with the Machine Learning Engineer Professional Certificate. This is an intermediate-level training recommended for participants who have data engineering or programming experience, and who want to learn how to apply machine learning in practice and to be successful in a machine learning role. There are 9 courses in this Professional Certificate, and completion time is about 7 months at the suggested pace of 5 hours per week. For beginners with Google Cloud in technical rolesCourse – Google Cloud Fundamentals for AWS ProfessionalsThis course introduces key concepts and terminology through a combination of videos and hands-on labs that can be completed in approximately 9 hours. You’ll learn about the components of the Google network infrastructure and differences between infrastructure as a service and platform as a service; how to organize projects and interact with Google Cloud; and jump into Google Cloud Compute Engine with a focus on virtual networking. For beginners in Data Engineering Project – Introduction to SQL for BigQuery and Cloud SQLThis is a self-paced lab that takes place in the Google Cloud console, giving you interactive practice running structured queries on BigQuery and Cloud SQL. This is a beginner level project that takes about an hour to complete.As the year comes to a close, it’s a great time to prioritize growing your cloud skills. Check out our no-cost Google Cloud training offers on Coursera, available until December 31, 2022.1. According to Forbes: The Top 5 Cloud Computing Trends in 20232. According to Forbes: From Crisis to Opportunity: Tackling the U.S. Cloud Skills GapRelated ArticleBuild your cloud skills with no-cost access to Google Cloud training on CourseraAdvance your technical skills and boost your career by getting hands-on practice with Google Cloud projects.Read Article
Quelle: Google Cloud Platform

Samsung upskills their Big Data Center teams to transform business

Samsung Electronics, one of the world’s largest manufacturers of smartphones, TV’s, home appliances and electronic devices, recently launched their Big Data Center (BDC) to improve the use of big data in development, marketing, and product sales. Samsung also focused on maximizing the value of their existing data by creating a data hub platform to harness and streamline big data resources enabling a boost in their internal analysis and forecasting capabilities. These initiatives all support Samsung’s long-term transformation goal to become a data-driven organization where data drives actionable decisions. Due to Covid-19 impacts, BDC employees had limited in-person training and learning opportunities which also affected their ability to drive data center optimization. This prompted Samsung to creatively determine the best, most expedient route to ensure their teams could attend world-class training. Samsung did this whilst staying focused on their primary objective of maximizing the high capacity of the BDC, and upskilling employees to the most cutting-edge technologies and trends. Hundreds of employees team together in the BDC to design innovative, enterprise-level data utilization environments. This team includes a diverse set of technical expertise including back-end developers, cloud architects, data engineers, data scientists, machine learning engineers, privacy and security experts, strategists, and planners. Samsung preferred to secure a tailored-approach for the desired curriculum and educational experience in order to satisfy their diverse learning needs. With significant experience and expertise in delivering learning across multiple skill sets and delivery formats, the Google Cloud Customer Experience services was the key that enabled and guided Samsung to a successful learning process with data center optimization. Google Cloud Learning services and Customer Care services were engaged to meet their learning and data center optimization objectives. Initially the Learning services team proposed a project to nurture Google Cloud champions within the BDC team by utilizing the on-demand training program including Google Cloud Skills Boost hands-on training and Google Cloud courses available on Coursera. The selected courses were intentionally designed to cover a wide range of product solutions including AI Platform, BigQuery, Cloud Composer, Cloud Dataflow, Compute Engine, and Pub/Sub. In addition, Samsung leveraged Google Cloud’s enterprise-level support service, known as Premium Support to further extend their technical capabilities while meeting the learning needs of BDC employees. Samsung had previously chosen to expand their Technical Account Management capability with the addition of Value-Add Services ensuring that multiple Technical Account Managers (TAMs) were engaged. With proactive collaboration between BDC employees and the dedicated team of TAMs, who arrived with customer-aware knowledge driven insight and guidance, each phase of the prescribed learning program was enabled to deliver the desired tailored development and implementation. Premium Support services was the layer that also ensured that the learning program suited both the immediate needs of BDC employees and the larger enterprise-wide initiatives to foster and drive digital transformation.The learning program produced successful results in two dynamic paths. First, Samsung satisfied their employees’ thirst for continuous learning by upskilling existing skills with on-demand access and customized learning curriculum.  Additionally to reinforce the advantages gained from the cloud learning program, the TAM team organized a Google Kaggle Hackathon (GKH) enabling learning participants to gain and demonstrate new proficiencies with tools such as VertexAI, BigQuery, and BigQuery ML in a competition format.  The second result included Samsung extending their data center productivity and capabilities to optimize capacity of the BDC with Premium Support services. With tailored guidance from their Premium Support TAM team, Samsung effectively cultivated technical and digital transformation across their business.  The BDC employees welcomed the diverse, interactive opportunities to expand their education, and many have directly or indirectly improved their work performance by completing the courses. “Employees are highly satisfied by providing high-quality and diverse educational opportunities. Thanks to this, we won the Samsung Culture Index at the end of the year.” —Wooseung Jang, Corporate EVP, Head of Big Data Center, Samsung ElectronicsThe flexibility of the on-demand Google Cloud Learning services and Premium Support services ensured that the BDC team participated in an innovative learning program in a manner that did not disrupt broader business operations and efficiencies. In fact, the BDC has already experienced a significant increase in their monthly recurring revenue stemming from the more robust operational efficiencies and the resulting employee innovations. Samsung successfully launched their enterprise data hub, with plans to expand and create a Big Data Center in America in the near future.Their educational program is also gaining momentum, with a five-times increase in participants in one year. Currently, over 100 BDC employees participate in Google Cloud training through on-demand courses offered on Coursera, with plans to expand this program to hundreds of employees in data analytics and machine learning next year. By ensuring their employees’ skill sets are optimized, Samsung has gained reassurance that their Big Data Center will remain optimized for continued innovation to perform at the highest levels.To learn more about how Google Cloud Customer Experience services can support your organization’s talent transformation journey, visit: Customer Care Premium Support to empower business innovation with expert-led technical guidance and support Google Cloud Training & Certification to expand and diversify your team’s cloud education
Quelle: Google Cloud Platform

Announcing MongoDB connector for Apigee Integration

MongoDB is a developer friendly application data platform that makes it easy for developers to access a wide variety of data using a unified language interface, simplifying the data handling process. MongoDB Atlas , MongoDB’s fully managed cloud database, enhances MongoDB’s capabilities even further with full-text search and real-time analytics, as well as event-driven and mobile experiences.Google Cloud’s Apigee is an industry-leading, full lifecycle API management platform that provides businesses control over and visibility into the APIs that connect applications and data across the enterprise and across clouds. MongoDB and Apigee have already partnered to provide a solution to ease and secure access to siloed data for internal developers or partners. Today, we are further simplifying this solution by announcing a new connector between Apigee and MongoDB. How is it simpler?It can be complex to connect data and applications. Developers need to create and maintain custom transactional code between cloud apps to create the connection between the data source and application:This code is often the first one to breakIt is not cost effective as it is not reusable.Last year, Google Cloud announced Apigee Integration, a solution that helps enterprises easily connect their existing data and applications and surface them as accessible APIs that can power new experiences, expand digital ecosystems and protect access to critical assets. Apigee provides a secure facade between the frontend application and data source to speed up the development process using standard interfaces and a simplified developer experience.Apigee Integration now includes an out-of-the-box MongoDB connector. With this connector, developers can perform CRUD operations on a MongoDB database. The need for setting up the programming modules and exposing  them using the RESTful interface is eliminated. The connection to MongoDB Atlas can be set up directly using the Apigee UI with support for advanced MongoDB connection settings.As the connector is part of Apigee Integration it also provides the ability to transform the data using the transformation enginefrom Google Cloud.You can easily design your transformation logic using a drag-and-drop interface, manage variables in different formats (Json, String, Arrays..) and conditional flows. A concrete exampleA Healthcare company needs to share datasets with external partners. They chose MongoDB Atlas as it is fully managed and for its dynamic schema that is ideal for building modern applications. Their partners can only consume the data through an API. For security reasons, they will not be able to access the database directly.Fig.1 shows how simple it is to implement a “plug and play” approach for this scenario, with built in security at the edge of Google Cloud to prevent attacks using Cloud Armor and Apigee as well as providing fine grained governance for the partners.Figure 1: High level architecture that illustrates how to expose MongoDB Atlas through a Apigee platform without codeFig.2 shows how easily the MongoDB connector can be deployed in the Integration designer, without maintaining any infrastructure. The business logic, like sensitive data approval, can be added to the connector, before the data is returned to the partner.  In this example :The flow is triggered by an API call exposed by ApigeeThe MongoDB connector retrieves the data If the DataClass retrieved is A, an approval will be requested on the UI.If the DataClass retrieved is B, only the necessary fields will be sent back to the consumer using the filtering capabilities.Figure 2: Designer example to call MongoDB connector from Apigee Integration and implement an approval workflow and data mappingDevelopers Save Time in a Secure EnvironmentWith this new integration between Apigee and MongoDB Atlas, developers now have a simpler experience for accessing relevant data.Instead of wasting time building transactional code, they can focus on implementing business scenarios in a secure and scalable environment.Next StepsIntroduction to Apigee X.Learn more about MongoDB Atlas.Learn about Apigee connectors.Learn how to set up an Apigee MongoDB connector.Extend your data to new uses with MongoDB and Apigee – blog.We thank the many Google Cloud and MongoDB team members who contributed to this collaboration.
Quelle: Google Cloud Platform

Access modeled data from Looker Studio, now in public preview

In April, we announced the private preview of our integration between Looker and Looker Studio (previously known as Data Studio). At Next in October, to further unify our business intelligence under the Looker umbrella, we announced that Data Studio has been renamed Looker Studio. The products are now both part of the Looker family with Looker Studio remaining free of charge. At Next we also announce that the integration between these two products is now available in public preview with additional functionality.How does the integration work?Customers using the Looker connector will have access to governed data from Looker within Looker Studio. The Looker connector for Looker Studio makes both self-serve and governed BI available to users in the same tool/environment. When connecting to Looker, Looker Studio customers are able to leverage its semantic data modelwhich enables complex data to be simplified for end users with a curated catalog of business data, pre-defined business metrics, and built-in transformations. This helps users make calculations and business logic consistent within a central model and promotes a single source of truth for their organization. Access to Looker-modeled datawithin Looker Studio reports allows people to use the same tool to create reports that rely on both ad-hoc and governed data. They can use LookML to create Looker data models by centrally defining and managing business rules and definitions in one Git, version-controlled data model.. Users can analyze and rapidly prototype ungoverned data (from spreadsheets, csv files, or other cloud sources) within Looker Studio and blend governed data from Looker with data available from over 800 data sources in Looker Studio to rapidly generate new insights. They can turn their Looker-governed data into informative, highly customizable dashboards and reports in Looker Studio and collaborate in real-time to build dashboards with teammates or people outside the company. What’s new in the public preview version?We are excited that we are now able to offer this preview to a broader reach of customers, many of whom have already asked for access to the Looker connector for Looker Studio. Additionally, with this Public Preview, additional capabilities have been added to more fully represent the Looker model in Looker Studio:We are providing support for field hierarchies in the Looker Studio data panel, to keep fields organized when working with large Explores. The data panel will now show a folder structure, and you will be able to see your fields organized in the usual ways – for Views, Group Labels, and Dimension Groups. We are providing greater visibility by exposing field descriptions in new ways to enable users to quickly check the description information specified in the Looker model. Field descriptions will be available within the data panel and within tables in the report.Users will also see an option to “Open in Looker Studio” from Explores in Looker, enabling them to quickly create a Looker Studio report with a data source pointing back to that Explore.And to ensure users are getting the most current data from the underlying data source, refreshing data in Looker Studio now also refreshes the data in the Looker cache. Specifically, for this public preview, we’ve implemented enhanced restrictions on Looker data sources in Looker Studio, so admins can rest easy about testing out the functionality:We’ve disabled owner’s credentials for Looker data sources in Looker Studio, so each and every viewer needs to supply their own credentials including for shared reports.We’re also currently disabling data download and email scheduling for these data sources in Looker Studio. We’re planning to integrate with these permissions in Looker in the near future.Calculated fields are disabled, so end users cannot define their own custom metrics and dimensions in Looker Studio, and need to rely on the fields defined in the Looker Explore. How do I access the preview?This integration encompasses the connector along with changes made to both Looker Studio and Looker to represent the Looker model and extend Looker governance in Looker Studio. There is much more to come as we continue our efforts to bring together a complete, unified platform balancing self-service and governed BI. We’re planning to continue adding functionality in Looker Studio to fully represent the Looker model, and want to ensure Looker admins have insight into API activity coming from Looker Studio – similar to the way they might use System Activity in Looker today. In extending governance, we want to expand the circle of trust from Looker to Looker Studio, and we’ll be looking for customers to help us plan the best way forward. This integration is compatible with Google Cloud hosted instances with Looker version 22.16 or higher. To get access, an admin of a Looker instance can submit the sign-up form providing an instance URL and specifying which organizational domain to enable. For more information on how to get started go to the Looker Studio Help Center.For more information and demo, watch the Next ‘22 session ANA202: Bringing together a complete, unified BI platform with Looker and Data Studio and Keynote: ANA100: What’s new in Looker and Data Studio.
Quelle: Google Cloud Platform

How Telus Insights is using BigQuery to deliver on the potential of real-world big data

Editor’s note: Today, we’re hearing from TELUS Insights about how Google BigQuery has helped them deliver on-demand, real-world insights to customers.Collecting reliable, de-identifiable data on population movement patterns and markets has never been easy, particularly for industries that operate in the physical world like transit and traffic management, finance , public health, and emergency response. Unlike online businesses, these metrics might be collected  manually or limited by smaller sample sizes during a relatively short time. But imagine the positive impact this data could have if organizations had access to mass movement patterns and trends to solve complicated problems and mitigate pressing challenges such as traffic accidents, economic leakage, and more.As one of Canada’s leading telecommunications providers, TELUS is in a unique position to provide powerful data insights about mass movement patterns. At TELUS, we recognize that the potential created by big data comes with a huge responsibility to our customers.  We have always been committed to respecting our customers’ privacy and safeguarding their personal information,  which is why we have implemented industry-leading Privacy by Design standards to ensure that their privacy is protected every step of the way. All the data used by TELUS Insights is fully de-identified, meaning it cannot be traced back to  an individual. It is also aggregated into large data pools, ensuring privacy is fully protected at all times.BigQuery checked all our boxes for building TELUS InsightsTELUS Insights is the result of our vision to help businesses of all sizes and governments at all levels make smarter decisions based on real-world facts. Using industry-leading privacy standards, we can strongly de-identify our network mobility data and then aggregate it so no one can trace back data to any individual. We needed to build an architecture that would provide the performance necessary to run very complex queries, many of which were location-based and benefited from dedicated geospatial querying. TELUS is recognized as the fastest mobile operator and ranked first for network quality performance in Canada, and we wanted to deliver the same level of performance for our new data insights business.We tested out a number of products, from data appliances to an on-premise data lake, but it was BigQuery, Google Cloud’s serverless, highly scalable, and fully managed enterprise data warehouse, that eventually came out ahead of the pack. Not only did BigQuery deliver fast performance that enabled us to easily and quickly analyze large amounts of data at infinity scale, it also offered support for geospatial queries, a key requirement for the TELUS Insights business. Originally, the model for TELUS Insights was consultative in nature: we would meet with customers to understand their requirements and our data science team would develop algorithms to provide the needed insights from the available data sets.However, performance from our data warehouse proved challenging. It would take us six weeks of query runtime to extract insights from a month of data. To best serve our customers,  we began investigating the development of an API that, with simple inputs, would provide a consistent output so that customers could start using the data in a self-serve and secure manner. BigQuery proved itself able to meet our needs by combining high performance for complex queries, support for geospatial queries, and ease of implementing a customer-facing API.High performance enabled new models of customer serviceWith support for ANSI SQL, our data scientists found the environment very easy to use.  The performance boost was immediately apparent with project queries taking a fraction of the time compared to previous experiences – and that was before performing any optimization. BigQuery’s high performance was also one of the main reasons we were able to successfully launch an API that can be consumed directly and securely by our customers. Our customers were no longer limited on the size of their queries and would now get their data back in minutes. In the original consulting model, customers were dependent on our team and had little direct control over their queries, but BigQuery has allowed us to put the power of our data directly in our customers’ hands, while maintaining our commitment to privacy.Using BigQuery to power our data platform means we also benefit from the entire ecosystem of Google Cloud services and solutions, opening up new doors and opportunities for us to deepen the value of our data through advanced analytics and AI-based techniques, such as machine learning. Cloud architecture enabled a quick pivot to meet COVID challengesWhen the COVID-19 pandemic hit, we realized there was a huge value in de-identified and aggregated network mobility data for health authorities and academic researchers in helping reduce COVID-19 transmission without compromising the personal privacy of Canadians. As our TELUS Insights API was already in place, we were able to immediately shift focus and meet this public health need. Our API allowed us to provide supervised and guided access to government organizations and academic institutions to our de-identified and aggregated data, after which they were able to build their own algorithms, specific to the needs of epidemiology. BigQuery also enabled us to build federated access environments where we could safelist these organizations and, with appropriate supervision, allow them to securely access views they needed to build their reporting.COVID-19 Use Case:  The image above shows de-identified and aggregated mass movement patterns in the City of Toronto into outlying regions in May 2020 when stay-at-home orders were issued by the City and residents started traveling to cottage country.  Public Health authorities were able to use this data to inform local hospitals of the surge in population in their surrounding geographic location and to attempt to provision extra capacity at nearby hospitals, including the provisioning of equipment such as much needed ventilators.Our traditional Hadoop environments could never adapt to that changing set of requirements so quickly. With BigQuery, we were able to get the system up and running in under a month. That program, now called Data for Good, won both awards: the HPE International Association of Privacy Professionals’ Privacy Innovation of the Yearaward for 2020 and Social Impact & Communications and Service Providers Google Cloud Customer awardfor 2021. TELUS’ Data for Good program is supporting other areas of social good, in no small part because of the architectural benefits of having built on BigQuery and Google Cloud.Ready to unleash the power of our data with Google CloudBigQuery is a key enabler of TELUS Insights, enabling us to shift from a slow, consultative approach to a more adaptive data-as-a-service model that makes our platform and valuable data more accessible to our customers. Moving to BigQuery led to major improvements in performance, reducing some of our initial queries from months of runtime to hours. Switching to a cloud-based solution with exceptionally high performance also made it easier for us to create an API to serve our commercial customers and enabled us to offer a key service, in a time of crisis, to the community with our Data for Good program. To learn more about TELUS Insights, or to book a consultation about our products and services, visit our website.When we built our TELUS Insights platform, we worked with leading industry experts in de-identification. In addition, TELUS has taken a leadership role in de-identification and is a founding member of the Canadian Anonymization Network, whose mission is to help establish strong industry standards for de-identification. The TELUS de-identification methodology and, in fact, our whole Insights service, has been tested through re-identification attacks[1] [2] , stress-tested and, importantly, it has been Privacy by Design Certified. Privacy by Design certification was achieved in early 2017 for our Custom Studies product, and in early 2018 for our GeoIntelligence product.Related ArticleUnleashing the power of BigQuery to create personalized customer experiencesBigQuery’s high performance drives real-time, actionable decision-making that enables Wunderkind to bring large brands closer to their cu…Read Article
Quelle: Google Cloud Platform