New geospatial data comes to BigQuery public datasets with CARTO collaboration

At Google Cloud, we host many public datasets, including weather, traffic, housing and other data, in BigQuery, our enterprise data warehousing platform. You can use this public data to experiment with data analytics and join it with your own data to find insights. We’re pleased to announce a new collaboration with CARTO to bring valuable location-based geospatial datasets to the BigQuery public datasets program. Spatial data is something that requires a community effort, and we’re excited to open up new possibilities for you to access, analyze and visualize GIS data.This collaboration makes it easier for users to access data and do geospatial analysis with CARTO Data Observatory 2.0, a location intelligence platform that’s powered by BigQuery. The first available dataset is the U.S. Census Bureau American Community Survey (ACS). The American Community Survey is one of the most valuable public datasets in the world. Much like the decennial census, it provides demographic, population, and housing data at an incredibly high spatial resolution. Unlike the census, though, this data is collected, aggregated and updated every year, which makes it a powerful tool to support business, non-governmental, or academic initiatives.For example, the query below shows the SQL to retrieve the data on the median income in Brooklyn in 2010 and 2017, calculate the difference, and join it to a census block groups dataset, which will then be visualized on a map.To see this in action, the CARTO team made a short Google Colab Python Notebook that performs that SQL query into BigQuery and visualizes it on CARTOframes. If you want to run it on your own, just open the following Google Colab and authenticate with your Google account that has access to BigQuery. After running this query, you can see a few of the Brooklyn neighborhoods stand out right away, as shown here:You can start using this ACS dataset in your BigQuery analyses or join your geo data with public datasets using any of the filters or predicates available in BigQuery GIS. Three additional public datasets will be available in the coming weeks, with many more to follow:Bureau of Labor Statistics (BLS) economic data: The Bureau of Labor Statistics is the U.S. government’s authoritative source on economic and employment data. The department provides extremely detailed data on the strength of the U.S. labor market, aggregated at various time periods and geographies. CARTO applies its technology to make this data easier to understand and use.TIGER/Line U.S. Coastlines, clipped by CARTO: Each year, the U.S. Census Bureau publishes detailed boundary files that describe the political and statistical boundaries in the U.S. Because the Census Bureau publishes files to define the national coastline boundaries, these do not always cleanly align with the boundary between the shore and the ocean. CARTO applies their expertise to clip the boundary to more accurately align with the coastline and let you better connect your data with the $7.9 trillion economy of the U.S. coastline.Who’s on First: An open-source gazetteer (essentially a long list) of places around the globe, Who’s on First is a combination of original works and existing open datasets that results in a massive, flexible, and incredibly detailed dictionary of places. Each place in the dataset has a stable identifier and some number of descriptive properties about that location. The dataset is carefully structured and updated, so you can depend on it to support a variety of projects.Using CARTO Data Observatory 2.0 and BigQuery GISCARTO’s Data Observatory 2.0, the latest version of their spatial data repository, helps GIS professionals and data scientists save time by simplifying access to public data and easing data joins for spatial analysis through a common geography base. Importing and wrangling geospatial datasets can present challenges, like needing to validate file formats or geometries. With CARTO’s team creating these datasets as well-maintained references in BigQuery, it gets a lot easier to use these datasets in either CARTO or BigQuery. Plus, the CARTO team takes advantage of BigQuery’s native GIS functionality in its own technology stack.”We chose BigQuery to power Data Observatory because it allows us to carry out geospatial analysis at scale for a wide range of use cases,” says Javier de la Torre, founder and chief strategy officer at CARTO. “And we like that Google Cloud hosts these datasets and covers the storage costs on behalf of customers. Finally, we love that public datasets can be referenced in analyses with the same ease and performance as a customer’s own internal data. No loading, no copying—just use the data and enjoy.” Here’s a look at how CARTO incorporates Google Cloud into its architecture:Read more about CARTO’s spatial data infrastructure, powered by BigQuery and other Google Cloudservices.We’re excited to make these new datasets available and bring new possibilities to your geospatial analytics projects. To get started, check out the BigQuery GIS documentation and start integrating these new datasets from the CARTO Data Observatory or our Google Cloud datasets marketplace.
Quelle: Google Cloud Platform

Make your voice heard in the Global Knowledge 2020 IT Skills and Salary Survey

IT professionals are always looking to expand their skills and get certified on new technologies. This truth was particularly clear in 2019, as 85% of global IT professionals reported holding at least one certification, with over half of those earned in the past 12 months. We’re especially proud to note that the Google Cloud Professional Cloud Architect certification was ranked as the top-paying IT certification in North America, Europe, the Middle East, and Africa this year. These are just a few of the findings from the Global Knowledge 2019 IT Skills and Salary Report, a comprehensive look at the IT industry that shares valuable insights on certification compensation and benefits, all broken down by region. If you’re an IT professional, we invite you to participate in the 2020 version of the survey and weigh in on industry salaries, the value of certifications, and which ones are in demand. This is your opportunity to make your voice heard in the largest worldwide study of the IT landscape, and provide information that will help shape organizational strategy and the future of the cloud industry. The survey runs through November 8, 2019 and takes about 10 to 15 minutes to complete. Don’t miss this opportunity to share your experience. Take the survey today.
Quelle: Google Cloud Platform

Knock, knock, who's there? It's Guidion, bringing timely service calls with APIs

Editor’s note: Today we hear from Aditya Bhargava, Solutions Architect and agile coach at Guidion, on how the company is using the Google Cloud’s Apigee API Management Platform to transform the Dutch solar panel and communications equipment installation landscape. Read more to learn how Guidion uses APIs to deliver technical services in and around the house, bringing happiness to seven customers every minute.  Guidion is a Dutch field service management company. We install consumer solar panels and broadband internet for our partners via a pool of about 2,000 freelance expert technicians. We provide our partners with a fully digitized, B2B cloud platform that we use to schedule and manage installations. The installation work we do isn’t the innovation—it’s way we deliver our services that’s the game changer.Streamlining the service economy with APIsMost people can think of more than one occasion when they needed a technical service like internet or cable television installed at their homes. This usually involves phoning the company to make the order, then waiting for a call back or an email or text message from the provider with the pre-ordained “installation window.” Often this painstaking process requires taking a day off to wait around for the technician to arrive. Customers don’t usually have a way to easily reschedule if the service window provided isn’t convenient. And they often don’t have ways to get updates on the day of the appointment about when–or if–the technician will arrive. This can obviously lead to frustration. Guidion has reimagined service delivery, putting our partners’ customers first with our on-demand installation platform. Once our partners notify us of an installation via our online platform, the customer is provided a link to schedule the installation at their convenience, and at a fixed appointment time that works best for them. Our technicians rely on the app to notify us of their availability, accept jobs, and if necessary, communicate directly with customers. Using Apigee to satisfy partner requirementsAbout four years ago, we migrated from our legacy system to use the Field Service Lightning and FinancialForce platforms from Salesforce to run our business. They do a great job for us, but we needed to find a way to adapt to how individual partners want to communicate with us without also migrating our legacy APIs. Since we already had existing, strong custom APIs that we didn’t want to adapt to the many partner-specific requirements, we started to look for a way to handle those kinds of translations while receiving the same API call via the same API proxy from different partners. We wanted to be able to handle the translations based on which partner is calling the API and then push the request back to Salesforce. That’s where the Apigee platform comes in. We were motivated to adopt an end-to-end API management platform because we didn’t want to have to develop a tool in-house to do SOAP-to-REST API translations (though we do offer REST endpoints that send requests to the same Salesforce custom APIs—partners can choose which route they take to integrate with our services). We chose Apigee to implement the SOAP endpoints, but also to enable us to do much more.Discovering out-of-the-box Apigee functionalityWith Apigee we have a standard way of having all our partners communicate with our Salesforce platform. The Apigee developer portal allows us to expose our endpoints and make partner onboarding very easy. We also we made a switch within Apigee that allows partners to make a choice about how they use our new system. We can either quickly and easily turn the request over to Salesforce to manage in their own REST API schema, or we can send it through Apigee as a pass-through endpoint which hands off to the old legacy system. Apigee also gives us the possibility of doing login monitoring, analytics, debugging, whitelisting, and certificate-based authentication. These are all functionalities that we got out of the box from Apigee, which we really appreciated. It meant that we didn’t have to invest any time in making or buying additional solutions.Our partners are large enterprises that need us to adapt to their requirements, so we need to keep all of the partner variations within the Salesforce platform. If we didn’t have Apigee, it would take us twice the time to implement partner-specific requirements, not to mention create a lot of additional resource-intensive maintenance. Cascading benefits to the businessAnother huge benefit stems from the fact that new partner onboarding is now handled by the business side rather than the technical side of Guidion. When a new partner or an existing partner needs a new service, the only thing our team members need to do is log in to Apigee as an operations administrator and fill in the key value map. Once the right information is in there, the partner is onboarded with no IT assistance required. Instead of waiting days for the IT team to get to it, the business is self-servicing partner able to react in real time to customer needs. In the past when we were using SOAP endpoints, integrations were considered a tough job. Not anymore. I think of Apigee as a restaurant. The menu is the Swagger documentation, then the waiter is the API that takes your order to the kitchen. The kitchen is the server that prepares your order and delivers it back to clients. Having Apigee makes our integrations as easy as eating out.To learn more about API management on Google Cloud, visit the Apigee page.
Quelle: Google Cloud Platform

Grafana and BigQuery: Together at last

Editor’s note: We’re hearing today from DoiT International, a Google Cloud Premier and MSP partner, and two-time International Partner of the Year. They recently built a Grafana plugin for BigQuery, making it easier to visualize your data. Read on for details.At DoiT International, we see data problems of all shapes and sizes. From complexity analysis to large-scale system design, there are a variety of tools that can help solve our clients’ technology and analytical needs. But sometimes there’s a tool that seems so necessary that we create and share it ourselves.Which is why we built the Grafana plugin for BigQuery.We love BigQuery for its unparalleled capability to execute queries very fast over very large datasets, and often encourage our customers to use it. We also see how much our customers love using Grafana to visualize their time-series data for monitoring, alerts, analysis, or some combination thereof. The two seem like a natural match, yet until recently, there wasn’t a way to bring them together.Fortunately, Aviv Laufer, senior cloud engineer at DoiT International, found a way. Already familiar with the BigQuery API, he dug into the Grafana documentation and had a working prototype within a few weeks, and released a beta version shortly thereafter. After about a month, we’d solved the major bugs, become production-ready, and have been fielding feature requests from the community ever since.Monitoring big data operationsHundreds of companies are already taking advantage of the plugin so they can use both tools to their fullest extent. King, for instance, is using it to monitor the company’s big data operations. The mobile game developer, which famously brought the world Candy Crush Saga back in 2012, runs their data warehouse entirely in Google Cloud and uses BigQuery’s flat-rate subscription model. As King’s usage grew to support hundreds of projects, they were having trouble measuring slot utilization at the reservation or project level. They needed a better way to assess their usage patterns and query efficiency than scraping metrics from the Stackdriver API and consolidating those into yet another project to analyze with Grafana.Since King was already piloting an alpha of the flat-rate usage export into BigQuery, and was familiar with using Grafana with Stackdriver, the plugin let them tap into the best of both worlds. For example, the following short standard SQL query obtains slot usage by project:With the Grafana plugin, King was able to visualize the results of this query and get a clear picture of the activity across their more than 1,000 projects. Different projects use different amounts of slots, and the more dominant colors indicate which projects are using more slots than the others.Grafana visualization in dark modeAnother short query allows King to monitor their global slot usage. Below, they have a clear window into a 24-hour period:Grafana visualization in dark modeUsing the plugin to visualize BigQuery monitoring its own usage is just the beginning of how King may use the plugin in the future. King is now displaying BigQuery utilization on digital signage across all its offices to help the company interpret its usage data, ask new questions about it, and find ways to write queries (and manage its data warehouse) more efficiently. Visualizing billingAnother company benefiting from the one-two punch of BigQuery and Grafana is Travix, a global online travel company with operations in 39 countries. Travix is also a heavy user of BigQuery and Grafana, and when the plugin came out, they jumped on the opportunity to streamline their workflow.One of the critical areas Travix needs to monitor is SKUs. By exporting their billing information into BigQuery, Travix can analyze all their billing data. With a quick query and the Grafana plugin, Travix can see their top 10 GCP products and the associated costs on a given time frame.This lets them monitor how much they spend on Google Cloud and perform their own cost optimizations, and drill down into the costs of individual applications.Grafana visualization in light modeTravix is also using the plugin to measure their network traffic at 15-minute intervals. By defining events from Cloudflare logs as ingress and egress, they can see what their network traffic patterns are like, and monitor for any new trends or anomalies.Grafana visualization in light modeTravix also analyzes their access logs for slowly increasing response times, which would be invisible if looking at shorter periods of time. Using the Grafana pluginUsing BigQuery and Grafana together can apply to countless applications: dashboards analyzing logs, billing data, sales metrics, traffic analysis, tracking digital marketing campaigns, and probably many more we haven’t thought of yet. Getting started is as easy as downloading the plugin from the Grafana website, or cloning the open-source Github repository. We welcome your feedback on this plugin via Github, and we respond quickly to bugs and feature requests. We look forward to seeing what you can do!
Quelle: Google Cloud Platform

Improve your connectivity to Google Cloud with enhanced hybrid connectivity options

Whatever the requirement—from enterprise-readiness fundamentals like reliability, performance, and security, to innovations for enabling microservices architecture or hybrid and multi-cloud deployments—the Google Cloud networking portfolio has something to offer.  At NEXT ‘19 in San Francisco, we announced the betas for 100 Gbps Dedicated Interconnect as well as High Availability (HA) VPN. Today, we’re excited to announce that both are generally available. With HA VPN, you can connect your on-premises deployment to a Google Cloud Platform (GCP) Virtual Private Cloud (VPC) with an industry-leading SLA of 99.99%, catered for your mission-critical workloads. Follow these migration steps to easily create redundant VPNs from your classic VPN deployments.100 Gbps Dedicated Interconnect, meanwhile, provides 10X the capacity of our previous Interconnect offering, and can be combined into Link Aggregation Groups to bring massive amounts of bandwidth. With 100 Gbps Dedicated Interconnect, you can scale your connection capacity to meet your particular requirements.Google connects customers around the world to Google Cloud through service providers via Partner Interconnect. You can now find the optimum connectivity pathway to Google Cloud from any on-net building or data center worldwide on the new Cloud Pathfinder App for Google Cloud, provided by Cloudscene; learn more from their launch blog.Let’s connect100 Gbps Dedicated Interconnect, HA VPN, and Cloud Pathfinder for Google Cloud are just the latest examples of how you can connect your business to Google Cloud. Let us know how you plan to use these new networking features and what capabilities you’d like in the future. You can learn more about GCP’s cloud networking portfolio online and reach us at gcp-networking@google.com.
Quelle: Google Cloud Platform

Stay in control of your security with new product enhancements in Google Cloud

When it comes to securing your cloud infrastructure, there is no shortage of challenges. You want to retain the visibility and control you had on-premises, while taking advantage of all the benefits the cloud can provide. The adoption of cloud-based services, for example, makes it easier for your development teams to quickly build and push services into production. However, this can unintentionally create Shadow IT, where you don’t know what services are running and if they’re secure. Today, we’re excited to announce the beta of Security Health Analytics, a security product that integrates into Cloud Security Command Center (Cloud SCC). Security Health Analytics helps you identify misconfigurations and compliance violations in your Google Cloud Platform (GCP) resources and take action. In this blog, we’ll look at how Security Health Analytics can help you stay in control of your Google Cloud security, including a real-world example from a customer, AirAsia. Staying in control of security in Google Cloud: AirAsiaAirAsia is the largest low-cost carrier in Asia as measured by passengers, and serves more than 150 destinations across 23 markets. Skytrax has named it the world’s best low-cost airline for 11 years running. As a company with a reputation for getting customers where they need to go without breaking the bank, AirAsia has several security practices in place to ensure that their budget goes to keeping their customers’ travel costs low, and not to recovering from security breaches. AirAsia’s large IT operation requires the ability to provision virtual machines (VMs) and spawn containers in Google Kubernetes Engine (GKE). The company also uses App Engine to build applications in Google Cloud. They chose Google Cloud because it offers far more flexibility, agility, and cost-effectiveness than other computing methods. While running these critical workloads in Google Cloud, AirAsia uses Security Health Analytics to see if their resources are configured properly and compliant with CIS benchmarks. “Being able to go to the new Security Health Analytics dashboard eliminates the guesswork of what we have running and if it is secure,” says Muhammad Faeez Bin Azmi, Information Security and Automation Solution Architect. “Now anyone on our team, even non-security professionals, can go to this dashboard and see a list of the misconfigured assets and compliance violations across all of our GCP resources. We can also see the severity of misconfigurations, which helps us prioritize our response.”To see what this looks like, below is an example Security Health Analytics Vulnerabilities dashboard showing potential security issues—called findings. When you click on a finding, you get a step-by-step remediation plan for how to solve the particular issue, such as an open firewall (shown below) or overly privileged access to a storage bucket, and a link that takes you directly to the impacted resource.Faeez adds, “Security Health Analytics has really helped us reduce the amount of time we spend trying to figure out what’s wrong with our resources. It’s allowed us to use our time more effectively to identify and resolve more security issues than we could before.” New to Security Health Analytics is its support for CIS benchmarks. Security Health Analytics is now fully certified by the Center for Internet Security (CIS) to monitor Google Cloud Platform Foundation benchmarks—recommendations for keeping your GCP resources secure and compliant. For example, the screenshot below shows how Security Health Analytics actively monitors for assets that violate CIS recommendation 5.1 (securing public storage buckets), which can help you identify and remediate storage buckets that are accessible to the public and prevent a data breach before it occurs.If you’re new to GCP and want to give these features a try, start your free GCP trial, enable Cloud SCC, and then turn on Security Health Analytics. If you’re an existing customer, simply enable Security Health Analytics from Security Sources in Cloud SCC. For more information on Security Health Analytics, read our documentation.
Quelle: Google Cloud Platform

How the cloud can drive economic growth in APAC (and everywhere)

Public cloud adoption in the Asia Pacific (APAC) region continues to outstrip the pace of growth in North America and Europe, according to BCG’s “Ascent to the Cloud: How Six Key APAC Economies can Lift-off” report. BCG’s report examines the public cloud’s economic impact in six key APAC markets: Australia, India, Indonesia, Japan, Singapore, and South Korea. Although public cloud adoption in these markets are still emerging when compared to the U.S. and Western Europe, the growth rate is much faster (25% in APAC versus less than 20% in the U.S. and Western Europe) and there is great potential for further development. The Cloud is not just a digital transformation story; it’s also an economic one. BCG finds that cloud adoption is expected to contribute about $450 billion of GDP across the six markets between 2019 and 2023. The direct effects of the economic boost has the potential to produce approximately 425,000 jobs in the covered economies, and influence about 1.2 million additional jobs by second order effects of public cloud deployment in key industries that drive the economy. Greater acceleration of cloud adoption and a supportive policy environment could increase the contribution to $580 billion with 770,000 direct jobs created and as many as 2.1 million jobs influenced.As part of its research, BCG identified six key benefits APAC businesses are experiencing as they embrace the cloud—with broad applications for businesses worldwide. Here’s more on what BCG’s research found.1. The cloud enhances team productivityBecause the cloud creates a standardized environment with scalable back-end systems and functions, and it provides access to proven tools that the IT teams can use to develop systems, many businesses find that moving to the cloud results in improved IT efficiencies. This means they can focus more on high value tasks like customer targeting, content development, and bringing new products to market. Better collaboration tools such as G Suite create administrative and communication efficiencies, while advanced applications such as artificial intelligence or machine learning, enable faster, clearer insights that enhance the overall productivity of the organization.L&T Financial Services provides quick access to financial services for rural communities in India. It relies on G Suite to help staff work together efficiently. Employees can interact with each other in real time using Hangouts Meet, and the task of information sharing is more seamless and secure through Drive. BigQuery also helps L&T Financial Services generate behavior scorecards to track credit quality of its micro-loan customers.“Cloud is the technology that enables us to achieve scale and reach,” says Sunil Prabhune, Chief Executive-Rural Finance, and Group Head-Digital, IT and Analytics, L&T Financial Services. “Today there are countless data points available about rural consumers which enable us to personalize our products to serve them better. With access to faster compute power, we can also on-board consumers more efficiently. Our rural businesses have clocked a disbursement CAGR of 60% over the past three years.”2. The cloud can reduce time to marketThe public cloud allows users to take new products and services to market quickly, helping organizations develop a fail fast approach that alerts them to problems immediately and makes a fast turnaround possible when something needs to be fixed.The mobile game maker Netmarble, for example, uses advanced public cloud based tools including analytics and machine learning to support new game development, manage infrastructure, and infuse business intelligence throughout its operations. The company also uses productivity tools for real-time collaboration across front and back offices.“The public cloud aligns with our vision for innovation and is as committed as we are to building better player services with advanced artificial intelligence and reliable, scalable cloud infrastructure,” says Duke Kim, SVP, Head – Netmarble AI Revolution Center, Netmarble.3. A better security and compliance environment can be found in the cloudThe top public cloud providers spend billions of dollars every year on cyber security—far more than most businesses can spend on their own. As a result, security has increasingly become a key incentive to using the public cloud. Recognizing that gaining and maintaining trust would be key to customer and partner adoption of its new products and services, Bank Rakyat Indonesia (Bank BRI) decided to pursue ISO 27001 certification in 2018. In fact, it was the first bank in ASEAN (the Association of Southeast Asian Nations) to be certified as information security compliant. Now, fintechs, insurance companies, and financial institutions that lack the talent or the financial resources to do quality credit scoring and fraud detection on their own are turning to Bank BRI. They also package data through more than 50 monetized open APIs for more than 70 ecosystem partners wanting to do credit scoring, business assessments, and risk management.4. The cloud helps businesses launch new products and services faster and more efficientlyMany businesses find that the compute infrastructure they gain by moving to the cloud allows them to introduce new products or services, as well as the internationalization of new digital products and services. With the public cloud, they are better able to expand their business models. Australia Post recently expanded into parcel delivery and growing its digital business to include retail, travel, and financial services and solutions. Using BigQuery, Australia Post has visibility into every stage of the mail delivery process and reduced the time taken to perform analytics. Operations managers can now see what’s happening in sorting facilities in real time, helping to identify flow blockages almost instantly. Previously, these types of insights would only be available at the end of the day, but now they’re delivered within 15 seconds—that’s 300 times faster. “With near real-time data analytics, we can free up valuable resources, act quicker and provide better service to the millions of Australians that rely on us every day,” says Australia Post CIO John Cox.5. The cloud enables enhanced customer engagement and experiences For many businesses, moving to the cloud means access to advanced tools such as big data analytics and machine learning that can help them improve customer experiences. To win over new customers, many feel the need to excel over their competitors when it comes to engaging the clientele and offering a positive experience—and are turning to the cloud to do it.DeNA leverages public cloud-based ML to improve the experience for new players of its mobile game Gyakuten Othellonia. To help beginners learn how to play the complex game competitively, and most importantly, enjoy the game, DeNa used AI to create a deck recommendation system for beginners and a smart AI player that would match the gamer’s level of skill. “Using the public cloud, we have been able to leverage Google Cloud’s expertise in AI to build and serve several components in our game,” says Kenshin Yamada, Director of AI Dept, DeNA Co., Ltd. “The cloud’s open and serverless technologies also enabled us to host our AI models without worrying about scalability of infrastructure or portability of code.”6. The cloud can reduce costsThe cloud offers the potential for substantial and meaningful cost reductions when businesses embrace transforming their architecture and consolidating their IT management functions. As a result, many find they’re able to achieve cost efficiencies that result from operating with smaller, fully autonomous agile IT teams that are able to focus on business rather than on managing the IT infrastructure. Before moving to the cloud, AirAsia ran its IT apps and services in an on-premises infrastructure that required extensive maintenance that diverted technology team members away from projects that would add value to the business. In addition, the infrastructure could not scale quickly and cost effectively to support its data-first transformation to a digital airline. By moving to the cloud, AirAsia found the business agility it needed, as well as a 5% to 10% forecast reduction in operating costs. It’s now looking at adopting machine learning to drive further cost efficiencies through optimizing pricing for a range of services and predicting demand for items like additional baggage, seats, and meals.Building better businesses with the cloud in APAC—and beyondWith its scalable infrastructure and a flexible, pay-as-you-go delivery of computing services, the cloud has become an increasingly essential digital transformation driver for APAC. By embracing the public cloud, these businesses are finding they can fuel growth through increased productivity, enhanced customer experiences, decreased costs, and reduced time to market. Many organizations also find significant benefit in the public cloud’s ability to provide security at a scale that often surpasses what even large companies can afford. To learn more about BCG’s findings, download a copy of the report.
Quelle: Google Cloud Platform

Compute Engine or Kubernetes Engine? New trainings teach you the basics of architecting on Google Cloud

Google Cloud wants you to be able to use the cloud on your terms, and we provide a range of computing architectures to meet you where you are. In practice, this often means choosing between Compute Engine and Google Kubernetes Engine (GKE). But, which one will best serve your needs?If you’re used to managing virtual machines (VMs) in your on-premises environment or other clouds, and want a similar experience in Google Cloud, then Compute Engine is for you. It offers scale, performance, and value so you can easily launch large compute clusters on Google’s infrastructure. Compute Engine also lets you build predefined VMs or tailor custom machine types to your specific needs.If you’re working with containers, and need to coordinate more than one in your solution, then GKE—our managed, production-ready environment for deploying containerized applications—is your best choice. It uses our latest innovations in developer productivity, resource efficiency, automated operations, and open source flexibility to help you accelerate your time-to-production.Of course, your cloud architecture will look very different depending on whether you build it with VMs (Compute Engine) or containers (GKE). That’s why we now offer two architecting training paths, available on-demand or in a classroom setting:Architecting with Google Compute EngineArchitecting with Google Kubernetes EngineArchitecting with Google Compute Engine takes you from introductory to advanced concepts in five courses. You’ll learn all the basics of the Google Cloud Platform (GCP) console and how to create virtual machines using Compute Engine. Then, you’ll dive into core services, such as Identity and Access Management (IAM), database services, billing resources, and Stackdriver services. Next, you’ll gain an understanding of how to configure load balancers and autoscaling for VM instances. The course will teach you to automate the deployment of GCP services and leverage managed services for data processing, as well as how to design highly reliable and secure GCP deployments.Over four courses, Architecting with Google Kubernetes Engine teaches you the basics of the GCP console, and then goes deeper into deploying and managing containerized applications using GKE. You’ll learn all the tools of GKE networking, and how to give your Kubernetes workloads persistent storage, while gaining an understanding of security, logging, monitoring, GCP managed storage, and database services.Ready to learn more about architecting with GCP? Join us on Friday, October 25 at 9:00 AM PST for a special webinar, Architecting with Google Compute Engine: Building your cloud infrastructure. In this webinar, we’ll give you an overview of the different Compute Engine services and demonstrate some of those services in GCP. By attending the webinar, you’ll also get one month of access to this training on Coursera at no charge. Click here to register today.
Quelle: Google Cloud Platform

Best practices for password management, 2019 edition

It is hard to imagine life today without passwords. They come in many forms, from your email credentials to your debit card PIN number, and they’re all secrets you use to help prove your identity. But traditional password best practices are no match for today’s sophisticated, and often automated, cybersecurity threats. With the all-too-often news of massive data breaches, leaked passwords, and phishing attacks, internet users must adapt to protect their valuable information. While passwords are far from perfect, they aren’t going away in the foreseeable future. Google’s automatic protections prevent the vast majority of account takeover attacks—even when an attacker knows the username and password—but there are also measures that users and IT professionals can take to further enhance account security. In the spirit of October being National Cybersecurity Awareness Month, we’ve released two new whitepapers to help you navigate password security.Modern password security for users provides pragmatic and human-centric advice for end users to help improve your authentication security habits. We go in-depth with tips on improving the security of the passwords you use today, advice on how to answer security questions, and explanations of why certain practices should be avoided.Modern password security for system designers is the first paper’s technical counterpart, outlining the latest advice on password interfaces and data handling. It provides technical guidance on how to handle UTF-8 characters, advice on sessions, and best practices for building a secure authentication system that can stand up to modern threats.Our aim is to promote an open and secure internet where users are equipped to protect their personal information and online systems are designed to prevent credential loss, even if those systems are compromised. We hope these whitepapers—available in PDF form at the links above—help you in your quest to better protect your environment.
Quelle: Google Cloud Platform

S4 Agtech picks Google Cloud to transform agricultural risk management

Editor’s note: Today we’re hearing from S4 Agtech, a risk management solutions company for agriculture that is based in Buenos Aires, Argentina; São Paulo, Brazil; and St. Louis, Missouri. S4 integrates multiple sources of agricultural data with its machine learning and other algorithms to determine agronomic and financial risk for farmers, seed developers, insurance and financial companies, traders and governments. With those tools, customers can make the best decisions for planting and planning, and transfer away climate risk to the financial markets. Read on for details on how the company is using Google Cloud Platform (GCP) to bring real-time data insights to users.Like countless other industries, farming is going digital and undergoing big changes—driven by access to more actionable information. The agriculture business can now gather and analyze georeferenced data from satellites, combined with data from IoT sensors in fields, crop rotation and yield histories, weather patterns, seed genotypes and soil composition to help increase the quantity and quality of crops. This is essential for businesses in the agriculture industry, but it’s also critical to address growing food shortages around the world. At S4, we create technology to de-risk crop production. We provide customers seeking agricultural risk management solutions with the tools to make better, data-driven decisions for their crop planning, based on machine learning and proprietary algorithms. We interpret plant evolution on a global scale with predictive modeling and analytics, and offer super-efficient risk-transferring solutions. Our multi-cloud platform includes a petabyte-scale database, an open source stack, and—after 50 proof-of-concept evaluations—BigQuery for our data warehouse and the Cloud SQL database service to handle OLTP queries to our PostgreSQL database. These PoCs included, among others, Microsoft Azure Data Lake Analytics, IBM Netezza, Postgres/PostGIS running on IBM bare-metal servers with SATA SSDs and on Google’s Compute Engine with NVMe disks, and on-premises memSQL, CitusData and Yandex ClickHouse. Weeding out risk in an uncertain marketAccording to recent research, climate extreme events like drought, heat waves, and heavy precipitation are responsible for 18-43% of global variation in crop yields for maize, spring wheat, rice, and soybeans. This is a clear trend for other crops as well. Such variation poses risks of food shortages as well as large financial risks to farmers, insurers, and regions dependent on successful crop yields. Also, it creates vast humanitarian difficulties.Our mission at S4 is to help de-risk crop production by matching the right data with analytics tools so farmers and other participants in the agricultural value chain can plan better, resulting in more reliable food supplies. In a nutshell, we create indices out of biological assets. These indices measure yield losses on crops that are caused by the effects of weather and other factors, which are then used as underlying assets for products, such as swap/derivative contracts and parametric insurance policies, to transfer risk to the financial markets. We enable insurers and lenders to buy and sell agricultural risks through the futures market. Also, our other products help farmers and seed and fertilizer companies provide customized genotype recommendations and fertilization requirements. This helps to optimize planting by geography, resources, and crop species, monitor phenological, pests and humidity evolution throughout the crop season, and estimate yields.Local communities benefit from S4’s technology, as the ability to manage weather risks allows farmers to stabilize their cash flows, invest more to produce more with fewer risks, and develop in a more sustainable manner.Growing data sources, reducing costs, accelerating performanceWith the volume of diverse data sources and analytical complexity both growing at a very fast pace, we decided that using a major cloud services provider with a broad roadmap and global partnerships would be beneficial to S4’s future evolution. At the same time, we wanted to bring our services to users faster and cut costs by consolidating our on-premises technology stack. When we started evaluating providers, our leading criteria included a powerful geospatial database and data analytics tools along with excellent support, all at a competitive price. GCP prevailed in nearly all criteria categories among the 50 companies we measured. Our previous platform architecture included a hybrid relational database that used Compute Engine for virtual machines and Cloud Storage for database backup. The RDBMS was slow. Maintaining our own data warehouse was complex and expensive. We wanted to use machine learning and neural networks, but couldn’t do so easily and affordably. The complexity of that system meant that products or services requiring small changes or additions to the data model translated to expensive expansions of infrastructure or project time. Also, agronomical or product teams couldn’t test these changes by themselves, always requiring the intervention on no small part of the IT team, which led to further delays.We added GCP services like BigQuery as S4’s cloud data warehouse and use BigQuery GIS for geospatial analysis, Cloud Dataflow for simplified stream and batch data processing, and Cloud SQL for queries to the S4 database platform, which have all made a huge impact on our services and bottom line. Database and analytics costs have decreased by 40% and customers are receiving our analytical results 25% faster. In addition, we’ve eliminated the time-consuming downloading of images, reducing storage and processing costs by 80%, because we no longer need expensive tools licenses, and have greatly reduced classification processing times.Our customers working in the agriculture industry are also benefiting from this infrastructure change. They are now able to speed up their data analytics using our GCP-based platform.”S4 products and technologies unlock the full potential of satellite imagery for crop prescriptions, monitoring and yield estimates,” says Nicolás Loria, Manager of Marketing Services, Southern Cone, Corteva Agriscience. “We’ve worked with S4 for the last three (and starting year number four) crop seasons as its team capabilities, data integration capacities, and analytics insights have allowed Corteva to perform an entire new solution. Thanks to S4’s customized 360° approach, fast response and delivery times, we have safely outsourced our remote crop analytic technical needs.”This image is one example of the detailed data we’re able to provide to our customers, so they can better map crop land and plan as efficiently as possible. The image on the left shows automatic crop classification methods, while the image on the right shows manual methods with operator-assisted supervision. The results we get from these automated classifications using Google Earth Engine and BigQuery GIS are much faster and less expensive to produce.. They correlate strongly to what actually happens in the field.Crop classification using satellite data. Yellow=soy; light green=fallow; dark green=corn; red=pastures; orange=non-cultivable areas.Also, this new architecture has allowed us to scale our models and databases with almost no limits, at a fraction of the cost vs. the previous models. We’ve saved a lot of time on executing processes and reduced work needed by our internal teams to do certain tasks, like preparing images, converting them, validating results, and more. Using Google Earth Engine has decreased the execution time of daily tasks anywhere from 50% to 90% of the previous time, going from an average time of 30 minutes to between four and 15 minutes, depending on the task.In addition to saving money and time, we are able to focus on innovation with the GCP performance and features we’re using. We’re able to seamlessly add satellite data to analytics using both public datasets and our own private data, and deliver GIS data management, analytics, crop classification and monitoring in real time. We can do semi-automatic crop classification and classification using spectral signatures with Google Earth Engine. Later this year, we’ll be using neural networks for pattern recognition and machine learning in new applications to improve crop yields and fine-tune risk models. And using GCP and Google Earth Engine infrastructure means we can run models for customers in South America and around the world, since Google Earth Engine has global satellite imagery available. We’ve heard from our customer Indigo Argentina that they’re able to bring customers data insights faster. “We are working with S4 in the development of two different applications for satellite crop monitoring and yield assessment,” says Carlos Becco, CEO, Indigo Argentina. “S4’s technology allowed us to manage and analyze multiple sources and layers of information in real time, letting us uncover valuable insights in Indigo’s own microbiome technologies, and at a very competitive cost.” Analytical products and app development thrive with GCPWith GCP, we are updating and improving algorithms that we built manually with machine learning processes to develop drought indices for upcoming crop seasons. Algorithms can recognize specific phases of crop phenology (e.g., bud burst, flowering, fruiting, leaf fall) and correlate them with photosynthetic activity, light, water, temperature, radiation, and plant genetics factors. Other analytical products like crop monitoring, pre-planting recommendations, financial scoring, and yield estimation can now do a lot more for users by offering multiple layers and datasets, faster image processing, and real-time access via APIs.We also replaced our bare-metal S4 app deployment with the App Engine serverless application platform. It provides tighter integration between the S4 platform and our BigQuery data warehouse for integration with marketplaces and third-party solutions. We get all of these Google Cloud features with all the benefits of managed cloud services, from multiversioning and security to automatic backups and high availability.At S4, we trust technology to decode plant growth and help protect farmers and their communities from climate change. With growing food shortages due to increasing populations and intensifying weather, data and analytics can have a huge impact in lowering financial risks and improving agricultural yields. It’s one sector where cloud, database, analytics, and other technologies are combining to improve business outcomes and affect the lives of billions of people. Learn more about S4’s work and learn more about data analytics on Google Cloud.
Quelle: Google Cloud Platform