easyJet: Transforming how customers search for flights with the help of Google Cloud

With a growing fleet of 325 aircraft that cover more than 1,000 routes across 158 airports, easyJet is one of Europe’s most popular airlines. And easyJet serves an average of 90 million passengers each year, so a helpful mobile experience for its customers is a top priority.Travellers today are inherently mobile-first, so finding new ways to make it easier for them to search and book flights is key. To do exactly that, easyJet partnered with technology company Travelport to develop Speak Now, a new feature on easyJet’s mobile app that interprets voice searches to deliver accurate and relevant flight information to travelers.Powered by Dialogflow, Google Cloud’s natural language understanding tool for building conversational experiences, Speak Now lets customers ask questions to determine exactly what they’re looking for—from destinations, to dates and times, to airports they want to fly from. How do we create conversational experiences across devices and platforms for enterprises?It’s clear that the rise in voice search is changing the way we go about our daily lives.Twenty-seven percent% of the global online population already uses voice search on mobile, and the rapid adoption of this technology is reshaping entire industries. It’s no surprise that easyJet looked to adopt this technology to positively transform experiences for their customers.   Dialogflow, a core component of Google Cloud Contact Center AI, makes it easy to build accurate, flexible conversation interfaces that allow users to ask questions and accomplish tasks in everyday language. It understands the nuances of human language and translates end-user text or audio during a conversation to structured data that apps and services can understand. Daniel Young, Head of Digital Experience at easyJet commented: “We picked Dialogflow due to its strengths and ease with which a powerful conversational agent can be built. Speak Now is a great example of how we’re using cloud technologies and AI to make the experience of buying and managing travel continually better for everyone. This is the latest in a series of innovative features that will make booking travel as easy as it can possibly be, giving easyJet customers a helpful digital experience.”Consumers already rely on voice assistants to play their favorite music, add items to a shopping list, and order taxis, Speak Now is a great example of how voice assistants can now make the customer experience better and more intuitive for travel.To learn more about Dialogflow, visit our website. Speak Now will be available in English language at the end of September on iOS.
Quelle: Google Cloud Platform

Best practices for Cloud Storage cost optimization

Whether you’re part of a multi-billion dollar conglomerate trying to review sales from H1, or you’re just trying to upload a video of your cat playing the piano, you need somewhere for that data to reside. We often hear from our customers that they’re using Cloud Storage, the unified object store from Google Cloud Platform (GCP), as a medium for this type of general storage. Its robust API integration with our multitude of services makes it a natural starting point to outline some of our pro tips, based on what we as Technical Account Managers (TAMs) have seen in the field, working side by side with our customers. Part of our responsibility is to offer direction to our customers on making decisions that can reduce costs and help get the most out of their GCP investments. While storing an object in the cloud in itself is an easy task, making sure you have the soundest approach for the situation you are in requires a bit more forethought. One of the benefits of having a scalable, limitless storage service is that, much like an infinitely scalable attic in your house, there are going to be some boxes and items (or buckets and objects) that you really can’t justify holding onto. These items incur a cost over time, and whether you need them for business purposes or are just holding onto them on the off chance that they might someday be useful (like those wooden nunchucks you love), the first step is creating a practice around how to identify the usefulness of an object/bucket to your business. So let’s get the broom and dustpan, and get to work!Cleaning up your storage when you’re moving to cloudThere are multiple factors to consider when looking into cost optimization. The trick here is to ensure that there are no performance impacts and that we aren’t throwing out anything that may need to be retained for future purposes, whether that be compliance, legal, or simply business value purposes. With data emerging as a top business commodity, you’ll want to use appropriate storage classes in the near term as well as for longitudinal analysis. There are a multitude of storage classes to choose from, all with varying costs, durability, and resiliency. There are rarely one-size-fits-all approaches to anything when it comes to cloud architecture. However, there are some recurring themes we have noticed as we work alongside our customers. These lessons learned can apply to any environment, whether you’re storing images or building advanced machine learning models.The natural starting point is to first understand “What costs me money?” when using Cloud Storage. The pricing page is incredibly useful, but we’ll get into more detail in this post. When analyzing customer Cloud Storage use, we consider these needs:PerformanceRetentionAccess patternsThere can be many additional use cases with cost implications, but we’ll focus on recommendations around these themes. Here are more details on each.Retention considerations and tipsThe first thing to consider when looking at a data type is its retention period. Asking yourself questions like “Why is this object valuable?” and “For how long will this be valuable?” are critical to help determine the appropriate lifecycle policy. Setting a lifecycle policy lets you tag specific objects or buckets and creates an automatic rule that will delete or even transform storage classes for that particular object or bucket type. Think of this as your own personal butler that will systematically ensure that your attic is organized and clean—except instead of costing money, this butler is saving you money for these operations. We see customers use lifecycle policies in a multitude of ways with great success. A great application is for compliance in legal discovery. Depending on your industry and data type, there are certain laws that regulate the data type that needs to be retained and the period for which it must be retained. Using a Cloud Storage lifecycle policy, you can instantly tag an object for deletion once it has met the minimum threshold for legal compliance needs, ensuring you aren’t charged for retaining it longer than is needed and you don’t have to remember which data expires when. To make this simpler, Cloud Storage has a bucket lock feature to minimize the opportunity for accidental deletion. If you’re concerned with FINRA, SEC, and CFTC, this is a particularly useful feature. Bucket lock may also help you address certain healthcare industry retention regulations.Within Cloud Storage, you can also set policies to transform a storage type to a different class. This is particularly useful for data that will be accessed relatively frequently for a short period of time, but then won’t be needed for frequent access in the long term. You might want to retain these particular objects for a longer period of time for legal or security purposes, or even general long-term business value. A great way to put this in practice is within a lab environment. Once you complete an experiment, you likely want to analyze the results quite a bit in the near term, but in the long term won’t access that data very frequently. Having a policy set up to convert this storage to Nearline or Coldline storage classes after a month is a great way to save on its long-term data costs.Access pattern considerations and tipsThe ability to transform objects into lower-cost storage classes is a powerful tool, but one that must be used with caution. While long-term storage is cheaper to maintain for an object that is accessed at a lower frequency, there will be additional charges incurred if you suddenly need to frequently access the data or metadata that has been moved to a “colder” storage option. There are also cost implications when looking to remove that data from a particular storage class. For instance, there’s currently a minimum time of 30 days for an object to sit in Nearline storage. If you need to access that data with an increased frequency, you can make a copy in a regional storage class instead to avoid increased access charges. When considering the opportunities for cost savings in the long term, you should also think about whether your data will need to be accessed in the long term and how frequently it will need to be accessed if it does become valuable again. For example, if you are a CFO looking at a quarterly report on cloud expenses and only need to pull that information every three months, you might not need to worry about the increased charges accrued for the retrieval of that data, because it will still be cheaper than maintaining the storage in a regional bucket year round. Some retrieval costs on longer-term storage classes can be substantial and should be carefully reviewed when making storage class decisions. See the pricing page for the relative differences in cost.Performance considerations and tips“Where is this data going to be accessed from?” is a major question to consider when you’re considering performance and trying to establish the best storage class for your particular use case. Locality can directly influence how fast content is pushed to and retrieved from your selected storage location. For instance, a “hot object” with global utilization (such as a database that is accessed frequently, like your employee time-tracking application) would fit well in a multi-regional location, which enables an object to be stored in multiple locations. This can potentially bring the content closer to your end users as well as enhance your overall availability. Another example is a gaming application with a broad geo-distribution of users. This brings the content closer to the user for a better experience (less lag) and ensures that your last saved file is distributed across several locations, so you don’t lose your hard-earned loot in the event of a regional outage.One thing to keep in mind when considering this option is that storage in multi-regional locations allow for better performance and higher availability, but comes at a premium and could increase network egress charges, depending on your application’s design. During the application design phase, this is an important factor to consider. Another option when you’re thinking about performance is buckets in regional locations, a good choice if your region is relatively close to your end users. You can select a specific region that your data will reside in, and get guaranteed redundancy within that region. This location type is typically a safe bet when you have a team working in a particular area and accessing a dataset with relatively high frequency. This is the most commonly used storage location type that we see, as it handles most workloads’ needs quite well. It’s fast to access, redundant within the region, and affordable overall as an object store. Overall, for something as simple-sounding as a bucket, there are actually vast amounts of possibility, all with varying degrees of cost and performance implications. As you can see, there are many ways to fine-tune your own company’s storage needs to help save some space and some cash in a well thought-out, automated way. GCP provides many features to help ensure you are getting the most out of your GCP investment, with plenty more coming soon. Find more in these Next ‘19 sessions about optimizing your GCP costs.
Quelle: Google Cloud Platform

How Google Cloud’s AI has boosted Netmarble’s team collaboration, game development and consumer reach

In less than two decades, Netmarble has become one of the world’s largest mobile-gaming companies, with more than 35 titles available in 120 countries, and hit MMORPG games like Blade & Soul Revolution,  Lineage 2: Revolution and most recently, BTS World. We began collaborating with Netmarble in 2017, at first to aid them in migrating to Google Cloud Platform (GCP), but more recently to help them leverage cloud tools and services to solve business challenges faced by many companies in the gaming industry. Most recently, we’ve worked with Netmarble’s AI Center, which manages all of the company’s AI initiatives. By applying AI to their infrastructure and operations, they’ve seen a wide variety of benefits, from faster team collaboration, to more intuitive game development, to increased reach in various regions.In this blog post, we’ll share three examples of how Netmarble and its AI Center team have worked to infuse Cloud AI into all aspects of their business, improving development, game services and operations, marketing, and player experiences overall. ML for game services operation: churn factor analysis, churn prediction and in-game anomaly detectionFor gaming companies like Netmarble with a substantial online and mobile audience, it’s not just enough to attract players, they need to retain them as well. This makes understanding why players stop playing a game—what’s known as churn—critically important.Taking it a step further, Netmarble makes a churn prediction report, categorizing players by those who are likely to leave, who are likely to remain, and who should be managed. Based on this report, the Netmarble team can then decide each day what actions to take for each respective user group. “The churn report has been an invaluable resource, because we hadn’t previously had access to that type of information,” said Kim. “Our next goal will be to get even more nuanced with the report, in hopes to answer tough questions like how likely are we to lose a particular player? We’ll also want to look into what preventative measures we can take to retain users who have been categorized as ‘very likely’ to abandon a game.”One way to retain users is to continuously add new content, but this can have unintended consequences, such as increasing the number of bugs which the QA team must address. By applying machine learning for automated testing, Netmarble can quickly find any bugs—even after a high volume launch day.As games are successfully launched, millions of users will access the game, including many fraudulent users (such as hackers and bots, for example). That’s why Netmarble uses Google Cloud AI Platform to build ML models for fraudulent user detection. Learning the growth and consumption patterns of in-game users means anomalous behavior can be quickly identified, analyzed, and aggregated into a report for further assessment.ML in marketing: from multi-market promotions to managing ad fraudGame marketing can be complex, with many functions to think about, such as lead generation, digital communications and game launch promotions. Additionally, Netmarble must craft strategies and battle ad fraud not in one market alone, but in both Western and Asian markets. To address these challenges, Netmarble turned to BigQuery—a serverless, highly-scalable, and cost-effective cloud data warehouse—to build its Return On Advertising Spend (ROAS) prediction. This tool helps Netmarble predict when marketing expenses can be collected after spends in various regions. Its LTV prediction solution can evaluate the quality of users entered by cohort whom the marketer wants. In order to cope with a variety of ad fraud challenges, Netmarble set its AD Fraud Detection system to classify heterogeneous traffic through machine learning as well as through general rule-basis as part of its detection and mediation process. This way, the company can test new media and channels preventing invalid clicks and conversions. “In order to effectively reach the right audiences and make sure they have the best touchpoints with our games, we need to have very nuanced marketing procedures in place,” Duke Kim, SVP, Head of Netmarble’s AI Center, recently shared with us. “Google Cloud AI Platform gives us the agility and technological prowess to quickly and cost-efficiently build out internal solutions that best met these requirements.”ML in game development: AI agents, balance checks and animation Just as the types of games that Netmarble makes has evolved in the past 19 years, so has the way it approaches game development. Now, the team’s next big focus is to create an AI agent that will help provide the best game experience for each player. Through this agent, which will be released soon, Netmarble intends to deliver a customized experience with specific levels, tasks or challenges tailored to a player’s skill level, that will ultimately help increase retention. This agent will offer players personalized user experiences and check how the user perceives a particular game. It will even be able to play on the user’s behalf in the event of an issue like sudden internet disconnection.Netmarble is also looking to AI for voice and animation, which can be applied to in-game cut-scenes as well as using AI to animate the faces of its in-game non-player characters (NPCs). ML scripts prompt the NPC’s voice which then mimics mouse movements. “I never even dreamed of some of the functionalities that AI can now bring into a game,” Kim said. “We’ve only scratched the surface of AI’s benefits for games; I’m beyond excited about what lies ahead one, five and even ten years in the future. The best part is that Google’s Cloud AI technologies have been so easy to infuse into our games, typically only taking one month. I have no doubt we’ll be able to integrate the latest AI quickly, moving forward, with Google as a collaborative partner.”It’s been fantastic to have such a close-knit relationship with Netmarble and the entire AI Center team for the past three years. We look forward to helping them continue to reach business goals and customers in the years to come. To  learn more about game development on Google Cloud, visit our website, and to find our more about Deployed AI business use cases, read this latest blog.
Quelle: Google Cloud Platform