How we’re supporting retailers across the globe during COVID-19

The shift to digital has been well under way for years in retail, but today, retailers have a new sense of urgency to digitally transform as the industry responds to COVID-19. The pandemic has dramatically impacted the retail industry at-large, exposing gaps in omnichannel capabilities, business continuity plans, and supply chain responsiveness. Today’s environment is uncharted territory for most retailers. Government-imposed lockdowns, social distancing guidelines, rapid changes in consumer behavior, and uneven demand for different product categories have introduced challenges to retailers around the globe. For retailers, crisis planning typically covers events in which a set of stores or warehouses are down due to natural disasters or other power outages. However, closing down all stores or, conversely, supporting demand surging to 20 times normal is a new dynamic that many retailers hadn’t contemplated previously. The pandemic has also had a polarizing effect on retailers. On one hand, grocery and mass merchandisers are experiencing unprecedented surges, while retailers in other categories, like fashion and beauty, are experiencing declines in many product categories. As we look to the future, we know that recovery will take time and will vary by sub-segment. To help retailers tackle these challenges, we’re sharing a number of industry-tailored solutions to support our customers and partners during this time. G Suite collaboration tools to assist workforce enablement and optimizationAs some retailers, particularly those in grocery and mass merchandise, experience a rapid rise in hiring to fulfill unprecedented demand, they’re realizing how critical it is to build and maintain collaboration with their employees. G Suite offers video conferencing, chat, email, and shared documents, allowing teams to efficiently work together remotely and in real time. As remote work and video conferencing continue to be the norm for many retailers, supermarkets like Schnucks—a family-owned supermarket with 100 stores in Missouri, Illinois, Indiana, Iowa, and Wisconsin—are using Google Meet to help keep dispatch running smoothly and as a help desk for in-store clerks. And in the UK, DFS Furniture Company Ltd has been able to transition its entire workforce to working from home using Google Meet. Rapid Response Virtual Agent program to quickly respond to customersOur newly launched Rapid Response Virtual Agent solution allows retailers to stand up new chatbot-based services within two weeks to help respond to their customers more quickly and efficiently, especially as it relates to critical information around COVID-19. This new chatbot can help with store hours inquiries, inventory questions, pick-up options, and more, offloading an immense amount of calls going to human agents, so they can focus on more complex service needs. Retailers like Albertsons faced call volume to their stores that increased five times the norm during COVID-19. To get customers faster responses, Albertsons enlisted the Rapid Response Virtual Agent to manage inbound call volumes and address customers’ more basic questions, such as hours of operation, pick and delivery options, and order status.Accelerated migration and PSO migration factory solutions to reduce operational overheadRapid changes in customer demand are causing significant capital and expense constraints. Accelerating the migration of IT systems to Google Cloud can help retailers quickly cut fixed costs, reduce their operational overhead, and set up the right infrastructure to map to their changing business needs—while ensuring business continuity during unexpected business disruptions. Lush, the UK-based beauty retailer, migrated its global ecommerce sites to Google Cloud in 2017 to help run its online channels smoothly, especially during peak seasons like Boxing Day. Migrating to the cloud has allowed Lush to control costs and, in return, develop innovative projects that will help drive its business forward, especially in light of COVID-19.   Capacity Management and Black Friday/Cyber Monday (BFCM) Assistance solutions to quickly rightsize cloud deploymentsBuying behavior has changed drastically, with atypical demand for some retail sub-segments and extreme declines for others. This, paired with sudden shifts from in-store to online, has caused a strain on omnichannel capabilities. Through early capacity planning, reliability testing, and operational war rooms, we can help retailers quickly rightsize cloud deployments to reflect the changing needs of their businesses. We’ve also activated our special peak season support protocols for retailers seeing ecommerce traffic surges. Ecommerce modernization to assist offline to onlineAs customer expectations shift during this time, providing a top-tier digital experience has become increasingly important. Having a flexible and agile ecommerce platform is crucial to enable retail teams to quickly introduce new shopping experiences to keep up with ever-changing demands. Google Cloud can help untangle legacy ecommerce systems, introducing a containerized architecture that provides flexibility and agility for businesses. Retailers like Ulta Beauty, whose stores are temporarily closed in response to the government-imposed lockdown, are leaning into their online channels to stay connected with shoppers, providing them with a remote digital experience. While some shoppers roam isles within a store to find exactly what they’re looking for, Ulta Beauty’s Virtual Beauty Advisor tool, built on Google Cloud, is proving particularly useful, providing consumers with data-driven product recommendations. Google Cloud’s demand forecasting and modeling capabilities to help respond to significant consumer behavior changesFor retailers, shifts in consumer spending present both a near-term need for a responsive supply chain, and a longer-term need to predict how customer behavior changes will impact demand. With Google Cloud’s AI/ML capabilities, in partnership with o9 Solutions, retailers can develop custom models based on their own data and signals to accurately forecast future demand of products at any given location—reducing lost revenue through stockouts, excessive discounting and markdowns, inventory holding, and spoilage costs. Retailers can also use Google Cloud’s broad range of public datasets, including weather, traffic, and more, to better forecast demand down to the store level.Google Cloud/Looker solutions for a 360-degree view of the customerWith substantial disruption in business due to COVID-19, it’s imperative for retailers to rely even more on data that is real-time and reliable. Google Cloud and Looker provide pre-built data models and analytics packages (called “Blocks”) that are specific for retail needs. With these pre-built resources, Looker can help quickly deliver solutions that transcend traditional business intelligence offerings such as reports and dashboards. By bringing multiple datasets across an organization together, retailers can create data experiences that help optimize in-store operations, increase retail margins, and improve customer lifetime value. Looker is also fully integrated with Google Cloud for Marketing solutions, allowing retailers to make informed marketing decisions in real time. Through this integration, they’re able to bring all of their marketing data from Google together for analysis to see how changes on Google Ads, YouTube, and Google Analytics affect one another. Retailers across all sub-segments can discover real-time insights and immediately implement changes within ongoing marketing programs.Now more than ever, we’re committed to bringing forward technologies the retail industry needs to adapt to this new era. Our goal is to eliminate the stress tied to keeping the lights on in IT and instead allow retailers to focus on what matters most: their customers and employees. Read more about our work with the retail industry here.
Quelle: Google Cloud Platform

How Google Cloud helped scaling-out one person's AI service—and his life

Editor’s note: This is a post by Kaz Sato from Google based on an interview with Sato (@sato_neet), an individual developer. It’s confusing that we have similar names, but we are not the same person.AI Gahaku (AI master painter): built by Sato with Firebase, Cloud Run, and Google Colab. One million users are enjoying the tool everyday.When Sato (@sato_neet) quit college in Tokyo 10 years ago, he didn’t know he had Asperger syndrome. After spending some time unemployed, he tried a couple different career paths, including trying to attend nursing school and learning to become a baker. When he realized that Asperger’s could be the reason that he wasn’t able to fit well in those environments he tried something else entirely: artificial intelligence (AI).It was two years ago when Sato started learning AI. He had taken some basic programming classes in college, but wanted to learn Python and JavaScript to create something fun with emerging technology and share it with the community. He also conquered the basics of deep learning with TensorFlow and Colaboratory. “As I have been earning so little money these days, it was very helpful for me that TensorFlow and Colab are freely available,” Sato explains. “I could get a great learning environment at no cost.”Developing AI GahakuIn March 2020, Sato released AI Gahaku (“AI master painter”), which he has been developing alone. It generates classical-painting style portraits based on portrait photos that you upload to the site.A classical-painting style portrait generated with AI Gahaku.The site uses a pix2pix-based ML model for the style transformation. Pix2pix is a kind of conditional generative adversarial network (cGAN) model that’s designed to generate a realistic image from a specified image as a condition. (Check out the pix2pix example with TensorFlow to try it for yourself.)A pix2pix generated image. The left image works as the condition, and the model generates the image at right.In the case of AI Gahaku, Sato trained a pix2pix-based model that takes the uploaded photo as the condition and generates a realistic classical painting portrait.The site instantly made a buzz when he shared it on Twitter, first in Japan and then in the US and other countries. Now AI Gahaku is being enjoyed by one million users worldwide, every day.The number of AI Gahaku users spiked from 0 to 1 million in 10 days.Sato has also released another fun project called pixel-me, a tool for generating 8-bit-style portraits with the same pix2pix technology—the difference is that he used pixelated images for training this model.An 8-bit style portrait generated with pixel-me.From 0 to 1,000,000 users in 10 daysWhen he was building the sites, Sato relied on Google Cloud Platform Free Tier—specifically Firebase, Cloud Run, and Colab. This allowed him to develop both AI Gahaku and pixel-me while keeping costs low.The systems architecture of AI Gahaku.”In addition to those free tiers, the $300 free trial program helped me learn how to use Google Cloud tools,” Sato says. “And, since the UX design clearly says which resource is free and which is not I was very comfortable using it.”Although AI Gahaku was built by one person, it scales automatically, thanks to Cloud Run’s serverless autoscaling feature. Now that Sato has packaged the pix2pix-based ML model as a container and deployed it to Cloud Run, he doesn’t have to manually start up or shut down server instances based on traffic load. Instead, each instance can start up in seconds after receiving an increasing number of requests—if there’s a sudden traffic spike, tens or hundreds of instances start up almost instantly, while staying under a controlled budget. This all means that Sato didn’t have to change anything in the system architecture as he watched the traffic load spike from zero to one million users per day within just 10 days of the release. Now? The Cloud Run backend of AI Gahaku is using the maximum 200 containers. “I’m so surprised that Cloud Run and Firebase are naturally scalable as serverless environments,” Sato says. “The site is keeping a fast and steady response time for millions of users, without any design changes for handling the boom in global traffic.”Scalable AI for everyoneUnder the current load, the operational cost of AI Gahaku is around $20 USD per day, Sato says. But he doesn’t have any plans to monetize the site. “I’m just not interested in those things, like starting up a business and extending the site. I just want to keep creating something truly interesting to me,” Sato explains. “I like Google Cloud serverless services because the platform allows me to explore those fun ideas easily, without worrying much about the initial cost, scalability, and ongoing operation.”Sato continues, “In the last week, I got so many responses and great feedback from all over the world. It has been the most valuable and meaningful time in my entire life. I really thank all the users, donators, and people who made this happen.” We tend to think of Google Cloud’s scalable AI features as something businesses deploy to help them scale and become more efficient. But, Sato’s story shows that sometimes this technology can help scale out your creativity, reachability, and connections with others. Do you have an idea to explore? Check out the Google Cloud Platform Free Tier page to learn more and try out the technology for yourself.
Quelle: Google Cloud Platform

Manage and find data with Blob Index for Azure Storage—now in preview

 

Blob Index—a managed secondary index, allowing you to store multi-dimensional object attributes to describe your data objects for Azure Blob storage—is now available in preview. Built on top of blob storage, Blob Index offers consistent reliability, availability, and performance for all your workloads. Blob Index provides native object management and filtering capabilities, which allows you to categorize and find data based on attribute tags set on the data.

Manage and find data with Blob Index

As datasets get larger, finding specific related objects in a sea of data can be difficult and frustrating. Previously, clients used the ListBlobs API to retrieve 5000 lexicographical records at a time, parse through the list, and repeat until you found the blobs you wanted. Some users also resorted to managing a separate lookup table to find specific objects. These separate tables can get out-of-sync—increasing cost, complexity, and frustration. Customers should not have to worry about data organization or index table management, and instead focus on building powerful applications to grow their business.

Blob Index alleviates the data management and querying problem with support for all blob types (Block Blob, Append Blob, and Page Blob). Blob Index is exposed through a familiar blob storage endpoint and APIs, allowing you to easily store and access both your data and classification indices on the same service to reduce application complexity.

To populate the blob index, you define key-value tag attributes on your data, either on new data during upload or on existing data already in your storage account. These blob index tags are stored alongside your underlying blob data. The blob indexing engine then automatically reads the new tags, indexes them, and exposes them to a user-queryable blob index. Using the Azure portal, REST APIs, or SDKs, you can then issue a FindBlobsByTags API call specify a set of criteria. Blob storage will return a filtered result set consisting only of the blobs that met the match criteria.

The below scenario is an example of how Blob Index works:

In a storage account container with a million blobs, a user uploads a new blob “B2” with the following blob index tags: < Status = Unprocessed, Quality = 8K, Source = RAW >.
The blob and its blob index tags are persisted to the storage account and the account indexing engine exposes the new blob index shortly after.
Later on, an encoding application wants to find all unprocessed media files that are at least 4K resolution quality. It issues a FindBlobs API call to find all blobs that match the following criteria: < Status = Unprocessed AND Quality >= 4K AND Status == RAW>.
The blob index quickly returns just blob “B2,” the sole blob out of one million blobs that matches the specified criteria. The encoding application can quickly start its processing job, saving idle compute time and money.

 

Platform feature integrations with Blob Index

Blob Index not only helps you categorize, manage, and find your blob data but also provides integrations with other Blob service features, such as Lifecycle management.

Using the new blobIndexMatch as a filter, you can move data to cooler tiers or delete data based on the tags applied to your blobs. This allows you to be more granular in your rules and only move or delete data if they match your specified criteria.

The following sample lifecycle management policy applies to block blobs in the “videofiles” container and tiers objects to archive storage after one day only if the blobs match the blob index tag of Status = ‘Processed’ and Source = ‘RAW’.

Lifecycle management integration with Blob Index is just the beginning. We will be adding more integrations with other blob platform features soon!

Conditional blob operations with Blob Index tags

In REST versions 2019-10-10 and higher, most blob service APIs now support a new conditional header, x-ms-if-tags, so that the operation will only succeed if the specified blob index tags condition is met. If the condition is not met, the operation will fail, thus not modifying the blob. This functionality by Blob Index can help ensure data operations only occur on explicitly tagged blobs and can protect against inadvertent deletion or modification by multi-threaded applications.

How to get started

To enroll in the Blog Index preview, submit a request to register this feature to your subscription by running the following PowerShell or CLI commands:

Register by using PowerShell

Register-AzProviderFeature -FeatureName BlobIndex -ProviderNamespace Microsoft.Storage

Register-AzResourceProvider -ProviderNamespace Microsoft.Storage

Register by using Azure CLI

az feature register –namespace Microsoft.Storage –name BlobIndex

​az provider register –namespace 'Microsoft.Storage'

After your request is approved, any existing or new General-purpose v2 (GPv2) storage accounts in France Central and France South can leverage Blob Index’s capabilities. As with most previews, we recommend that this feature should not be used for production workloads until it reaches general availability.

Build it, use it, and tell us about it!

Once you’re registered and approved for the preview, you can start leveraging all that Blob Index has to offer by setting tags on new or existing data, finding data based on tags, and setting rich lifecycle management policies with tag filters. For more information, please see Manage and find data on Azure Blob Storage with Blob Index.

Note, customers are charged for the total number of Blob Index tags within a storage account, averaged over the month. Requests to SetBlobTags, GetBlobTags, and FindBlobsByTags are charged in accordance to their respective operation types. There is no cost for the indexing engine. See Block Blob pricing to learn more.

We will continue to improve our feature capabilities and are looking forward to hearing your feedback regarding Blob Index or other features through email at BlobIndexPreview@microsoft.com. As a reminder, we love hearing all of your ideas and suggestions about Azure Storage, which you can post at Azure Storage feedback forum.
Quelle: Azure