3co reinvents the digital shopping experience with augmented reality on Google Cloud

 Giving people as close to a “try-before-you-buy” experience is essential for retailers. With the move to online shopping further accelerated by the COVID-19 pandemic, many people are now comfortable shopping online for items they previously only considered buying in stores. The problem for shoppers is that it still can be difficult to get what feels like more hands-on experiences of items given limitations with even some of today’s most advanced augmented reality (AR) technologies. And while retailers continue to invest heavily in creating the most life-like digital experiences possible, the results often come up short for shoppers with more digital buying options than ever. To make AR experiences more convincing for shoppers—and for anyone wanting richer, more immersive experiences in entertainment and other industries—the depiction of real-world physical objects in digital spaces needs to continue to improve and evolve. As avid plant lovers, we knew the experience of viewing and buying plants online was severely lacking. That prompted our initial exploration into rethinking what’s possible with AR: we built a direct-to-consumer app for buying plants in AR. However, during our time in the Techstars program, we quickly realized that improving how people see and experience plants online was just a fraction of a much bigger, multi-billion-dollar opportunity for us. Since 2018, 3co has been laser-focused (quite literally) on scaling 3D tech for all of e-commerce.An automated 3D scanning system for photorealistic 3D modeling of retail products, designed by 3co and powered by Google Cloud.Closing the gap between imagination and reality with Google CloudWith that in mind, 3co began developing breakthroughs needed in 3D computer vision. Our advanced artificial intelligence (AI) stack is designed to give companies an all-in-one 3D commerce platform to easily and cost-effectively create realistic 3D models of physical objects and stage them in virtual showrooms.When building our AR platform, we quickly understood that engineering 3D simulations with sub-perceptual precision requires an enormous amount of compute power. Fortunately the problems are parallelizable. But it simply isn’t possible to 3D model the complex real world with superhuman precision on conventional laptops or desktops.As a part of the Google for Startups Cloud Program, Startup Success Managers helped 3co plug into the full power of Google’s industry-leading compute capabilities. For several projects, we selected a scalableCompute Engine powerful enough to solve even the most complex 3D graphics optimizations at scale. Today with the A2 virtual machine, 3co leverages NVIDIA Ampere A100 Tensor Core GPUs to create more life-like 3D renderings over ten times faster. And this is just the beginning.We’re also proud to have deployed a customized streaming GUI on top of Google’s monstrous machines, which allowed our colleagues across the world (including in Amsterdam and Miami) to plug-and-play with the latest 3D models on a world-class industrial GPU. I would highly recommend to companies solving super hard AI and/or 3D challenges in a distributed team to consider adopting cloud resources in the same way. It was a delight to see Blender render gigabyte 3D models faster than ever before in my life.GUI for 3D modeling, streamed from Google Cloud computers by 3co, which unlocked previously impossible collaborative workflows on gigabyte-sized 3D models.Equally critical, with our technology, 3D artists in retail, media and entertainment, and other industries pressured to deliver more—and more immersive AR—experiences can reduce costs and speed to generate photorealistic 3D models, as much as tenfold. We know this from our own work because we’ve seen computing costs to generate the highest-quality 3D experiences drop significantly—even though we run an advanced Compute Engine loaded with a powerful GPUs, high-end CPUs, and massive amounts of RAM. If the goal is to scale industry-leading compute power quickly for a global customer base, Google Cloud is the proper solution. Cloud Storage is another key but often overlooked component of the Google Cloud ecosystem, critical for 3co. We need the high throughput, low latency, and instant scalability delivered bylocal cloud SSDs to support the massive amounts of data we generate, store, and stream. The local SSDs complement our A2 compute engines and are physically attached to the servers hosting the virtual machine instances. This local configuration supports extremely high input/output operations per second (IOPS) with very low latency compared to persistent disks.To top it off,Cloud Logging delivers us real-time log management at exabyte scale — ingesting analytic events that are streamed to data lakes withPub/Sub – so we can know while enjoying the beach here in Miami, Florida that everything is going smoothly in the cloud.Building the 3co AI stack with TensorFlowBuilding one of the world’s most advanced 3D computer vision solutions would not have been possible withoutTensorFlow and its comprehensive ecosystem of tools, libraries, and community resources. Since the launch of TensorFlow in 2015, I’ve personally built dozens of deep learning systems using this battle-hardened technology, an open source Google API for AI. Through TensorFlow on Google Cloud, 3co is able to scale its compute power for creation of truly photorealistic digital models of physical objects — down to microscopic computation of material textures, and deep representations of surface light transport from all angles.  Most recently, 3co has been making massive progress on top of the TensorFlow implementation of Neural Radiance Fields (“NeRF”, Mildenhall et al. 2020). We are humbled to note that this breakthrough AI in TensorFlow truly is disruptive for the 3D modeling industry: we anticipate the next decade in 3D modeling will be increasingly shaped and colored by similar neural networks (I believe the key insight of the original authors of NeRF is to force a neural network to learn a physics-based model of light transport). For our contribution, 3co is now (1) adapting NeRF-like neural networks to optimally leverage sensor data from various leading devices for 3D computer vision, and (2) forcing these neural networks to learn industry-standard 3D modeling data structures, which can instantly plug-and-play on the leading 3D platforms. As Isaac Newton said, “If I have seen further, it is by standing on the shoulders of giants.” That is, tech giants. In several ways, TensorFlow is the go-to solution both for prototyping and for large-scale deployment of AI in general. Under-the-hood, TensorFlow uses a sophisticated compiler (XLA) for optimizing how computations are allocated on underlying hardware.3co achieved a 10x speed-up in neural network training time (for inverse rendering optimization), by compiling its computations with TensorFlow XLA.Unlike its competitors (e.g. PyTorch, JAX), TensorFlow can also compile binaries to run on TPUs (i.e. TFLite) and across device architectures (e.g. iOS, Android, JavaScript). This ability is important because 3co is committed to delivering 3D computer vision wherever it is needed, with maximum speed and accuracy. Through TensorFlow on Google Cloud, 3co has been able to speed up experimental validation of patent-pending 3D computer vision systems that can run the same TensorFlow code across smartphones, LIDAR scanners, AR glasses, and so much more.3co is developing an operating system for 3D computer vision powered by TensorFlow, in order to unify development of a single codebase for AI, across the most common sensors & processors.TensorFlow also enables 3co’s neural networks to train faster, through an easy API for distributed training across many computers. Distributed deep learning was the focus of my masters thesis in 2013 (inspired by work from Jeff Dean, Andrew Ng, and Google Brain), so you can imagine how excited I was to see Google optimize these industry-leading capabilities for the open source community, over the following years. Parallelization of deep learning has consistently proven essential for creating this advanced AI, and 3co is no exception to this rule. As well, with faster AI training means faster conclusion of R&D experiments. As Sam Altman says, “The number one predictor of success for a very young startup: rate of iteration”. From day one, TensorFlow was built to speed up Google’s AI computing challenges at the biggest scale, but it also “just works” at the earliest stages of exploration. Through TensorFlow on Google Cloud, 3co is steadily improving our capabilities for autonomous photorealistic 3D modeling. Simple and flexible architectures for fast experimentation enable us to quickly move from concept to code, from code to state-of-the-art deployed ML models. Thus, Google has given 3co through TensorFlow a powerful tool needed to better serve customers with their modern AI and computer vision. In the future, 3co has big plans involving supercomputers of Google Cloud Tensor Processing Units (TPUs), so we plan to achieve even greater speed and cost optimization. Running TensorFlow on Cloud TPUs requires just a little bit of extra work by the AI developer, but Google is increasingly making it easier to plug-and-play on these gargantuan computing architectures. They truly are world class servers for AI. I remember being as excited as a little boy in a candy store, reading research back in 2017 on Google’s TPUs, which was the climax of R&D for literally dozens of super smart computer engineers. Since then, several versions of TPUs have been deployed internally at Google for many kinds of applications (e.g. Google Translate), and increasingly have been made more useful and accessible. Startups like 3co – and our customers – can benefit so much here. Through the use of advanced computer processors like TPUs, 3co expects to parallelize its AI to perform photorealistic 3D modeling of real scenes in real-time. Imagine the possibilities for commerce, gaming, entertainment, design, and architecture that this ability could unlock. Scaling 3D commerce with Google Cloud and credits3co’s participation in the Google for Startups Cloud Program (facilitated via Techstars, we also can’t thank them enough) has been instrumental to our success in closing the gap between imagination and reality. It’s a mission we’ve been working on for years – and will continue to hone for many years to come. And this success is thanks to the Google for Startups Success team: they are truly amazing. They just care about you. If you’re a startup founder, just reach out to them: they really do wonders. We especially want to highlight the Google Cloud research credits which provided 3co access to vastly greater amounts of compute power. We are so grateful to Google Cloud for enabling 3co to scale its 3D computer vision services to customers worldwide. I love that 3co is empowered by Google to help many people see the world in a new light.  If you want to learn more about how Google Cloud can help your startup, visit our pagehere to get more information about our program, and sign up for our communications to get a look at our community activities, digital events, special offers, and more.Related ArticleThe Future of Data: Unified, flexible, and accessibleGoogle Cloud’s whitepaper explores why the future of data will involve three key themes: unified, flexible, and accessible.Read Article
Quelle: Google Cloud Platform

Security through collaboration: Building a more secure future with Confidential Computing

At Google Cloud, we believe that the protection of our customers’ sensitive data is paramount, and encryption is a powerful mechanism to help achieve this goal. For years, we have supported encryption in transit when our customers ingest their data to bring it to the cloud. We’ve also long supported encryption at rest, for all customer content stored in Google Cloud. To complete the full data protection lifecycle, we can protect customer data when it’s processed through our Confidential Computing portfolio. Confidential Computing products from Google Cloud protect data in use by performing computation in a hardware isolated environment that is encrypted with keys managed by the processor and unavailable to the operator. These isolated environments help prevent unauthorized access or modification of applications and data while in use, thereby increasing the security assurances for organizations that manage sensitive and regulated data in public cloud infrastructure. Secure isolation has always been a critical component of our cloud infrastructure; with Confidential Computing, this isolation is cryptographically reinforced. Google Cloud’s Confidential Computing products leverage security components in AMD EPYC™ processors including AMD Secure Encrypted Virtualization (SEV) technology.Building trust in Confidential Computing through industry collaborationPart of our mission to bring Confidential Computing technology to more cloud workloads and services is to make sure that the hardware and software used to build these technologies is continuously reviewed and tested. We evaluate different attack vectors to help ensure Google Cloud Confidential Computing environments are protected against a broad range of attacks. As part of this evaluation, we recognize that the secure use of our services and the Internet ecosystem as a whole depends on interactions with applications, hardware, software, and services that Google doesn’t own or operate. The Google Cloud Security team, Google Project Zero, and the AMD firmware and product security teams collaborated for several months to conduct a detailed review of the technology and firmware that powers AMD Confidential Computing technology. This review covered both Secure Encrypted Virtualization (SEV) capable CPUs, and the next generation of Secure Nested Paging (SEV-SNP) capable CPUs which protect confidential VMs against the hypervisor itself. The goal of this review was to work together and analyze the firmware and technologies AMD uses to help build Google Cloud’s Confidential Computing services to further build trust in these technologies.This in-depth review focused on the implementation of the AMD secure processor in the third generation AMD EPYC processor family delivering SEV-SNP. SNP further improves the posture of confidential computing using technology that removes the hypervisor from the trust boundary of the guest, allowing customers to treat the Cloud Service Provider as another untrusted party. The review covered several AMD secure processor components and evaluated multiple different attack vectors. The collective group reviewed the design and source code implementation of SEV, wrote custom test code, and ran hardware security tests, attempting to identify any potential vulnerabilities that could affect this environment.PCIe hardware pentesting using an IO screamerWorking on this review, the security teams identified and confirmed potential issues of varying severity. AMD was diligent in fixing all applicable issues and now offers updated firmware through its OEM channels. Google Cloud’s AMD-based Confidential Computing solutions now include all the mitigations implemented during the security review.“At Google, we believe that investing in security research outside of our own platforms is a critical step in keeping organizations across the broader ecosystem safe,” said Royal Hansen, vice president of Security Engineering at Google. “At the end of the day, we all benefit from a secure ecosystem that organizations rely on for their technology needs and that is why we’re incredibly appreciative of our strong collaboration with AMD on these efforts.” “Together, AMD and Google Cloud are continuing to advance Confidential Computing, helping enterprises to move sensitive workloads to the cloud with high levels of privacy and security, without compromising performance,” said Mark Papermaster, AMD’s executive vice president and chief technology officer. ”Continuously investing in the security of these technologies through collaboration with the industry is critical to providing customer transformation through Confidential Computing. We’re thankful to have partnered with Google Cloud and the Google Security teams to advance our security technology and help shape future Confidential Computing innovations to come.”  Reviewing trusted execution environments for security is difficult given the closed-source firmware and proprietary hardware components. This is why research and collaborations such as this are critical to improve the security of foundational components that support the broader Internet ecosystem. AMD and Google believe that transparency helps provide further assurance to customers adopting Confidential Computing, and to that end AMD is working toward a model of open source security firmware.With the analysis now complete and the vulnerabilities addressed, the AMD and Google security teams agree that the AMD firmware which enables Confidential Computing solutions meets an elevated security bar for customers, as the firmware design updates mitigate several bug classes and offer a way to recover from vulnerabilities. More importantly, the review also found that Confidential VMs are protected against a broad range of attacks described in the review.Google Cloud’s Confidential Computing portfolio The Google Cloud Confidential VMs, Dataproc Confidential Compute, and Confidential GKE Nodes have enabled high levels of security and privacy to address our customers’ data protection needs without compromising usability, performance, and scale. Our mission is to make this technology ubiquitous across the cloud. Confidential VMs run on hosts with AMD EPYC processors which feature AMD Secure Encrypted Virtualization (SEV). Incorporating SEV into Confidential VMs provide benefits and features including: Isolation: Memory encryption keys are generated by the AMD Secure Processor during VM creation and reside solely within the AMD Secure Processor. Other VM encryption keys such as for disk encryption can be generated and managed by an external key manager or in Google Cloud HSM. Both sets of these keys are not accessible by Google Cloud, offering strong isolation. Attestation: Confidential VMs use Virtual Trusted Platform Module (vTPM) attestation. Every time a Confidential VM boots, a launch attestation report event is generated and posted to customer cloud logging, which gives administrators the opportunity to act as necessary.Performance: Confidential Computing offers high performance for demanding computational tasks. Enabling Confidential VM has little or no impact on most workloads. The future of Confidential Computing and secure platformsWhile there are no absolutes in computer security, collaborative research efforts help uncover security vulnerabilities that can emerge in complex environments and help to prevent Confidential Computing solutions from threats today and into the future. Ultimately, this helps us increase levels of trust for customers. We believe Confidential Computing is an industry-wide effort that is critical for securing sensitive workloads in the cloud and are grateful to AMD for their continued collaboration on this journey. To read the full security review, visit this page. Acknowledgments We thank the many Google security team members who contributed to this ongoing security collaboration and review, including James Forshaw, Jann Horn and Mark Brand.We are grateful for the open collaboration with AMD engineers, and wish to thank David Kaplan, Richard Relph and Nathan Nadarajah for their commitment to product security. We would also like to thank AMD leadership: Ab Nacef, Prabhu Jayanna, Hugo Romero, Andrej Zdravkovic and Mark Papermaster for their support of this joint effort.Related ArticleExpanding Google Cloud’s Confidential Computing portfolioGoogle Cloud Confidential Computing is now GA and including Confidential GKE Nodes.Read Article
Quelle: Google Cloud Platform

Manage Red Hat workloads seamlessly on Azure

Every year, Red Hat Summit features inspirational and actionable content, industry-shaping news, and innovative practices from customers and partners. From hybrid cloud, containers, and cloud-native app platforms to management, automation, and more, speakers from around the world, across industries, and sectors join to share how they're using open tools to build better solutions for themselves and their customers. Microsoft is proud to sponsor and participate in Red Hat Summit 2022 which brings together communities who are passionate about open source in the enterprise.

Business is changing, and keeping up with fluctuations in markets and customer demands is not easy. Modernization is essential. Technologies like containers, Kubernetes, and hybrid cloud architectures are key components that provide the scalability, innovation, and flexibility you need to maintain a competitive edge, grow market share, and increase margins. Microsoft and Red Hat offer you the tools to reduce complexity and simplify your environment, innovate faster, deliver high-quality customer experiences, and expand and scale your infrastructure in any direction so you can be a disruptor in your industry.

Today, we’re announcing multiple enhancements to our Red Hat on Azure offerings that help customers accelerate their digital transformation with the power of the cloud. This includes the broad availability of our Red Hat Ansible Automation Platform on Azure and Red Hat Open Shift Support for Azure Arc-enabled SQL Managed Instance.

Detailed updates include:

Red Hat Ansible Automation Platform on Azure is now available to customers in North America with global availability coming soon. The Ansible Automation Platform 2.2 features are available for customers in the tech preview. Red Hat Ansible Automation Platform on Azure enables IT organizations to quickly automate and scale in the cloud, with the flexibility to deliver any application, anywhere, without additional overhead or complexity. Achieve zero to automation in minutes by deploying the managed application directly from the Azure Marketplace.

Azure Arc-enabled SQL Managed Instance is now supported on Red Hat OpenShift. For Red Hat Enterprise Linux customers who need to run their data workloads outside Azure in their own datacenters or multicloud environments, we bring trusted Azure SQL and open-source software database services to meet them where they are. This database service unifies management and delivers mission-critical performance, high availability/disaster recovery at scale. With an evergreen SQL that has no end-of-support, customers can realize the best of Azure SQL on OpenShift, in any environment. Customers can enjoy fully automated updates and patches to innovate faster and be more secure. 

"Red Hat has been a strategic partner in our Azure Arc partner ecosystem in lighting up the next-gen Azure data services to run anywhere. With this support, organizations can run Azure Arc-enabled SQL Managed Instance across any environment without worrying about the infrastructure underneath. The combination of RedHat OpenShift and Azure Arc-enabled SQL Managed Instance allows customers to use the platform they know and trust to accelerate innovation with faster time to market with enterprise-grade support."—Peter Carlin, CVP Azure Database Platform

Red Hat Enterprise Linux (RHEL) 9 will be available on Azure from May 24. With demand for edge computing continuing to grow, RHEL 9 incorporates key enhancements specifically designed to address evolving IT needs at the edge. Edge management helps teams more securely manage and scale Red Hat Enterprise Linux on distributed devices from a single interface. RHEL 9 will include support for Red Hat Update Infrastructure 4 allowing for automatic updates.
Azure Hybrid Benefit for Linux 3.0 will be broadly available from May 24. Through Azure Hybrid Benefit for Linux 3.0, customers can migrate their on-premises RHEL servers to Azure by bi-directionally converting existing RHEL pay-as-you-go (PAYG) VMs on Azure to bring-your-own-subscription (BYOS) billing, resulting in cost savings. In its latest iteration, support for custom images has been included. Read more about how Azure Hybrid Benefit for Linux for additional information.

Learn more

Visit the Microsoft Red Hat on Azure page to learn more about our offerings and join us at Red Hat Summit.
Quelle: Azure