Google’s scalable supercomputers for machine learning, Cloud TPU Pods, are now publicly available in beta

To accelerate the largest-scale machine learning (ML) applications deployed today and enable rapid development of the ML applications of tomorrow, Google created custom silicon chips called Tensor Processing Units (TPUs). When assembled into multi-rack ML supercomputers called Cloud TPU Pods, these TPUs can complete ML workloads in minutes or hours that previously took days or weeks on other systems. Today, for the first time, Google Cloud TPU v2 Pods and Cloud TPU v3 Pods are publicly available in beta to help ML researchers, engineers, and data scientists iterate faster and train more capable machine learning models.A full Cloud TPU v3 PodDelivering business valueGoogle Cloud is committed to providing a full spectrum of ML accelerators, including both Cloud GPUs and Cloud TPUs. Cloud TPUs offer highly competitive performance and cost, often training cutting-edge deep learning models faster while delivering significant savings. If your ML team is building complex models and training on large data sets, we recommend that you evaluate Cloud TPUs whenever you require:Shorter time to insights—iterate faster while training large ML modelsHigher accuracy—train more accurate models using larger datasets (millions of labeled examples; terabytes or petabytes of data)Frequent model updates—retrain a model daily or weekly as new data comes inRapid prototyping—start quickly with our optimized, open-source reference models in image segmentation, object detection, language processing, and other major application domainsWhile some custom silicon chips can only perform a single function, TPUs are fully programmable, which means that Cloud TPU Pods can accelerate a wide range of state-of-the-art ML workloads, including many of the most popular deep learning models. For example, a Cloud TPU v3 Pod can train ResNet-50 (image classification) from scratch on the ImageNet dataset in just two minutes or BERT (NLP) in just 76 minutes.Cloud TPU customers see significant speed-ups in workloads spanning visual product search, financial modeling, energy production, and other areas. In a recent case study, Recursion Pharmaceuticals iteratively tests the viability of synthesized molecules to treat rare illnesses. What took over 24 hours to train on their on-prem cluster completed  in only 15 minutes on a Cloud TPU Pod.What’s in a Cloud TPU PodA single Cloud TPU Pod can include more than 1,000 individual TPU chips which are connected by an ultra-fast, two-dimensional toroidal mesh network, as illustrated below. The TPU software stack uses this mesh network to enable many racks of machines to be programmed as a single, giant ML supercomputer via a variety of flexible, high-level APIs.2D toroidal mesh networkThe latest-generation Cloud TPU v3 Pods are liquid-cooled for maximum performance, and each one delivers more than 100 petaFLOPs of computing power. In terms of raw mathematical operations per second, a Cloud TPU v3 Pod is comparable with a top 5 supercomputer worldwide (though it operates at lower numerical precision).It’s also possible to use smaller sections of Cloud TPU Pods called “slices.” We often see ML teams develop their initial models on individual Cloud TPU devices (which are generally available) and then expand to progressively larger Cloud TPU Pod slices via both data parallelism and model parallelism to achieve greater training speed and model scale.You can learn more about the underlying architecture of TPUs in this blog post or this interactive website, and you can learn more about individual Cloud TPU devices and Cloud TPU Pod slices here.Getting startedIt’s easy and fun to try out a Cloud TPU in your browser right now via this interactive Colab that enables you to apply a pre-trained Mask R-CNN image segmentation model to an image of your choice. You can learn more about image segmentation on Cloud TPUs in this recent blog post.Next, we recommend working through our Cloud TPU Quickstart and then experimenting with one of the optimized and open-source Cloud TPU reference models listed below. We carefully optimized these models to save you time and effort, and they demonstrate a variety of Cloud TPU best practices. Benchmarking one of our official reference models on a public dataset on larger and larger pod slices is a great way to get a sense of Cloud TPU performance at scale.Image classificationResNet (tutorial, code, blog post)AmoebaNet-D (tutorial, code)Inception (tutorial, code)Mobile image classificationMnasNet (tutorial, code, blog post)MobileNet (code)Object detectionRetinaNet (tutorial, code, blog post)TensorFlow Object Detection API (blog post, tutorial)Image segmentationMask R-CNN (tutorial, code, blog post, interactive Colab)DeepLab (tutorial, code, blog post, interactive Colab)Natural language processingBERT (code, interactive Colab)Transformer (tutorial, Tensor2Tensor docs)Mesh TensorFlow (paper, code)QANet (code)Transformer-XL (code)Speech recognitionASR Transformer (tutorial)Lingvo (code)Generative Adversarial NetworksCompare GAN library, including a reimplementation of BigGAN (blog post, paper, code)DCGAN (code)After you work with one of the above reference models on Cloud TPU, our performance guide, profiling tools guide, and troubleshooting guide can give you in-depth technical information to help you create and optimize machine learning models on your own using high-level TensorFlow APIs. Once you’re ready to request a Cloud TPU Pod or Cloud TPU Pod slices to accelerate your own ML workloads, please contact a Google Cloud sales representative.
Quelle: Google Cloud Platform

Build with Azure IoT Central and IoT Plug and Play

We’ve made it our mission to provide powerful yet simple-to-use IoT offerings across cloud and edge, so that our partners and customers can quickly move from idea, to pilot, and then production without the need for deep expertise. Azure IoT Central and IoT Plug and Play are at the center of our quest to simplify the IoT journey so that any customer, no matter where they’re starting from, can quickly and easily create trusted, connected solutions.
Quelle: Azure

Host multiplayer Minecraft: Education Edition on Azure Virtual Machines

 

The creative nature of Minecraft has made it one of the premier educational tools for the modern classroom. Teachers around the world have designed, modified, and explored collaborative Minecraft projects for all subjects, and with Minecraft: Education Edition it has become even easier for teachers to spin up multiplayer servers right from their own machines and lead their classes in collaborative building and problem solving.

We're excited to announce the new Minecraft: Education Edition virtual machine (preview) is available on the Azure Marketplace. This release allows teachers to run multi-player Minecraft: Education Edition sessions with the scalability, performance, and security of Azure. Students need only log-in with their school-issued email address to join the learning! And institutions that have a Minecraft: Education Edition license through select Microsoft 365 Education plans need only pay only what they use on the virtual machine itself.

Running Minecraft: Education Edition on Azure can provide teachers more flexibility and control over the learning experience. Many teachers may not have a personal or organized-issued devices that can host large multi-player sessions. Or they aren’t able to leave the multiplayer instance open for students to connect from home, making the environment only accessible during class hours. This is just the first step in pairing experiences in education like Minecraft: Education Edition with the Azure cloud – we invite you to share your thoughts and feedback.

Azure provides $200 credit and a free tier of services (including virtual machines) for educators and IT administrators with Azure Free Account. Students can also receive $100 credit and a free tier of services without requiring a credit card through academic verification with Azure for Students.

Learn more about Minecraft: Education Edition, Microsoft 365 Education, and Azure in Education to try out this new offering in your classroom. Or, share this information with your local schools!

In just a couple steps, teachers can start a multiplayer Minecraft: Education Edition server and students can connect from anywhere, anytime.

If you couldn’t visit us at Microsoft Build 2019 and try out the multiplayer experience for yourself, check out the DYI site for more details.

We are excited to launch a pilot with a few current Minecraft: Education Edition teachers to try out this new Azure VM experience.

If you are a teacher or school IT leader interested in partnering with us to improve this Minecraft: Education Edition experience, please message me (Sarah Guthals on LinkedIn). We look forward continuing partnerships in education! 
Quelle: Azure

Datenschutz: Google Maps bekommt Inkognito-Modus

Im Chrome-Browser können Nutzer schon seit Jahren im Inkognito-Modus surfen – besuchte Webseiten und Cookies werden dann nicht gespeichert. Die Funktion kommt nun auch für den Kartendienst Maps, später auch für die Suche. Außerdem können alle Nutzer jetzt AR-Wegweiser verwenden. (Google I/O 2019, Google)
Quelle: Golem