AWS Config fügt Unterstützung für den AWS Service Catalog hinzu.

Jetzt können Sie mit AWS Config Konfigurationsänderungen am AWS Service Catalog aufzeichnen, ein Service, der es Kunden ermöglicht, Cloud-Ressourcen auf AWS zu organisieren, zu verwalten und bereitzustellen. Mit AWS Config können Sie Änderungen an der Konfiguration eines AWS Service Catalog-Portfolios nachverfolgen, wie z. B. das Hinzufügen oder Entfernen von Startbeschränkungen, Tags und Konten, für die das Portfolio freigegeben wurde.
Quelle: aws.amazon.com

Azure IoT automatic device management helps deploying firmware updates at scale

Automatic device management in Azure IoT Hub automates many of the repetitive and complex tasks of managing large device fleets over the entirety of their lifecycles. Since the feature shipped in June 2018, there has been a lot of interest in the firmware update use case. This blog article highlights some of the ways you can kickstart your own implementation.

Update the Azure IoT DevKit firmware over-the-air using automatic device management

The Azure IoT DevKit over-the-air (OTA) firmware update project is a great implementation of automatic device management. With automatic device management, you can target a set of devices based on their properties, define a desired configuration, and let IoT Hub update devices whenever they come into scope. This is performed using an automatic device configuration, which will also allow you to summarize completion and compliance, handle merging and conflicts, and roll out configurations in a phased approach. The Azure IoT DevKit implementation defines an automatic device configuration that specifies a collection of device twin desired properties related to the firmware version and image. It also specifies a set of useful metrics that are important for monitoring a deployment across a device fleet. The target condition can be specified based on device twin tags or device twin reported properties. The latter is particularly useful as it allows devices to self-report any prerequisites for the update.

OTA with Mongoose OS, an open source IoT Firmware Development Framework

In October 2018, our partner Cesanta announced support for automatic device management in Mongoose OS. Mongoose OS is an open source IoT Firmware Development Framework that is cross-platform and supports a variety of microcontrollers from top semiconductor companies. Mongoose OS provides reliable OTA updates, built-in flash encryption, and crypto chip support. It allows developers to have a quick and easy start with ready to go starter kits, solutions, libraries, and the option to code either in C or JavaScript.

“Mongoose OS is designed to simplify IoT firmware development for microcontrollers by helping developers to concentrate only the specific device logic while taking care of all the heavy lifting: security, networking, device control and remote management, including over-the-air updates. By working with Microsoft Azure IoT, Mongoose OS streamlines connected product development and provides a ready-to-go integration,” says CTO and Co-Founder at Cesanta Sergey Lyubka.

Firmware update deployment for operators using Azure IoT Remote Monitoring

Most recently, we released support for automatic device management in Azure IoT Remote Monitoring. Expanding on the firmware update implementation for the Azure IoT DevKit, this solution accelerator shows how automatic device management can be utilized by an operator role, in particular how a group of devices can be targeted for deployment and how the deployment can be monitored through metrics.

More resources

Learn more about automatic device management in IoT Hub.
Learn more about the Azure IoT DevKit.
Learn more about Mongoose OS.
Learn more about automatic device management in Azure IoT Remote Monitoring.

Quelle: Azure

Create alerts to proactively monitor your data factory pipelines

Data integration is complex and helps organizations combine data and business processes in hybrid data environments. The increase in volume, variety, and velocity of data has led to delays in monitoring and reacting to issues. Organizations want to reduce the risk of data integration activity failures and the impact it cause to other downstream processes. Manual approaches to monitoring data integration projects are inefficient and time consuming. As a result, organizations want to have automated processes to monitor and manage data integration projects to remove inefficiencies and catch issues before they affect the entire system. Organizations can now improve operational productivity by creating alerts on data integration events (success/failure) and proactively monitor with Azure Data Factory.

To get started, simply navigate to the Monitor tab in your data factory, select Alerts & Metrics, and then select New Alert Rule.

Select the target data factory metric for which you want to be alerted.

Then, configure the alert logic. You can specify various filters such as activity name, pipeline name, activity type, and failure type for the raised alerts. You can also specify the alert logic conditions and the evaluation criteria.

Finally, configure how you want to be alerted. Different mechanisms such email, SMS, voice, and push notifications are supported.

Creating alerts will ensure 24/7 monitoring of your data integration projects and make sure that you are notified of issues before they potentially corrupt your data or affect downstream processes. This helps your organizations to be more agile and increase confidence in your overall data integration processes. This ultimately results in increasing overall productivity in your organizations, and guarantee that you deliver on your SLAs. Learn more about creating alerts in Azure Data Factory.

Our goal is to continue adding features to improve the usability of Data Factory tools. Get started building pipelines easily and quickly using Azure Data Factory. If you have any feature requests or want to provide feedback, please visit the Azure Data Factory forum.
Quelle: Azure

Our 2019 Resolution: Help you transform your 2008 server applications with Azure!

This blog post was co-authored by Erin Chapple, CVP, Microsoft Windows Server, and Rohan Kumar, CVP, Microsoft Data.

The beginning of a new year is always a time to reflect on our plans. At Microsoft, with the end of support for 2008 servers looming, we’ve been thinking about how we can help you with your server refresh journey. How can we enable you to take advantage of all the cutting-edge innovations available in Azure?

And as we take stock, we believe that the 3 reasons why Azure is the best place to transform your 2008 server applications are:

Security: With security threats becoming more and more sophisticated, increasing your organization’s security policies should be top of mind. The good news is that Azure is the most trusted cloud in the market with more certifications than any other public cloud.
Innovation: We have an optimized, low-risk path to help you embrace Azure. And once you are there, you can continue to innovate with fully-managed services such as Azure SQL Database, Azure Cosmos DB and Azure AI.
Cost savings: By taking advantage of Azure Hybrid Benefit and Extended Security updates, you can save significantly. For example, moving a hundred 2008 servers to Azure can save you more than $300K over 3 years compared to the cost of running them on-premises (check out our Azure TCO calculator to do your own modelling). And you don’t need to think past Azure for Windows Server and SQL Server: AWS is 5x more expensive.    

Get started now – we’re here to help!

The end of support for SQL Server 2008/R2 is now less than six months away on July 9th, 2019 and support ends for Windows Server 2008/R2 on January 14th, 2020. Windows 7, Office 2010 and Exchange Server are also ending their extended support soon. Microsoft and our partners are here to help you in every step of the way. Here are 3 steps to help you make the shift to a modern estate:

Step 1: Assess your environment

The first step is to get a complete inventory of your 2008 server environment. Rank each workload by strategic value to your organization. Answer questions like: How would this workload benefit from running in the cloud? What needs to remain on-premises? What is your strategy for upgrading each server to current versions? By establishing your priorities and objectives at the start, you can ensure a more successful migration.

For detailed guidance, visit the Azure Database Migration Guide, the Windows Server Migration Guide, the Microsoft SQL Server Docs and the Windows Server Docs.

Step 2: Know your options

Microsoft offers a wide range of solutions to modernize your 2008 server applications on your terms:

Azure SQL Database Managed Instance: Get full engine compatibility with existing SQL Server deployments (starting with SQL Server 2008), while enabling PaaS capabilities (automatic patching and version updates, automated backups, high-availability) and AI-based features that drastically reduce management overhead and TCO.

Windows Server and SQL Server in Azure Virtual Machines: Get the flexibility of virtualization for a wide range of computing solutions—development and testing, running applications and extending your datacenter.

Windows Server and SQL Server on-premises: Bring innovative security and compliance features, industry-leading performance, mission-critical availability, advanced analytics built-in and new deployment options such as containers. Refresh your hardware and software infrastructure with Windows Server 2016 and 2019 and SQL Server 2017 and 2019.

Step 3: Make the move

Build your cloud migration plan using four widely adopted strategies: rehost, refactor, rearchitect, and rebuild applications. Choose the right mix for your business, considering the new database and OS options. Join our upcoming webinars and events to learn more:

January 15th to March 21st: SQL Server and Azure Data Services Roadshow
January 24th: Webinar: Transform Your Business with a Modern Data Estate
January 29th: Webinar: Transform Windows Server 2008 Apps and Infrastructure

When you add it all up, Microsoft has the most comprehensive and compelling Cloud, Data and AI platform on the planet. Only Microsoft offers expansive programs that deliver unprecedented value for your existing investments and we are here to help on your refresh journey. We are excited to see how you and your organization continue to innovate and transform your world!
Quelle: Azure

AWS Device Farm unterstützt jetzt Appium Node.js und Appium Ruby.

Sie können Appium-Tests, die in Ruby oder Node.js geschrieben wurden, nun auf Ihren nativen, hybriden und browserbasierten Anwendungen auf AWS Device Farm ausführen. Device Farm unterstützt Tests, die in allen gängigen JavaScript- und Ruby-Frameworks wie Mocha und RSpec geschrieben wurden. Sie können außerdem die Abhängigkeiten Ihres Projekts wie auch die genauen Befehle angeben, die während der Tests ausgeführt werden sollen, und so sicherstellen, dass die Tests genau wie in der lokalen Umgebung durchgeführt werden.
AWS Device Farm ist ein App-Testservice, mit dem Sie automatisierte Tests durchführen und mit Ihren Android-, iOS- und Webanwendungen auf realen Geräten interagieren können. Device Farm unterstützt die Ausführung automatisierter Tests, die in den meisten gängigen Test-Frameworks wie Espresso, XCTest, Appium Python und Appium Java geschrieben wurden. Ab heute können Sie mit Device Farm Ihre in Appium Node.js und Appium Ruby geschriebenen Tests auf echten Geräten ausführen. Mit diesen Frameworks können Sie jeden Schritt im Testprozess über eine einfache Konfigurationsdatei anpassen.
Um mehr darüber zu erfahren, wie Sie Appium Node.js für Android AWS Device Farm verwenden können, lesen Sie bitte unsere Dokumentation.
Weitere Informationen zu AWS Device Farm finden Sie auf der Produktseite. 
Quelle: aws.amazon.com

AI is the new normal: Recap of 2018

The year 2018 was a banner year for Azure AI as over a million Azure developers, customers, and partners engaged in the conversation on digital transformation. The next generation of AI capabilities are now infused across Microsoft products and services including AI capabilities for Power BI.

Here are the top 10 Azure AI highlights from 2018, across AI Services, tools and frameworks, and infrastructure at a glance:

AI services

1. Azure Machine Learning (AML) service with new automated machine learning capabilities.

2. Historical milestones in Cognitive Services including unified Speech service.

3. Microsoft is first to enable Cognitive Services in containers.

4. Cognitive Search and basketball

5. Bot Framework v4 SDK, offering broader language support (C#, Python, Java, and JavaScript) and extensibility models.

AI tools and frameworks

6. Data science features in Visual Studio Code.

7. Open Neural Network Exchange (ONNX) runtime is now open source.

8. ML.Net and AI Platform for Windows developers.

AI infrastructure

9. Azure Databricks

10. Project Brainwave, integrated with AML.

With many exciting developments, why are these moments the highlight? Read on, as this blog begins to explain the importance of these moments.

AI services

These services span pre-built AI capabilities such as Azure Cognitive Services and Cognitive Search, Conversational AI with Azure Bot Service, and custom AI development with Azure Machine Learning (AML).

1. Azure Machine Learning

At Microsoft Connect, the Azure Machine Learning (AML) service with new automated machine learning (automated ML) capabilities became available. With AML, data scientists and developers can quickly and easily build, train, and deploy machine learning models anywhere from the intelligent cloud to the intelligent edge. Once the model is developed, organizations can deploy and manage their models in the cloud and on edge, including IoT devices with integrated (CI/CD) tooling.

To learn more, read our announcement blog, “Announcing the general availability of Azure Machine Learning service.”

Few people know the story behind how Automated ML came to be. It all started in the gene-editing lab in 2016.

Dr. Nicolo Fusi, a machine learning researcher at Microsoft, encountered a problem while working with a new gene editing technology called CRISPR.ML. He tried to use machine learning to predict the best way to edit a gene. His model contained thousands of hyperparameters, making it too difficult and time consuming to optimize with existing methods. Then, Dr. Fusi had a breakthrough idea, why not apply the same approach and algorithms used for recommending movies and products to this problem of model optimization? The result is a recommendation system for machine learning pipelines. The approach combines ideas from collaborative filtering and Bayesian optimization to identify possible machine learning pipelines, allowing data scientists and developers to automate model selection and hyperparameter tuning.

In this interview, Dr.Fusi gives you an inside look at how automated ML empowers decision-making and takes the tedium out of data science.

Check out this Cornell-published white paper, “Probabilistic Matrix Factorization for Automated Machine Learning” to learn more.

2. New milestones for Azure Cognitive Services

Azure Cognitive Services is a collection of APIs that lets developers easily add the ability of vision, speech, language, and search into applications and machines. To date, more than a 1.2 million developers use Cognitive Services.

At the Build 2018 conference, Microsoft unveiled the next wave of innovation for Cognitive Services:

New Services:

A unified Speech service, enabling developers to perform Speech to Text (speech transcription), Text to Speech (speech synthesis), and Speech Translation for providing real-time speech translation capabilities all through a single API.
A Custom Vision Service that makes it effortless to train an image recognition system by simply dragging and dropping a collection of images.
The preview of the Speech devices SDK as well as the new Speech client SDK.

Enhancements to existing services:

Updates to Video Indexer to automatically detect known brands in speech and visual text and can be trained to recognize custom brands.
Updates to Bing Custom Search, Custom Decision Service, and Cognitive Services Labs with previews of emerging Cognitive Services technologies. As well as, announced support for the customization of neural machine translation.

For more details, read “Microsoft Empowers developers with new cognitive services capabilities."

3. Microsoft is the first company to deliver Cognitive Services in containers

In November Azure Cognitive Services containers became available in preview, making Azure the first platform with pre-built Cognitive Services that span the cloud and the edge.

To learn more, please read the technical blog, “Getting started with Azure Cognitive Services in containers."

4. Azure Cognitive Search and Basketball

Azure Cognitive Search, an AI-first approach to content understanding became available through preview. Cognitive Search expands Azure Search with built-in cognitive skills to extract knowledge. This knowledge is then organized and stored in a search index, enabling new ways for exploring the data.

Check out how the National Basketball Association (NBA) used Cognitive Search, Cognitive Services, and custom models to power rich data exploration in the //BUILD 2018 keynote.

Read “Announcing Cognitive Search: Azure Search + cognitive capabilities" for more details.

5. Bot Framework v4 SDK

With the general availability of Bot Framework v4 SDK announced in September, developers can take advantage of broader language support. C# and JavaScript are generally available, while Python and Java are in preview. Also take advantage of better extensibility to harness a vibrant ecosystem of pluggable components like dialog management and machine translation. The Bot Framework also includes an emulator and a set of CLI tools to streamline the creation and management of different bot language understanding services. Today the service has over 340 thousand users and growing.

To learn more, check out Conversational AI Updates.

AI tools and frameworks

These tools and frameworks include Visual Studio tools for AI, Azure Notebooks, Data Science VMs, Azure Machine Learning Studio, ONNX, and the AI Toolkit for Azure IoT Edge.

6. Data science features in Visual Studio Code

As of November, data science features are available  in the Python extension for Visual Studio Code! With these features, developers can work with data interactively in Visual Studio Code. Whether for exploring data or for incorporating machine learning models into applications, this makes Visual Studio Code an exciting new option for those who prefer an editor for data science tasks.

Visual Studio Tools for AI provides additional details for taking advantage of these new features.

7. ONNX Runtime is now open source

ONNX Runtime is now open source. ONNX is an open format to represent machine learning models that enable developers and data scientists to use the frameworks and tools that work best for them including PyTorch, TensorFlow, scikit-learn, and more. ONNX Runtime is the first inference engine that fully supports the ONNX specification. Users typically see two timesthe improvement in performance gains.

At Microsoft, teams are using ONNX Runtime to improve the scoring latency and efficiency of their models. For models the teams converted to ONNX, average performance improved by two times compared to scoring in previous solutions. Leading hardware companies such as Qualcomm, Intel and NVIDIA are actively integrating their custom accelerators into ONNX Runtime.

More details are available in the blog post, "ONNX Runtime is now open source.”

8. ML.NET and AI Platform for Windows Developers

Developers can access ML.Net, a new open-source, cross-platform machine learning framework. The technology behind AI features in Office and Windows has been released as a project on Github.

In addition, the AI Platform for Windows developers, allows ONNX models to run natively on Windows-based devices.

Check out this blog post and video, “How Three Lines of Code and Windows Machine Learning Empower .NET Developers to Run AI Locally on Windows 10 Devices” for a helpful example of how to use these platforms.

AI infrastructure

This category covers Azure Data Services, compute services including Azure Kubernetes Services (AKS), and AI Silicon support including GPUs and FPGAs.

9. Azure Databricks

Azure Databricks, a fast, easy, and collaborative Apache® Spark™-based analytics platform optimized for Azure became widely available. Today, organizations benefit from Azure Databricks' native integration with other services like Azure Blob Storage, Azure Data Factory, Azure SQL Data Warehouse, and Azure Cosmos DB. This platform enables new analytics solutions that support modern data warehousing, advanced analytics, and real-time analytics scenarios.

To learn more, read “Ignite 2018 – Making AI real for your business with Azure Data.”

10. Project Brainwave, integrated with Azure Machine Learning

Microsoft showcased the preview of Project Brainwave, integrated with Azure Machine Learning. This service brings hardware-accelerated real-time inference for AI to Azure. The Project Brainwave architecture is deployed on a type of computer chip from Intel called a field programmable gate array (FPGA), which makes real-time AI calculations at a competitive cost and with the industry's lowest lag time.

In addition, customers got a sneak peak of bringing Project Brainwave to the edge. Meaning customers can take advantage of this computing speed for their businesses and facilities, even if their systems aren't connected to a network or the Internet.

Read “Real-time AI: Microsoft announces a preview of Project Brainwave" for more details.

AI is the new normal

AI catalyzes digital transformation. Microsoft believes in making AI accessible so that developers, data scientists and enterprises can build systems that augment human ingenuity to tackle meaningful challenges.

AI is the new normal. Microsoft has more than 20 years of AI research applied to our products and services. Everyone can now access this AI through simple, yet powerful productivity tools such as Excel and Power BI.

In continual support of bringing AI to all, Microsoft introduced new AI capabilities for Power BI. These features enable all Power BI users to discover hidden, actionable insights in their data and drive better business outcomes with easy-to-use AI. No code needed to get started. Here are a few highlights:

Integration of Azure Cognitive Services.
Key driver analysis helps users understand what influences key business metrics.
Create machine learning models directly in Power BI using automated ML.
Seamless integration of Azure Machine Learning within Power BI.

Moving forward into 2019

Many thanks to you, our customers, MVPs, developers, and partners in being a part of Microsoft’s journey to empower businesses to build globally scalable AI applications. A new year is on the way, and the possibilities are endless. We can’t wait to share what we have in store for you in 2019 and to see what you will build with Azure this upcoming year. Happy New Year from the Azure AI team!
Quelle: Azure

Building Google's Game of the Year with Cloud Text-to-Speech and App Engine

At the end of every year, we take a look at Google Search trends, culminating in our annual Year in Search film. This year, we decided to also build Game of the Year, the first quiz game based on Google Search trends. We thought it would be fun to bring the trends to life, and we wanted to experiment a bit with our own technology. You can see here what the game is all about:To build the game, we used Google Cloud technologies and WaveNet, which is a deep neural network that generates raw audio waveforms. Here’s how we did it.Bringing the game to life with Cloud Text-to-Speech, WaveNet and SSMLMonths before we built anything for production, our designers, writers, and developers here on the Brand Studio team worked on varying game ideas and prototypes centered around the year’s Search trends data. A key feature of these early prototypes was Cloud Text-to-Speech. From the beginning, we wanted to take advantage of its ability to personalize any statement with a user’s name on the fly using a natural-sounding voice. This feature lets us develop our “host,” a delightful feature and core part of the game.From a practical perspective, using Cloud Text-to-Speech also significantly reduced production overhead. We could change copy easily without needing a voice actor to re-record every time we added or changed a question or answer. It also allows us to easily scale if we decide to add new questions to the game or translate it to other languages.As part of our early prototypes, we also played with several WaveNet voices. Its ability to sound out everything from awkward brand names to difficult-to-pronounce celebrity names was uncanny—and especially important given that some of 2018’s Search trends aren’t exactly standard words you find in the dictionary. We also explored Speech Synthesis Markup Language (SSML), which lets you tailor WaveNet’s speech by modifying inflection, emphasis, timing, and other very granular speech parameters. We used SSML mostly in our initial demos to make even more natural-sounding speech. Because our final product underwent frequent content updates, we couldn’t take advantage of SSML as much as we would have liked by launch time. Fortunately, we found the default speech synthesis to be pretty impressive as is. We were pleasantly surprised when the WaveNet model pronounced certain strings like “Givenchy” (jzhiv-on-shee) as intended. Other interpretations did not quite work as we had hoped (see: Go…o.o.o.o.o.o.o.o.o.o…al), but were humorous enough to keep in the final build.Finding the right audio balanceOur initial prototypes showcased all of the possible accents, languages, and genders available in Cloud Text-to-Speech. In some iterations we used the voice primarily as a source of comic relief in between questions, such as by ribbing the player for getting a wrong answer, or incorporating terrible puns after some questions. While fun to listen to, we realized we needed to strike the right balance between humorous audio commentary and unobtrusive gameplay. In the end, it felt more natural to have the host read the questions and answer selections like an actual game show host would, and to develop the host’s “character” via clever writing. Limiting the host to speaking only the written questions and facts also meant that those not using the audio experience wouldn’t miss any of the fun dialogue or receive a lesser game experience.The amount of dialogue was also important in calculating the necessary API quota. Exceeding the quota causes the host to remain silent on subsequent play-throughs of the game that day, as the API returns an appropriate “quota exceeded” error. We worked with the Cloud Text-to-Speech team to estimate queries per minute and characters per minute based on expected traffic and the length and frequency of each spoken phrase. In order to avoid issues in the event that the game did exceed our quota, we wrote in a simple check to disable the host’s voice and talking animation if any client or server errors were returned by the API. This allows the game to continue seamlessly for users with the music and sound effects only.Though we ended up narrowing down the host’s voice to only two options (one male and one female), which are randomized at the start, users can customize those voices in-game by changing the speed and pitch on the intro page, as shown below. We decided to limit those ranges to avoid unintended audio-timing bugs that appeared with extreme changes to the voice speed—for example, the host talking too slowly to finish speaking before the next line of dialogue begins. We hope that users find this balance of audio features as delightful as we do!Building the game at scaleWe built the game on App Engine to take advantage of Google Cloud’s ability to quickly scale based on traffic, its developer-friendly environment, access management, easy deployment and versioning, and API management tools. The game is a single-page Angular app, which is statically served and front-end-cached to reduce latency, and integrates the Cloud Text-to-Speech API, Matter.js for physics, Hammer.js for touch gestures, and Tween.js for animation. To easily scale and maintain the content, we used an internally built content management system to store and edit the questions, answers, fun facts and images used throughout the game.The Cloud Text-to-Speech API integrated seamlessly into the game’s build, creating a smooth, natural audio experience across all supported platforms. Knowing how easily we can include this technology in our applications opens a lot of doors to enhance future projects in delightfully unexpected ways. We’re equally excited to see what other developers come up with using this awesome piece of technology.Give Game of the Year a shot and find out how well you know the trends of 2018.
Quelle: Google Cloud Platform

Azure.Source – Volume 65

Now generally available

Announcing the general availability of Azure Data Box Disk

Azure Data Box Disk, an SSD-based solution for offline data transfer to Azure, is now generally available in the US, EU, Canada, and Australia, with more country/regions to be added over time. Each disk is an 8 TB SSD that can copy data up to USB 3.1 speeds and support the SATA II and III interfaces. The disks are encrypted using 128-bit AES encryption and can be locked with your custom passkeys. In addition, check out the end of this post for an announcement about the public preview for Blob Storage on Azure Data Box. When this feature is enabled, you will be able to copy data to Blob Storage on Data Box using blob service REST APIs.

New year, newly available IoT Hub Device Provisioning Service features

The following Azure IoT Hub Device Provisioning Service features are now generally available: Symmetric key attestation support; Re-provisioning support; Enrollment-level allocation rules; and Custom allocation logic. The IoT Hub Device Provisioning Service is a helper service for IoT Hub that enables zero-touch, just-in-time provisioning to the right IoT hub without requiring human intervention, enabling you to provision millions of devices in a secure and scalable manner. All features are available in all provisioning service regions, through the Azure portal, and the SDKs will support these new features by the end of January 2019 (with the exception of the Python SDK).

News and updates

Cognitive Services Speech SDK 1.2 – December update – Python, Node.js/NPM and other improvements

Developers can now access the latest improvements to Cognitive Services Speech Service including a new Python API and more. See this post to read what’s new for the Python API for Speech Service, Node.js support, Linux support, lightweight SDK for greater performance, control of server connectivity and connection status, and audio file buffering for unlimited audio session length support. Support for ProGuard during Android APK generation is also now available.

New Azure Migrate and Azure Site Recovery enhancements for cloud migration

This post covers some of the new features added to Microsoft Azure Migrate and Azure Site Recovery that will help you in your lift and shift migration journey to Azure. Azure Migrate enables you to discover your on-premises environment and plan your migration to Azure. Based on popular demand, we enabled Azure Migrate in two new geographies, Azure Government and Europe. We will enable support for other Azure geographies in future. Azure Site Recovery (ASR) helps you migrate your on-premises virtual machines (VMs) to IaaS VMs in Azure, this is the lift and shift migration, which now includes: Support for physical servers with UEFI boot type, Linux disk support, and Migration from anywhere (public or private clouds).

Additional updates for migration support and Azure Site Recovery:

Support for SQL to Azure SQL Database online migrations
Support for MySQL to Azure Database for MySQL online migrations
Support for PostgreSQL to Azure Database for PostgreSQL online migrations
Azure Site Recovery – Update Rollup 32

Streamlined development experience with Azure Blockchain Workbench 1.6.0

Azure Blockchain Workbench 1.6.0 is now available and includes new features such as application versioning, updated messaging, and streamlined smart contract development. You can deploy a new instance of Workbench through the Azure portal or upgrade existing deployments to 1.6.0 using an upgrade script. Be advised that this release does include some breaking changes, so check the blog post for details. In addition, information for the latest updates is available from within the Workbench UI.

New smart device security research: Consumers call on manufacturers to do more

To better understand how the allure of smart device experiences stacks up against the concern for security, and who consumers hold responsible to secure smart devices, we partnered with Greenberg Strategy, a consumer research firm to poll more than 3,000 people in the US, UK, and Germany. The research showed that more than 90% of people expect manufacturers to do more for device security, and most people will avoid brands that have public breaches. Security is the top consideration for consumers thinking of buying a device, and consumers are willing to pay more for highly secured devices. See the blog post for a detailed infographic that outlines the details of the research. Note that devices built with Azure Sphere always maintain their security health through a combination of secured hardware, a secured OS, and cloud security that provides automated software updates.

Multi-modal topic inferencing from videos

Azure Video Indexer is a cloud application built on Azure Media Analytics, Azure Search, Cognitive Services. It enables you to extract the insights from your videos using Video Indexer models. Multi-modal topic inferencing in Video Indexer is a new capability that can intuitively index media content using a cross-channel model to automatically infer topics. The model does so by projecting the video concepts onto three different ontologies – IPTC, Wikipedia, and the Video Indexer hierarchical topic ontology. Video Indexer’s topic model empowers media users to categorize their content using an intuitive methodology and optimize their content discovery. Multi-modality is a key ingredient for recognizing high-level concepts in video.

The January release of Azure Data Studio

Azure Data Studio (formerly known as SQL Operations Studio) is a new cross-platform desktop environment for data professionals using the family of on-premise and cloud data platforms (such as SQL Server, Azure SQL DB and Azure SQL Data Warehouse) on Windows, MacOS, and Linux. The January release includes: Azure Active Directory Authentication support; Data-Tier Application Wizard support; IDERA SQL DM Performance Insights (Preview); Updates to the SQL Server 2019 Preview extension; SQL Server Profiler improvements; results streaming for large queries (Preview); User setup installation support; and various bug fixes.

Additional updates

Azure Sphere: Update to the 18.11 release
Azure Sphere – Anatomy of a secured MCU
Final reminder: OMS portal moving to the Azure portal
Additional compute levels added to vCore-based Azure SQL databases and elastic pools

Technical content

To infinity and beyond: The definitive guide to scaling 10k VMs on Azure

Every platform has limits, workstations and physical servers have resource boundaries, APIs may be rate-limited, and even the perceived endlessness of the virtual public cloud enforces limitations that protect the platform from overuse or misuse. However, sometimes you experience scenarios that take platforms to their extreme, and those limits become real and therefore thought should be put into overcoming them. Solving this challenge must not only take into account the limitations and thresholds applied near the edge of the cloud platform’s capabilities, but also optimize cost, performance, and usability. Buzz is a scaling platform that uses Azure Virtual Machine Scale Sets (VMSS) to scale beyond the limits of a single set and enables hyper-scale stress tests, DDoS simulators and HPC use cases. Buzz orchestrates a number of Azure components to manage high scale clusters of VMs running and performing the same actions, such as generating load on an endpoint.

Teradata to Azure SQL Data Warehouse migration guide

With the increasing benefits of cloud-based data warehouses, there has been a surge in the number of customers migrating from their traditional on-premises data warehouses to the cloud. Teradata is a relational database management system and is one of the legacy on-premises systems from which customers are looking to migrate. This post introduces a technical white paper that gives insight into how to approach a Teradata to Azure SQL Data Warehouse migration. It is broken into sections which detail the migration phases, the preparation required for data migration including schema migration, migration of the business logic, the actual data migration approach and testing strategy.

5 Microsoft Learn Modules for Getting Started with Azure

In this quick read, Ari Bornstein shares his top five recommendations for getting up to speed with Azure Services to help you navigate through fundamentals, storing data, deploying to the cloud, administering containers, and using serverless APIs.

Performance troubleshooting using new Azure Database for PostgreSQL features

At Ignite 2018, Microsoft’s Azure Database for PostgreSQL announced the preview of Query Store (QS), Query Performance Insight (QPI), and Performance Recommendations (PR) to help ease performance troubleshooting, in response to customer feedback. This post builds on a previous post (Performance best practices for using Azure Database for PostgreSQL) to show how you can use these recently announced features to troubleshoot some common scenarios.

Questions on data residency and compliance in Microsoft Azure? We got answers!

Transparency and control are essential to establishing and maintaining trust in cloud technology, while restricted and regulated industries have additional requirements for risk management and to ensure ongoing compliance. To address this, Microsoft provides an industry-leading security and compliance portfolio. See this post for a link to the white paper, Achieving Compliant Data Residency and Security with Azure. This paper provides guidance about the security, data residency, data flows, and compliance aspects of Azure. It is designed to help you ensure that your data on Microsoft Azure is handled in a way that meets data protection, regulatory, and sovereignty requirements.

Best practices for alerting on metrics with Azure Database for MariaDB monitoring

Whether you are a developer, a database analyst, a site reliability engineer, or a DevOps professional at your company, monitoring databases is an important part of maintaining the reliability, availability, and performance of your MariaDB server. This post provides guidance and best practices for alerting on the most commonly monitored metrics for MariaDB and areas you can consider improving based on these various metrics.

Azure shows

The Azure Podcast | Episode 261 – Outage Communications

Kendall, Cale and Evan talk to Sami Kubba, a Senior PM Lead in the Azure CXP org, about how they handle communications of outages and other issues in Azure. Great insight into what goes on behind to scenes to maintain full transparency into the workings of Azure.

HTML5 audio not supported

Global real-time multi-user apps with Azure Cosmos DB | Azure Friday

Chris Anderson joins Donovan Brown to discuss how to use Azure Cosmos DB and other great Azure services to build a highly-scalable, real-time, collaborative application. You'll see techniques for using the Azure Cosmos DB change feed in both Azure Functions and SignalR applications. We also briefly touch on how custom authentication works with Azure Functions.

What’s New? A Single Key for Cognitive Services | AI Show

In this video we will talk about the work we are doing to simplify the use of Cognitive Services in your applications. We now have a single key, which eliminates having to reference and manage many keys per service for a single application.

Azure IoT Microsoft Professional Program | Internet of Things Show

Accelerate your career in one of the fastest-growing cloud technology fields: IoT. This program will teach you the device programming, data analytics, machine learning, and solution design skills you need for a successful career in IoT. Learn the skills necessary to start or progress a career working on a team that implements IoT solutions.

Consensus in Private Blockchains | Block Talk

This episode provides a review of consensus algorithms that are used, primarily for consortium based deployments.  This include the popular Proof of Authority, Proof of Work and a variant of BFT.  The core concepts of the algorithms are introduced and a demonstration of using the popular GETH client to provision a PoA based network, and how the consensus can be chosen at blockchain creation time, demonstrating the popular pluggable consensus.

TWC9: Unlimited Free Private GitHub Repos, Python in Azure App Service, CES Highlights and more

This Week on Channel 9, Christina Warren reports on the latest developer news.

How to add logic to your Testing in Production sites with PowerShell | Azure Tips and Tricks

Learn how to add additional logic by using PowerShell to automatically distribute the load between your production and deployment slot sites with the Testing in Production feature.

How to work with connectors in Azure Logic Apps | Azure Tips and Tricks

Learn how to work with connectors in Azure Logic Apps. Azure Logic Apps has a collection of connectors that you could use to integrate with 3rd party services, such as the Twitter connector.

Learn about Serverless technology in Azure Government

Steve Michelotti, Principal Program Manager on the Azure Government team, sits down with Yujin Hong, Program Manager on the Azure Government Engineering team, about Serverless computing in Azure Government.

Azure DevOps Podcast | Aaron Palermo on Cybersecurity and SDP – Episode 018

Jeffrey Palermo, interviews his own older brother, Aaron Palermo. Aaron is a DevOps engineer, solution architect, and all-around cybersecurity expert. This episode is jam-packed with incredibly useful information applicable to software developers — but also anybody who has a Wi-Fi network. Stay tuned to hear about how an SDP replaces a VPN, Aaron’s recommendations on how people can fully protect themselves online, which state-of-the-art multi-factor authentication people should be using, how to keep your data safe and protect from Wi-Fi vulnerabilities, and more.

HTML5 audio not supported

Events

CES 2019: Microsoft partners, customers showcase breakthrough innovation with Azure IoT, AI, and Mixed Reality

We are continuing to see great momentum for Azure IoT and Azure AI for connected devices and experiences, and new partners and customers choosing Azure IoT and Azure AI to accelerate their business. From connected home products to connected car experiences, check out this post for a few examples from CES 2019 in Las Vegas. Then take a look at a couple of examples that demonstrate innovation for immersive, secured digital experiences.

CES 2019: The rise of AI in automotive

CES 2019 was the perfect venue demonstrate how our customers and partners are enhancing their connected vehicle, autonomous vehicle and smart mobility strategies using the power of the Microsoft intelligent cloud, intelligent edge and AI capabilities. This post covers just a few examples of the innovative work that is happening today. As AI takes on more and more roles across the automotive ecosystem, it is inspiring to imagine the transformational possibilities that lie ahead for our industry.

Four ways to take your apps further with cloud, data, and AI solutions with Microsoft

Companies today demand the latest innovations for every solution they deliver. How can you make sure your infrastructure and data estate keep up with the demands of your business? Read this post for four tips on transforming your business with a modern data estate. Then register for a attend a free webinar on Thursday, January 24, to learn more about the new features and products that can help you optimize value and overcome challenges in modernizing your data estate.

Customers, partners, and industries

Implement predictive analytics for manufacturing with Symphony Industrial AI

Symphony Industrial AI has a mission: to bring the promise of Industrial IoT and AI to reality by delivering real value to their customers through predictive operations solutions. Two solutions by Symphony are specially tailored to the process manufacturing sector (chemicals, refining, pulp and paper, metals and mining, oil, and gas). Check this post to learn about two solutions offered by Symphony Industrial AI: Asset 360 AI and Process 360 AI.

Gain insight into your Azure Cosmos DB data with QlikView and Qlik Sense

Connecting data from various sources in a unified view can produce valuable insights that are otherwise invisible to the human eye and brain. As Azure Cosmos DB allows for collecting the data from various sources in various formats, the ability to mix and match this data becomes even more important for empowering your businesses with additional knowledge and intelligence. This is what Qlik’s analytics and visualization products, QlikView and Qlik Sense, have been able to do for years and now they support Azure Cosmos DB as a first-class data source. Qlik Sense and QlikView are data visualization tools that combine the data from different sources into a single view.

Microsoft Azure-powered Opti platform helps Atlanta prevent flooding

The City of Atlanta Department of Watershed Management will use the Opti platform to prevent flooding by making a retention pond at a local park more efficient. Microsoft CityNext partners with Opti to prevent flooding in Atlanta and other cities. Microsoft CityNext is helping cities around the world become more competitive, sustainable, and prosperous. With partners like Opti, Microsoft is working with cities to engage their citizens, empower city employees, optimize city operations and infrastructure, and transform to accelerate innovation and opportunity. The portfolio organizes solution categories across five broad functional areas: Digital Cities, Educated Cities, Healthier Cities, Safer Cities, and Sustainable Cities.

3 ways AI can help retailers stay relevant

Microsoft recently partnered with Blue Yonder, a JDA company, to survey retailers everywhere on how they are adapting to the rapidly evolving retail market by using new technologies. The findings show as retailers face new challenges in customer loyalty, competition online and changing consumer expectations, they are more committed than ever to investing in technologies like the Cloud and artificial intelligence (AI). Check out this post for three ways AI can help retailers survive in an unpredictable market. If you’re attending NRF this week, drop by the Microsoft booth to visit with JDA to learn more about price optimization solutions.

How Microsoft AI empowers transformation in your industry

AI presents incredible opportunities for organizations to change the way they do business. With 1,000 researchers—including winners of the Turing Award and Fields Medal—in 11 labs, Microsoft has established itself as a leader in AI through its dogged focus on innovation, empowerment, and ethics. Now, the groundbreaking capabilities of AI can move beyond the lab to make a positive impact on every enterprise, every industry. As Microsoft continues to research AI and incorporate its capabilities into the technologies of everyday life, it also remains committed to an ethical future. Microsoft has identified six principles—fairness, reliability and safety, privacy and security, inclusivity, transparency, and accountability—to guide the development and use of artificial intelligence so technology reflects the diversity of those who use it. In the end, it’s less about what AI can do than what people can do with AI. Visit this post to download the white paper, Microsoft’s vision for AI in the enterprise.

 

A Cloud Guru’s Azure This Week – 11 January 2019

This time on Azure This Week, Lars Klint talks about the definitive guide to scaling 10k VMs on Azure, Teradata to Azure SQL Data Warehouse migration guide, and using QlikView and Qlik Sense with Azure Cosmos DB.

Quelle: Azure