Announcing the general availability of Azure Lab Services

Today, we are very excited to announce the general availability of Azure Lab Services – your computer labs in the cloud.

With Azure Lab Services, you can easily set up and provide on-demand access to preconfigured virtual machines (VMs) to teach a class, train professionals, run hackathons or hands-on labs, and more. Simply input what you need in a lab and let the service roll it out to your audience. Your users go to a single place to access all their VMs across multiple labs, and connect from there to learn, explore, and innovate.

Since our preview announcement, we have had many customers use the service to conduct classes, training sessions, boot camps, hands on labs, and more! For classroom or professional training, you can provide students with a lab of virtual machines configured with exactly what you need for class and give each student a specified number of hours to use the VMs for homework or personal projects. You can run a hackathon or a hands-on lab at conferences or events and scale up to hundreds of virtual machines for your attendees. You can also create an invite-only private lab of virtual machines installed with your prerelease software to give preview customers access to early trials or set up interactive sales demos.

Top three reasons customers use Azure Lab Services

Automatic management of Azure infrastructure and scale

Azure Lab Services is a managed service, which means that provisioning and management of a lab’s underlying infrastructure is handled automatically by the service. You can just focus on preparing the right lab experience for your users. Let the service handle the rest and roll out your lab’s virtual machines to your audience. Scale your lab to hundreds of virtual machines with a single click.

Simple experience for your lab users

Users who are invited to your lab get immediate access to the resources you give them inside your labs. They just need to sign in to see the full list of virtual machines they have access to across multiple labs. They can click on a single button to connect to the virtual machines and start working. Users don’t need Azure subscriptions to use the service.

Cost optimization and tracking 

Keep your budget in check by controlling exactly how many hours your lab users can use the virtual machines. Set up schedules in the lab to allow users to use the virtual machines only during designated time slots or set up reoccurring auto-shutdown and start times. Keep track of individual users’ usage and set limits.

Get started now

Try Azure Lab Services today! Get started by creating a lab account for your organization or team. All labs are managed under a lab account. You can give permissions to people in your organization to create labs in your lab account.

To learn more, visit the Azure Lab Services documentation. Ask any questions you have on Stack Overflow. Last of all, don’t forget to subscribe to our Service Updates and view other Azure Lab Services posts on the Azure blog to get the latest news.

General availability pricing

Azure Lab Services GA pricing goes into effect on May 1, 2019. Until then, you will continue to be billed based on the preview pricing. Please see the Azure Lab Services pricing page for complete details.

What’s next

We continue to listen to our customers to prioritize and ship new features and updates. Several key features will be enabled in the coming months:

Ability to reuse and share custom virtual machine images across labs
Feature to enable connections between a lab and on-premise resources
Ability to create GPU virtual machines inside the labs

We always welcome any feedback and suggestions. You can make suggestions or vote on priorities on our UserVoice feedback forum.
Quelle: Azure

Latest enhancements now available for Cognitive Services' Computer Vision

This blog was co-authored by Lei Zhang, Principal Research Manager, Computer Vision

You can now extract more insights and unlock new workflows from your images with the latest enhancements to Cognitive Services’ Computer Vision service.

1. Enrich insights with expanded tagging vocabulary

Computer Vision has more than doubled the types of objects, situations, and actions it can recognize per image.

Before

Now

2. Automate cropping with new object detection feature

Easily automate cropping and conduct basic counting of what you need from an image with the new object detection feature. Detect thousands of real life or man-made objects in images. Each object is now highlighted by a bounding box denoting its location in the image.

3. Monitor brand presence with new brand detection feature

You can now track logo placement of thousands of global brands from the consumer electronics, retail, manufacturing, entertainment industries.

With these enhancements, you can:

Do at-scale image and video-frame indexing, making your media content searchable. If you’re in media, entertainment, advertising, or stock photography, rich image and video metadata can unlock productivity for your business.
Derive insights from social media and advertising campaigns by understanding the content of images and videos and detecting logos of interest at scale. Businesses like digital agencies have found this capability useful for tracking the effectiveness of advertising campaigns. For example, if your business launches an influencer campaign, you can apply Custom Vision to automatically generate brand inclusion metrics pulling from influencer-generated images and videos.

In some cases, you may need to further customize the image recognition capabilities beyond what the enhanced Computer Vision service now provides by adding specific tagging vocabulary or object types that are relevant to your use case. Custom Vision service allows you to easily customize and deploy your model without requiring machine-learning expertise.

See it in action through the Computer Vision demo. If you’re ready to start building to unlock these insights, visit our documentation pages for image tagging, object detection, and brand detection.
Quelle: Azure

Running Cognitive Services on Azure IoT Edge

This blog post is co-authored by Emmanuel Bertrand, Senior Program Manager, Azure IoT.

We recently announced Azure Cognitive Services in containers for Computer Vision, Face, Text Analytics, and Language Understanding. You can read more about Azure Cognitive Services containers in this blog, “Brining AI to the edge.”

Today, we are happy to announce the support for running Azure Cognitive Services containers for Text Analytics and Language Understanding containers on edge devices with Azure IoT Edge. This means that all your workloads can be run locally where your data is being generated while keeping the simplicity of the cloud to manage them remotely, securely and at scale.

Whether you don’t have a reliable internet connection, or want to save on bandwidth cost, have super low latency requirements, or are dealing with sensitive data that needs to be analyzed on-site, Azure IoT Edge with the Cognitive Services containers gives you consistency with the cloud. This allows you to run your analysis on-site and a single pane of glass to operate all your sites.

These container images are directly available to try as IoT Edge modules on the Azure Marketplace:

Key Phrase Extraction extracts key talking points and highlights in text either from English, German, Spanish, or Japanese.
Language Detection detects the natural language of text with a total of 120 languages supported.
Sentiment Analysis detects the level of positive or negative sentiment for input text using a confidence score across a variety of languages.
Language Understanding applies custom machine learning intelligence to a user’s conversational and natural language text to predict overall meaning and pull out relevant and detailed information.

Please note, the Face and Recognize Text containers are still gated behind a preview, thus are not yet available via the marketplace. However you can deploy them manually by first signing up to for the preview to get access.

In this blog, we describe how to provision Language Detection container on your edge device locally and how you manage it through Azure IoT.

Set up an IoT Edge device and its IoT Hub

Follow the first steps in this quick-start for setting up your IoT Edge device and your IoT Hub.

It first walks your through creating an IoT Hub and then registering an IoT Edge device to your IoT hub. Here is a screenshot of a newly created edge device called “LanguageDetection" under the IoT Hub called “CSContainers". Select the device, copy its primary connection string, and save it for later.

Next, it guides you through setting up the IoT Edge device. If you don’t have a physical edge device, it is recommended to deploy the Ubuntu Server 16.04 LTS and Azure IoT Edge runtime virtual machine (VM) which is available on the Azure Marketplace. It is an Azure Virtual Machine that comes with IoT Edge pre-installed.

The last step is to connect your IoT Edge device to your IoT Hub by giving it its connection string created above. To do that, edit the device configuration file under /etc/iotedge/config.yaml file and update the connection string. After the connection string is update, restart the edge device with sudo systemctl restart iotedge.

Provisioning a Cognitive Service (Language Detection IoT Edge module)

The images are directly available as IoT edge modules from the Iot Hub marketplace.

Here we’re using the Language Detection image as an example, however other images work the same way. To download the image, search for the image and select Get it now, this will take you to the Azure portal “Target Devices for IoT Edge Module” page. Select your subscription with your IoT Hub, select Find Device and your IoT Edge device, then click the Select and Create buttons.

Configuring your Cognitive Service

Now you’re almost ready to deploy the Cognitive Service to your IoT Edge device. But in order to run a container you need to get a valid API key and billing endpoints, then pass them as environment variables in the module details.

Go to the Azure portal and open the Cognitive Services blade. If you don’t have a Cognitive Service that matches the container, in this case a Text Analytics service, then select add and create one. Once you have a Cognitive Service get the endpoint and API key, you’ll need this to fire up the container:

The endpoint is strictly used for billing only, no customer data ever flows that way. Copy your billing endpoint value to the “billing” environment variable and copy your API key value to the “apikey” environment variable.

Deploy the container

All required info is now filled in and you only need to complete the IoT Edge deployment. Select Next and then Submit. Verify that the deployment is happening properly by refreshing the IoT Edge device details section.

Verify that the deployment is happening properly by refreshing the IoT Edge device details section.

Trying it out

To try things out, we’ll make an HTTP call to the IoT Edge device that has the Cognitive Service container running.

For that, we’ll first need to make sure that the port 5000 of the edge device is open. If you’re using the pre-built Ubuntu with IoT Edge Azure VM as an edge device, first go to VM details, then Settings, Networking, and Outbound port rule to add an outbound security rule to open port 5000. Also copy the Public IP address of your device.

Now you should be able to query the Cognitive Service running on your IoT Edge device from any machine with a browser. Open your favorite browser and go to http://your-iot-edge-device-ip-address:5000.

Now, select Service API Description or jump directly to http://your-iot-edge-device-ip-address:5000/swagger. This will give you a detailed description of the API.

Select Try it out and then Execute, you can change the input value as you like.

The result will show up further down on the page and should look something like the following image:

Next steps

You are now up and running! You are running the Cognitive Services on your own IoT Edge device, remotely managed via your central IoT Hub. You can use this setup to manage millions of devices in a secure way.

You can play around with the various Cognitive Services already available in the Azure Marketplace and try out various scenarios. Have fun!
Quelle: Azure

Announcing Azure Integration Service Environment for Logic Apps

A new way to integrate with resources in your virtual network

We strive with every service to provide experiences that significantly improve the development experience. We’re always looking for common pain points that everybody building software in the cloud deals with. And once we find those pain points, we build best-of-class software to address the need.

In critical business scenarios, you need to have the confidence that your data is flowing between all the moving parts. The core Logic Apps offering is a great, multi-faceted service for integrating between data sources and services, but sometimes it is necessary to have dedicated service to ensure that your integration processes are as performant as can be. That’s why we developed the Integration Service Environment (ISE), a fully isolated integration environment.

What is an Integration Service Environment?

An Integration Service Environment is a fully isolated and dedicated environment for all enterprise-scale integration needs. When you create a new Integration Service Environment, it is injected into your Azure virtual network, which allows you to deploy Logic Apps as a service on your VNET.

Direct, secure access to your virtual network resources. Enables Logic Apps to have secure, direct access to private resources, such as virtual machines, servers, and other services in your virtual network including Azure services with service endpoints and on-premises resources via an Express Route or site to site VPN.
Consistent, highly reliable performance. Eliminates the noisy neighbor issue, removing fear of intermittent slowdowns that can impact business critical processes with a dedicated runtime where only your Logic Apps execute in.
Isolated, private storage. Sensitive data subject to regulation is kept private and secure, opening new integration opportunities.
Predicable pricing. Provides a fixed monthly cost for Logic Apps. Each Integration Service Environment includes the free usage of 1 Standard Integration Account and 1 Enterprise connector. If your Logic Apps action execution count exceeds 50 million action executions per month, the Integration Service Environment could provide better value.

Integration Service Environments are available in every region that Logic Apps is currently available in, with the exception of the following locations:

West Central US
Brazil South
Canada East

Logic Apps is great for customers who require a highly reliable, private integration service for all their data and services. You can try the public preview by signing up for an Azure account. If you’re an existing customer, you can find out how to get started by visiting our documentation, “Connect to Azure virtual networks from Azure Logic Apps by using an integration service environment.”
Quelle: Azure

Instantly restore your Azure Virtual Machines using Azure Backup

Today, we are delighted to share the release of Azure Backup Instant Restore capability for Azure Virtual Machines (VMs). Instant Restore helps Azure Backup customers quickly recover VMs from the snapshots stored along with the disks. In addition, users get complete flexibility in configuring the retention range of snapshots at the backup policy level depending on the requirements and criticality of the virtual machines associated, giving users more granular control over their resources.

Key benefits

Instant recovery point: Snapshots taken as a part of the backup job are stored along with the disk and are available for recovery instantly. This eliminates the wait time for snapshots to copy to the vault before a restore can be triggered.
In-place restore capability: With instant restore, users also get a capability to perform in-place restore, thus, overwriting the data in the original disk rather than creating a copy of the disk at an alternate location. It is particularly useful in scenarios where there is a need to rollback a patch. Once the snapshot phase is done, users can go ahead and use the local snapshot to restore if the patch goes bad.
Flexibility to choose retention range for snapshots at backup policy level: Depending on the operational recovery requirements of VMs, the user has the flexibility to configure snapshot retention range at a VM backup policy level. The snapshot retention range will apply to all VMs associated with the policy and can be between one to five days, two days being the default value.

In addition, users get Azure Backup support for Standard SSD disks and disks up to 4TB size.

How to change the snapshot retention period?

We are enabling this experience starting today and rolling it out region by region. You can check the availability in your region today.

Portal:

Users can change the snapshot retention to any value between one and five days from the default value of two days.

Next steps

Learn more about Instant restore capability.
Learn more about Azure Backup.
Want more details? Check out Azure Backup documentation.
Need help? Reach out to Azure Backup forum for support
Tell us how we can improve Azure Backup by contributing new ideas and voting up existing ones.
Follow us on Twitter @AzureBackup for the latest news and updates

Quelle: Azure

Azure Stack laaS – part two

This blog post was co-authored by David Armour, Principal Program Manager, Azure Stack.

Start with what you already have

Every organization has a unique journey to the cloud. This journey is based on the organization’s history, business specifics, culture, and maybe most importantly, their starting point. While it can be hard for some to say goodbye to their current virtualization environment and way of doing things, the journey to the cloud provides many options, features, functionalities, and opportunities to improve existing governance, operations, and implement new ones. The journey to the cloud can also provide the opportunity to redesign applications and take advantage of the cloud architecture. Additionally, Microsoft Azure gives you the option to host your virtual machines (VMs) in the public cloud or in your own facility with Azure Stack.

In most cases, this journey starts with a lift and shift of the existing servers, either virtual machines or physical servers. Because Azure Stack at its core is an infrastructure-as-a-service (IaaS) platform, the right way to think about this first phase of the journey is as a lift and optimize process. Moving the servers should be the first step towards enabling modern operations across your workloads. That could mean something as little as selecting the right size for your VMs so that you “pay for what you use,” enabling self-service by doing it yourself, automating deployments, or even building on the success of others.

What to think about when migrating

The Azure migration center provides a good model to help start the assessment, make sure you have the right stakeholders involved and help create the proper frame for your migration.

As you start this assessment, there are several factors which you can use to identify what is the best suited platform for your workload, whether that is Azure or Azure Stack:

Cost
Connectivity requirements
Potential regulations and data gravity requirements
High availability and regional requirements

After you complete the assessment and planning, you will need to select the right tool for the migration.

Our partner ecosystem includes ISVs that have built solutions which range from simple migrations, to “as a Service” solutions. There are also Microsoft migration options which require manual steps to implement, but offer a potential lower cost.

Partner options

Azure Stack has ISV solutions for every stage of application migration, from envisioning and/or discovery, to modernization by leveraging PaaS capabilities. Each have their own capabilities and improve the process in their own way.

Carbonite – Offers server migration, backup, high availability of Windows Servers, and enterprise protection for Microsoft Windows endpoints.
Cloudbase – Offers a migration-as-a-service solution called Coriolis which integrates with Azure Migrate and uses it for the initial assessment, as well as the VM-size mapping.

Coriolis will be available as a trial version in the Azure Stack Marketplace, offering free VM migrations to validate the process and make sure it is the right solution. 

   Commvault – Complements migration, management, protection, and activation of data on Microsoft Azure Stack and other hybrid cloud infrastructure solutions. Commvault helps enterprises increase agility, reduce costs, and discover valuable insights.

Commvault is available in the Azure Stack Marketplace and it offers a 60-day free trial that can be upgraded in place to a full version.   

Corent – Offers a migration-as-a-service solution. See below for published cases studies:

“Migrating a BFSI(Banking, Financial Services and Insurance) Application to Microsoft Azure Stack using SurPaaS MaaS”     

“Calligo delivers successful migration to Azure Stack using Corent Technology SurPaaS platform”

Corent Technology is offering a free POC of scanning and migrating of up to five VMs to readers of this blog.  Email sales@corenttech.com with #AzureStackBlog to set up your free PoC. 

ZeroDown – Provides business continuity and high availability across multiple stamps, and even during a migration process. 

This isn’t really a migration tool, instead it can offer fault tolerance and high availability for your solution.     

This solution can also help with creating fault tolerance and high availability across multiple stamps. Please see our demo of an application running across two Azure Stack stamps.

It is also available in the Azure Stack Marketplace and offers a 30-day free trial   

Microsoft migration options

The Storage Migration Service makes it easier to migrate servers and to target VMs in Azure Stack. You can use the graphical tool that inventories data on servers and then transfer that data and configuration to the VMs already deployed on Azure Stack. The service works without apps or users having to change anything. Depending on the assessment, some of these workloads might go to Azure IaaS, or Azure Files.

Use Storage Migration Service because you’ve got a server or lots of servers that you want to migrate to Azure Stack virtual machines. Storage Migration Service is designed to help by doing the following:

Inventory multiple servers and their data.
Rapidly transfer files, file shares, and security configuration from the source servers.
Optionally take over the identity of the source servers, also known as cutting over, so that users and apps don’t have to change anything to access existing data.
Manage one or multiple migrations from the Windows Admin Center user interface.

Typically, in your migration journey, you will use a mixture of tools. So you will need to understand the options available in order to select the right tool for the specific workloads.

Learn more

Use Storage Migration Service to migrate a server
Install Windows Admin Center

In this blog series

We hope you come back to read future posts in this blog series. Here are some of our planned upcoming topics:

Azure Stack at its core is an Infrastructure-as-a-Service (IaaS) platform
Fundamentals of IaaS
Start with what you already have
Do it yourself
Pay for what you use
It takes a team
If you do it often, automate it
Protect your stuff
Build on the success of others
Journey to PaaS

Quelle: Azure

Azure Marketplace new offers – Volume 32

We continue to expand the Azure Marketplace ecosystem. From January 16 to January 31, 2019, 70 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

Virtual machines

Admin Password Manager for Enterprise: Admin Password Manager for Enterprise simplifies password management while helping customers implement recommended defenses against possible cyberattacks.

Appiyo BPM – Simple lightweight process engine: Appiyo's compute engine helps you implement simple business processes and API integration scenarios. It can be used for enterprise linkage to conversational BOTs and for IOT scenarios and document management.

Attendize Open-source ticket selling system: Attendize offers a wide array of ticket and event management features, including mobile-friendly event pages, attendee management, data export, real-time event statistics, and support for multiple currencies.

AVReporter Azure: The AVReporter energy management software contains desktop, web, and mobile interfaces; reports and graphical elements; alerts; ready-to-use dashboards; and more.

Celebrus Enterprise Customer Data Platform: Celebrus captures a complete picture of customer behavior and experience, creating events and profiles in real time for 1-to-1 personalization and streaming analytics.

DNS Safety Filter: This solution is a DNS server with extensive filtering capabilities. It allows you to filter access to domain names by categories and block access to specified domains, and it provides access policies for different groups of machines in your network.

EDLIGO: EDLIGO is a fully integrated solution that delivers real-time insights for data-driven decisions in education. It features easy-to-use dashboards and advanced predictive, causal, and prescriptive analytics.

IOTA Full Node: A full node is a program that fully validates transactions. Setting up your own full node saves you from relying on third parties, giving you more financial control.

Jamcracker CSB Service Provider Version 7.0: Jamcracker CSB, a purpose-built appliance for service providers, is a cloud brokerage solution for Software-as-a-Service and Infrastructure-as-a-Service products. It automates order management, provisioning, and billing.

JFrog Artifactory VM: This virtual machine comes with Java 8, JFrog Artifactory, and Nginx.

Joomla on Windows Server 2016: Joomla is a free and open-source content management system for building websites and powerful online apps. With Joomla, you can connect your sites to databases like MySQL, MySQLi, or PostgreSQL to manage content and delivery.

Joomla on Windows Server 2019: Joomla is a free and open-source content management system for building websites and powerful online apps. With Joomla, you can connect your sites to databases like MySQL, MySQLi, or PostgreSQL to manage content and delivery.

MediaWiki on Windows Server 2016: MediaWiki is a powerful, free, and open-source wiki engine written in the PHP programming language. MediaWiki software is fully customizable, with more than 1,800 extensions available for enabling features.

MediaWiki on Windows Server 2019: MediaWiki is a powerful, free, and open-source wiki engine written in the PHP programming language. MediaWiki software is fully customizable, with more than 1,800 extensions available for enabling features.

NVIDIA Quadro Virtual Workstation – Ubuntu 18.04: Spin up a GPU-accelerated virtual workstation in minutes, without having to manage endpoints or back-end infrastructure. NVIDIA Tesla GPUs in the cloud power high-performance simulation, rendering, and design.

NVIDIA Quadro Virtual Workstation – WinServer 2016: Spin up a GPU-accelerated virtual workstation in minutes, without having to manage endpoints or back-end infrastructure. NVIDIA Tesla GPUs in the cloud power high-performance simulation, rendering, and design.

WordPress on Windows Server 2016: WordPress is a free and open-source website management system for blogs, applications, business sites, portfolios, and more. It's written in the PHP programming language.

WordPress on Windows Server 2019: WordPress is a free and open-source website management system for blogs, applications, business sites, portfolios, and more. It's written in the PHP programming language.

Web applications

Avere vFXT for Azure ARM Template: The Avere vFXT provides scalability, flexibility, and easy access to cloud-based or file-based storage locations for users tasked with managing critical high-performance computing (HPC) workloads.

Cisco CSR 1000V DMVPN Transit VNET: A transit VNet is a common strategy to connect geographically dispersed VNets and remote networks. It simplifies network management and minimizes the number of connections required to connect VNets and remote networks.

Citrix SD-WAN Standard Edition 10.2: Citrix SD-WAN Standard Edition for Azure logically bonds network links into a single, secure, logical virtual path. Organizations can leverage broadband, MPLS, 4G/LTE, satellite, and other connections.

CloudMigrator: Use CloudMigrator to securely migrate email, contacts, calendars, and files from enterprise sources to Microsoft Office 365. CloudMigrator is highly configurable, allowing you to complete the most complex and demanding migrations with ease.

Customer-Facing Anti-Phishing: Segasec specializes in helping organizations mitigate the risk of their customers becoming victims of online fraud and phishing scams. Segasec's solution requires zero onboarding and no integration, so companies can start immediately.

Discovery Hub® with SQL MI and AAS: Discovery Hub Application Server for Azure is a high-performance data management platform that accelerates your time to data insights.

FortiWeb Web Application Firewall – HA: Whether to simply meet compliance standards or to protect mission-critical hosted applications, FortiWeb's web application firewalls (WAFs) provide advanced features and AI-based machine learning detection engines.

HPC Azure Cluster Management Service: This self-hosted service can help you manage your HPC clusters on Azure. The service provides cluster diagnostics (including benchmark and MPI diagnostics), monitoring, and management features.

JFrog Artifactory Enterprise ARM Template: JFrog Artifactory Enterprise delivers end-to-end automation and management of your binaries and artifacts. It is a scalable, universal binary repository manager that integrates with your DevOps tools and platforms.

MSPControl: The powerful MSPControl platform gives users simple multi-tenant point-and-click control over Windows Server applications, including Microsoft IIS, Microsoft SQL Server, and Microsoft Exchange.

TimeXtender Discovery Hub with Managed Instance: Discovery Hub supports core analytics, the modern data warehouse, artificial intelligence, and the internet of things (IoT). This offering deploys the Discovery Hub application server and Azure SQL Managed Instance.

Waves MultiNode: Waves MultiNode is a ready-to-use blockchain software solution for those who want to launch their own private blockchain network with a specific business logic, or to start development of a decentralized application.

Waves Node: Waves Node is a ready-to-use blockchain software solution for those who want to help maintain the Waves network without any hardware or specialist experience.

Container solutions

Entitystream Custodian: Custodian is a full-stack master data management solution that enables its users to connect data from different parts of their organization with the need for complex data management projects.

EntityStream Mars: Mars is a microservice that allows you to present it with two records and have it index, standardize, and compare them so you can understand how similar they are in real-life scenarios.

Jsonnet Container Image: Jsonnet is a data templating language for application and tool developers. It’s based on JSON.

Kubeapps AppRepository Controller Container Image: Kubeapps AppRepository Controller is one of the main components of Kubeapps, a web-based application deployment and management tool for Kubernetes clusters. This controller monitors resources.

Kubeapps Chart Repo Container Image: Kubeapps Chart Repo is one of the main components of Kubeapps, a web-based application deployment and management tool for Kubernetes clusters. It scans a chart repository and populates its metadata.

Kubeapps Chartsvc Container Image: Kubeapps Chartsvc is one of the main components of Kubeapps, a web-based application deployment and management tool for Kubernetes clusters. This service reads metadata about the repositories.

Kubeapps Tiller Proxy Container Image: Kubeapps Tiller Proxy is one of the main components of Kubeapps, a web-based application deployment and management tool for Kubernetes clusters. This proxy provides a secure way to authenticate users.

Redis Enterprise Software: With Redis Enterprise Software, a dataset can grow beyond the largest node in the cluster and be processed by any number of cores.

Scribendi Accelerator: The Scribendi Accelerator is an advanced grammatical error correction tool designed to support professional editors during the editing process and increase their productivity.

TensorFlow ResNet Container Image: TensorFlow ResNet is a client utility for use with TensorFlow Serving and ResNet models.

Consulting services

Accelerate Advanced Analytics: 4-week Assessment: This engagement by Applied Cloud Systems involves an assessment of your analytics capabilities, a period of incubation in Azure, and a path forward. This is a four-week assessment for small to medium-sized organizations.

Accelerate Advanced Analytics: 8-week Assessment: This engagement by Applied Cloud Systems involves an assessment of your analytics capabilities, a period of incubation in Azure, and a path forward. This is an eight-week assessment for small to medium-sized organizations.

Accelerate Azure: 4-week Assessment: Applied Cloud Systems will provide direction and velocity for Azure adoption, innovation, and operations in this four-week assessment intended for small to medium-sized organizations.

Accelerate Azure: 8-week Assessment: Applied Cloud Systems will provide direction and velocity for Azure adoption, innovation, and operations in this eight-week assessment intended for small to medium-sized organizations.

Accelerate DevOps: 4-week Assessment: This four-week assessment by Applied Cloud Systems will guide your organization through application modernization and delivery leveraging Azure DevOps.

Accelerate DevOps: 8-week Assessment: This eight-week assessment by Applied Cloud Systems will guide your organization through application modernization and delivery leveraging Azure DevOps.

AgileIdentity: Azure AD: 3-week Implementation: Easily manage identities across thousands of apps and platforms with Azure Active Directory and Azure AD Premium consulting by Agile IT.

Application Containerization: 3-wk Assessment: For enterprises looking to host their applications on Azure Container Services, DXC Technology uses tools and expertise to determine the technical feasibility, business suitability, and transformation path for each app.

Application Containerization Quickstart: 4-wk POC: For enterprises looking to improve the elasticity of their applications, DXC Technology provides this service to containerize applications and deploy them to Azure Container Services.

Application Services Quickstart: 4-wk POC: For enterprises looking to host their applications on Microsoft Azure, DXC Technology provides a comprehensive migration solution.

Azure4DevOps: 3 Day DevOps Maturity Assessment: This assessment by Testhouse will identify your DevOps maturity and provide a roadmap for moving your IT organization toward the highest level.

Azure DevOps4Dynamics: 3-Day Maturity Assessment: Assess your IT organization’s DevOps maturity and generate an improvement roadmap to ensure you can effectively manage the application lifecycle of your enterprise software investment on Azure.

Azure DevOps Migration: 10 Day Engagement: Readify’s Azure DevOps engagement offers you the opportunity to work with our experts to evaluate current barriers and begin to implement new tools, services, and practices in Azure DevOps.

Azure Disaster Recovery: 1-Day Workshop: In this session, one of InsITe’s skilled engineers will walk you through Azure Backup and Azure Site Recovery services, demonstrating how you can immediately begin leveraging active datacenter replication.

Azure Integration Services: 1-wk POC: This limited implementation by VNB Consulting is designed to evaluate if Azure Integration Services is right for your organization’s hybrid or cloud integration platform initiative.

Azure Migration: 1-day Assessment: CHISW Development LTD's assessment will cover an environment review, premigration requirements, migration, post-migration requirements and enhancements, and a security check.

Azure Security: 3-Day Workshop: This workshop by Catapult Systems is designed to help customers understand the basic security requirements and services available in Azure.

Azure Site Recovery-1 Server: 1-day Implementation: Forsyte will take the time to understand your disaster recovery needs, define an Azure Site Recovery strategy, configure a server, configure the target environment in Azure, create a replication, and initiate the server replication.

BizTalk Health Check/Upgrade Azure Assessment 1-Wk: The BizTalk Health Check by TwoConnect is the first step toward considering an Azure migration and achieving optimal performance for your BizTalk system.

BizTalk Support & Azure Managed Services 3-Day PoC: TwoConnect’s award-winning BizTalk support and Azure Managed Services team will focus on supporting, maintaining, and adapting your integration solutions to ensure the seamless continuity of your business operations.

BizTalk to Azure Migration: 2-day Assessment: This assessment by VNB Consulting will be held at your facility or conducted remotely, and it will involve an evaluation of your BizTalk environment that results in a detailed plan for a BizTalk-to-Azure migration.

Cloud Migration – Transformation: 1-Hour Briefing: Learn how Wintellisys Inc.'s capabilities, approach, and methodologies can help customers accelerate their journey to the cloud. Get an overview of the migration suite and a demonstration of a few of the key components.

Disaster Recovery: 2-Week Implementation: Catapult Systems will discover, assess, and define the architecture and recovery process for two on-premises workloads to Azure.

EDI on Azure LogicApps: 2-day POC: This limited implementation by VNB Consulting is designed to evaluate if Azure Integration Services (Azure Logic Apps and Integration Account) is right for your organization’s EDI-in-the-cloud initiative.

Govern Azure: 8-week Assessment: Applied Cloud Systems' assessment will include all aspects of the governance process, including architecture, acceptable usage, security, monitoring, and cost management.

TFS-Azure DevOps Migration: 4-week Implementation: Canarys will help you each step of the way on your journey, from acquiring the licenses required to use Azure DevOps to migrating projects from Team Foundation Server to Azure DevOps (formerly VSTS).

TIC Modernization 1-Hour Briefing: Practical Solutions Inc. will discuss how any federal agency can remove barriers to the cloud and modern technology adoption. We will also show how agencies can meet the OMB Trusted Internet Connections (TIC) initiative.

Website / App Migration: 3 Day Assessment: BUI will analyze your IIS/Linux installation and identify which sites can be migrated to Microsoft Azure, highlighting any elements that cannot be migrated or are unsupported on the platform.

Windows Server & SQL 2008 EOS Migration- 6 Wk Imp.: This implementation by ANS Group is a packaged migration service that combines assessing, planning, and transitioning stand-alone services into Microsoft Azure.

Quelle: Azure

Announcing Azure Monitor AIOps Alerts with Dynamic Thresholds

We are happy to announce that Metric Alerts with Dynamic Thresholds is now available in public preview. Dynamic Thresholds are a significant enhancement to Azure Monitor Metric Alerts. With Dynamic Thresholds you no longer need to manually identify and set thresholds for alerts. The alert rule leverages advanced machine learning (ML) capabilities to learn metrics’ historical behavior, while identifying patterns and anomalies that indicate possible service issues.

Metric Alerts with Dynamic Thresholds are supported through a simple Azure portal experience, as well as provides support for Azure workloads operations at scale by allowing users to configure alert rules through an Azure Resource Manager (ARM) API in a fully automated manner.

Why and when should I apply Dynamic Thresholds to my metrics alerts?

Smart metric pattern recognition – A big pain point with setting static threshold is that you need to identify patterns on your own and create an alert rule for each pattern. With Dynamic Thresholds, we use a unique ML technology to identify the patterns and come up with a single alert rule that has the right thresholds and accounts for seasonality patterns such as hourly, daily, or weekly. Let’s take the example of HTTP requests rate. As you can see below, there is definite seasonality here. Instead of setting two or more different alert rules for weekdays and weekends, you can now get Azure Monitor to analyze your data and come up with a single alert rule with Dynamic Thresholds that changes between weekdays and weekends.

Scalable alerting – Wouldn’t it be great if you could automatically apply an alert rule on CPU usage to any virtual machine (VM) or application that you create? With Dynamic Thresholds, you can create a single alert rule that can then be applicable automatically to any resource that you create. You don’t need to provide thresholds. The alert rule will identify the baseline for the resource and define the thresholds automatically for you. With Dynamic Thresholds, you now have a scalable approach that will save a significant amount of time on management and creation of alerts rules.

Domain knowledge – Setting a threshold often requires a lot of domain knowledge. Dynamic Thresholds eliminates that need with the use of your ML algorithms. Further, we have optimized the algorithms for common use cases such as CPU usage for a VM or requests duration for an application. So you can have full confidence that the alert will capture any anomalies while still reducing the noise for you.

Intuitive configuration – Dynamic Thresholds allow setting up metric alerts rules using high-level concepts, alleviating the need to have extensive domain knowledge about the metric. This is expressed by only requiring users to select the sensitivity for deviations (low, medium, high) and boundaries (lower, higher, or both thresholds) based on the business impact of the alert in the UI or ARM API.

Dynamic Thresholds also allow you to configure a minimum amount of deviations required within a certain time window for the system to raise an alert, the default time window is four deviations in 20 minutes. The user can configure this and choose what he/she would like to be alerted on by changing the failing periods and time window.

Metric Alerts with Dynamic Threshold is currently available for free during the public preview. To see the pricing that will be effective at general availability, visit our pricing page. To get started, please refer to the documentation, “Metric Alerts with Dynamic Thresholds in Azure Monitor (Public Preview).” We would love to hear your feedback! If you have any questions or suggestions, please reach out to us at azurealertsfeedback@microsoft.com.

Please note, Dynamic Threshold based alerts are available for all Azure Monitor based metric sources listed in the documentation, “Supported resources for metric alerts in Azure Monitor.”
Quelle: Azure

Improving the TypeScript support in Azure Functions

TypeScript is becoming increasingly popular in the JavaScript community. Since Azure Functions runs Node.js, and TypeScript compiles to JavaScript, motivated users already could get TypeScript code up and running in Azure Functions. However, the experience wasn’t seamless, and things like our default folder structure made getting started a bit tricky. Today we’re pleased to announce a set of tooling improvements that improve this situation. Azure Functions users can now easily develop with TypeScript when building their event-driven applications!

For those unfamiliar, TypeScript is a superset of JavaScript which provides optional static typing, classes, and interfaces. These features allow you to catch bugs earlier in the development process, leading to more robust software engineering. TypeScript also indirectly enables you to leverage modern JavaScript syntax, since TypeScript is compatible with ECMAScript 2015.

With this set of changes to the Azure Functions Core Tools and the Azure Functions Extension for Visual Studio Code, Azure Functions now supports TypeScript out of the box! Included with these changes are a set of templates for TypeScript, type definitions, and npm scripts. Read on to learn more details about the new experience.

Templates for TypeScript

In the latest version of the Azure Functions Core Tools and the Azure Functions Extension for VS Code, you’re given the option to use TypeScript when creating functions. To be more precise, on creation of a new function app, you will now see the option to specify TypeScript on language stack selection. This action will opt you into default package.json and .tsconfig files, setting up their app to be TypeScript compatible. After this, when creating a function, you will be able to select from a number of TypeScript specific function templates. Each template represents one possible trigger, and there is an equivalent present in TypeScript for each template supported in JavaScript.

The best part of this new flow is that to transpile and run TypeScript functions, you don’t have to take any actions at all that are unique to Functions. For example, what this means is that when a user hits F5 to start debugging Visual Studio Code, Code will automatically run the required installation tasks, transpile the TypeScript code, and start the Azure Functions host. This local development experience is best in class, and is exactly how a user would start debugging any other app in VS Code.

Learn more about how to get your TypeScript functions up and running in our documentation.

Type definitions for Azure Functions

The @azure/functions package on npm contains type definitions for Azure Functions. Have you ever wondered what’s an Azure Function object is shaped like? Or maybe, the context object that is passed into every JavaScript function? This package helps! To get the most of TypeScript, this should to be imported in every .ts function. JavaScript purists can benefit too – including this package in your code gives you a richer Intellisense experience. Check out the @azure/functions package on npm to learn more!

Npm scripts

Included by default in the TypeScript function apps is a package.json file including a few simple npm scripts. These scripts allow Azure Functions to fit directly into your typical development flow by calling specific Azure Functions Core Tools commands. For instance, ‘npm start’ will automatically run ‘func start’, meaning that after creating a function app you don’t have to treat it differently than any other Node.js project.

To see these in action, check out our example repo!

Try it yourself!

With either the Azure Functions Core Tools or the Azure Functions Extension for VS Code, you can try out the improved experience for TypeScript in Azure Functions on your local machine, even if you don’t have an Azure account.

Next steps

Get started with Azure Functions in VS Code.
Get started with Azure Functions in your CLI with the Azure Functions Core Tools.
Check out a sample TypeScript Function App.
Take a look at the Azure Functions JavaScript Developer Guide for additional details.
Sign up for an Azure free account if you don’t have one, and deploy your serverless apps to the cloud.

As always, feel free to reach out to the team with any feedback on our GitHub or Twitter. Happy coding!
Quelle: Azure

Announcing the general availability of Java support in Azure Functions

Azure Functions provides a productive programming model based on triggers and bindings for accelerated development and serverless hosting of event-driven applications. It enables developers to build apps using the programming languages and tools of their choice, with an end-to-end developer experience that spans from building and debugging locally, to deploying and monitoring in the cloud. Today, we’re pleased to announce the general availability of Java support in Azure Functions 2.0!

Ever since we first released the preview of Java in Functions, an increasing number of users and organizations have leveraged the capability to build and host their serverless applications in Azure. With the help of input from a great community of preview users, we’ve steadily improved the feature by adding support for easier authoring experiences and a more robust hosting platform.

What’s in the release?

With this release, Functions is now ready to support Java workloads in production, backed by our 99.95 percent SLA for both the Consumption Plan and the App Service Plan. You can build your functions based on Java SE 8 LTS and the Functions 2.0 runtime, while being able to use the platform (Windows, Mac, or Linux) and tools of your choice. This enables a wide range of options for you to build and run your Java apps in the 50+ regions offered by Azure around the world.

Powerful programming model

Using the unique programming model of Functions, you can easily connect them to cloud scale data sources such as Azure Storage and Cosmos DB, and messaging services such as Service Bus, Event Hubs, and Event Grid. Triggers and bindings enable you to invoke your function based on an HTTP request, or schedule an event in one of the aforementioned source systems. You can also retrieve information or write back to these sources as part of the function logic, without having to worry about the underlying Java SDK.

Easier development and monitoring

Using the Azure Functions Maven plugin you can create, build, and deploy your Functions from any Maven-enabled project. The open source Functions 2.0 runtime will enable you to run and debug your functions locally on any platform. For a complete DevOps experience, you can leverage the integration with Azure Pipelines or setup a Jenkins Pipeline to build your Java project and deploy it to Azure.

What is even more exciting is that popular IDEs and editors such as Eclipse, IntelliJ, and Visual Studio Code can be used to develop and debug your Java Functions.

One of the added benefits of building your serverless applications with Functions is that you automatically get access to rich monitoring experiences thanks to the Azure Application Insights integration for telemetry, querying, and distributed tracing.

Enterprise-grade serverless

Azure Functions also makes it easy to build apps that meet your enterprise requirements. Leverage features like App Service Authentication / Authorization to restrict access to your app, and protect secrets using managed identities and Azure Key Vault. Azure boasts a wide range of compliance certifications, making it a fantastic host for your serverless Java functions.

Next steps

To get started, take a closer look at how the experience of building event-driven Java apps with Azure Functions looks like following the links below:

Build your first serverless Java Function using the instructions our tutorial.
Find the complete Azure Functions Java developer reference.
Follow upcoming features and design discussion on our GitHub repository.
Learn about all the great things you can do with Java on Azure.

With so much being released now and coming soon, we’d sincerely love to hear your feedback. You can reach the team on Twitter and on GitHub. We also actively monitor Stack Overflow and UserVoice, so feel free to ask questions or leave your suggestions. We look forward to hearing from you!
Quelle: Azure