Adventure awaits: Azure Trivia is back!

You know that feeling—the one that rushes in right after you’ve cracked an achingly tough problem or a tangled string of code. It’s the pure adrenaline, joy, and satisfaction you get from reaching the next level—a new challenge in and of itself.

This year's #AzureTrivia celebrates your love of problem solving, growth, and adventure. Players will be taken on an exciting, mystical journey where you’ll not only pick up new skills, but test your technical prowess in order to unlock new lands and win some sweet, sweet swag.

Check out @Azure every Monday on Twitter for a new Azure-related question to tackle. Get it right and you’ll be entered to win a weekly prize. Just select your answer from multiple choices and tweet out the correct answer no later than 11:59 PM Pacific Time on Thursday each week. (Be sure to use the handy auto-generated tweet, including #AzureTrivia and the original image, for official entry to win.)

Pro tip: Come back every week for a brand-new problem to solve and the chance to win!

If you love playing #AzureTrivia and want a competitive edge, check out these free Azure learning resources to kick your problem-solving skills up a notch!

FAQs

How do I enter?

It’s easy! Follow these three simple steps by Thursday at 11:59 PM Pacific Time:

Visit twitter.com/azure
Find this week’s #AzureTrivia question (hint: it’ll usually be our pinned tweet) and click on the correct answer.
Tweet the auto-generated confirmation message from your own account. Make sure your #answer, #AzureTrivia, and the original question image link are included in your tweet.

What if I get the answer wrong? Can I still win this week?

No, unfortunately only correct answers are valid entries. But you can try again next week!

Am I eligible to win?

You need to be:

A legal resident of the 50 U.S. states (and the District of Columbia)
18 years of age or older
If you’re under 18, you need to have consent of a parent or legal guardian

You cannot be:

An employee or a family member of an employee of Microsoft Corporation and its subsidiaries, affiliates, advertising agencies, and sweepstakes parties

How many winners are there each week?

We’ll randomly select 10 lucky winners each week.

When do you announce the answer?

We’ll announce the answer on Friday morning each week. Check the @Azure feed to find out if you got it right.

How do I find out if I won?

Check your mentions on Fridays! We’ll reply to the week’s winners every Friday. Be sure to DM us within 72 hours so we can confirm that you’re eligible and get your address to send you your prize!

What could I win?

You could win some sweet #AzureTrivia Adventure swag and other goodies, including a 3D puzzle, a custom #AzureTrivia beanie, #AzureTrivia stickers, a VR headset, and a cinema light board.

Can I enter every week to win?

Yeah, go for it!

If I’ve already won once, can I win again?

You sure can. That’s what we call an #overachiever, and we’re impressed.

What if I go back and answer a past week’s question? Am I entered to win this week?

Glad you love answering trivia questions! But no, you are not entered to win this week’s sweepstakes if you answer a past question. You must answer the current week’s question to be entered to win this week.

Can I enter to win more than once per week?

No, you can only enter to win once per week. You can use #AzureTrivia as much as you want, but multiple tweets within the weekly entry period will not increase your chances of winning.

*No purchase necessary. Game ends June 13, 2019 at 11:59 PM Pacific Time. For full details, see the Official Rules.   
Quelle: Azure

Ansible solution now available in the Azure Marketplace

Last year, we shared the release of a great developer experience through Ansible in Azure Cloud Shell, and the Ansible extension for Visual Studio Code. Today we’re excited to be expanding our support of Ansible on Azure with a fully configured Azure Marketplace Ansible solution.

As you know from interacting with Microsoft Azure services, we delivered a suite of Ansible cloud modules that help you automate provisioning and orchestrate your infrastructure on Azure. Using Ansible cloud modules for Azure requires authenticating with the Azure API. This Ansible solution enables teams to use Ansible with managed identities for Azure resources, formerly known as Managed Service Identity (MSI). This means you can use the identity to authenticate to any service that supports Azure Active Directory authentication, without any credentials in your code or environment variables.

Furthermore, this solution template also permits you to select the Ansible releases later than version 2.5.0, such as 2.7.5, according to your needs. By default, the latest release version is used. It also enables Azure CLI. These features allow you to use a consistently hosted instance of Ansible for cloud configuration & management and production scenarios.

You can search Ansible in your Azure portal or go to Azure Marketplace and select Get it now to create a hosted instance of Ansible. You also can find a three minute QuickStart that provides a step-by-step walkthrough.

Ansible solution and Ansible extension – Develop locally, run in Azure

It’s common to write Ansible playbooks in a local environment, but run them in a fully configured Ansible host. We added secure shell support for the Ansible extension to permit Ansible developers to copy Ansible playbooks or the whole workspace, and run them in the remote Ansible host. Now Ansible developers can use Ansible extension in  Visual Studio Code, a free code editor which runs on macOS, Linux, and Windows operating systems to develop Ansible playbooks in any platform and run them in a fully configured Ansible host created by Ansible solution.

You need to grant the virtual machine access to the subscription used to connect to the Ansible host from your local environment.

We are excited about the improving developer experience we are creating for Ansible on Azure. Go ahead and try the Ansible solution. For more information visit the Ansible on Azure developer hub.
Quelle: Azure

QnA Maker simplifies knowledge base management for your Q&A bot

This post was co-authored by the QnA Maker Team.

With Microsoft Bot Framework, you can build chatbots and conversational applications in a variety of ways. Whether you’re looking to develop a bot from scratch with the open source Bot Framework, looking to create your own branded assistant with the Virtual Assistant solution accelerator, or looking to create a Q&A bot in minutes with QnA Maker. QnA Maker is an easy-to-use web-based service that makes it easy to power a question-answer application or chatbot from semi-structured content like FAQ documents and product manuals. With QnA Maker, developers can build, train, and publish question and answer bots in minutes.

Today, we are excited to reveal the launch of a highly requested feature, Active Learning in QnA Maker. Active Learning helps identify and recommend question variations for any question and allows you to add them to your knowledge base. Your knowledge base content won’t change unless you choose to add or edit the suggestions to the knowledge base.

How it works

Active Learning is triggered based on the scores of top N answers returned by QnA Maker for any given query. If the score differences lie within a small range, then the query is considered a possible “suggestion” for each of the possible answers. The exact score difference logic is a function of the score root of the confidence score of the top answer.

All the suggestions are then clustered together by similarity and top suggestions for alternate questions are displayed based on the frequency of the particular queries by end users. Therefore, active learning gives the best possible suggestions in cases where the endpoints are getting a reasonable quantity and variety in terms of usage queries.

QnA Maker learns new question variations in two possible ways.

Implicit feedback – The ranker understands when a user question has multiple answers with scores which are very close and considers that as implicit feedback.
Explicit feedback – When multiple answers with little variation in scores are returned from the knowledge base, the client application can ask the user which question is the correct question. When the user selects the correct question, the user's explicit feedback is sent to QnA Maker with the Train API.

Either method provides the ranker with similar queries that are clustered. When similar queries are clustered, QnA Maker suggests the user-based questions to the knowledge base designer to accept or reject.

How to turn on active learning

By default, Active Learning will be disabled for everybody. Please follow the below steps to enable the Active Learning.

1. To turn active learning on, go to your Service Settings in the QnA Maker portal, in the top-right corner.

 

2. Find the QnA Maker service then toggle Active Learning.

Once Active Learning is enabled, the knowledge suggests new questions at regular intervals based on user-submitted questions. You can disable Active Learning by toggling the setting again.

How to add Active Learning suggestion to the knowledge base

1. In order to see the suggested questions, on the Edit knowledge base page, select Show Suggestions.

2. Filter the knowledge base with question and answer pairs to only show suggestions by selecting Filter by Suggestions.

3. Each question section with suggestions shows the new questions with a check mark to accept the question or an x mark to reject the suggestions. Click on the checkmark to add the question.

You can add or delete all suggestions by selecting Add all or Reject all.

4. Select Save and Train to save the changes to the knowledge base.

To use Active Learning efficiently, one should have higher traffic on the bot. Higher the number of end-user queries, the better will be quality and quantity of suggestions.

QnA Maker active learning Dialog

The QnA Maker active learning Dialog does the following:

Get the Top N matches from the QnA service for every query above the threshold set.
If the top result confidence score is significantly more than the rest of the results, show only the top answer.
If the Top N results have similar confidence scores, then we prompt the user asking which of the following question he meant.
Once the user selects the right question that matches intent, show the answer for that corresponding question.
This selection also triggers feedback into the QnA Maker service via the Train API.

Migrating knowledge bases from the old preview portal 

You may recall at //Build last May 2018, we announced the general availability (GA) of QnA Maker with new architecture built on Azure. As a result, knowledge bases created with QnA Maker free preview will need to be migrated to QnA GA, as the QnA Maker preview will be deprecated January 31, 2019. Learn how to migrate existing ones on the documentation, “Migrate a knowledge base using export-import.”

Below is a screenshot of the old QnA Maker preview portal for reference:

For more information about the changes in QnA Maker GA, see the QnA Maker GA announcement blog post, “Announcing General Availability of QnAMaker.”

QnA Maker GA highlights:

New architecture. The data and runtime components of the QnAMaker stack will be hosted in the user’s Azure subscription. Learn more on the documentation, “What is QnA Maker?”
No more throttling. Pay for services hosted, instead of transactions. See pricing information.
Data privacy and compliance. The QnA data will be hosted within your Azure compliance boundary.
Brand new portal experience to create and manage your knowledge base. Check out the new portal.
Scale as you go. Scale different part of the stack as per your needs. See upgrading your QnA Maker service.

Quelle: Azure

Disaster Recovery support for Linux on VMware

Over the last five years, a gradual shift is observed toward open source environments for a number of advantages over boxed open sources. Factors of lower cost, flexibility, security, performance, and community support for open source operating systems, primarily Linux distros have largely been driving this shift across organizations. Microsoft has embraced this industry trend and has been continuously worked with providers hand in hand to contribute and strengthen the community. All major platform providers of Linux have also witnessed frequent release upgrades, assuring the developers with continued support. With all the more increasing adoption of Linux worldwide, a large number of organizations are moving their mission-critical workloads to Linux based server machines.

Azure Site Recovery (ASR) has always been onboarded with all major Linux server versions on VMware and/or physical machines for disaster recovery. Also, over the last six months, it has continued to put a keen focus on extending support for the latest OS version releases from multiple providers such as Red Hat Enterprise Linux (RHEL), CentOS, Ubuntu, Debian, SUSE, and Oracle.

ASR started supporting RHEL 7.5, 6.10, and 7.6 from July 2018, August 2018, January 2019, and onward.
Support for SUSE Linux Enterprise Server 12 (up to SP3) was added in July 2018 after the success of SP2 and SP3 releases and wide usage for critical workloads.
ASR started supporting CentOS 6.10 from August 2018 onward.
Latest versions of Oracle Enterprise Linux versions 6.8, 6.9, 7.0, to 7.5 and UEK Release 5 were added for support in November 2018, followed by OEL versions 6.10 and 7.6 in January 2019.

In addition to the above release updates from providers, Linux OS in terms of file systems and partitioning methods have been enhanced. ASR has been watching out for these enhancements and their industry adoption on VMware and physical Linux machines.

In 2018, a large number of implementations moved to GUID partitioning table (GPT), allowing for nearly unlimited number of partitions. It also stores multiple copies of boot data which makes the system more robust. ASR started supporting GPT partition style in legacy BIOS compatibility mode from August 2018 onward.
Custom usage of Linux has also evolved variety in system structure. Some specific scenarios include having /boot on disk partition (and on LVM volumes), having /(root), /boot, /usr, /usr/local, /var, /etc directories on separate file systems and separate partitions that are not on same system disk. ASR added support for these customizations in November 2018.

Below, the timeline captures the Linux support extended by ASR since July 2018 for VMware and physical machines.

 

Related Links and additional content

ASR Update Rollup 27
ASR Update Rollup 28
ASR Update Rollup 31
ASR Update Rollup 32
Learn about the supported operating systems for replicating VMware virtual machines.
Get started by configuring disaster recovery for VMware machines.
Need help? Reach out to ASR forum for support.
Tell us how we can improve ASR by contributing new ideas and voting up existing ones.

Quelle: Azure

Azure Security Center can detect emerging vulnerabilities in Linux

Recently a new flaw was discovered in PolKit – a component which controls system-wide privileges in Unix OS. This vulnerability potentially allows unprivileged account to have root permission. In this blog post, we will focus on the recent vulnerability, demonstrate how attacker can easily abuse and weaponize it. In addition, we will preset how Azure Security Center can help you detect threats, and provide recommendations for mitigation steps.

The PolKit vulnerability

PolKit (previously known as PolicyKit) is a component that provides centralized way to define and handle policies and controls system-wide privileges in Unix OS. The vulnerability CVE-2018-19788 was caused due to improper validation of permission requests. It allows a non-privileged user with user id greater than the maximum integer to successfully execute arbitrary code under root context.

The vulnerability exists within PolKit’s versions earlier than 0.115, which comes pre-installed by some of the most popular Linux distributions. A patch was released, but it required a manual install by the relevant package manager issuer.
You can check if your machine is vulnerable by running the command “pkttyagent -version” and verify that your PolKit’s version is not vulnerable.

How an attacker can exploit this vulnerability to gain access to your environment

We are going to demonstrate a simple exploitation inspired from a previously published proof of concept (POC). The exploitation shows how an attacker could leverage this vulnerability for achieve privilege escalation technique and access restrict files. For this demonstration, we will use one of the most popular Linux distributions today.

First, we verify that we are on vulnerable machine by checking the PolKit version. Then, we verify that the user ID is greater than the maximal integer value.

Now, that we know we are on vulnerable machine, we can leverage this flaw by using another pre-installed tool, Systemctl, that uses PolKit as the permission policy enforcer and has the ability to execute arbitrary code. If you take closer look into CVE-2018-19788, you would find Systemctl is impacted by the vulnerability. Systemctl is one of Systemd utilities, and the system manager that is becoming the new foundation for building with Linux.

Using Systemctl, we will be able to create a new service in order to execute our malicious command with root context. Because of the flaw in PolKit, we can bypass the permission checks and runs systemctl operations. Let’s take a look at how we can do that.

Bash script content:

#!/bin/bash
cat <<EOF >> /tmp/polKitVuln.service
[Unit]
Description= Abusing PolKit Vulnerability
[Service]
ExecStart=/bin/bash -c 'cat /etc/sudoers > /tmp/sudoersList.txt'
Restart=on-failure
RuntimeDirectoryMode=0755

[Install]
WantedBy=multi-user.target
Alias= polKitVuln.service
EOF

systemctl enable /tmp/polKitVuln.service
systemctl start polKitVuln.service

First, we define a new service and provides the required information to “/tmp/polkitVuln.service”. The ExecStart directive contains our command (bolded above), accesses the sudoers file, and copies its content to a share folder. This shared folder can be accessed by unprivileged users. The Sudoers file is one of the most important files in the system, as it contains the users and groups privileges information of the machine. At the last part of the script, we make the actual call for systemctl tool to create and start our new service.

Execute the script:

Notice the errors regarding Polkit failing to handle the uid field. As the Sudoers file is copied using the exploitation, we can read its content.

With this vulnerability attackers can bypass permissions to check and gain root access to your environment. In another blog post, “How Azure Security Center helps detect attacks against your Linux machines,” we showed how attackers can exploit hosts for installing crypto miners or attack other resources.

Protect against and respond to threats with Azure Security Center

Azure Security Center can help detect threats, such as the PolKit vulnerability, and help you quickly mitigate these risks. Azure Security Center consolidates your security alerts into a single dashboard, making it easier for you to see the threats in your environment and prioritize your response to threats. Each alert gives you a detailed description of the incident as well as steps on how to remediate the issue.

While we investigate Azure Security Center hosts impact, we could determine what is the frequency in which machines are under attack and using behavioral detection techniques, inform customers when they have been attacked. Below is the security alert based on our previous activity which you can see in Security Center.

In addition, Azure Security Center provides a set of steps that enable customers to quickly remediate the problem:

System administration should not allow negative user IDs or user IDs greater than 2147483646.

Verify user ID maximum and minimum values under “/etc/login.defs.”

Upgrade your policykit package by the package manager in advance.

Get started with Azure Security Center

Start using Azure Security Center’s Standard Tier for free today.
Quelle: Azure

Azure Marketplace new offers – Volume 30

We continue to expand the Azure Marketplace ecosystem. From December 16 to December 31, 2018, 46 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

Virtual machines

A10 vThunder ADC for Microsoft Azure: A10 Networks' vThunder for Microsoft Azure is purpose-built for high performance, flexibility, and easy-to-deploy application delivery and server load balancing. It's optimized to run natively within Azure.

Appzillon Consumer Banking: Appzillon Consumer Banking from i-exceed offers users a convenient and simplified omnichannel approach to digital banking and ensures that user journeys are delightful and engaging.

IIS on Windows Server 2016: Each version of Microsoft Windows Server brings a new version of Internet Information Services (IIS). With the recent release of Windows Server 2016 comes IIS version 10, also known as version 10.0.17763.

MySQL Server 8.0 On Windows 2016: Quickly deploy MySQL server into your Azure tenant. MySQL provides you with a suite of tools to develop and manage MySQL-based business-critical applications on Windows.

RStudio Server Pro for Azure: RStudio Server Pro lets you access RStudio from anywhere using a web browser and delivers the productivity, security, centralized resource management, and metrics that professional data science teams need to develop in the R programming language.

SkyLIGHT PVX: Get the SkyLIGHT PVX virtual appliance by Accedian here in the Azure Marketplace.

Teradata Unity (IntelliSphere): Teradata Unity (IntelliSphere) is key to Teradata's workload routing and synchronization, providing active-active database availability with near-zero recovery time objectives/recovery point objectives. This is a bring-your-own-license offering.

Toad Intelligence Central 4.3: Improve productivity, collaboration, and data provisioning. Share all Toad artifacts – including entity relationship diagrams, query files, automation scripts, and SQL files – with other Toad users.

WhereScape® RED automation software: WhereScape RED is an integrated development environment that provides teams with the automation to streamline workflows, eliminate coding by hand, and cut the time to develop, deploy, and operate data infrastructure.

WorkflowGen: Automate complex human-centric processes by leveraging the Azure ecosystem. WorkflowGen can enhance your software or application offerings with a high-performance competitive process automation component.

Web applications

Azure Sandbox as a Service: The Sandbox enables your organization to provide access to cloud subscriptions on-demand that are isolated from your production environment. It's a great service for implementing proofs of concept, DevTest environments, or even hackathons.

eDocStore: The eDocStore is a centralized, scalable, structured file storage solution with enterprise content management (ECM)-grade interoperability based on open OASIS standard Content Management Interoperability Services (CMIS) v1.1.

HashiCorp Consul Enterprise: The world is moving from static infrastructure to dynamic infrastructure. HashiCorp Consul is a distributed service networking layer to connect, secure, and configure applications across dynamic distributed infrastructure.

HashiCorp Vault Enterprise: HashiCorp Vault protects sensitive data. It's designed to help security teams secure, store, and tightly control access to tokens, passwords, certificates, and encryption keys.

Identity Suite: To meet the strict compliance requirements of the GDPR and of highly regulated industries, Omada offers a solution that governs access to privacy data and other sensitive information.

PowerERM HRMS APP: The PowerERM is a company-wide employee relationship management software package used to coordinate all employee functions from the time of hiring to separation, including recruitment and training functionalities.

RSA® Authentication Manager 8.4: RSA Authentication Manager is the on-premises platform behind RSA SecurID Access that allows for centralized management of the environment, which includes authentication methods, users, applications, and agents across physical sites.

Secure File Exchange: Use this solution template to simply and securely exchange files with your teammates, partners, and customers. All files are encrypted with a password in your storage account. You and the recipient get notifications with detailed download instructions.

SentinelDB: SentinelDB is a cloud-based, privacy-by-design database that complies with GDPR and HIPAA regulations. It offers per-record encryption, a blockchain-backed audit trail, horizontal scalability, and zero maintenance for customers.

Teradata Unity with IntelliSphere: Teradata Unity with IntelliSphere is key enabling technology for synchronization and workload routing between Teradata Vantage systems.

WordPress Multi-Tier: This solution uses a virtual machine for the application front end and Azure Database for MariaDB services for the application data. The Azure Database service provides automatic patching, automatic backups, and built-in monitoring and security.

Container solutions

Nginx Secured Ubuntu Container with Antivirus: Deploy an enterprise-ready container for Nginx on Ubuntu. Nginx can be deployed to serve dynamic HTTP content on a network using FastCGI, SCGI handlers for scripts, WSGI application servers, or Phusion Passenger modules.

Node 8 Secured Alpine Container with Antivirus: Deploy an enterprise-ready container for Node 8 on Alpine. Node.js is an open-source, cross-platform JavaScript runtime environment for developing tools and applications.

Sestek SR REST Server: You can use this REST API for speech recognition. Supported languages include Turkish, English, Spanish, French, Azerbaijani, Arabic, German, Russian, Urdu, Flemish, and Dutch. Note that this image contains only Turkish, English, and Arabic.

Consulting services

Azure Blockchain Workbench: 8-Week Implementation: Azure Blockchain Workbench by Akvelon provides the infrastructure scaffolding for building end-to-end blockchain applications on top of the Microsoft Azure platform.

Azure Cloud Migration: 2 Day Assessment: Take advantage of FMT Consultants' thorough Azure migration assessment. Gain valuable insights for what it will take to migrate your on-premises applications.

Azure Consulting: 2 Day Workshop: BUI’s Azure Discovery Workshop follows a defined and repeatable process to share the value of Microsoft Azure. BUI will showcase different Azure capabilities, considering security, identity, and more.

Azure Data Services Accelerator 4wk implementation: ANS accelerates your path to achieving your defined business outcome, as will be uncovered during the preassessment solution stage, utilizing services such as Azure Data Lake and Microsoft Power BI.

Azure eDiscovery/Compliance 2 day workshop: This workshop by Controle is for legal and technical users and is conducted at the client's site. Participants will gain an understanding of eDiscovery capabilities available through Azure.

Azure Managed Svc + ANS Glass 10wk implementation: Migrate to Azure and utilize new Azure services, such as IoT and AI, with the expertise of ANS cloud experts, underpinned by the governance, automation, and financial insights of ANS Glass.

Azure Migration Assessment: 1 Day Assessment: Are you considering moving your infrastructure to Microsoft Azure? This one-day assessment by Executech will outline the time, effort, and cost required to make the shift and move to the cloud.

Cloud Backup Service: 10 Weeks Implementation: BUI's Cloud Backup managed services offering is an affordable, fully managed Cloud Backup-as-a-Service (CBaaS) solution. We eliminate your need for costly storage, backup management, and maintenance.

Cloud Data Platform: 3-Day Workshop: In this workshop, Siili Solutions Oyj will bring together design, data, and tech, outlining the benefits of Azure and unlocking the value of data through Microsoft Power BI visualization.

Cloud Discovery: 5 Days Workshop: BUI's Cloud Discovery offering is an affordable and structured engagement, consisting of workshops to determine whether your people, environment, and systems are cloud-ready.

Cloud Executive Readiness: 3 Days Workshop: Meylah's Cloud Executive Readiness workshop will identify and review the platforms, processes, and resources required for a successful transition to the cloud.

Cloud Migration Discovery: 6-Wk Assessment: Discover how the scale, flexibility, agility, and consumption-based pricing of cloud services can be used as part of a program to re-platform and re-architect your business systems.

Cloud Readiness Assessment: 4 Week Assessment: Crimson Line’s assessment will provide a detailed report, giving you insight into project costs and complexities to enable you to make accurate cloud decisions.

Data Platform Modernisation – 4 week Assessment: This four-week architecture assessment by Ascent Technology helps management, line-of-business, and IT teams design their data platform deployment and identify potential areas of improvement.

Disaster Recovery Accelerator – 6wk Implementation: ANS, in partnership with Microsoft, has developed a disaster recovery solution, the Azure Disaster Recovery Accelerator, designed to minimize disruption in the event of a failure or disaster.

DR / ASR Assessment Plan: 5-day assessment: Crimson Line will install software to assess a client’s on-premises environment. The outcome will provide a detailed Azure Site Recovery deployment planning report that can be used as part of the Crimson Line DRaaS offering.

Dynamic Customer Profile Creation: 4-week PoC: This pilot is intended to help the client create a single enriched view of customers using transaction and interaction data by leveraging Azure data services and TheDataTeam's Cadence framework.

FREE 1-Hr Briefing: Azure Cloud Migration: In this briefing, FMT Consultants will provide you with a report of our discussion, budgetary pricing for an assessment, and the next steps to kick off your cloud migration.

Microsoft Azure Readiness: 5-day Workshop: A team of experts at Opsgility will help guide your leadership and learning teams by creating and delivering a detailed learning plan and courses tailored to your organizational learning goals.

Oracle Migration to Azure: 1-Week Assessment: Pythian can help you migrate from Oracle to the technologies for your needs, including Azure SQL Database, Azure SQL Data Warehouse, and SQL Server (cloud and on-premises.) This offer is for customers in Canada.

Oracle Migration to Azure: 1-Week Assessment (UK): Pythian can help you migrate from Oracle to the technologies for your needs, including Azure SQL Database, Azure SQL Data Warehouse, and SQL Server (cloud and on-premises.) This offer is for U.K. customers.

Oracle Migration to Azure: 1-Week Assessment (US): Pythian can help you migrate from Oracle to the technologies for your needs, including Azure SQL Database, Azure SQL Data Warehouse, and SQL Server (cloud and on-premises.) This offer is for U.S. customers.

Quelle: Azure

Development, source control, and CI/CD for Azure Stream Analytics jobs

Do you know how to develop and source control your Microsoft Azure Stream Analytics (ASA) jobs? Do you know how to setup automated processes to build, test, and deploy these jobs to multiple environments? Stream Analytics Visual Studio tools together with Azure Pipelines provides an integrated environment that helps you accomplish all these scenarios. This article will show you how and point you to the right places in order to get started using these tools.

In the past it was difficult to use Azure Data Lake Store Gen1 as the output sink for ASA jobs, and to set up the related automated CI/CD process. This was because the OAuth model did not allow automated authentication for this kind of storage. The tools being released in January 2019 support Managed Identities for Azure Data Lake Storage Gen1 output sink and now enable this important scenario.

This article covers the end-to-end development and CI/CD process using Stream Analytics Visual Studio tools, Stream Analytics CI.CD NuGet package, and Azure Pipelines. Currently Visual Studio 2019, 2017, and 2015 are all supported. If you haven’t tried the tools, follow the installation instructions to get started!

Job development

Let’s get started by creating a job. Stream Analytics Visual Studio tools allows you to manage your jobs using a project. Each project consists of an ASA query script, a job configuration, and several input and output configurations. Query editing is very efficient when using all the IntelliSense features like error markers, auto completion, and syntax highlighting. For more details about how to author a job, see the quick start guide with Visual Studio.

If you have existing jobs and want to develop them in Visual Studio or add source control, just export them to local projects first. You can do this from the server explorer context menu for a given ASA job. This feature can also be used to easily copy a job across regions without authoring everything from scratch.

Developing in Visual Studio also provides you with the best native authoring and debugging experience when you are writing JavaScript user defined functions in cloud jobs or C# user defined functions in Edge jobs.

Source control

When created as projects, the query and other artifacts sit on the local disk of your development computer. You can use the Azure DevOps, formerly Visual Studio Team Service, for version control or commit code directly to any repositories you want. By doing this you can save different versions of the .asaql query as well as inputs, outputs, and job configurations while easily reverting to previous versions when needed.

Testing locally

During development, use local testing to iteratively run and fix code with local sample data or live streaming input data. Running locally starts the query in seconds and makes the testing cycle much shorter.

Testing in the cloud

Once the query works well on your local machine, it’s time to submit to the cloud for performance and scalability testing. Select “Submit to Azure” in the query editor to upload the query and start the job running. You can then view the job metrics and job flow diagram from within Visual Studio.

Setup CI/CD pipelines

When your query testing is complete, the next step is to setup your CI/CD pipelines for production environments. ASA jobs are deployed as Azure resources using Azure Resource Manager (ARM) templates. Each job is defined by an ARM template definition file and a parameter file.

There are two ways to generate the two files mentioned above:

In Visual Studio, right click your project name and select “Build.”
On an arbitrary build machine, install Stream Analytics CI.CD NuGet package and run the command “build” only supported on Windows at this time. This is needed for an automated build process.

Performing a “build” generates the two files under the “bin” folder and lets you save them wherever you want.

The default values in the parameter file are the ones from the inputs/outputs job configuration files in your Visual Studio project. To deploy in multiple environments, replace the values via a simple power shell script in the parameter file to generate different versions of this file to specify the target environment. In this way you can deploy into dev, test, and eventually production environments.

As stated above, the Stream Analytics CI.CD NuGet package can be used independently or in the CI/CD systems such as Azure Pipelines to automate the build and test process of your Stream Analytics Visual Studio project. Check out the “Continuously integrate and develop with Stream Analytics tools”and “Deploy Stream Analytics jobs with CI/CD using Azure Pipelines Tutorial” for more details.

Providing feedback and ideas

The Azure Stream Analytics team is highly committed to listening to your feedback. We welcome you to join the conversation and make your voice heard via our UserVoice. For tools feedback, you can also reach out to ASAToolsFeedback@microsoft.com.

Did you know we have more than ten new features in public preview? Sign up for our preview programs to try them out. Also, follow us on Twitter @AzureStreaming to stay updated on the latest features.
Quelle: Azure

Read Replicas for Azure Database for PostgreSQL now in Preview

This blog is co-authored by Parikshit Savjani, Senior Program Manager and Rachel Agyemang, Program Manager, Microsoft Azure.

We are excited to announce Public Preview availability of Read Replicas for Azure Database for PostgreSQL.

Azure Database for PostgreSQL now supports continuous, asynchronous replication of data from one Azure Database for PostgreSQL server (the “master”) to up to five Azure Database for PostgreSQL servers (the “read replicas”) in the same region. This allows read-heavy workloads to scale out horizontally and be balanced across replica servers according to users' preferences. Replica servers are read-only except for writes replicated from data changes on the master. Stopping replication to a replica server causes it to become a standalone server that accepts reads and writes.

Key features associated with this functionality are:

Turn-key addition and deletion of replicas.
Support for up to five read replicas in the same region.
The ability to stop replication to any replica to make it a stand-alone, read-write server.
The ability to monitor replication performance using two metrics, Replica Lag and Max lag across Replicas.

For more information and instructions on how to create and manage read replicas, see the following articles:

Read replicas in Azure Database for PostgreSQL
How to create and manage read replicas in Azure Database for PostgreSQL using the Azure portal

We are sharing some of the application patterns and reference architectures which our customers and partners have implemented leveraging the read replicas for scaling out their workload.

Microservices Pattern with Read scale Replicas

In this architecture pattern, the application is broken into multiple microservices with data modification APIs connecting to master server while reporting APIs connecting to read replicas. The data modification APIs are prefixed with “Set-” while reporting APIs are prefixed with “Get-“. The load balancer is used to route the traffic based on the API prefix.

BI Reporting

For BI Reporting workload, data from disparate data sources is processed every few mins and loaded in the master server. The master server is dedicated for loads and processing not directly exposing it to BI users for reporting or analytics to ensure predictable performance. The reporting workload is scaled out across multiple read replicas to manage high user concurrency with low latency.

IoT scenario

For IoT scenario, the high-speed streaming data is loaded first in master node as a persistent layer. The master server is used for high speed data ingestion. The read replicas are leveraged for reporting and downstream data processing to take data driven actions.

We hope that you enjoy working with the latest features and functionality available in our Azure database service for PostgreSQL. Be sure to share your impressions via User Voice for PostgreSQL.

Additional resources for Azure Database for PostgreSQL

Service information
Documentation
Feedback forum

Quelle: Azure

Microsoft joins the SciKit-learn Consortium

As part of our ongoing commitment to open and interoperable artificial intelligence, Microsoft has joined the SciKit-learn consortium as a platinum member and released tools to enable increased usage of SciKit-learn pipelines.

Initially launched in 2007 by members of the Python scientific community, SciKit-learn has attracted a large community of active developers who have turned it into a first class, open source library used by many companies and individuals around the world for scenarios ranging from fraud detection to process optimization. Following SciKit-learn’s remarkable success, the SciKit-learn consortium was launched in September 2018 by Inria, the French national institute for research in computer science, to foster growth and sustainability of the library, employing central contributors to maintain high standards and develop new features. We are extremely supportive of what the SciKit-learn community has accomplished so far and want to see it continue to thrive and expand. By joining the newly formed SciKit-learn consortium, we will support central contributors to ensure that SciKit-learn remains a high-quality project while also tackling new features in conjunction with the fabulous community of users and developers.

In addition to supporting SciKit-learn development, we are committed to helping Scikit-learn users in training and production scenarios through our own services and open source projects. We released support for using SciKit-learn in inference scenarios through the high performance, cross platform ONNX Runtime. The SKlearn-ONNX converter exports common SciKit-learn pipelines directly to the ONNX-ML standard format. In doing so, these models can now be used on Linux, Windows, or Mac with ONNX Runtime for improved performance and portability.

We also provide strong support for SciKit-learn training in Azure Machine Learning. Using the service, you can take existing Scikit-learn training scripts and scale up your training to the cloud, automatically iterate through hyperparameters to tune your model, and log experiments with minimal effort. Furthermore, the automated machine learning capability can automatically generate the best SciKit-learn pipeline according to your training data and problem scenario.

At Microsoft we believe bringing AI advances to all developers, on any platform, using any language, in an open and interoperable AI ecosystem, will help ensure AI is more accessible and valuable to all. We are excited to be part of the SciKit-learn consortium and supporting a fantastic community of Scikit-learn developers and users.
Quelle: Azure