Building a Data Lake with Cloudera and Azure Data Lake – Roadshow

Today we are announcing the Cloudera + Microsoft Roadshow to showcase the partnership and integration with Cloudera Enterprise Data Hub and Azure Data Lake Storage (ADLS).    Linux and Open Source solutions (OSS) have been some of the fastest growing workloads in Azure and Big Data/Analytics are popular among our customers.   Microsoft works closely with our open source partners such as Cloudera and many others to build solutions across infrastructure-as-a-service (IaaS) and platform-as-a-service (PaaS) to make it as simple as possible and allow customers to focus on solving their data challenges.  This roadshow will build from the latest Cloudera announcements and will start in the following cities:

Chicago, IL – June 8
Detroit, MI – June 9
Dallas, TX – June 13
Houston, TX – June 14

Join Cloudera and Microsoft where you will get a hands-on lab experience to build a Data Lake and learn how:

Organizations are finding success with Cloudera Enterprise on Azure
To deploy and configure your Cloudera cluster on Microsoft Azure Data Lake Store
To ensure your data is secure and governed in the cloud
You can get started recognizing value from your Azure data

Click on the city to register!  Space is limited so register early!
Quelle: Azure

Building an Azure Analysis Services Model for Azure Blobs — Part 1

Azure Analysis Services recently added support for the 1400 compatibility level for Tabular models, as announced in the article “1400 compatibility level in Azure Analysis Services.” The modern Get Data experience in Tabular 1400 is a true game changer. Where previously Tabular models in the cloud would primarily interface with Azure SQL Database or Azure SQL Data Warehouse, you now have more options to bring in data directly from a source. There are pros and cons to either approach. Perhaps most importantly, Azure SQL Data Warehouse can help to ensure data integrity at cloud scale. It is hard to overstate the importance of accurate and trustworthy data for BI solutions. On the other hand, the data warehouse increases the complexity of the data infrastructure and adds latencies to data updates. If data integrity can be ensured without requiring a data warehouse, direct data import into Tabular 1400 can help to avoid the extra complexity and latencies. For an example, see the blog article “Building an Azure Analysis Services Model on Top of Azure Blob Storage” on the Analysis Services team blog.

 

 

To learn more, please read the full blog post "Building an Azure Analysis Services Model on Top of Azure Blob Storage—Part 1."

New to Azure Analysis Services? Find out how you can try Azure Analysis Services or learn how to create your first data model.
Quelle: Azure

DockerCon Hands-on Labs now online

One of more popular activities at DockerCon is our Hands-on Labs, where you can learn to use the Docker tools you see announced on stage, or talked about in the breakout sessions. This year we had eight labs for people to work through, ranging from 20 minutes to an hour in length.

We’ve now moved these apps into the Docker Labs Repo so that everyone can use them. The Docker Labs Repo is where we put a bunch of learning content for people who want to learn Docker, from beginner to advanced security and networking labs.
Here are the new labs:
Continuous Integration With Docker Cloud
In this lab, you will learn how to configure a continuous integration (CI) pipeline for a web application using Docker Cloud’s automated build features.
Docker Swarm Orchestration Beginner and Advanced
In this lab, you will play around with the container orchestration features of Docker. You will deploy a simple application to a single host and learn how that works. Then, you will configure Docker Swarm Mode, and learn to deploy the same simple application across multiple hosts. You will then see how to scale the application and move the workload across different hosts easily.
Securing Apps with Docker EE Advanced / Docker Trusted Registry
In this lab, you will integrate Docker EE Advanced in to your development pipeline. You will build your application from a Dockerfile and push your image to the Docker Trusted Registry (DTR). DTR will scan your image for vulnerabilities so they can be fixed before your application is deployed.
Docker Networking
In this lab you will learn about key Docker Networking concepts. You will get your hands dirty by going through examples of a few basic networking concepts, learn about Bridge and Overlay networking, and finally learning about the Swarm Routing Mesh.
Windows Docker Containers 101
Docker runs natively on Windows 10 and Windows Server 2016. In this lab you’ll learn how to package Windows applications as Docker images and run them as Docker containers. You’ll learn how to create a cluster of Docker servers in swarm mode, and deploy an application as a highly-available service.
Modernize .NET Apps – for Devs
You can run full .NET Framework apps in Docker using the Windows Server Core base image from Microsoft. That image is a headless version of Windows Server 2016, so it has no UI but it has all the other roles and features available. Building on top of that there are also Microsoft images for IIS and ASP.NET, which are already configured to run ASP.NET and ASP.NET 3.5 apps in IIS.
This lab steps through porting an ASP.NET WebForms app to run in a Docker container on Windows Server 2016. With the app running in Docker, you can easily modernize it – and in the lab you’ll add new features quickly and safely by making use of the Docker platform.
Modernize .NET Apps – for Ops
You’ll already have a process for deploying ASP.NET apps, but it probably involves a lot of manual steps. Work like copying application content between servers, running interactive setup programs, modifying configuration items and manual smoke tests all add time and risk to deployments.
In Docker, the process of packaging applications is completely automated, and the platform supports automatic update and rollback for application deployments. You can build Docker images from your existing application artifacts, and run ASP.NET apps in containers without going back to source code.
This lab is aimed at ops and system admins. It steps through packaging an ASP.NET WebForms app to run in a Docker container on Windows 10 or Windows Server 2016. It starts with an MSI and ends by showing you how to run and update the application as a highly-available service on Docker swarm.
So check out these labs, or head on over the Docker Labs repo and check out the other great content we have there. And if that doesn’t satisfy you desire for hands-on learning, come to DockerCon Europe in October, where we’ll have yet more labs for you to try out the very latest in Docker tech.

More Hands-on Learning with the #DockerCon Labs, now available to allClick To Tweet

More Resources

Check out the Docker Labs repo for this and many more tutorials
Register for an upcoming Docker Webinar
Attend an upcoming Docker event near you

The post DockerCon Hands-on Labs now online appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

Announcing New CloudWatch Metrics for VPN Tunnels

Today we are announcing the availability of CloudWatch Metrics for Amazon Virtual Private Network (VPN) connections. Each VPN connection now collects and publishes a variety of tunnel metrics to Amazon CloudWatch. These metrics will allow you to monitor tunnel health, activity, and create automated actions. 
Quelle: aws.amazon.com

WannaCry & Co.: So schützen Sie sich

Nach WannaCry ist vor dem nächsten Erpressungstrojaner. Was Gefährdete jetzt tun sollten, wie Sie sich vor Nachahmern schützen können und welche Optionen bleiben, wenn der Verschlüsselungstrojaner schon zugeschlagen hat.

Quelle: Heise Tech News

How to use Azure Functions with IoT Hub message routing

I get a lot of asks for new routing endpoints in Azure IoT Hub, and one of the more common asks is to be able to route directly to Azure Functions. Having the power of serverless compute at your fingertips is very powerful and allows you to do all sorts of amazing things with your IoT data.
 
(Quick refresher: back in December 2016 we released message routing in IoT Hub. Message routing allows customers to setup automatic routing of events to different systems, and we take care of all of the difficult implementation architecture for you. Today you can configure your IoT Hub to route messages to your backend processing services via Service Bus queues, topics, and Event Hubs as custom endpoints for routing rules.)

One quick note: if you want to trigger an Azure Function on every message sent to IoT Hub, you can do that already! Just use the Event Hubs trigger and specify IoT Hub's built-in Event Hub-compatible endpoint as the trigger in the function. You can get the IoT Hub built-in endpoint information in the portal under Endpoints –> Events:

Here’s where you enter that information when setting up your Function:

If you’re looking to do more than that, read on.

I have good news and I have bad news. The bad news first: this blog post is not announcing support for Functions as a custom endpoint in IoT Hub (it's on the backlog). The good news is that it's really easy to use an intermediate service to trigger your Azure Function to fire!

Let's take the scenario described in the walkthrough, Process IoT Hub device-to-cloud messages using routes. In the article, a device occasionally sends a critical alert message that requires different processing from the telemetry messages, which comprise the bulk of the traffic through IoT Hub. The article routes messages to a Service Bus queue added to the IoT hub as a custom endpoint. When I demo the message routing feature to customers, I use a Logic app to read from the queue and further process messages, but we can just as easily use an Azure Function to run some custom code. I'm going to assume you've already run through the walkthrough and have created a route to a Service Bus queue, but if you want a quick refresher on how to do that you can jump straight to documentation here. This post will be waiting when you get back!

First, create a Function App in the Azure Portal:

Next, create a Function to read data off your queue. From the quickstart page, click on “Create your own custom function” and select the template “ServiceBusQueueTrigger-CSharp”:

Follow the steps to add your Service Bus queue connection information to the function, and you’re done setting up the trigger. Now you can use the power of Azure Functions to trigger your custom message processing code whenever there's a new message on the queue.
 
Service Bus is billed per million operations and it doesn't add an appreciable amount to the cost of your IoT solution. For example, if I send all messages from my 1 unit S1 SKU IoT hub (400k messages/day) through to a function in this manner, I pay less than $0.05 USD for the intermediate queue if I use a Basic SKU queue. I’m not breaking the bank there.
 
That should tide you over until we have first-class support for routing to Azure Functions in IoT Hub. In the meantime, you can read more about message routes on the developer guide, and learn more about  As always, please continue to submit your suggestions through the Azure IoT User Voice forum or join the Azure IoT Advisors Yammer group.
Quelle: Azure