HDInsight tools for IntelliJ & Eclipse December Updates

We are pleased to announce the December updates of HDInsight Tools for IntelliJ & Eclipse. The HDInsight Tools for IntelliJ & Eclipse serve the open source community and will be of interest to HDInsight Spark developers. The tools run smoothly in Linux, Mac and Windows. The recent release focuses on users’ feedback to ensure a smooth user experiences on project creation and submission. The release also covers a couple of new features including Spark 2.0 support, local run, and a refined Job View & Job Graph.

Support Spark 2.0

The HDInsight Tools for IntelliJ & Eclipse now is fully compatible Spark 2.0. It allows you to enjoy the cool features from Spark 2.0 including API usability, SQL 2003 support, performance improvements, structured streaming, R UDF support, as well as operational improvements.

Local Run – Use the HDInsight Tools for IntelliJ with the Hortonworks Sandbox

With this feature, the HDInsight Tools for IntelliJ can work with generic Hadoop clusters in addition to submitting Spark jobs to HDInsight clusters. Using the Hortonworks Sandbox allows you to work with Hadoop locally on your development environment. Once you have developed a solution and want to deploy it at scale, you can then move to an HDInsight cluster.

Connect to local sandbox for local run and debug

Job View & Job Graph

The updated Job View provides you a slick UI to view your jobs list, job summary, and details for a selected job. The job graph also allows you to view the execution details, task summary, and executors view for a job.

Job List and Job Summary

Job Graph

Task Summary

Executors View

Installation

User can get the latest bits by going to IntelliJ repository, and searching “HDInsight.” IntelliJ will also prompt users for latest update if user has already installed the plugin.

 

For more information, check out the following:

IntelliJ HDInsight Spark Local Run: Use HDInsight Tools for IntelliJ with Hortonworks Sandbox   
IntelliJ Remote Debug: Use HDInsight Tools in Azure Toolkit for IntelliJ to debug Spark applications remotely on HDInsight Spark Linux cluster 

        Create Spark Applications:

IntelliJ User Guide: Use HDInsight Tools in Azure Toolkit for IntelliJ to create Spark applications for HDInsight Spark Linux cluster
Video: Introducing HDInsight Tools for IntelliJ for Spark Development
Eclipse User Guide: Use HDInsight Tools in Azure Toolkit for Eclipse to create Spark applications
Video: Use HDInsight Tool for Eclipse to create Spark applications

 

Learn more about today’s announcements on the Azure blog and Big Data blog.

Discover more Azure service updates.

 

If you have questions, feedback, comments, or bug reports, please use the comments below or send a note to hdivstool@microsoft.com.
Quelle: Azure

How cloud brokerage matches the right cloud to your workload

I’ve had a lot of conversations this past year with clients who have wanted to deploy new or move existing workloads into the cloud.
Asking a few fundamental questions about the application’s operational requirements can turn a simple case of deploying code and data to a cloud environment into something rather more complex.
For example, one client wanted to keep legacy data storage costs to a minimum, but was concerned about potential data access costs. Another wanted to move an application with lots of complex, time-dependent and business-critical interfaces. Yet another was looking for some very stringent service-level agreements (SLAs) from their provider.
While a cloud solution can accommodate each of these examples, a different approach was needed in each case. A lower cost for data storage and access could be achieved by deploying to a cloud which was not the client’s de facto choice. Complex interfaces could be accommodated through IBM Bluemix Private Cloud Local. Stringent SLAs could be addressed using a managed cloud service, such as IBM Cloud Managed Services. Each of these cases highlighted a need to be flexible and that hybrid cloud is a reality for many organizations.
These examples were single applications and environments, where fast decisions were not needed. They allowed for a short comparison study to be carried out, but there are companies that need make fast, daily cloud hosting decisions on a large scale. In those cases, some form of intelligent automation is a must, and organizations look to cloud brokerage tools.
The IBM Cloud Matrix is sophisticated brokerage software that enables organizations to gain insight into where their workloads would best be placed. Cloud Matrix will take the required characteristics of a given system and use cognitive processes to determine a score for each proposed hosting environment, be it a traditional data center, a private cloud or a public provider, including IBM Cloud, AWS and Azure among others.
Cloud Matrix can use an organization’s cloud pricing model, too, to ensure that results match up to the correct pricing. Scores for hosting the desired workload in each provider’s environment are presented, and the organization can then make an informed decision based on suitability for the workload’s requirements, along with the costs of building and running the environment in a vendor’s cloud. The report can be repeated as needed.
For organizations that run particularly cost-sensitive workloads, brokerage tools backed with automated orchestration can enable fast movement between providers to ensure that the lowest hosting cost is maintained. This use case highlights the need for open standards in cloud, too.
Determining the right run-time environment for a given workload is important if deployment to the cloud is going to be a success. Find out more about brokerage and IBM Cloud Matrix or contact us to arrange a meeting with an IBM Cloud Advisor.
The post How cloud brokerage matches the right cloud to your workload appeared first on news.
Quelle: Thoughts on Cloud