Docker: (a few) Best Practices

The post Docker: (a few) Best Practices appeared first on Mirantis | Pure Play Open Cloud.
As Docker continues to evolve, it is important to stay up to date with best practices. We joined JFrog to go over the challenges of Dockerization, Dockerfile tips, and configuration tweaks for production.
You may think that running your application in Docker is no different than running it on the host. This is not that easy; however, with some of the tips provided, you will avoid spending hours of debugging on your own.
Check out this video, featuring Mirantis’ Director of Engineering, Mike Scherbakov, to learn:

How to make sure that your application will terminate gracefully within a Docker container, flushing its caches, etc.
Why your Docker container is full of zombies, and you’ve never seen that on a host Linux. What to do in order to get rid of them.
Peculiarities of proxying network traffic to your application from outside the host
How to keep your containers running when you upgrade Docker, or change parameters and need to restart the daemon.

 

Slides: goo.gl/cn3UzA
The post Docker: (a few) Best Practices appeared first on Mirantis | Pure Play Open Cloud.
Quelle: Mirantis

Deploying CloudForms in Microsoft Azure

In this article we will deploy the CloudForms appliance in the Azure cloud. Red Hat provides CloudForms as an appliance. For Microsoft Hyper-V and Azure, Red Hat provides a Virtual Hard Disk (VHD) as a dynamic disk. Azure, unfortunately, does not support dynamic disks. In order to import the CloudForms appliance into Azure, we need to convert the appliance VHD to a fixed disk.
The VHD will have a fixed size of around 40GB. To prevent having to upload 40GB and only upload the actual data which is closer to 2GB, we will use several tools. You can of course use Powershell, using the Azure cmdlets, or if you are a Linux guy like me, Microsoft has provided a tool written in Go that works great for uploading disks to Azure. In addition Microsoft provides a command line (Azure CLI) similar to the functionality of Powershell, but written in Python.
Convert VHD from Dynamic to Fixed
The first question you might have is why provide a dynamic disk? Well Red Hat doesn&;t want you to have to download a 40GB image so they provide a dynamic disk. In the next steps we will take that image, convert to fixed disk and upload to Azure.
First, we need to convert the image to a raw image. We can do this using qemu-tools. To do so, we need to compute the appropriate size for the new disk image and resize it. I‘ve written a quick script that will do this. You can get it here.
Upload and Run CloudForms
In order to upload the disk image to Azure and run it, you need to use the Microsoft Azure VHD tools and Azure CLI tools as I mentioned previously. They are written in Go, so you may also need to install Go as well. The steps I took in my environment are at the end of this post.
The command to upload the disk image is:
$ ./azure-vhd-utils upload –localvhdpath /home/cfme-azure-5.7.0.17-1.vhd –stgaccountname <storage account> –stgaccountkey <storage key> –containername templates –blobname cfme-azure-5.7.0.17-1.vhd –parallelism 8
You need to substitute your own values into the storage account and storage key values. Once the upload completes you can deploy the CloudForms Appliance in Azure. In order to do this we will use the Azure CLI.
The following command creates the CloudForms VM from the VHD you just uploaded.
$ azure vm image create cfme-azure-5.7.0.17-1 –blob-url https://premiumsadpdhlose2disk.blob.core.windows.net/templates/cfme-azure-5.7.0.17-1.vhd –os Linux /home/cfme-azure-5.7.0.17-1.vhd
Note that you can also use the Azure portal UI to create the CloudForms VM once the image is uploaded.
Configure CloudForms in Azure
Once the CloudForms Appliance is deployed you can access it using username/password or ssh-key, depending on what you chose when creating VM in Azure.
That‘s it! You can now configure the CloudForms appliance just as you would normally.
Summary
In this article we explored how to deploy the CloudForms appliance in the Azure cloud. CloudForms provides a single-pane for administering various cloud platforms. Having a CloudForms appliance deployed in Azure gives you more responsive management over Azure resources.
Happy Clouding in Azure!

Note: As I mentioned, the Azure VHD tools are written in Go so you need to first install Go. I installed version 1.7.4.
$ gunzip go1.7.4.linux-amd64.tar.gz
$ tar xvf go1.7.4.linux-amd64.tar
$ cd go
Export the environment parameters
$ mkdir $HOME/work
$ export GOPATH=$HOME/work
$ export PATH=$PATH:$GOPATH/bin
Then install the VHD tools
$ go get github.com/Microsoft/azure-vhd-utils
Similarly, to use the Azure CLI, you need to install Python and dependencies first. (These may already be on your system, but are provided here for completeness.)
$ sudo dnf install python
$ sudo dnf install python-pip
$ sudo dnf install python-devel
$ sudo dnf install openssl-devel
$ sudo dnf install npm
Then you can install the Azure CLI
$ sudo npm install azure-cli -g
$ sudo pip install –upgrade pip
$ sudo pip install azure==2.0.0rc5
 
Quelle: CloudForms

OpenStack Maintenance Engineer

The post OpenStack Maintenance Engineer appeared first on Mirantis | Pure Play Open Cloud.
Mirantis is looking for an engineer with expertise in Linux, Puppet and Python to join us as an OpenStack Maintenance Engineer. In this role, you&;ll maintain already shipped releases of Mirantis OpenStack by creating updates to Mirantis OpenStack components to improve security, performance, data consistency, usability and other aspects of production OpenStack environments. You will work closely with Development, QA, Services and Support teams to provide the best user experience for Mirantis OpenStack customers.Job responsibilities:Develop Puppet manifests to deploy maintenance updates to Mirantis OpenStack clustersMaintain deployment manifests of already shipped releases of Mirantis OpenStackInvestigate and troubleshoot technical issuesDevelop and backport patches for Mirantis OpenStack componentsAnalyze upstream stable branches and consume upstream fixes in Mirantis OpenStack maintenance updatesWork closely with development engineering and assist support and services engineers Requirements:5+ years of experience in IT industry3+ years of experience as deployment engineerStrong knowledge of PuppetGood system administration and automation skills in LinuxExperience with HA and relevant tools, such as Corosync, Pacemaker, keepalived, HAProxyExperience with git, gerrit or other distributed version control and review systemsAbility to identify and troubleshoot issues quickly in a Linux-based environmentGood written communication skills in EnglishWould be a plus:Spoken EnglishSoftware development experience with PythonExperience of working with Ansible and/or ChefExperience with OpenStack and cloud computingExperience with common messaging platforms, such as RabbitMQUnderstanding of software development and release management processLinux networking experienceVirtualization experienceThe post OpenStack Maintenance Engineer appeared first on Mirantis | Pure Play Open Cloud.
Quelle: Mirantis

Bundle solutions with WebSphere Liberty

Some things are just meant to go together. Peanut butter and jelly, milk and cookies, Sherlock Holmes and Dr. Watson, macaroni and cheese—the list could go on and on.
If you’re looking to pair your Java EE application server with something that’s also fast and dynamic, look no further than WebSphere Liberty. WebSphere Liberty is a highly composable, flexible profile of WebSphere Application Server (WAS). It’s ideal for developers and ready for production, both on premises or in the cloud.
Many IBM products use a Java EE application server to deliver their unique functionality. Sometimes it’s a standalone web-based tool. Other times, the core functionality of the product is delivered through a Java EE application. WebSphere Liberty provides the underlying capability for more than 200 products.
Most of these IBM products embed WebSphere Liberty, meaning users of these products might not be aware that Liberty is running underneath. Even service to Liberty itself is seamlessly provided by the product that embeds it. As of 2017, we have 108 products that embed Liberty, and we continue to add Liberty to 10 to 15 products monthly. Users of these products don’t use the Java EE application server in WebSphere Liberty directly, but Jave EE underpins products’ capabilities.
For example, IBM BlueworksLive is a cloud-based IBM product that helps you model and improve your business processes. It’s a Java EE application that runs entirely within WebSphere Liberty. Another example is Watson Care Manager for personalized care plans, automated care management workflows and integrated patient engagement capabilities. Watson Care Manager had its roots on-premises. When it was moved to the cloud, WebSphere Liberty gave Watson Care Manager configuration mechanisms that enabled the rollout and management of large numbers of application servers by DevOps organizations.
WebSphere Liberty has an architecture that provides many significant advantages. It provides simple deployment and installation as well as the ability to provide a low-overhead Java runtime environment that’s well suited for hosting cloud applications, including microservices. But to receive the full benefits of WebSphere Liberty, you can easily package it with other IBM products and further enhance your capabilities. Consider WebSphere Liberty the Batman to your Robin, or the Calvin to your Hobbes.
Learn more about the partner that awaits you, and how together you can reach new heights. For more information on WebSphere Liberty and key products to bundle with this solution, click here.
The post Bundle solutions with WebSphere Liberty appeared first on news.
Quelle: Thoughts on Cloud

How businesses can get the cognitive edge

The buzz in the computing industry is all about cognitive.
My clients are at various stages of understanding its implications. They want to know what cognitive truly is and how it solves real business challenges.
So what is cognitive?  As many observers have noted, it’s not a specific product. Instead, it’s an era that includes multiple vendors and various technologies. The move to this new era is driven by changes in the data landscape. Cognitive computing is vital to turning zettabytes of data into meaningful information. It enables computers to understand, reason and learn without a person programming everything to achieve answers.
For a business, cognitive’s implications are enormous. It is the “disruptive enabler.”
IBM has identified five areas where a business can benefit now if it starts building a cognitive business:

Drive deeper engagement. Help clients behind the scenes for better customer experience.
Scale expertise. Companies spend lots of money training employees. This can be scaled more effectively.
Put learning in every product. Build products that adapt to each customer&;s needs.
Change operations. Streamline your supply chain to help margins.
Transform how discovery is done. From pharmaceuticals to financial industries, research will be the foundation of how many companies work in the future.

In our survey of companies on a cognitive journey, the results were profound. For advanced users, the gains in customer engagement and the ability to respond faster to market needs were nearly doubled compared to beginners. Improvements to productivity and efficiency were just as significant.
For example, Mueller, a privately held company that employs 700 people across four manufacturing and distribution centers in the south-central United States, has implemented cognitive systems to assist with revenue forecasting, supply chain management, marketing, employee health and safety, and talent management. Within 12 months, one solution returned 113 percent on investment, creating a net annual benefit of more than $780,000, and reducing scrap metal waste by 20 to 30 percent. Another solution reduced the time spent creating reports by more than 90 percent, while a third solution resulted in a 90 percent improvement in the time to value in data processing.
USAA, a financial services company, provides banking and insurance services to 10.4 million past and present members of the US armed forces and their immediate family members. To better service these customers, USAA has implemented an innovative cognitive computing solution that uses IBM Watson. The solution enables transitioning military members to visit usaa.com or use a mobile browser to ask questions specific to leaving the military, such as “Can I be in the reserve and collect veterans compensation benefits?” or “How do I make the most of the Post-9/11 GI Bill?” As a result, USAA can provide customers comprehensive answers to complex questions in a non-judgmental environment.
WellPoint (now part of Anthem), one of the largest health benefits companies in the United States, delivers numerous health benefit solutions through its networks nationwide. For complex decisions, patients can often wait weeks for the clinical review to occur, and a lack of available evidence or ability to process in a timely fashion can delay treatment or lead to errors. To address this business challenge, WellPoint implemented a cognitive computing solution powered by IBM Watson that provides decision support for the pre-authorization process. Providing these decision support capabilities and reducing paperwork gives clinicians the chance to spend more time with patients.
These are the competitive business advantages an enterprise needs: the capabilities to transform business processes, impact business outcomes and engage customers in the new era ahead.
This fundamental change in computing needs leading-edge providers to drive it. We agree with research analysts at Gartner who said in their latest report that the IBM approach to cognitive computing is “likely to be one of the most attractive platforms in the future.&;
If your business hasn’t done so already, now’s the time to start your cognitive journey.
Learn more about cognitive capabilities with IBM Watson.
The post How businesses can get the cognitive edge appeared first on news.
Quelle: Thoughts on Cloud

3 imperatives for self-service in a multicloud environment

The advent of cloud-based platform services has dramatically expanded the options available to developers. While many developers have flocked to cloud-based development tools, others continue to use on-premises development environments.
What has become clear is that there is no single development platform or cloud deployment model that fits every situation. With the advent of the hybrid cloud, developers have never had more options. IT management can get caught in a difficult situation of having to please multiple groups of constituents. It must provide departmental developers with the options they need to quickly develop new solutions while maintaining security, governance and cost control.
While self-service is fairly straightforward in a single cloud environment, it can be much more complicated across multiple clouds and cloud services. Administrators must decide which teams should have access to which services. For example, a team that routinely handles personal information might be restricted to on-premises services. Alternatively, teams working on developing mobile applications might receive access to a variety of public cloud services. Each environment needs its own self-service interface and environment. The challenge is to provide an overall self-service interface across cloud development tools.
But enabling self-service is more than simply providing an interface to access the right image from a public cloud service. Increasingly, we are moving to a world where companies are using microservices and a variety of application and data services to help developers quickly create new applications in quickly changing markets. Creating new applications from highly-distributed services requires coordination among a variety of elements: basic cloud compute, storage, complex application services, data services, security and governance.
Below are three imperatives that businesses can achieve by implementing a multicloud self-service environment.
1. A consistent way to evaluate options
Users need to have a way to evaluate their options and choose the best cloud environment that meets their technical and financial requirements. A self-service portal will expose the options that are appropriate for that developer based on the type of data that is involved, their workload requirements and cost restrictions. The developer can read a brief description of each service, assess the tools that are available on that service and decide if it will fulfill their needs.
2. Balance control while allowing choice
Executive management should make sure they retain overall visibility and control of costs and governance. At the same time, developers want to the freedom to choose the platforms that meets their immediate needs. By giving developers choice, organizations are avoiding the problem of “shadow IT.”
3. Allow DevOps teams to focus on creativity and coding
Business leaders don’t want DevOps teams investing time setting up environments, selecting tools and environment teardowns. Instead, teams should be focused on rapidly improving applications, responding to feedback and creating new services. A self-service, multicloud environment with automation allows teams to quickly spin up tested images so that they can focus on coding.
There’s only one way to pragmatically approach the complexities of a multicloud environment: create a self-service portal that is designed with well-defined APIs. This self-service portal must include rules that assist developers in selecting the most appropriate service. All of these services need to be managed with a carefully-vetted catalog so that only approved services are used.
A self-service portal provides a predictable and safe environment to ensure that a business can create innovation at the pace of change. To be successful, the business demands ease of use for the developer with the right safeguards to protect the integrity of the business. The portal brings together the tools for the developers in context with the deployment models needed to support scalability and protection.
To learn about IBM Cloud Automation Manger, visit ibm.biz/tryICAM. The first version of IBM Cloud Automation Manager is now available on IBM Bluemix and supports IBM Cloud and other public cloud offerings.
The post 3 imperatives for self-service in a multicloud environment appeared first on news.
Quelle: Thoughts on Cloud

CurrentCare offers the benefits of assisted living at home with Watson IoT

Forget cameras, microphones, and wearable devices. There’s a better way to monitor the well-being of loved ones who might need assistance. Instead of intruding on their privacy, their houses can check that they’re all right using passive sensors.
Inspired by a number of energy monitoring projects, including one to reduce energy poverty for residents in social housing, Current Cost, a manufacturer of real-time displays for monitoring domestic electricity usage, has taken a new direction and developed a connected-home offering.
Current Cost realized that its energy monitoring product family could be enhanced with additional sensors to provide CurrentCare, a solution for ambient assisted living.
CurrentCare is a spinoff company that, with IBM as its technology partner, passively monitors elderly and vulnerable people in their homes. With sensors, CurrentCare’s telecare solution alerts caregivers and family members when something out of the ordinary is happening.
What appliances can tell you
For example, consider someone who has an electric kettle and makes a cup of tea first thing in the morning. If the kettle is being monitored, it’s obvious, when it goes from zero watts to 3,000 watts in the morning, that someone’s heating water for their tea.
It is also obvious when something hasn’t happened by a certain time. In many cases, it would be extremely unusual, and maybe something is very wrong, if a habitual morning tea drinker hasn’t had at least one cup of tea by 10 in the morning.
Configuring CurrentCare
The CurrentCare solution uses low-cost sensors on key appliances, door open/close sensors, temperature and carbon monoxide monitors, room-level motion detectors, pressure mats in or near the bed, and a toilet flush sensor.
The sensors are easy to install and can be individually configured in a person’s home for their specific needs. Data is sent from the home over broadband (a GSM option will also be available soon) to a cloud-based analytical system, hosted by IBM.
Users can configure alerting rules, customizing them to the individual, to determine what to do if something unusual happens.
How the system works
CurrentCare’s data analysis service is hosted in an IBM cloud data center, and makes use of the secure, scalable and reliable IBM Watson Internet of Things (IoT) connectivity platform, which receives data sent from the CurrentCare equipment in patients&; homes. The data is then processed by an application running in the IBM Bluemix application platform.
Here, the triggering rules for the sensors from each house are applied, determining what alerts are to be sent to which appropriate party:

If the front door opens between midnight and 4 a.m.

Call me
Send the friendly neighbor a text

If the toilet hasn&8217;t flushed by 10 a.m.

Send me a text

If the room temperature drops below 68 degrees

Turn on the heating

If the refrigerator fails

E-mail me

The CurrentCare portal dashboard works with any browser, including tablets and smartphones. This means caregivers can access data and activity charts wherever they are.
CurrentCare minds the house
At the heart of it all, the CurrentCare system is a home monitoring system. This means it can also help homeowners while they’re away, alerting them to factors such as whether anyone has entered the yard or opened the door. They’ll know if their pet sitter has come and whether the mail carrier delivered the package they were expecting. They can also be alerted to unusual activity. It’s a great way to keep an eye on things.
For more about the CurrentCare system and the way it works, visit CurrentCare.
To learn more about this topic, read this post on the IBM Internet of Things blog.
The post CurrentCare offers the benefits of assisted living at home with Watson IoT appeared first on news.
Quelle: Thoughts on Cloud

3 ways the new IBM Cloud APM Boosts hybrid cloud performance

Looking to optimize monitoring in your software as a service (SaaS) environments? We recently announced a new release of IBM Application Performance Management (APM).
You might have noticed the name change from IBM Performance Management on Cloud to IBM Cloud APM. While not a major change, it does signify our commitment to helping you manage your hybrid cloud environments, whether your workloads are running in Bluemix, AWS, Azure, a private cloud environment or everything in between.
Let me explain the key new features and why we added them.
Custom page builder
We previewed this capability at InterConnect and received very positive feedback. You can quickly and easily create dashboards tailored to your specific needs using real-time as well as historical data. Now you can try it out for yourself. With the technology preview included as part of this release, you can create custom dashboards based on Linux and WebSphere Application Server monitoring.
Hybrid monitoring
Applications span many different types of environments. You need to be able to manage them all. It’s much easier to do that when you can manage them all from one tool. IBM Cloud APM monitors your Bluemix Liberty, Node.js, Python and Ruby workloads. It can also bring in metrics from your mainframe components that are monitored using IBM OMEGAMON.
Expanded coverage
We’re continually adding more monitoring coverage to help ensure that you have no blind spots in managing the performance and availability of your applications. New monitoring capabilities in this release includes Active Directory Federation Services, Cassandra, Microsoft Office365, NetApp, RabbitMQ and Siebel.
Ready to try it out? Visit here to try out Cloud APM’s new features in a live demo environment.
Does your application use z? IBM Cloud APM gives you full visibility including your z/OS resources.
The post 3 ways the new IBM Cloud APM Boosts hybrid cloud performance appeared first on news.
Quelle: Thoughts on Cloud