What’s New in OpenShift 3.4 – Cluster Management

By any means you choose to measure, Kubernetes is one of the most popular and comprehensive open source projects in the container space. It is a unique community of individuals and corporations coming together to solve some of the most complex problems in the containerization of cloud native and stateful application services. Red Hat continues to be a thought leader in the community and a company that offers robust and enterprise-ready solutions based on the platform. Today we announce OpenShift Container Platform 3.4! Let Red Hat help you get on a successful path with your container project today.
Quelle: OpenShift

P.L.M. Industries moves logistics beyond human error with cloud

According to a Federal Bureau of Investigation report, industry expert analysts estimated companies lost more than $30 billion a year due to cargo theft. Loss and theft are continuous, whether in the air, by train, or over the road with truckers.
In the first quarter of 2016, Freight Watch International recorded 221 cargo thefts with an average loss of US $221 million per load, with a grand total of just under US $25 million.
P.L.M. Industries, our start-up technology company located in Southern California, is set to disrupt the logistics tracking industry and prevent loss with the Smart Pro Tracking System.
Cracking the code for end-to-end visibility in logistics
Our system was designed to eliminate the biggest contributing factors to the bottom-line profit margin, which are lost and stolen freight. It’s the first and only system that provides complete, end-to-end visibility from manufacturer to user and shipper to consignee.
What it does specifically is provide a view into the full logistics chain, including trailers, shipping containers, luggage, high-value shipments, chemicals, you name it.

Working with IBM Cloud and the Bluemix Garage
After founding our company in 2014, we began developing devices for our system. We worked with the engineering firm IDNEO Technologies to design the user interface (UI). Then we came to IBM to integrate our devices with the cloud. We’re now hosting the Smart Pro Tracking System in the IBM Bluemix environment.
IDNEO Technologies has a very strong working relationship with another provider, so initially we thought we would go in that direction. We started the process of engagement, but only got as far as exchanging emails. It was disappointing. In a moment of frustration, we turned to the IBM website and started a chat, which led us to where we are today.
When we came to IBM, we were focused on the hardware of the system, not necessarily the UI. But when we participated in a Bluemix Garage Design Thinking Workshop in San Francisco, we looked at the user experience and realized we needed to make use of the expertise of IBM to create not just the back-end functionalities of our system, but also an enhanced UI.
The workshop was great. We didn’t just cover product design. We covered the business aspects as well, to make sure that what we were building was in line with market needs.
Disrupting logistics tracking
There’s nothing out there that enables freight to communicate with each other the way the Smart Pro Tracking System does. There’s also nothing out there that provides end-to-end visibility. To stop theft and loss from occurring, shippers need that visibility.
Our system sends real-time alerts when an adverse situation happens during the shipping process. For example, the system can stop batch separations from happening. Items that are grouped together to go to one place will actually all arrive that way. If at any time during the transit process, any one of those items on the truck or trailer is taken or separated from its batch, the customer will get an alert that enables correcting that error.
Revolutionizing the way things are shipped
Our plans include incorporating the IBM Watson Analytics Service from the IBM Watson Developer Cloud portfolio so we can analyze different data points, such as weather and traffic accidents relative to where freight is. We could also incorporate other information, including a history of how workers have performed in the past and vulnerable shipping routes. Then the system can help us plan the best way to meet the customer goals for productivity and service.
With our devices and the IBM Cloud and Watson, we can revolutionize and change the way that things are shipped all over the world.
Everything we require is in the Bluemix platform, such as web analytics and Watson logistic analytics. Having our system on the Bluemix platform and IBM support behind us give us the confidence to say to any customer, “Yes we can do what you asked.”
The post P.L.M. Industries moves logistics beyond human error with cloud appeared first on news.
Quelle: Thoughts on Cloud

PeroxyChem builds a whole new IT infrastructure in less than five months

What sounds like a problem to some companies represents an opportunity for others.
When PeroxyChem was divested from its parent company in 2014, the chemical manufacturer was given just one year to create a new IT infrastructure and department while maintaining day-to-day business functions.
40 percent management, 60 percent innovation
When Jim Curley took over as chief information officer (CIO), PeroxyChem had 11 months to create a new IT environment and migrate all necessary data and applications, SAP and non-SAP. For Curley, setting up a new infrastructure wasn’t just about maintaining the status quo and avoiding downtime. It was also about making the right decision for the future.
“We went in knowing we didn’t want to re-implement anything new or change the application landscape,” he said, “but we did want a cloud infrastructure because we needed transparency on the cost implications for future acquisitions or divestitures.”
Curley’s vision for the new IT department included one critical goal: IT personnel would spend only 40 percent of their time on IT management tasks that “keep the lights on,” and the remaining 60 percent on strategic projects to propel the business forward. Additionally, the company did not have the time or resources required to hire and train new personnel to only manage the day-to-day operations of the environment.
A new, more strategic IT environment
With six months remaining, PeroxyChem selected IBM to set up and host the company’s new environment with a managed cloud infrastructure. By collaborating with IBM, PeroxyChem completed the project with weeks to spare.
PeroxyChem now has a flexible environment that can scale to support its high growth requirements and workload peaks. The deployment includes service level agreements (SLAs) for high input/output requirements and standardization that helps reduce the complexity of the SAP landscape.
By adopting an IBM managed cloud hosting solution, Curley and his team achieved their goal of spending less time on maintenance and more time on innovation.
“Certainly, that wouldn’t be the case if we had not gone with the cloud and outsourced what we did,” he said. “That has translated to our being able to start up our IT business steering committee and have, on average, five business-related projects going on at any given point in time. We’re hitting the dates we give for project completions because we’re not having to do that production support and maintenance work.”
Learn more about how a managed cloud hosting solution for SAP (and non-SAP) applications can free up your resources for a more strategic approach to IT with IBM Cloud for SAP Applications.
The post PeroxyChem builds a whole new IT infrastructure in less than five months appeared first on news.
Quelle: Thoughts on Cloud

Writing RPM macro for OpenStack

RPM macro is a short string, always prefixed by % and generally surrounded by curly brackets ({}) which RPM will convert to a different and usually longer string.
Some macros can take arguments and some can be quite complex.
In RHEL, CentOS and Fedora, macros are provided by rpm package and from redhat-rpm-config.
In , OpenStack macros are provided by openstack-macros which comes from upstream rpm-packaging project.
You can find list of all macros under /usr/lib/rpm/macros.d/ directory.

To see the list of all available macros on your system:

$ rpm –showrc

For example: %{_bindir} is a rpm-macro which points to the binary directory where executables are usually stored.

To evaluate an rpm macro:

$ rpm –eval %{_bindir}

%py_build is the commonly used rpm-macro in RDO OpenStack packages which points to python setup.py build process.

$ rpm –eval %py_build

Motivation behind writing a new RPM macro for OpenStack packages
Currently, Tempest provides an external test plugin interface which enables anyone to integrate an external test suite as a part of Tempest run and each service Tempest plugin has an entrypoint
defined in setup.cfg through which tempest discovers it and list the Tempest plugins.
For example:

tempest.test_plugins =
heat_tests = heat_integrationtests.plugin:HeatTempestPlugin

In RDO OpenStack services RPM packages, In-tree Tempest plugins packages are provided by openstack-{service}-tests subpackage but the tempest plugin entrypoint is provided by the main package openstack-%{service}.
So once you have a working OpenStack environment with Tempest installed having no test subpackage installed. Then we tried to run tempest commands you would have encountered “No module heat_integrationtests.plugin found”
and you end up installing a hell lot of packages to fix this. The basic reason for the above error is tempest plugin entry point is installed by main OpenStack package but files pointing to entrypoint are not found.

To fix the above issue we have decided to separate out the tempest plugin entrypoint from the main package and move it to openstack-{service}-tests subpackage during rpmbuild process by creating a fake tempest plugin entry point
for all RDO services packages. Since it is a massive and similar change affecting all OpenStack services packages.
So, I have created %py2_entrypoint macro which is available in OpenStack Ocata release.
Here is the macro definition of %py2_entrypoint:
“`
# Create a fake tempest plugin entry point which will
# resides under %{python2_sitelib}/%{service}_tests.egg-info.
# The prefix is %py2_entrypoint %{modulename} %{service}
# where service is the name of the openstack-service or the modulename
# It should used under %install section
# the generated %{python2_sitelib}/%{service}_tests.egg-info
# will go under %files section of tempest plugin subpackage
# Example: %py2_entrypoint %{modulename} %{service}
# In most of the cases %{service} is same as %{modulename}
# but in case of neutron plugins it is different
# like servicename is neutron-lbaas and modulename is neutron_lbass
%py2_entrypoint()
egg_path=%{buildroot}%{python2_sitelib}/%{1}-*.egg-info
tempest_egg_path=%{buildroot}%{python2_sitelib}/%{1}_tests.egg-info
mkdir $tempest_egg_path
grep “tempest|Tempest” %{1}.egg-info/entry_points.txt >$tempest_egg_path/entry_points.txt
sed -i “/tempest|Tempest/d” $egg_path/entry_points.txt
cp -r $egg_path/PKG-INFO $tempest_egg_path
sed -i “s/%{2}/%{1}_tests/g” $tempest_egg_path/PKG-INFO
%nil

“`
Here is the list of tempest-plugin-entrypoint reviews.

Some learning from above macro:

[1.] You can use the shell script or Lua language to write macros.

[2.] %define is used to define a macro in spec file or you can directly place the macro in /usr/lib/rpm/macros.d/macros.openstack-rdo to consume it using rpmbuild process.

[3.] use %nil to showcase the end of the macro.

[4.] use %{1} to %{6} to pass variables in macros.

Above is a temporary solution. We are working upstream to separate out tempest plugins from OpenStack project to a new repo for easier management and packaging
in Pike release:https://review.openstack.org/#/c/405416/.

Thanks to Daniel, Alan Haikel and many others on rdo channel for getting the work done.
It was a great learning experience.
Quelle: RDO

Why analytics is a key tool for retail

Success in retail is contingent on offering the customer the right product in the right environment. Matching the two can be hard, since no two customers are the same, and external forces as diverse as catwalk trends and the weather affect demand and purchasing decisions.
In the digital age, consumers interact with stores and service providers through multiple channels: in-store, online and mobile, as well as via phone and social media. At each point, retailers gain an insight into individual customer behaviors through data. That data creates rich picture of an individual’s purchasing habits. Tapping into this rich data source is becoming increasingly key to predicting future customer behavior, driving loyalty and increasing revenues.
Predictive analytics software, such as IBM Predictive Customer Analytics, can take customer data from multiple cloud or on-premises data sources to create truly personalized interactions, thereby boosting purchases, wallet share and loyalty. The role of IBM Predictive Customer Analytics is to take customer data, apply predictive analytics and deliver the best action to front-line systems so retail businesses can use the data to deliver an exceptional customer experience.
For example, take a customer that has been using a department store’s mobile app to search for high-end TVs. Say the customer also has a loyalty card, and historically has reacted positively to double-points offers. A few days later, the customer walks into one of a store and alerts the company’s marketing system of her presence via GPS. Seeded by predictive customer analytics, the marketing system uses the customer’s search history and preference for double-points offers to send a personalized offer while the customer is in the store. This helps the customer make a buying decision and increases overall satisfaction and loyalty. The store is more likely to make a sale.
It’s also important to recognize changes in buying patterns and taste, or a groundswell for the next big retail phenomenon. Analysis of social media can give early insights into such changes or trends, but noticing patterns in such a vast array of unstructured data is difficult without help. IBM Social Media Insight for Retailing is a cloud solution that helps merchandisers by analyzing a range of social media sources and internal data.
For example, a cycling retailer may use Insight software to monitor cycling-related topics. Cycling clothing for children may be an increasingly popular theme. The merchandiser looks for social media insights about its own cycle clothing range for children, to discover their customers feel let down by a limited selection. This may prompt the retailer to launch a range of cycling clothing aimed at children, backed by a social marketing campaign, with the knowledge that there is consumer demand. Without the powerful analytics tools looking at social media, this market opportunity might otherwise be missed, especially if previous sales were low.
The ability to analyze and understand customer behaviors and market trends is key for success in highly competitive retail markets.
IBM offers a range of analytics tools which help retail companies gain insights from customer behavior and retail data so they can make smart merchandising decisions that boost revenue growth.
Learn more about IBM Cloud retail solutions.
The post Why analytics is a key tool for retail appeared first on news.
Quelle: Thoughts on Cloud