Machine learning: A key weapon in the cybersecurity battle

Since the dawn of the internet, companies have been fighting to stay ahead of cybercriminals. Artificial intelligence (AI) and machine learning have made this job easier by automating complex processes for detecting attacks and reacting to breaches. However, cybercriminals are also using this technology for their own malicious purposes.
More and more hackers are exploiting cognitive technologies to spy, take control of Internet of Things (IoT) devices and carry out malicious activities. CSO magazine called 2018 “the year of the AI-powered cyberattack”. For example, smart malware bots are now using AI to collect data from thousands of breached devices and can learn from that information to make future attacks more difficult to prevent and detect.
As hackers weaponize AI, cybersecurity professionals must fight fire with fire by using cognitive technology to identify and prevent attacks.
Sophisticated phishing at scale
Neural networks, modeled after the human brain, can be used to automate “spear phishing”, the creation of phishing emails or tweets that are highly personal and target specific users. According to research by Blackhat, automated spear phishing had between a 30 to 66 percent success rate, which is 5 to 14 percent higher than large-scale traditional phishing campaigns and comparable with manual spear phishing campaigns.
Automation enables attackers to run spear phishing campaigns at an alarmingly large scale. However, companies are using the capabilities of AI as a countermeasure.
According to a recent Ponemon study, 52 percent of companies are looking to add in-house AI talent to help them boost their cybersecurity efforts, and 60 percent said AI could provide deeper security than purely human efforts. That’s why new security solutions such as IBM QRadar use machine learning to automate the threat detection process, helping cyber incident investigation and response efforts get started as much as 50 times faster than before.
CAPTCHA and authentication concerns
Another area in which AI tools are already helping cybercriminals do their dirty work is in breaking complex codes, whether it’s CAPTCHA or usernames and passwords. Using processes such as optical character recognition, the software can identify and learn from millions of images, eventually gaining the ability to recognize and solve a CAPTCHA. Similarly, hackers are applying the same optical character recognition combined with the ability to automate login requests to test stolen usernames and passwords across multiple sites.
Fighting back against such large-scale attacks requires leaning on these same AI technologies. One way to do this is to use learning-enabled technology to understand what is normal for a system, then flag unusual incidents for human review. Security professionals need AI-based monitoring solutions to provide automated help and identify which alerts pose a real and immediate risk.
Malware
Smart malware, which “learns” how to become less detectable, is also posing a significant threat. Defeating normal malware is typically done by “capturing” the malware and reverse engineering it to figure out how it works. However, in smart malware, it is more difficult to analyze how the neural network makes decisions on who to attack.
While reverse-engineering smart malware remains challenging, neural networks have been successful at recognizing malicious domains created by a domain generation algorithm (DGA), which creates pseudo-random domain names. A smart DGA keeps changing to stay ahead of attempts to thwart it, but, likewise, a smart neural network will continue to learn the strategies deployed by hackers and how to defeat them.
Fight security threats before they happen
One of the most powerful aspects of security enabled by AI and machine learning is the ability to uncover patterns and learn from unstructured data. As a result, these tools can provide security professionals with the means to combat attacks, as well as insights into emerging threats and recommendations on how to defend against impending incidents. Additionally, machine learning can help locate vulnerabilities that may be difficult for human security teams to find.
Cybercriminals are already using AI to launch larger-scale, more sophisticated attacks. Here’s the good news: companies can fight back by using these same technologies. If your organization has been considering implementing AI but hasn’t yet put a plan in place, the time is now, and the business case has arrived. Cognitive technologies such as neural networks and automated security monitoring solutions can help bring your business’s defenses into the cyber age and give you the most cutting-edge weapons to defend against emerging threats.
Discover the ways that IBM Cloud Private for Data can enable security by supporting the development and deployment of AI and machine learning capabilities.
The post Machine learning: A key weapon in the cybersecurity battle appeared first on Cloud computing news.
Quelle: Thoughts on Cloud

The Masters drives innovation with hybrid cloud and AI

The Masters golf tournament may be traditional on the course, but it’s driving innovation with its infrastructure and online fan experiences behind the scenes.
“The Masters is surprisingly modern when it comes to its infrastructure in that it operates a hybrid cloud strategy,” reports Forbes. “For the vast majority of the year its technology needs are fairly modest, but during tournament week, the Masters official applications and website experience a huge spike in demand from millions of fans around the world.”
The solution is the hybrid cloud. IBM has partnered with the Augusta National for more than 20 years to shape the Masters digital experience and engage patrons online. With the IBM Cloud as the digital foundation of the Masters, IBM brings a hybrid cloud environment that allows the Masters to quickly scale and manage various sources of data across multiple locations and use artificial intelligence (AI) to enhance fan engagement.
Three-minute player highlight reels
New this year, “Round in Three Minutes” lets fans at home and away from the fairway experience highlights from each player. The online experience uses IBM Watson on the IBM Cloud to rate and curate a three-minute highlight reel for each player’s round. The feature in the Masters app evaluates the excitement level of each moment, including facial expressions, gestures and the roar of the crowd.
“For the first time ever in golf, we will capture virtually every shot of every player during every competitive round,” said Augusta National Golf Club Chairman Fred Ridley, as reported by GOLF.
IBM analyzed approximately 4,000 shots at the 2018 Masters and estimates it will track 5,000 holes and around 20,000 shots this year, shares SportTechie.
The Masters and IBM
Beyond the beauty of the golf greens, the powerful IBM Cloud hybrid environment, game-changing AI and enterprise-grade security are helping the Masters to scale, innovate and deliver for fans.

 
Learn more about the IBM technology enabling fan experiences at the Masters.
The post The Masters drives innovation with hybrid cloud and AI appeared first on Cloud computing news.
Quelle: Thoughts on Cloud

Medical Confidence directory on IBM Cloud helps speed up Canadian healthcare

According to the Frasier Institute, waiting for treatment has become a defining characteristic of Canadian healthcare.
In the 2018 report, “Waiting Your Turn: Wait Times for Health Care in Canada”, specialist physicians surveyed report a median wait time of almost 20 weeks between referral from a general practitioner and receipt of treatment, more than twice as long as in 1993, when it was just over nine weeks.
Canadians have some of the longest wait times for specialist appointments in the world. This means rising costs and delayed recovery.
Medical Confidence is using the IBM Cloud ecosystem to offer a service that empowers individuals to overcome the obstacles and delays commonly experienced in the Canadian healthcare system.
Healthcare in Canada
The Canadian public healthcare system is fraught with challenges. Multiple siloed systems are hard for patients to navigate. Where previously general practitioners and specialists would work together in a hospital setting, today they practice in separate offices, making communication and collaboration difficult. General practitioners tend to refer patients to the same group of specialists but haven’t necessarily taken into account the specialist’s area of expertise or their wait times for consultation and treatment. For example, they might refer someone who needs hip surgery to a surgeon that’s a shoulder specialist.
A patient could wait eight months or more only to find out that they didn’t complete the required testing in advance of the appointment or that the specialist really can’t treat their condition. Then they have to make a new appointment, or they’re referred to a specialist for a second time and the waiting period begins all over again. General practitioners often do not have the resources or time to provide the support patients need to navigate the system, so patients are left on their own.
A healthcare navigator
Medical Confidence acts as a patient’s healthcare navigator. The company created a directory with information derived from a large number of sources. The directory is constantly updated and currently includes close to 14,000 specialists. It can be sorted based on specialization, sub-specialization, gender, certifications, languages spoken and many other criteria.
 

 
Patients may access the Medical Confidence directory through their disability insurance, their employer or directly from the company. The first step is a medical assessment by a nurse. Next is a search in the Medical Confidence system to identify the most appropriate candidate specialist or specialists. The nurse will guide the patient through the process of getting a general practitioner referral, ensuring it is received by the specialist. Then the appointment is scheduled, and the nurse ensures the necessary diagnostics can be ordered and the results are in the hands of the specialist for the first appointment.
Ultimately, patients arrive well prepared for their appointments. Afterward, the nurse gets the clinical notes and reviews those with the patient to make sure the patient understands the recommendations and next steps.
Medical Confidence and IBM
Medical Confidence uses a variety of best-of-breed tools in its proprietary system, which runs on the IBM Cloud.
One of the reasons we chose IBM Cloud is the ecosystem that’s available to us. We can use both IBM products and open source products, which offers flexibility and room to grow.
Medical Confidence started with a focus and strong expertise in big data, integrating IBM Watson and predictive analytics with evidence-based health measurements. Members of the team have also worked closely with Canadian universities and colleges to pilot new healthcare navigation prototypes that were later updated to be production ready. Medical Confidence patent-pending algorithms are based on the largest Canadian physician specialist network. Analyses are used by Medical Confidence to optimize and streamline the selection process of candidate physician specialists from the company’s network which are shared with the patient and their general practitioner.
Then there’s security. We looked at quite a few cloud providers and found IBM security to be one of the better ones. All of our medical data must reside in Canada, and IBM has cloud data centers located in Toronto and Montreal that provide us with the security we require, as well as direct recovery capabilities between the two locations.
Also, the IBM sales and marketing team understands and responds to our stringent requirements.
Benefits to the whole system
Patients find that having their own personal coach is a big asset. Since they are seeing the correct specialist from the start and are more actively engaged in their treatment, many are recovering sooner.
We get great feedback from general practitioners and specialists, too. General practitioners like that we find appropriate specialists with reasonable wait times, draft referrals, keep an audit trail, and encourage collaboration between them and their patients.
Specialists are now receiving referrals that they know fall within their area of practice and can be confident their patients are well prepared for appointments. Organizations benefit with reduced health benefits costs and lower absenteeism. Employees that are engaged and more productive improve a company’s bottom line.
One Medical Confidence client shared that they are, on average, saving six months in the duration of a disability claim.
Even the Canadian public healthcare system is benefiting, and ultimately the taxpayer who funds it, because we’re driving out the inefficiencies and inherent delays.
Read the case study for more details.
The post Medical Confidence directory on IBM Cloud helps speed up Canadian healthcare appeared first on Cloud computing news.
Quelle: Thoughts on Cloud

Clarifying cloud personalization: 4 common misconceptions

You wouldn’t purchase a car without some forethought, and you shouldn’t rush into any cloud environment without careful consideration. Your organization undoubtedly has widely varying IT needs, so relying on a one-size-fits-all cloud computing solution would be similar to driving off the lot with a sports car when you really need an SUV.
Cloud personalization can help identify an organization’s unique needs for development, security and IT management. You can work closely with a provider to personalize your cloud environment to help ensure these needs are met, so you can deliver products and services more effectively.
With the long-brewing debate between the advantages of customized and standardized environments, some falsehoods have, pardon the pun, clouded the picture.
Here are four myths about cloud personalization and some explanation to help clear the haze.
Myth 1: It’s expensive.
This line of thought posits that, because a custom solution calls for a level of specialized attention not seen with a standard cloud offering, all personalized clouds must be expensive. That’s not necessarily true.
If a business must suddenly expand its storage to take on a new project, a standard cloud solution could actually be more cost prohibitive. Standardized cloud services sell storage in terabyte units, but storage needs can change rapidly as companies grow and mature. In a recent survey from Enterprise Storage Forum, more than half of respondents said their storage needs have grown by 1 to 99 terabytes in the last two years.
In a flat-priced model, customers have to pay for a certain number of units even if they don’t use them all. A custom cloud provider, however, understands specific storage needs and can charge according to actual use, which helps organizations save money in the long run.
Myth 2: Management is out of your hands.
Another common, misguided belief is that the workings of a custom cloud can’t be much different from a standard one. Surely, the thinking goes, a provider will want to control oversight entirely.
The truth is that it depends. A provider will let an organization determine how much of a hand it has in managing a personalized cloud. The level of involvement depends on strategy. An organization can outsource just about every phase of management, or it can oversee certain access and operational aspects. This relationship is defined in the beginning, when the organization and the provider determine who will be responsible for the initial set up and configuration. The agreement can be changed at any time.
Myth 3: What you see is what you get.
Any given organization’s architecture configuration and scalability requirements will differ from other organizations, even within the same industry. But there’s no way a provider can meet the many specific configuration needs of its many clients and also be able to scale on a short notice, right?
They can. Providers typically don’t want to pigeonhole their clients and can indeed be flexible with infrastructure, whether the organization relies on established systems or operates through applications and mobile technology.
Myth 4: Security is the same for all users.
This myth also boils down to a general misunderstanding that all types of cloud are essentially the same. Many observers assume that if a standard cloud offers one level of security to clients, a personalized cloud can’t be any different. A user surely can’t request security configuration changes midstream.
But security expectations absolutely can be changed in a customized cloud. If an organization suddenly must account for Payment Card Industry Data Security Standard (PCI DSS) compliance, for instance, a cloud provider can increase security configurations in relatively little time. The provider can likewise open the window of operations to auditors as necessary to review compliance.
The Cybersecurity Insiders 2018 Cloud Security Report found that IT professionals’ top cloud security concern was misconfiguration of platforms. Cloud personalization can help overcome this worry. It enables IT teams to work closely with vendors to ensure their organization’s specific compliance, security and workflow needs are met.
Understanding your IT, the ability to be flexible and specialized attention are all hallmarks of cloud personalization. How else would the cloud be personalized if a provider can’t meet your unique business needs?
Discover other top myths surrounding cloud development in the IBM DevOps playbook.
 
The post Clarifying cloud personalization: 4 common misconceptions appeared first on Cloud computing news.
Quelle: Thoughts on Cloud

Why the multitenant cloud still matters

Multitenant cloud architectures have fallen out of favor with some enterprises. With the low cost of cloud computing, why bother sharing resources with someone else when you can reserve your own?
But there are many reasons to go multitenant, provided you use it for the right workload.
Shared resource benefits
A multitenant cloud is like an apartment building. Each tenant has its own secure space, while all tenants share common resources, such as power, maintenance and other services. Each tenant has the privacy to do as they please, while basic operational and infrastructure costs are shared by all.
A single-tenant environment, meanwhile, is more like an office building that is fully leased by one company. All spaces (instances) are in the same place, so there can be a high degree of interaction between them, but the overall cost is higher because the entire ecosystem is paid for by a single owner.
One of the major knocks against multitenancy is its perceived lack of security. How can you trust the sanctity of your own workload with all of these unknown actors sharing your cloud? This is something of a misunderstanding. Much like in an apartment building, all tenants in the cloud have the key to their own instance. If someone decodes your key, they can get to your data. But this is true whether you are a single tenant, one of many, or on a public, private or hybrid cloud. What is unlikely to happen is that someone accesses your data by hacking another tenant’s instance.
Also, as noted recently by New Relic, cloud providers have a vested interest in maintaining the highest security standards, particularly across a shared infrastructure. In a single tenant or private deployment, the primary responsibility for security falls the client.
Resource consumption considerations
Multitenancy requires active management of resource consumption. The cloud is highly scalable, but it cannot adjust workloads in real time.
If a few tenants in your cloud suddenly experience a dramatic spike in traffic, as in an e-commerce scenario, you may temporarily see some degradation of service as the cloud attempts to rebalance the load. This is what many cloud experts refer to as the “noisy neighbor” scenario.
But exactly what types of workloads should be hosted on multitenant cloud solutions, and what should remain single? This is a very nuanced question. Generally speaking, the more important functions are likely to be more suited to single tenant, while a multitenant cloud solution will be reserved for larger, less critical applications.
Single-tenant solutions tend to be more stable and predictable than multitenant cloud solutions. As a result, many organizations use them in support of infrastructure as a service (IaaS) and platform as a service (PaaS) deployments. Since these architectures tend to serve multiple functions, usually over a lengthy period of time, they are more amenable to deep integration with established environments in a private cloud setting.
SaaS solutions advantages
When it comes to software as a service (SaaS), however, multitenancy has many advantages. Digital Guardian notes that, besides the cost advantages, multitenant cloud solutions can be configured in a wide variety of ways so that organizations can customize a given application’s performance without having to alter any of its code. This also makes it easier to maintain the app and implement upgrades on an ongoing basis since the vendor doesn’t have to deal with multiple iterations of its product.
For the cloud provider, offering SaaS products on multitenant architectures makes even more sense, according to ITProPortal. For one thing, it’s easier to bring on new users when everyone is using the same version of the software. In many cases, this process can be fully automated, right down to domain and subdomain configuration. Setting the default data and configuring the application can also be automated. At the same time, the multitenant cloud helps maximize resource consumption, giving the provider the highest ROI for its infrastructure investment.
Multitenancy is not appropriate for all use cases, but neither should it be ignored simply because it features shared infrastructure. In today’s environment, you have the luxury of selecting the right architecture for your workload, not the other way around.
Learn more about moving the right IT assets to a multitenant cloud architecture so you can start innovating sooner.
The post Why the multitenant cloud still matters appeared first on Cloud computing news.
Quelle: Thoughts on Cloud

What you need to know about compliance audits

The post What you need to know about compliance audits appeared first on Mirantis | Pure Play Open Cloud.
Compliance audits are like doctor’s appointments. Nobody likes them, but virtually everybody needs them. Once your company gets beyond a certain size, it’s inevitable that you will be engaging in activities or collecting data that makes you subject to various regulations, whether it’s a hospital subject to HIPAA or a company collecting emails subject to the GDPR.
And eventually, you’re going to have to prove that you’re following those regulations successfully. That’s where the compliance audit comes in.
What is a compliance audit?
A compliance audit is quite literally an audit to see how closely you’re following the rules and regulations to which your company is subject, but it’s also more than that.  It’s about making sure you follow YOUR OWN rules.
Many companies think that all they need is to follow “best practices”, but that’s a fallacy, and for more than one reason.
First, where best practices DO exist, they are the absolute minimum that must be done to be effective. They’re essentially an excuse to stop thinking about how to solve a problem. And as the absolute minimum, there’s one group who absolutely LOVES them: hackers. They know what these “best practices” are, and they’ve had decades to learn how to get around them.
But there’s another important reason to go beyond this notion of “best practices”, and that’s that they simply do not exist. The technology world moves fast and any activity that’s been around long enough to be considered a “best practice” has been around long enough to be outdated.
In a world where data is money and the average data breach costs $3.6 million, a compliance audit is meant to ensure that you are following all of the security and legal controls necessary for your business, and not just blindly playing it by ear and hoping for the best.
How to do a compliance audit
Whether you hire a vendor or decide to do a compliance audit yourself, the process is essentially the same.
Step 1:  Determine what you’re trying to accomplish
The first thing you need to ask yourself is the simplest:  Why are you doing this? Do you have an audit due? Have you been compromised?
What keeps you up at night?
Ultimately you will be judged on your adherence to your particular regulatory scheme. In some cases you can choose a scheme to which you want to prove you’re being held, such as NIST or FedRamp. In others, your line of business will dictate that for you, such as HIPPA for medical institutions, PCI for companies that accept credit cards, or GDPR for companies storing personally identifiable information.
When making your decision, make sure that you are being realistic. It may sound like a great idea to shoot for the ultra-secure FedRamp High, but do you really want to spend a year and a million or so dollars to do that when you’re not actually providing a product to the United States Federal Government?
Step 2:  Decide what needs to be done
The next step is to determine the roadmap of your audit. How you proceed from here depends on whether you’re doing the audit yourself or hiring an outside vendor.
If you’re hiring an outside vendor, they will most likely provide you with a questionnaire that will enable them to get started without wasting time in your first meetings.  
If you’re performing the audit yourself (perhaps to ensure you’ll pass the third party audit), you’ll likely download the information detailing what you’ll need to check.  For example, NIST compliance requires you to satisfy 600-700 different security controls. FedRamp Moderate consists of 325 controls in 16 categories and 8 major areas.
Step 3:  Establish appropriate permissions
The whole point of this exercise is making sure that your systems are secure, so presumably the auditors will need permission to access various areas of your infrastructure, such as the network, servers, and so on. Make sure to establish these permissions in such a way that they can be removed later, when the audit is over.
Step 4:  Perform the actual assessment
This, of course, is the meat of the process, where auditors document information such as:

How many nodes do you have?
What is the networking situation?
What about antivirus protection? How is it kept up-to-date?

Auditors should also look at process, asking questions such as:

Do you have an incident response plan? Is it up-to-date?
Do you store event logs? Do you go through those stored logs?

After answering these questions, you’ll get hit with one of the most important:

Can you prove it?

Having a procedure in place to review event logs for anomalies is useless unless you can show that your team does actually review event logs for anomalies.
It’s in this “proof” step that companies most often fail a compliance audit.
Step 5:  Develop the gap analysis between what should be and what actually is
The whole point of doing a compliance audit is to identify places where you’re falling short and document them so you can correct the problem. At the end of this process, you should have a full gap analysis report, as well as one other crucial piece of information: the remediation plan.
A gap analysis that tells you you have problems but doesn’t provide the means for correcting them is only half the story.
What about hiring a compliance auditor?
While you certainly could perform your own compliance audit, it’s usually not in your best interests to do so, for several reasons:

Most companies don’t have compliance experts on staff
Staff members who take on this burden are operating from an “insider” perspective and are likely to just assume things are being done properly without digging deeper
The compliance auditor is always the most hated person in the room
While you don’t need to have a third party perform an audit, if you want the audit to be taken seriously — for example, if you’re trying to prove to your board that you need money for remediation — it’s better to have a third party audit.

If you’re hiring a vendor to perform your compliance audit, make sure that their goal is to understand what you’re doing. A good auditor will establish a relationship with you to help you meet your goals, not just take your money and tell you what you did wrong.
If you’d like to learn more about performing a compliance audit for your company, be sure to Contact Us for more information.
The post What you need to know about compliance audits appeared first on Mirantis | Pure Play Open Cloud.
Quelle: Mirantis

New IBM Aspera updates help media and entertainment companies push boundaries

Across the world today, there’s no denying that the technologies underlying the media and entertainment industry are undergoing some massive transformations.
The availability and volume of video content is expanding exponentially as the line between streaming and file-based content blurs and multicloud becomes the reality of modern IT infrastructures.
For more than 15 years, IBM Aspera has pioneered the data transport technologies supporting this transformation. This year at NAB Show, Aspera will continue to push boundaries by showcasing innovations designed to help clients continue to succeed in an evolving media landscape.
IBM Aspera highlights updates at NAB Show 2019
Here are three key updates to look for at NAB Show 2019:
1. New capabilities for cloud-based workflows.
Organizations across media and entertainment are migrating to the cloud for increased scalability and flexibility.
To provide the additional scale and efficiency that clients need for high-volume cloud-based workflows, IBM is adding new automation functionality to Aspera on Cloud for early access customers. An easy-to-use graphical workflow designer tool enables users to quickly build and configure event-driven transfer workflows. When combined with new Aspera on Cloud analytics features, these automation capabilities will serve as a powerful solution for managing cloud-based workflows.
2. Expanded streaming accessibility.
At NAB Show, Aspera will continue to push the boundaries of high-quality live and near-live video streaming.
A new beta of Aspera Streaming for Video will demonstrate capabilities that enable full bi-directional communication and flexible substitution for TCP (transitional control protocol) across an even wider variety of deployment environments to support additional streaming use cases.
The team will also show off an easy-to-use web application for Aspera Streaming for Video that provides auto-discovery and full visibility of devices.
3. Enhanced performance.
In addition to the new product features, IBM Aspera is also continuing to innovate on its core technology. We recently added a new encryption module built around industry-standard OpenSSL AES-GCM encryption to the Aspera High-Speed Transfer platform. For compute intensive operations, this update can improve encrypted transfer rates by up to 200 percent, while also significantly reducing CPU load.
Aspera multi-session transfer technology overcomes the inherent network speed limitations that organizations often encounter in cloud environments. By using parallel transfer processes to move very large files, Aspera can achieve multi-Gbps (billions of bits per second) transfer speeds to and from all of the leading cloud providers, including IBM, AWS, Azure and Google.
These improvements have brought impressive results. In a recent independent performance analysis of IBM Cloud Object Storage with Aspera multi-session transfer and Amazon S3 with Transfer Acceleration, the IBM technology completed a 20GB file transfer up to 12 times faster.

Connect with IBM Aspera
NAB Show 2019 will clearly be a big event for IBM Aspera partners and clients. The team will continue to push boundaries and bring new capabilities to organizations around the world.
Are you attending NAB Show 2019 in Las Vegas? Be sure to stop by the IBM Aspera booth at NAB Show 2019 for a technical demonstration or one of the many daily presentations in the booth theater.
If you aren’t able to attend the show, schedule a meeting with IBM Aspera to learn more.
The post New IBM Aspera updates help media and entertainment companies push boundaries appeared first on Cloud computing news.
Quelle: Thoughts on Cloud

Kubernetes Operator Hands-on Workshop at Red Hat Summit on May 6th Announced

For those of you who have been following the rise of the Operators across the Kubernetes eco-system and were wondering what all the excitement was about, here’s your chance to get up to speed quickly and get some hands-on experience building them. This is day-long hands-on workshop is co-located with Red Hat Summit in Boston, […]
The post Kubernetes Operator Hands-on Workshop at Red Hat Summit on May 6th Announced appeared first on Red Hat OpenShift Blog.
Quelle: OpenShift

Installing OpenShift 4 from Start to Finish

You have probably heard about all the great engineering work going on to get the next release of OpenShift 4 ready for prime time. OpenShift 4 marks an incredible advancement for enterprise Kubernetes as it includes some great new features such as over the air updates and integration with the operator hub.   One of […]
The post Installing OpenShift 4 from Start to Finish appeared first on Red Hat OpenShift Blog.
Quelle: OpenShift

Transforming government with cloud and AI

This may be the best time to work in — and with — government.
Given the daunting societal challenges we face, from extreme weather to economic development, cyber threats and aging infrastructure — working in government means the chance to do something about them.
With the emergence of cloud, artificial intelligence (AI), Internet of Things (IoT), blockchain and other new technologies, the ability for leaders and change-makers in government agencies to directly impact public challenges has never been greater.
Writing the next chapter in government cloud adoption
Over the last 10 years, IBM has worked with agencies at all levels of government to make tremendous strides in applying cloud, AI, blockchain and other new technologies.
Cloud and AI, though, are still in their infancy. According to an IBM-commissioned study by McKinsey & Company, less than 20 percent of enterprise workloads are utilizing cloud technology.
We’re entering what we call “chapter 2” of the cloud and AI era. Cloud no longer should be thought of as a “location.” Rather, it’s a set of capabilities in all of an organization’s environments that, when done right, can help to lower costs and deliver innovation and flexibility.
Importantly, “chapter 2” is about bringing cloud, AI, IoT, blockchain and other digital technologies to complex, mission-critical applications. It’s about using all your data and managing applications in a multicloud environment.
Introducing the IBM Government Cloud Virtual Summit
To help your agency succeed in this next chapter, we’ve created the Government Cloud Virtual Summit. No matter how advanced or how early your agency is in terms of cloud and AI, this free, online conference will help you accelerate.
At the Summit you’ll hear government and industry leaders talk about how they’re using new technologies to address a range of mission issues, including rebuilding trust in the aftermath of a cybersecurity breach, responding to natural disasters and improving public safety.
We’ll also look at technology challenges that impact almost every agency, such as finding and attracting talent, adopting modern development practices like agile and DevOps, modernizing legacy applications, securing your cloud environments, managing cloud sprawl and so much more.
This Government Cloud Summit is for leaders and change-makers who want to adopt cloud, AI and other digital technologies faster, who want to do it right, learn where to start and positively impact their mission.
Register now. We can’t wait to see you.
The post Transforming government with cloud and AI appeared first on Cloud computing news.
Quelle: Thoughts on Cloud