New OpenShift on OpenStack Reference Architecture

Large IT organizations are increasingly looking to develop innovative software applications in hybrid and multi clouds architectures. A lot of these applications have to be developed and deployed in an on-premises private cloud for various reasons (e.g. security and compliance, data affinity, performance, etc.). This private cloud should be simple, agile, flexible, secure, cost efficient, […]
The post New OpenShift on OpenStack Reference Architecture appeared first on Red Hat OpenShift Blog.
Quelle: OpenShift

Helping move healthcare organizations to Azure

Today’s healthcare organizations are expected to be agile, reduce costs, and direct capital toward revenue generating activities that improve patient outcomes. The cloud is a key part of the answer, but implementing a new solution on the cloud also requires new skills especially around governance, compliance with HIPAA, and security practices. Many healthcare organizations look to an experienced partner to help them migrate solutions from on-premises to the cloud, while building in the right set of structures to seamlessly handle known and future challenges.

The Azure platform offers a wealth of services for partners to enhance, extend, and build industry solutions. Here we describe how one Microsoft partner uses Azure to solve a unique problem.

Wanted: Governance and compliance expertise

For organizations that have moved to the cloud, a lack of governance and understanding about the way cloud services work can lead to wasted spending, unpredictable cloud service bills, and cloud vendor lock-in. The rapid growth of cloud infrastructures also creates a dizzying array of possibilities that can keep a team uncertain of the correct path and second guessing their choices, which can lead to delay and add risk of failure.

Now, healthcare CIOs increasingly rely on cloud platforms, but they run into new problems. To prevent the inevitable difficulties requires a staff that is fully enabled with the right skills for compliance, privacy, and security. Health IT professionals need guidance on how to move an on-premises healthcare infrastructure to a cloud platform, and ensure HIPAA compliance, policies, safeguards, and resources are in place.

Here are the major areas that require thought and planning:

Privacy, compliance concerns: Protecting patient data is a persistent concern, along with implementation, uncertainty, and risk. Concerns about HIPAA compliance, cloud, and legacy system integration are among the major obstacles that have kept healthcare IT on-premises.
Budget constraints, cost optimization: Cloud service bills are often highly detailed and complicated, making it difficult to determine which application, department, or resource is the source of a cost overrun.
Technical hurdles: Healthcare IT professionals may not have the skills or resources to leverage cloud services to do things like extend an on-premises datacenter to a hybrid cloud.
Training: Retaining and enabling IT staff is a key challenge, and education on any new solution is critical to success. Everyone should have easy to understand resources regardless of the role whether it be IT leaders, administrators, developers, and/or database administrators.
Gaps in capabilities: Even with an on-premises solution, many use special services from a vendor. Planning should include those partners as well as specialized areas that the vendors don’t currently address.

Solution

Burwood Group is a Microsoft partner that specializes in moving healthcare organizations to Azure. If a client has a secure, on-premises network, Burwood will build a secure cloud network and leverage the same regulatory controls used for an on-premises installations. They will also educate technology teams on endpoint security and serverless security, with emphasis on HIPAA compliance in the cloud.

The consulting firm offers extensive training. For example, through a one-day class, they provide the basic education to have a successful implementation in Azure, with an emphasis on healthcare requirements in the cloud. This workshop includes hands-on lab exercises and is 100 percent focused on pertinent, practical, and actionable knowledge.

Benefits

Standardization: As a cloud team, nothing is left to guess work. Instead, consistency is instilled across the team. Through education, Burwood introduces the healthcare datacenter in Azure.
Flexibility: IT teams may need to work with multiple cloud architectures for healthcare. This occurs as care is increasingly managed across settings with more interoperability across applications and business entities. Understanding best practices for the cloud allows expertise that is independent of any application or vendor.
Control: When it comes to cloud governance for healthcare, organizations need to control cloud sprawl. As personnel enter or leave an organization, permissions must be carefully allowed or revoked to prevent security breaches. Burwood provides education on these subjects: What is going into and out of Azure? Who has rights to resources in Azure? These types of questions are answered.
Service catalog: Burwood seeks to keep users informed of new services through a service catalog. Users are instructed about the following.

Handling cloud service requests and change management.
Expanding the current service catalog through an Azure for healthcare IT emphasis.
Potential items that users can request through the service catalog in Azure.

Indexing: All resources in the cloud must be tagged with cost center, creation date, and more.
IP awareness: Users are instructed to be very careful of public IP address assignments, and the potential of creating vulnerabilities.

Services

The company has a proficiency in both healthcare and Azure technology. These are a few of the Azure services used to create custom solutions:

Azure portal
Azure Resource Manager
Azure role based access control
Azure Active Directory
Azure Load Balancer

Next steps

To learn more about other industry solutions, go to the Azure for healthcare page. To find more details about consulting and a one day Azure University for healthcare workshop, go to the Azure Marketplace listing for the Burwood Group and select Contact me.
Quelle: Azure

Investing in Google infrastructure, investing in Nevada

Today we’re announcing new infrastructure investments in the state of Nevada: a new Google data center and Google Cloud region. These investments will expand our footprint in the southwestern U.S., creating more jobs in the area, improving connectivity and speed for users of Google services and Google Cloud customers, and ensuring that Nevada will become one of the world’s onramps to the internet.Google infrastructureInfrastructure is a key area of investment for us because it underpins all of the work that we do and supports all of our products. Data centers are the engines of the internet, and as demand for online content and cloud services continues to increase, our data centers are growing too. They support all of our products including Search, Ads, Maps, YouTube, and Google Cloud. In total, we’ve invested $47 billion in capex between 2016 and 2018, which includes investments in our infrastructure. We announced earlier this year that we’ll invest another $13 billion in the U.S. alone, including this investment in Nevada.Globally, Google operates data centers in sixteen locations, and Google Cloud customers are served by 20 cloud regions and 61 availability zones around the world. Once complete, our new site in Nevada will be part of this worldwide network of data centers.Economic growth and technology in the Silver StateThe Las Vegas metro area is home to over two million people and a booming entertainment and gaming industry. Whether you’re a Gmail user, a global retailer, or one of the world’s largest entertainment corporations, fast access to online content and cloud services are critical to keeping your day running smoothly. Caesars Entertainment Corporation is among the world’s largest hotel operators, with 40,000 rooms around the globe. The data analytics team at Caesars leverages Google Cloud’s BigQuery serverless data warehouse and TensorFlow machine learning framework to aggregate data and derive meaningful insights from it. With these valuable insights, Caesars Entertainment has improved the results of their marketing and hospitality initiatives. “Caesars Entertainment selected Google Cloud because we depend on highly reliable performance as well as scalability for our data analytics initiatives,” said Gene Lee, SVP Chief Analytics Officer for Caesars Entertainment. “The addition of a Google Cloud region in Las Vegas, combined with the sophisticated capabilities of BigQuery and TensorFlow, should enable Caesars to further differentiate the gaming, hospitality and entertainment experiences we are able to offer to individual guests.”A new Google data center At our groundbreaking event in Henderson today, we marked the start of construction on a new Google data center. Today’s celebration was attended by Senator Catherine Cortez Masto, Senator Jacky Rosen, Representative Susie Lee, and Governor Steve Sisolak who spoke about how Google continues to invest in the state by bringing tech jobs to the area, giving local non-profits access to over $1 million in funding, and providing additional support for small and large businesses in the state. When it comes online in 2020, the new data center will enhance our ability to provide the fastest and most reliable services for all our users and customers. We’re creating more jobs, servicing more customers in the area, and creating economic opportunity by supporting local nonprofits.A cloud for NevadaWhen it launches, the new Google Cloud region in Las Vegas will give organizations in the Western U.S. and those doing business in Nevada faster access to Google Cloud Platform products and tools that will help supercharge their businesses. The region will have three availability zones and will support our portfolio of key GCP products, delivering simple, reliable, and secure infrastructure and lightning fast data analytics and ML/AI capabilities.And we aren’t stopping there—we’ll launch our Salt Lake City cloud region in early 2020, for a total of seven Google Cloud regions in the continental United States. These new regions will enable Google Cloud customers to distribute their workloads across up to four regions in the west—Los Angeles, Oregon, Salt Lake City, and Las Vegas—providing even greater connectivity than ever before. Contact sales to learn more about cloud region availability and to get started on GCP today.A new home in the SouthwestWe believe it’s important to invest in the communities that we call home. In this spirit, today we announced the Google.org Impact Challenge Nevada, a $1,000,000 commitment to Silver State nonprofits with bold and innovative ideas to create economic opportunity in their communities. Beginning today, local nonprofits can submit their proposals to a panel of local judges who will select five winners to receive $175,000 grants and training from Google.org to jumpstart their ideas. Additionally, Nevadans will have a chance to vote for their favorite idea from the five winners, and the “People’s Choice Winner” will receive an additional $125,000 in funding.Google is proud to call Nevada its newest home, and we’ll continue to invest in communities throughout the state. Thank you for welcoming Google into your communities—we look forward to building out our infrastructure in Nevada and welcoming Google Cloud customers to our Las Vegas region soon.
Quelle: Google Cloud Platform

Introducing advanced security options for Cloud Dataproc, now generally available

Google Cloud Platform (GCP) offers security and governance products that help you meet your policy, regulatory, and business objectives. The controls and capabilities we offer are always expanding. We’re pleased to announce that we’ve expanded the security capabilities of Cloud Dataproc, our fully managed Hadoop and Spark service, by making Kerberos and Hadoop secure mode security configurations generally available. Cloud Dataproc’s new security configurations give you the best of two worlds: access to modern, best-in-class security features and infrastructure, and the familiar controls you’ve already developed for your Hadoop and Spark environments. Moving on-prem Hadoop clusters securely With Kerberos and Hadoop secure mode, you can migrate your existing Hadoop security controls directly into the cloud without having to make changes to your security policies and procedures. You can now enable new tools in Cloud Dataproc, including: Connecting Cloud Dataproc back to Microsoft Active DirectoryEncrypting data in flight between nodes in a cluster Supporting multi-tenant clustersHere’s a look at a common customer setup for Kerberos on Cloud Dataproc.Each GCP user is associated with a cloud identity. This authentication mechanism gives users the ability to SSH into a cluster, run jobs via the API and to create cloud resources (i.e., a Cloud Dataproc cluster).If you want to use a Kerberized “Hadoop” application, you have to obtain a Kerberos principal. Microsoft Active Directory is used as a cross-realm trust to users and groups that map into Cloud Dataproc Kerberos principals.Note: This setup requires Active Directory to be source of truth for user identities. Cloud Identity is only a synchronized copy.  When the “Hadoop” application needs to obtain data from Cloud Storage, a Cloud Storage Connector is invoked. The Cloud Storage Connector allows “Hadoop” to access Cloud Storage data at the block level as if it were a native part of Hadoop. This connector relies on a service account to authenticate against Cloud Storage.Standing on the shoulders of GCP securityKerberos and Hadoop secure mode provides you parity with legacy Hadoop security platforms, making it easy to port your existing procedures and policies. However, you may find that even though you maintain existing security practices, the overall security posture of your Hadoop and Spark environments greatly improves with the migration to GCP. This is because Cloud Dataproc and GCP take advantage of the same secure-by-design infrastructure, built-in protection, and global network that Google uses to protect your information, identities, applications, and devices. In addition, GCP and Cloud Dataproc offer additional security features that help protect your data. Some of the most commonly used GCP-specific security features used with Cloud Dataproc include: Default at-rest encryption, where GCP encrypts customer data stored at rest by default, with no additional action required from you. We offer a continuum of encryption key management options, including a CMEK feature that lets you create, use, and revoke the key encryption key (KEK). Stackdriver Monitoring provides visibility into the performance, uptime, and overall health of cloud-powered applications. Stackdriver collects and ingests metrics, events, and metadata from Cloud Dataproc clusters to bring you insights via dashboards and charts.OS Login allows you to use Compute Engine IAM roles to manage SSH access to Cloud Dataproc instances. This is an alternative to manually managing instance access by adding and removing SSH keys in metadata.VPC Service Controls allow you to define a security perimeter around Cloud Dataproc and the data stored in Cloud Storage buckets. Datasets can be constrained within a VPC to help mitigate data exfiltration risks. With VPC Service Controls, you can keep sensitive data private and still take advantage of the fully managed storage and data processing capabilities of GCP.These features and many others are certified by third-party auditors. Cloud Dataproc certifications include the most widely recognized, internationally accepted independent security standards, including ISO for security controls, cloud security and privacy, as well as SOC 1, 2, and 3. These certifications help us meet the demands of industry standards such as HIPAA and PCI. We continue to expand our list of certifications globally to assist our customers with their compliance obligations.End-to-end authorization with GCP Token BrokerAs a typical cloud best practice, we recommend that the GCP service accounts associated with the virtual machines (or cloud infrastructure) access datasets on behalf of a user. Many Cloud Dataproc customers choose to provision small autoscaling clusters for each Cloud Dataproc user. This way, there is a clear audit log to see who was on which cluster when it accessed a Cloud Storage dataset. However, we also hear that many enterprise customers would prefer to use multi-tenant clusters and have strict compliance requirements that dictate that access to GCP resources (Cloud Storage, BigQuery, Cloud Bigtable, etc.) must be attributable to the individual user who initiated the request. In addition, to meet compliance requirements, this should be done in a way that ensures no long-lived credentials are stored on client machines or worker nodes.To meet these customer goals, Google Cloud created an open source GCP Token Broker. The GCP Token Broker enables end-to-end Kerberos security and Cloud IAM integration for Hadoop workloads on GCP. You can use this open source software to bridge the gap between Kerberos and Cloud IAM to allow users to log in with Kerberos and access GCP resources.The following diagram illustrates the overall architecture for direct authentication.For more on how the GCP Token Broker extends the functionality of the generally available Kerberos and Hadoop secure mode in Cloud Dataproc, check out the joint Google and Cloudera session from Google Cloud Next ’19: Building and Securing Data Lakes.  Getting started with secure modeTo get started with Kerberos and Hadoop secure mode, check “Enable Kerberos and Hadoop secure mode” in the Cloud Dataproc console, as shown here:To securely exchange a secret key and administrator password, you will first need to create those files outside of the console and encrypt them using Cloud Key Management Service. By default, Cloud Dataproc will turn on all the features of Hadoop secure mode, including in-flight encryption. Cloud Dataproc will auto-generate a self-signed certificate for the encryption, or you can upload your own. Any default setting can be overwritten using a cluster property. For example, if you want to enable multi-tenant Cloud Dataproc but don’t have compliance requirements that warrant the performance penalty associated with in-transit encryption within a VPC, you can disable the in-transit encryption by setting the following Cloud Dataproc properties: You can set these properties from gcloud or in the cluster properties page, as shown here:A cross-realm trust option is also available if you want to rely on an external directory like Microsoft Active Directory. For complete instructions on setting up different types of security configurations, check out Cloud Dataproc security configuration.
Quelle: Google Cloud Platform