An Introduction to Convert2RHEL: Now officially supported to convert RHEL-like systems to RHEL

Convert2RHEL is now an officially supported component of Red Hat Enterprise Linux (RHEL). Convert2RHEL enables the conversion of select RHEL derivative distributions into a supportable RHEL system, retaining existing applications and configurations. This is the culmination of multiple teams within Red Hat who have worked to provide solutions and guidance to our customers and the community at large. 
Quelle: CloudForms

Introducing SAP Integration with Cloud Data Fusion

Businesses today have a growing demand for data analysis and insight-based action. More often than not, the valuable data driving these actions is in mission critical operational systems. Among all the applications that are in the market today, SAP is the leading provider of ERP software and Google Cloud is introducing integration with SAP to help unlock the value of SAP data quickly and easily. Google Cloud native data integration platform Cloud Data Fusion now offers the capability to seamlessly get data out of SAP Business Suite, SAP ERP and S/4HANA. Cloud Data Fusion is a fully managed, cloud-native data integration and ingestion service that helps ETL developers, data engineers and business analysts efficiently build and manage ETL/ELT pipelines that accelerate the building of data warehouses, data marts, and data lakes on BigQuery or operational reporting systems on CloudSQL, Spanner or other systems. To simplify the unlocking of SAP data, today we’re announcing the public launch of the SAP Table Batch Source. With this capability, you can now use Cloud Data Fusion to easily integrate SAP application data to gain invaluable insights via Looker. You can also leverage the best in class machine learning products on Google Cloud to help you gain insight into your business by combining SAP data with other datasets. Some examples include running machine learning on IoT data joined with ERP transactional data to do predictive maintenance, application to application integration with SAP and CloudSQL based applications, fraud detection, spend analytics, demand forecasting etc.Let’s take a closer look at the benefits of the SAP Table Batch Source in Cloud Data Fusion:Developer ProductivityAs Cloud Data Fusion is a complete, visual environment, users can use the Pipeline Studio to quickly design pipelines that read from SAP ECC or S/4HANA. With Data Fusion’s prebuilt transformations, you can easily join data from SAP and non SAP systems, and perform complex transformations like data cleansing, aggregations, data preparation, and lookups to rapidly get insights from the data. Time to ValueIn traditional approaches, users are forced to define models on data warehousing systems. In Cloud Data Fusion, this is automatically performed for the users when using BigQuery. After you design and execute a data pipeline that writes to BigQuery, Data Fusion auto generates the schema in BigQuery for you. As users don’t need to pre build models, you get insight into your data faster, which results in improved productivity  for your organization.Performance and ScalabilityCloud Data Fusion scales horizontally to  execute pipelines. Users can leverage the ephemeral clusters or dedicated clusters to run the pipelines. The SAP Batch Source plugin automatically tunes the data pipelines for optimal performance when it extracts data from your SAP systems, based on both SAP application server resources and Cloud Data Fusion runtime resources. If parallelism is misconfigured, a failsafe mechanism in the plugin prevents any issues in your source system.How does SAP Table Batch Source work?Transfer full table data from SAP to BigQuery or other systemsIn the Pipeline Studio, you can add multiple SAP source tables to a data pipeline, and then join the other SAP source tables with joiner transformations. As the joiner is executed in the Cloud Data Fusion processing layer, there is no additional impact on the SAP system.  For example, To create a Customer Master data mart, you can join all relevant tables from SAP using the plugin, and then build complex pipelines for that data in Cloud Data Fusion’s Pipeline Studio. Extract table records in parallelTo extract records in parallel, you can configure the SAP Table Batch Source plugin using the Number of Splits to Generate property. If this property is left blank, the system determines the appropriate value for optimal performance. Extract records based on conditionsThe SAP Table Batch source plugin allows you to specify filter conditions by using the property Filter Options.  You specify the conditions in OpenSQL syntax. The plugin uses the SQL WHERE clause to filter the tables. Records can be extracted based on conditions like certain columns having a defined set of values or a range of values.  You can also specify complex conditions that combine multiple conditions with AND or OR clauses (e.g. TIMESTAMP >= ‘ 20210130100000′ AND TIMESTAMP <= ‘ 20210226000000′).Limit the number of records to be extractedUsers can also limit the number for records extracted from the specified table by using the property Number of Rows to Fetch. This is particularly useful in development and testing scenarios.Maximizing the returns on dataWith Google Cloud Platform, you can already scale and process huge amounts of social, operational, transaction and IoT data to extract value and gain rapid insights. Cloud Data Fusion provides many connectors to existing enterprise applications and data warehouses. With the native capabilities to unlock SAP data with Cloud Data Fusion into BigQuery, you can now go a step further and get more by driving rapid and intelligent decision making.Ready to try out the SAP Table Batch connector? Create a new instance of Data Fusion and deploy the SAP plugin from the Hub. Please refer to the SAP Table Batch Source user guide for additional details. To learn more about how leading companies are powering innovation with our data solutions including data integration, check out Google Cloud’s Data Cloud Summit on May 26th.
Quelle: Google Cloud Platform

Introducing Cloud CISO perspectives

Since I joined Google Cloud as Chief Information Security Officer three short months ago, I’ve seen firsthand the unique point of view we have to improve security for our customers and society at large through the cloud. I started in this new role as the security industry was rattled by a major breach impacting the software supply chain, and I was reminded of one of the reasons I joined Google – the opportunity to push the industry forward in addressing challenging security issues and helping lay the foundation for a more secure future. Today, I’m excited to begin a new blog series that we will use to share our perspectives on the biggest announcements and trends in cybersecurity from Google Cloud and from across the industry – whether it’s conference highlights, new research or achievements from our Google-wide security teams. My hope is this series serves as your one-stop-shop to learn about our most important security updates and why they matter straight from a CISO’s perspective. Thoughts from around the industryGlobal Supply Chains in the Era of COVID-19 – Last month, I participated in a Council on Foreign Relations panel about the supply chain risks brought on by the COVID-19 pandemic. One of the biggest takeaways is the need for organizations and governments to discuss the ongoing steady state of risk management of supply chains as they exist today, such as risk mapping across a global supply chain. Just as physical supply chains have to prepare for natural risks, every supply chain has a digital element that could be disrupted and requires thinking through cyber prevention measures. IDC Multicloud Paper – We supported IDC in their work to investigate how multicloud can help regulated organizations mitigate risks of using a single cloud vendor. The paper looks at different approaches to multi-vendor and hybrid clouds taken by European organizations and how these strategies can help organizations address concentration risk and vendor-lock in, improve their compliance posture and demonstrate an exit strategy.Operational resilience is a key area of focus for financial services firms, and regulators around the world have been evolving their guidance on the use of outsourcers, including cloud service providers, in this context. We’ve worked closely with our FSI customers in this area and as a result produced a new paper on how migrating to cloud can help ensure the operational resilience required by customers, shareholders and regulators.#ShareTheMicInCyber – We celebrated an important industry effort called  #ShareTheMicInCyber for Women’s History Month, co-founded by one of our very own Googlers Camille Stewart. The benefits of DEI apply in all domains, but especially cyber, where we’ve learned first hand that diverse security teams are more innovative, produce better products and enhance our ability to defend against cyber threats. Google security corner Spectre proof-of-concept – Google’s security team published results from recent research on the exploitability of Spectre against web users. The research presented proof-of-concept (PoC) written in JavaScript which could leak information from a browser’s memory. There is immense value in sharing these types of findings with the security community. Additionally the team’s work highlighted protections available to web authors and best practices for enabling them in web applications, based on our experience across Google.Open Source Security – We continue to see tremendous activity and support for the work of the Open Source Security Foundation that Google helped establish. Membership is open to all to help drive security on many critical projects. Learn how to get involved here. We also welcomed the announcement of sigstore, a new project in the Linux Foundation that aims to improve software supply chain integrity and verification.Google Cloud security highlights Our Cloud security teams have been busy this quarter. We hit major milestones with product announcements like BeyondCorp Enterprise, Risk Protection Program and launched our new Google Cloud Security podcast. Here are some of my biggest takeaways:BeyondCorp Enterprise – Earlier this year, we announced our comprehensive zero-trust offering, BeyondCorp Enterprise, that brings our modern, proven BeyondCorp technology to organizations so they can get started on their own zero trust journey. Trusted Cloud- We outlined our vision to deliver a truly trusted Cloud built on three pillars: transparency and sovereignty, zero trust, and shared fate.Risk Protection Program – Google Cloud announced a partnership with two leading insurers to provide specialized cybersecurity insurance coverage for Google Cloud customers who adhere to specific security best practices and provide automated documentation of their security posture through our platform.Active Assist account security recommendations – Active Assist provides recommendations for our users on how to optimize their cloud deployments. We launched a new “Account security” recommender that will automatically detect when a user with elevated permissions, such as a Project Owner is not using strong authentication. They will see a notification prompting them to enable their phone as a phishing-resistant second factor, helping to further protect their account. New Security Best Practices documentation – We released two new comprehensive papers: A CISO’s Guide to Cloud Security Transformation and updated our Google Cloud security foundations guide.Over the next few months, we’ll be busy working on a number of new papers on cloud risk management for Risk and Compliance Officers and Heads of IT Audit as well some pieces on reimagining the Security Operations Center of the future. Thanks for checking out our first post in a series of many. I look forward to sharing more CISO perspectives with you soon.
Quelle: Google Cloud Platform