Adobe: Premiere Pro-Update mit KI-Remix für Musik
Das Videoschnittprogramm Premiere Pro von Adobe bekommt ein Update, mit dem Musikclips an die Länge der Videoinhalte angepasst werden. (Adobe, Grafiksoftware)
Quelle: Golem
Das Videoschnittprogramm Premiere Pro von Adobe bekommt ein Update, mit dem Musikclips an die Länge der Videoinhalte angepasst werden. (Adobe, Grafiksoftware)
Quelle: Golem
Hacker hatten 2016 fast 120.000 Bitcoins erbeutet, Nun sind diese von der US-Justiz beschlagnahmt worden – und sind Milliarden US-Dollar wert. (Bitcoin, Technologie)
Quelle: Golem
Mit dem Erscheinen von Vivaldi 5.1 erhält der Browser neue Komfortfunktionen. Auch die Android-Version des Browsers bekommt ein Update. (Vivaldi, Browser)
Quelle: Golem
Einen eigenen Server einzurichten und zu warten, muss nicht kompliziert sein – siehe Yunohost. (Server, Debian)
Quelle: Golem
What if there was a way to vastly reduce all lag in online gaming? With edge computing, that very well may be possible. In this post, learn how edge computing can solve the challenges of modern online gaming.
Quelle: CloudForms
This year’s Red Hat 2022 Global Tech Outlook report came with some interesting insights, including that AI/ML, edge and serverless computing are top priority emerging technologies for the year ahead. Learn more.
Quelle: CloudForms
In our conversations with technology leaders about data-driven transformation using Google Data Cloud – industry’s leading unified data and AI solution – , one important topic is incorporating continuous intelligence to move from answering questions such as “What has happened? to questions like “What is happening?” and “What might happen?”. The core to this evolution is the need for an underlying data processing that not only provides powerful real-time capabilities for events happening close to origination, but also brings together existing data sources under one unified data platform to enable organizations to draw insights and take actions holistically. Dataflow, Google’s cloud-native data processing and streaming analytics platform, is a key component of any modern data and AI architecture and data transformation journey, along with BigQuery, Google’s internet-scale warehouse with built-in streaming, BI engine and ML; Pub/Sub, a global no-ops event delivery service; and Looker, a modern BI and embedded analytics platform. One of the key evaluation factors is potential economic value of Dataflow to their organization, particularly in the context of engaging other stakeholders is key for many of the leaders that we engage with. So we commissioned Forrester Consulting to conduct a comprehensive study on the impact that Dataflow had on their organization by interviewing actual customers . Today we’re excited to share our commissioned study conducted by Forrester Consulting, the Total Economic Impact™ of Google Cloud Dataflow, which allows data leaders to understand and quantify the benefits of Dataflow, and use cases it enables. Forrester conducted interviews with Dataflow customers to evaluate the benefits, costs, and risks of investing in Dataflow across an organization. Based on their interviews, Forrester identified major financial benefits across four different areas: business growth, infrastructure cost savings, data engineer productivity, and administration efficiency. In fact, Forrester found that customers adopting Dataflow can achieve a 55% boost in developer productivity and a 50% reduction in infrastructure costs. In fact, Forrester projects that customers adopting Dataflow can achieve a range of up to 171% Return on Investment (ROI) and a less than six months payback period. Customers can now use figures in the report to compute their own Return on Investment (ROI) and payback period.“Dataflow is integral to accelerating time-to-market, decreasing time-to-production, reducing time to figure out how to use data for use cases, focusing time on value-add tasks, streamlining ingestion, and reducing total cost of ownership.” – Lead technical architect, CPGLet’s take a deeper look at the ways that Forrester found that Dataflow can help you achieve your goals and unlock your business potential. Benefit #1: Increase data engineer productivity by 55%Developers can choose among a variety of programming languages to define and execute data workflows. Dataflow also seamlessly integrates with other Google Cloud Platform and open source technologies to maximize value and applicability to a wide variety of use cases. Dataflow streamlined workflows with code reusability,dynamic templates, and the simplicity of a managed service. Engineers trusted pipelines to run correctly and adhere to governance. Data engineers avoided laborious issue-monitoring and remediation tasks that were common in the legacy environments such as poor performance, lack of availability, and failed jobs. Teams valued the language flexibility and open source base.“Dataflow provided us with ETL replacement that opened limitless potential use cases and enabled us to do smarter data enhancement while data remains in motion.” — Director of data projects, financial servicesBenefit #2: Reduce infrastructure costs by up-to 50% for batch and streaming workloads Dataflow’s serverless autoscaling and discrete control of job needs, scheduling, and regions eliminated overhead and optimized technology spending. Consolidating global data processing solutions to Dataflow further eliminated excess costs while ensuring performance, resilience, and governance across environments. Dataflow’s unified streaming and batch data platform gives organizations the flexibility to define either workload in the same programming model, run it on the same infrastructure, and manage it from a single operational management tool. “Our costs with our cloud data platform using Dataflow are just a fraction of the costs we faced before. Now we only pay for cloud infrastructure consumption because the open source base helps us avoid licensing costs. We spend about $120,000 per year with Dataflow, but we’d be spending millions with our old technologies.” – Lead technical architect, CPGBenefit #3: Increase top-line revenue by improving customer experience and retention with payback time of < 6 monthsStreaming analytics is an essential capability in today’s digital world to gain real-time actionable insights. Likewise, organizations must also have flexible, high- performance batch environments to analyze historical data for building machine learning models, business intelligence, and advanced analytics. Dataflow enabled real-time streaming use cases, improved data enrichment, encouraged data exploration,improved performance and resiliency, reduced errors, increased trust, and eliminated barriers to scale. As a result, organizations provided customers with more accurate, relevant, and in-the-moment data-backed services and insights — boosting customer experience, creating new revenue streams, and improving acquisition, retention, and enrichment.“It’s already been proven that we are getting more business [with Dataflow] because we can turn around results faster for customers.” – VP of technology, financial services technology“When we provide data to our customers and partners with Dataflow, we are much more confident in those numbers and can provide accurate data within a minute. Our customers and partners have taken note and commented on this. It’s reduced complaints and prevented churn.” – Senior software engineer, mediaOther benefits Eliminated administrative overhead and toilAs a cloud-native managed service, all administration tasks such as provisioning, scaling, and updates are automatically handled by Google Cloud. Teams no longer need to manage servers and related software for legacy data processing solutions. Admins also streamlined processes for setting up data sources, adding pipelines, and enforcing governance.Saved business operations costs for support teams and data end usersDataflow improved the speed, quality, reliability, and ease of access to data for insights for general business users, saving time and empowering users to drive better data-backed outcomes. It also reduced support inquiry volume while automating manual job creation.What’s next?Download the Forrester Total Economic Impact study today to dive deep into the economic impact Dataflow can deliver your organization. We would love to partner with you to explore the potential Dataflow can unlock in your teams. Please reach out to our sales team to start a conversation about your data transformation with Google Cloud.Related ArticleDataflow Prime: bring unparalleled efficiency and radical simplicity to big data processingCreate even better data pipelines with Dataflow Prime, coming to Preview in Q3 2021.Read Article
Quelle: Google Cloud Platform
If there’s one thing we learned talking to Kubernetes users, it’s that optimizing for reliability, performance, and cost efficiency is hard — especially at scale.That is why, not long ago, we released GKE cost optimization insights in preview, a tab within the Google Cloud Console that helps you discover optimization opportunities at scale, across your Google Kubernetes Engine clusters and workloads, automatically with minimal friction.The functionality allows you to figure out, over a selected period of time, the current state of your clusters by exposing the actual used, requested and allocated resources. For workloads running on your clusters, it shows your actual used and requested resources, as well as the set limits, so you can make granular, workload-level right-sizing optimizations.GKE cost optimization insights have proved popular with users right out of the gate. For example, Arne Claus, Site Reliability Engineer at hotel search platform provider Trivago says that “The new GKE cost optimization insights view helped us to identify cost optimization opportunities at the cluster and workload level and take immediate action. In the first weeks of use, the Trivago team spotted and improved the cost/performance balance of several clusters.”Today, we’re graduating GKE cost optimization insights from Preview to GA. It has undergone multiple improvements that we believe will help you with your day-to-day optimization routines. For instance, we’ve made it easier to spot under-provisioned workloads that could be at risk of instability due to insufficient resource requests.Now that you have the insights into optimization opportunities, let’s recap what capabilities are helping the most with reliability, performance, and cost efficiency in GKE, and what resources are available for your teams to get up to speed with GKE cost optimization.In public cloud managed Kubernetes services, there are four major pitfalls that lead to non-optimized usage of Kubernetes clusters:Culture – Many teams that embrace the public cloud have never worked with a pay-as-you-go service like GKE before, so they’re unfamiliar with how resource allocation and app deployment processes can affect their costs. The new GKE cost optimization insights can help teams better understand such an environment and can help improve business value by providing insights into balancing cost, reliability and performance needs. Bin packing – The more efficiently you pack apps into nodes, the more you save. You can pack apps into nodes efficiently by ensuring you’re requesting the right amount of resources based on your actual utilization. GKE cost optimization insights helps you identify bin-packing gaps by looking at the gray bar in the cluster view.App right-sizing – You need to be able to configure the appropriate resource requests and workload autoscale targets for the objects that are deployed in your cluster. The more precise you are in setting accurate resource amounts to your pods, the more reliably your apps will run and, in the majority of cases, the more space you will open in the cluster. With GKE cost optimization insights, you can visualize the right-sizing information by looking at the green bar in both cluster and workload views.Demand-based downscaling – To save money during low-demand periods such as nighttime, your clusters should be able to scale down with demand. However, in some cases, you can’t scale them down because there are workloads that cannot be evicted or because a cluster has been misconfigured.GKE cost optimization insights help you better understand and visualize these pitfalls. In order to solve them, or make them non-issues right from the beginning, there are de-facto solutions available from Google Cloud. For example, you can use the new GKE cost optimization insights to help with monitoring and with the cultural shift toward FinOps. If you don’t want to deal with bin packing, you can use the Autopilot mode of operation. Set up node auto-provisioning along with optimize-utilization profile can also help optimize bin packing. To help with app right-sizing and demand-based downscaling you can take advantage of GKE Pod autoscalers — in addition to the classic Horizontal Pod Autoscaler, we also provide a Vertical Pod Autoscaler and a Multidimensional Pod Autoscaler.We’ve extensively written about GKE features such as Autopilot, optimized VM types, Node auto-provisioning, pod autoscalers and others in our GKE best practices to lessen overprovisioning. This is a great place to learn how to solve for your newly discovered optimization opportunities.If you want a deeper dive into technical details, check out these best practices for running cost-optimized Kubernetes applications on GKE, an exhaustive list of GKE best practices.And finally, for the visual learner, there’s the GKE cost optimization video series on Youtube, where our experts will walk you through key concepts of cost optimization step by step.Related ArticleFind your GKE cost optimization opportunities right in the consoleNew GKE cost optimization insights now appear in the console, making it easier to adopt Kubernetes best practices like app right-sizing a…Read Article
Quelle: Google Cloud Platform
We pride ourselves on listening to our customers and then building products and partnerships that meet customer needs and enable every application to migrate to Azure. We recognize that migrating Virtual Desktop, Virtual Server, High Performance Compute, Analytics, and many other critical applications requires copying tens of terabytes to several petabytes of file data stored on file servers, NAS appliances, and Object Storage to Azure. Automated, intuitive, and scalable solutions are required to migrate file data between heterogeneous platforms and eliminate the inherent complexity and risk of these projects. Our customers have told us that copying unstructured and semi-structured file data to Azure Blob Storage, Azure Files, and Azure NetApp Files needs to be fast and easy so you can focus on innovating with Azure services.
Today we are announcing the Azure File Migration Program which gives customers and partners in our Solution Integrator and Service Provider ecosystem, access to industry-leading file migration solutions from Komprise and Data Dynamics—at no cost. These solutions help easily, safely, and securely migrate file and object data to Azure Storage.
Azure Migrate offers a very powerful set of no-cost (or low-cost) tools to help you migrate virtual machines, websites, databases, and virtual desktops for critical applications. You can modernize legacy applications by migrating them from servers to containers and build a cloud native environment. Our new program complements Azure Migrate and provides the means to migrate applications and workloads that include large volumes of unstructured file data.
This program offers free software licensing, an onboarding session, and access to the migration solution provider’s support organization. You can review a detailed comparison of the solutions, review the Getting Started Guides for Data Dynamics and Komprise, and watch videos showcasing their functionality. After choosing the solution that best fits your needs, you simply select the appropriate Azure sponsored offer from the Azure Marketplace.
We plan to expand this program going forward to include additional migration ISVs and target storage platforms to support any and every storage migration scenario—subscribe to this blog for updates as we expand the program.
Learn more about the Azure File Migration Program
To learn more about this program, please visit our Tech Community Blog where Principal Program Manager Karl Rautenstrauch has written a post to help you move forward and take advantage of this great offer! You can also learn more about migrating application workloads to Azure by visiting the Azure Migration and Modernization Center.
Quelle: Azure
Der Quellcode von Behördensoftware bleibt für deren Mitarbeiter tabu. Grund dafür sind auch die “unattraktiven Arbeitsbedingungen” für IT-Experten. (Politik/Recht, Internet)
Quelle: Golem