Gartner: Wie KI das C-Level-Management entwertet
Die IT-Branche gibt ein Vermögen für C-Level-Manager aus. Mit KI werden deren Entscheidungen schneller und besser. Zeit, um dort endlich abzuspecken. (Arbeit, KI)
Quelle: Golem
Die IT-Branche gibt ein Vermögen für C-Level-Manager aus. Mit KI werden deren Entscheidungen schneller und besser. Zeit, um dort endlich abzuspecken. (Arbeit, KI)
Quelle: Golem
AWS Lambda increases the file descriptor limit from 1,024 to 4,096, a 4x increase, for functions running on Lambda Managed Instances (LMI). This capability enables customers to run I/O intensive workloads such as high-concurrency web services, and file-heavy data processing pipelines, without running into file descriptor limits. LMI enables you to run Lambda functions on managed Amazon EC2 instances with built-in routing, load-balancing, and auto-scaling, giving you access to specialized compute configurations including the latest-generation processors and high-bandwidth networking, with no operational overhead. Customers use Lambda functions to build a wide range of serverless applications such as event-driven workloads, web applications, and AI-driven workflows. These applications rely on file descriptors for operations such as opening files, establishing network socket connections to external services and databases, and managing concurrent I/O streams for data processing. Each open file, network socket, or internal resource consumes one file descriptor. Today, Lambda supports a maximum of 1,024 file descriptors. However, LMI allows multiple requests to be processed simultaneously, which often requires higher number of file descriptors. With this launch, AWS Lambda is increasing the file descriptor limit to 4,096, allowing customers to run I/O intensive workloads, maintain larger connection pools, and effectively utilize multi-concurrency for functions running on LMI. This feature is available in all AWS Regions where AWS Lambda Managed Instances is generally available. To get started, visit the AWS Lambda Managed Instances documentation.
Quelle: aws.amazon.com
Starting today, customers can deploy their Graviton-based and GPU-accelerated workloads on Amazon Elastic Container Service (Amazon ECS) Managed Instances in a Federal Information Processing Standard (FIPS) compliant mode in the AWS GovCloud (US) Regions. FIPS is a U.S. and Canadian government standard that specifies the security requirements for cryptographic modules that protect sensitive information.
In the AWS GovCloud (US) Regions, Amazon ECS Managed Instances automatically enable FIPS compliance by default. ECS Managed Instances communicate through FIPS-compliant endpoints, use appropriately configured cryptographic modules, and boot the underlying kernel in FIPS mode. Customers with federal compliance requirements can run workloads with FIPS-validated cryptographic modules across a broad range of instance types, including Graviton-based, GPU-accelerated, network-optimized, and burstable performance instances.
To learn more about FIPS, refer to FIPS on AWS and AWS Fargate Federal Information Processing Standard (FIPS-140). To get started with ECS Managed Instances, use the AWS Console, Amazon ECS MCP Server, ECS Express Mode, or your favorite infrastructure-as-code tooling to enable it in a new or existing Amazon ECS cluster. You will be charged for the management of compute provisioned, in addition to your regular Amazon EC2 costs. To learn more about ECS Managed Instances, visit the feature page, documentation, and AWS News launch blog.
Quelle: aws.amazon.com
Today, AWS announces the ability to remotely connect from Kiro and Cursor IDEs to Amazon SageMaker Studio. This new capability allows data scientists, ML engineers, and developers to leverage their Kiro and Cursor setup – including its spec-driven development, conversational coding, and automated feature generation capabilities – while accessing the scalable compute resources of Amazon SageMaker Studio. By connecting Kiro and Cursor to SageMaker Studio using the AWS Toolkit extension, you can eliminate context switching between your local IDE and cloud infrastructure, maintaining your existing agentic development workflows within a single environment for all your AWS analytics and AI/ML services. SageMaker Studio, offers a broad set of fully managed cloud interactive development environments (IDE), including JupyterLab and Code Editor based on Code-OSS (Open-Source Software), and VS Code IDE as remote IDE. Starting today, you can also use your customized local Kiro and Cursor setup – complete with specs, steering files, and hooks – while accessing your compute resources and data on Amazon SageMaker. You can authenticate using the AWS Toolkit extension in Kiro or Cursor or through SageMaker Studio’s web interface. Once authenticated, connect to any of your SageMaker Studio development environments in a few simple clicks. You maintain the same security boundaries as SageMaker Studio’s web-based environments while developing AI models and analyzing data in local IDE of your choice – Kiro or Cursor. To learn more, refer to the SageMaker user guide.
Quelle: aws.amazon.com
Amazon EC2 High Memory U7i-8TB instances (u7i-8tb.112xlarge) and U7i-12TB instances (u7i-12tb.224xlarge) are now available in AWS Europe (Milan). U7i instances are part of AWS 7th generation and are powered by custom fourth generation Intel Xeon Scalable Processors (Sapphire Rapids). U7i-8tb instances offer 8TiB of DDR5 memory, and U7i-12tb instances offer 12TiB of DDR5 memory, enabling customers to scale transaction processing throughput in a fast-growing data environment.
U7i-8tb instances deliver 448 vCPUs; U7i-12tb instances deliver 896 vCPUs. Both instances support up to 100 Gbps of Amazon EBS bandwidth for faster data loading and backups, 100 Gbps of network bandwidth, and ENA Express. U7i instances are ideal for customers using mission-critical in-memory databases like SAP HANA, Oracle, and SQL Server.
To learn more about U7i instances, visit the High Memory instances page.
Quelle: aws.amazon.com
In Zagreb soll Europas erster kommerzieller Robotaxi-Dienst starten. Die Rimac-Tochter Verne arbeitet dafür mit Uber und Pony.ai zusammen. (Mobilität, Elektroauto)
Quelle: Golem
Nach verheerenden Attacken auf Trivy, LiteLLM und andere Tools will TeamPCP massenhaft eingesammelte Zugangsdaten für Ransomware-Angriffe einsetzen. (Cybercrime, Verschlüsselung)
Quelle: Golem
Wegen des Schürfens von Millionen Spotify-Songs fordern Musikriesen nun Schadensersatz in dreistelliger Millionenhöhe. (Spotify, Urheberrecht)
Quelle: Golem
Die God-of-War-Serie besetzt eine zentrale Figur: Sonya Walger übernimmt die Rolle der Freya. (God of War, Amazon)
Quelle: Golem
2023 hat Apple den Mac Pro mit Käsereiben-Design und M2 Ultra vorgestellt. Es ist der letzte klassische Tower-Computer von Apple. (Mac Pro, Apple)
Quelle: Golem