Amazon Bedrock now offers Claude Mythos Preview (Gated Research Preview)

Amazon Bedrock, the platform for building generative AI applications and agents at production scale, now offers Claude Mythos Preview in gated research preview as part of Project Glasswing. Claude Mythos Preview is Anthropic’s most advanced AI model to date, representing a fundamentally new model class with state-of-the-art capabilities across cybersecurity, software coding, and complex reasoning tasks. The model can identify sophisticated security vulnerabilities in software and demonstrate exploitability, comprehending large codebases and delivering actionable findings with less manual guidance than previous AI models. This enables security teams to accelerate defensive cybersecurity work, find and fix security vulnerabilities in the world’s most critical software, and address these issues before threats emerge. Claude Mythos Preview signals an upcoming wave of AI models with powerful cybersecurity capabilities. Anthropic and AWS are taking a deliberately cautious approach to release, prioritizing internet-critical companies and open-source maintainers whose software and digital services impact hundreds of millions of users. This approach gives defenders the opportunity to strengthen their codebases and share what they learn so the whole industry can benefit. Claude Mythos Preview is available in gated preview in the US East (N. Virginia) Region through Amazon Bedrock. Access is limited to an initial allow-list of organizations. If your organization has been allow-listed, your AWS account team will reach out directly. For AWS CISO Amy Herzog’s perspective on this launch and what it means for the future of cybersecurity, read Building AI Defenses at Scale: Before the Threats Emerge.
Quelle: aws.amazon.com

Amazon SageMaker adds serverless workflows to Identity Center domains

Amazon SageMaker Unified Studio now supports Serverless Workflows in Identity Center domains.  With this launch, customers using Identity Center domains can orchestrate data processing tasks with Apache Airflow (powered by Managed Workflows for Apache Airflow) without provisioning or managing Airflow infrastructure. Serverless Workflows were previously available only in IAM-based domains. 
Serverless Workflows automatically provision compute resources when a workflow runs and release them when it completes, so you only pay for actual workflow run time. Each workflow runs with its own execution role and isolated worker, providing workflow-level security and preventing cross-workflow interference. With Serverless Workflows, Identity Center domain customers also get access to the Visual Workflow experience with support for around 200 operators, including built-in integration with AWS services such as Amazon S3, Amazon Redshift, Amazon EMR, AWS Glue, and Amazon SageMaker AI.
Serverless Workflows in Identity Center domains are available in all AWS Regions where SageMaker Unified Studio is supported. To learn more, visit the Serverless Workflows documentation.
Quelle: aws.amazon.com

AWS Lambda expands response streaming support to all commercial AWS Regions

AWS Lambda response streaming is now available in all commercial AWS Regions, bringing full regional parity for this capability. Customers in newly supported Regions can use the InvokeWithResponseStream API to progressively stream response payloads back to clients as data becomes available.
Response streaming enables functions to send partial responses to clients incrementally rather than buffering the entire response before transmission. This reduces time-to-first-byte (TTFB) latency and is well suited for latency-sensitive workloads such as LLM-based applications as well as web and mobile applications where users benefit from seeing responses appear incrementally. Response streaming supports payloads up to a default maximum of 200 MB.
With this expansion, customers in all commercial Regions can stream responses using the InvokeWithResponseStream API through a supported AWS SDK, or through Amazon API Gateway REST APIs with response streaming enabled. Response streaming supports Node.js managed runtimes as well as custom runtimes.
Streaming responses incur an additional cost for network transfer of the response payload. You are billed based on the number of bytes generated and streamed out of your Lambda function over the first 6 MB. To get started with Lambda response streaming, visit the AWS Lambda documentation. 
Quelle: aws.amazon.com

AWS Certificate Manager now supports native certificate search

AWS Certificate Manager (ACM) now provides a search bar in the console that customers can use to find certificates using one or more certificate parameters such as domain name, certificate ARN, and/or certificate validity. For example, ACM users who manages multiple certificates can search for certificates with specific domains that are due to expire soon. To get started, use the new SearchCertificates API, or navigate to the ACM console and use the search bar to search by one or more certificate parameters. This feature is available in all Public AWS, AWS China, and AWS GovCloud regions. To learn more about this feature, please refer to Search Certificates. You can learn more about ACM and get started here.
Quelle: aws.amazon.com

Announcing Amazon S3 Files, making S3 buckets accessible as file systems

S3 Files delivers a shared file system that connects any AWS compute resource directly with your data in Amazon S3. With S3 Files, Amazon S3 is the first and only cloud object store that provides fully-featured, high-performance file system access to your data. It provides full file system semantics and low-latency performance, without your data ever leaving S3. That means file-based applications, agents, and teams can now access and work with your S3 data as a file system using the tools they already depend on. Built using Amazon EFS, S3 Files gives you the performance and simplicity of a file system with the scalability, durability, and cost-effectiveness of S3. You no longer need to duplicate your data or cycle it between object storage and file system storage. S3 Files maintains a view of the objects in your bucket and intelligently translates your file system operations into efficient S3 requests on your behalf. Your file-based applications run on your S3 data with no code changes, AI agents persist memory and share state across pipelines, and ML teams run data preparation workloads without duplicating or staging files first. Now, file-based tools and applications across your organization can work with your S3 data directly from any compute instance, container, and function using the tools your teams and agents already depend on.    Organizations store their analytics data and data lakes in S3, but file-based tools, agents, and applications have never been able to directly work with that data. Bridging that gap meant managing a separate file system, duplicating data, and building complex pipelines to keep object and file storage in sync. S3 Files eliminates that friction and overhead. Using S3 Files, your data is accessible through the file system and directly through S3 APIs at the same time. Thousands of compute resources can connect to the same S3 file system simultaneously, enabling shared access across clusters without duplicating data. S3 Files works with all of your new and existing data in S3 buckets, with no migration required.    S3 Files caches actively used data for low-latency access and provides up to multiple terabytes per second of aggregate read throughput, so storage never limits performance. There are no data silos, no synchronization complexities, and no tradeoffs. File and object storage, together in one place without compromise.
S3 Files is now generally available in 34 AWS Regions. For the full list of supported Regions, visit the AWS Capabilities tool. To learn more, visit the product page, S3 pricing page, documentation, and AWS News Blog.
Quelle: aws.amazon.com