Informatica Innovations Unveiled at AWS re:Invent 2025
Last Published: Dec 02, 2025 |
Table Of Contents
Table Of Contents
As thousands of cloud, data and AI enthusiasts, innovators and leaders come together at AWS re:Invent 2025, one question is paramount: How can we unlock the full potential of AI? In this blog, we will explore the latest Informatica innovations and integrations showcased at re:Invent, highlighting how these advancements empower our joint AWS customers to accelerate their AI and analytics journey with secure, trusted data.
Informatica MCP Servers for Amazon Bedrock Agent Core
Informatica introduced its new Model Context Protocol (MCP) servers, enabling enterprises to seamlessly integrate Informatica’s advanced data management capabilities into their agentic AI workflows on AWS. These MCP servers provide secure, programmatic access to enterprise-grade data management features — including data integration, data governance, and data quality — from Informatica’s Data Management Cloud (IDMC) directly to agents running on Amazon Bedrock AgentCore. With this integration, you can easily configure and connect one or more Informatica MCP servers to the AgentCore Gateway, creating a unified endpoint for your agents. You can also leverage the MCP servers’ built-in support for OAuth 2.0 to secure the connection with the AgentCore Gateway.
Through the unified endpoint, you can build and scale agent workflows faster without the hassle of managing multiple tool connections or redoing integrations. Combined with the AgentCore Gateway’s built-in semantic search, and Informatica’s rich API set enables agents to effectively select the right Informatica tools based on task context, boosting agent performance and reducing development complexity at scale. Please use the link to sign-up for preview access today.
CLAIRE® AI Engine Integration with Anthropic Claude through Amazon Bedrock
Informatica’s CLAIRE® AI Engine now leverages Anthropic’s Claude models through Amazon Bedrock, for advanced reasoning, semantic parsing and natural language understanding to Informatica’s agentic services. This integration enhances CLAIRE®’s ability to support intelligent agents across data integration, data quality, governance, and Master Data Management (MDM).
With Anthropic Claude via Amazon Bedrock, CLAIRE® Agents can perform complex tasks such as schema grounding, SQL optimization and semantic query generation, accelerating how enterprises access, manage and trust their data. Using CLAIRE® Agents, you can interact with Informatica’s cloud platform to:
- Automate goal-based exploration of enterprise data in cloud data warehouses and data lakes and enrich product data attributes from hierarchies and taxonomies in Informatica MDM using the Data Exploration Agent
- Continuously monitor and remediate data quality across cloud warehouses, MDM systems, and third-party repositories through Data Quality Agent (Preview)
- Enrich product data attributes from hierarchies and taxonomies in Informatica MDM with the Product Experience Agent (Preview)
Informatica Agentic AI Blueprint for Amazon Bedrock AgentCore

Informatica unveiled a new Enterprise Agent blueprint—a prescriptive framework for joint customers and partners to build and scale agentic workflows within their organization using trusted data foundations from Informatica. The architectural blueprint enables users to:
- Incorporate domain/enterprise context into agentic workflows by leveraging metadata (including data quality, lineage, access policies, business glossary terms, etc.) along with semantic insights from Informatica’ Metadata System of Intelligence
- Delegate data validation related flows via MCP to Informatica IDMC
- Integrate trusted context (e.g. customer details, product details, cross-domain relationship etc.) from Informatica Master Data Management and 360 applications
- Leverage the 300+ Informatica connectors and data integration pipeline for ingesting data from diverse internal and external enterprise sources to Amazon Bedrock Knowledge Bases
Informatica Achieves AWS Agentic AI Specialization
Back in May 2025, Informatica earned the AWS Generative AI Competency, showcasing its expertise and success in driving digital transformation with AWS generative AI technologies. And now Informatica is a launch partner for the AWS Agentic AI Specialization, a new category within the AWS AI Competency program. This distinction acknowledges Informatica as a trusted AWS Partner that empowers customers to deploy intelligent, self-directed AI systems capable of thinking, planning, and autonomously executing complex business processes. This achievement underscores Informatica’s commitment to keeping pace with the rapid innovation essential for enabling customers and partners to thrive in the fast-evolving AI landscape.
Informatica AI Agent Engineering Support for Amazon Bedrock
Informatica previewed its new AI Agent Engineering Service to build, connect, orchestrate, and manage fully customizable Informatica agents through an intuitive no-code interface while leveraging the choice of Large Language models from Amazon Bedrock. Featuring built-in test consoles for thorough validation, advanced monitoring tools, comprehensive SDLC support, and robust logging and observability, this service enables customers to efficiently build, test, deploy, monitor, and govern AI agents at scale. Additionally, the AI Agent Hub (also in Private Preview) provides a rich library of pre-built, domain-specific AI agents and automation recipes, further reducing development and deployment effort.
Informatica Connector for Amazon SageMaker Lakehouse
Informatica announced the general availability of its Cloud Data Integration (CDI) connector for Amazon SageMaker Lakehouse. The unified connector will support both Amazon S3 data lakes and S3 Tables. The connector enables you to ingest data from over 300 sources and quickly build no-code or low-code pipelines for machine learning, generative AI, and analytics. With built-in support for Apache Iceberg, the connector offers open and interoperable access datasets, making it easier for data scientists and engineers to prepare, transform, and operationalize data across AWS analytics and AI environments. For more details refer Introduction to Amazon SageMaker Lakehouse Connector.
Row-Level Access Control for Amazon Redshift with Informatica CDAM
Informatica announced a new feature that enables the creation, management, and enforcement of data access policies directly within Amazon Redshift through Informatica’s Cloud Data Access Management (CDAM) service.
With this release you will be able to assign granular permissions (Read, Write, Delete) on Redshift tables and views and apply filter policies on Redshift tables. Both the access and filter policies are linked to user roles or groups and integrated seamlessly with metadata stored in Informatica Cloud Data Governance and Catalog service. These permissions and policies are automatically synchronized between CDAM and Redshift, ensuring that any changes to metadata, policy definitions, or policy expiration are instantly propagated – eliminating the need for manual updates. This release streamlines secure data access management within Amazon Redshift, helping you enforce compliance and governance at scale.
Informatica Managed Secure Agent with AWS Private Link
Informatica has announced the general availability of Informatica Managed Secure Agent on AWS. This release enables you to build and deploy no-code, low-code data integration pipelines using Informatica’s managed infrastructure on AWS. Thus, reducing the overhead of managing the compute resource for Informatica Secure Agents. By leveraging AWS PrivateLink for Service Endpoints, you can securely connect to AWS services such as Amazon S3, Amazon Redshift, Amazon Elastic File System, Amazon RDS, and third-party services like Snowflake, ServiceNow, and Workday—with resource-level granularity. You can also share AWS resources by granting permissions to specific AWS principals, enabling Informatica Secure Agents to access the shared services through private, secure connectivity. For more detail refer to Serverless setup using AWS PrivateLink for Service Network Endpoints.