The ELT Process: Advanced Pushdown Optimization Versus Open Database Connectivity Pushdown

What’s Right for Your ELT Process/Data Pipeline?

Last Published: Aug 04, 2022 |
Arkapravo Chakraborty
Arkapravo Chakraborty

Principal Product Manager

What’s Right for Your ELT Process/Data Pipeline?

If you are processing your data pipeline using ELT (extract, load, transform) and have been using Informatica Cloud Data Integration (PowerCenter or Intelligent Data Management Cloud (IDMC)), you likely used Open Database Connectivity (ODBC)-based pushdown. Pushdown, the mechanism behind ELT, translates entire transformation logic into SQL queries and pushes them down to the underlying target. It helps in faster processing of the mapping since the entire transformation logic is encapsulated in SQL query. Plus, the same query is executed at the underlying target instead of having a lot of back-and-forth data movement between the target and the Informatica engine.

Last year we introduced an advanced pushdown optimization (APDO) capability. It was designed to handle various cloud data lake/data warehouse use cases and patterns. While it serves the same purpose (i.e., pushing down the transformation logic to target), it differs in many ways from ODBC-based pushdown.

This blog will clarify the difference between APDO- and ODBC-based pushdown. It will also provide guidelines on what is best for your ELT process.

6 Tips on When to Use APDO or ODBC-based Optimization

  1. One of the fundamentals of ODBC pushdown is this: To run a mapping in ODBC pushdown, an ODBC connection needs to be established first. But APDO supports all native connector features. So, there is no need to create any additional connection.
  2. While ODBC pushdown is helpful pushing down the data in ODBC-based targets, its capability is limited when it comes to cloud data warehouse and data lake (CDW/DL) use cases. ODBC pushdown currently only supports classic data warehouse patterns that generally involve working with dimensional modeling (dimensions, facts, SCD, etc.). Other patterns include data vaults, operational data stores, etc. But APDO is specifically designed to support additional use cases. An example is data movement between cloud data lakes to cloud data warehouses. Another use case is supporting classic cloud data warehouse patterns.
  3. ODBC pushdown functionality is considered a legacy capability. There are no plans to add more features in ODBC pushdown. APDO is designed to handle advanced CDW/DL patterns and related use cases. New features are in active development.
  4. In cloud data integration use cases, APDO coverage is more widespread than ODBC-based pushdown. Advanced pushdown supports all major ecosystems and cloud data warehouses (Snowflake, Microsoft Azure, Google BigQuery, Amazon Redshift and Databricks). It also supports most widely used transformations, functions and expressions.
  5. To run ODBC pushdown, a secure agent is mandatory. APDO, however, can be run with Secure Agent, Informatica runtime and Advanced Serverless.
  6. Regarding licensing, APDO carries a separate license. If you are using Informatica's IDMC consumption-based pricing model, it is included in the package. Also, it is important to note that APDO metering is done differently than how Cloud Data Integration (CDI) runs. For example, if you are running a mapping in APDO full pushdown mode, it will be metered for advanced pushdown and not for CDI. ODBC pushdown does not carry any separate license. You will be metered based on CDI metering if you are using ODBC pushdown optimization with CDI.

Based on the above, here are our recommendations:

  • If your primary use case is to process data in the cloud between CDL to CDW or within CDW, APDO is ideal. Its wide range of capabilities is appealing.
  • If you are using your on-premises data warehouse or using any ODBC-based targets, ODBC-based pushdown is the better choice.

See ELT Integration in Action

If you are currently using Informatica Cloud Data Integration, our APDO capabilities are suited to any organization that wants to bring cost-efficient and productive ELT integration to its data architecture. Ready to dive in? Get started with a 30-day trial. Or, learn how APDO can speed up your data processing.

Ready to solve complex use cases with APDO? Then join us on August 31 for an exciting webinar, which includes a live demo of how APDO runs in Snowflake!

First Published: Jun 22, 2022