Data Quality and Integration: Central to All Data Projects

Last Published: Nov 11, 2024 |
Donal Dunne
Donal Dunne

Associate Director, Product Marketing

When doing some research for this article I came across the following quote from TDWI in 2006: “Profiling, data quality, and data integration are three business practices that go together like bread, butter, and jam. . . data management professionals and their business counterparts need to coordinate efforts and design projects that integrate all three practices effectively.” Now fast-forward to 2020 and TDWI still highlights the importance of data quality and data management: “Organizations must take a holistic approach with key data management practices including data quality, governance, data preparation, and integration.”

Despite the many information technology advances since both articles were written, only now are some organizations realizing the true value of their enterprise data.  Increasing reliance on automated business processes, more stringent regulation, and the ever-increasing pitch of competition are driving organizations to focus on the data they use to run their business as they strive to improve customer service, comply with government regulations and customer mandates, and streamline global operations through digital transformation.

Data Everywhere

corporate data needs to be universally accessible, flexible, reusable, and trusted

To support today’s business processes and goals, all corporate data needs to be universally accessible, flexible, reusable, and trusted. Organizations need to know more about what is in their source systems, they need to be able to integrate data from multiple systems into new, more productive data-intensive applications, and they need to be able to cleanse, validate, and enhance data as well as monitor and manage the quality of data as it is used in different applications.

Businesses are using more data from more sources in more systems today, and new digital transformation initiatives mean that often data collected for one purpose is reused and repurposed in other applications. But data collected for use in operational systems may not be suitable for other applications such as business intelligence, customer relationship management, machine learning, or regulatory reporting. At the same time, fragmented and hybrid IT systems can lead to various data quality issues, such as data duplication, lack of conformity, and other discrepancies if all related copies of data are not kept synchronized and current.

Symbiotic Relationship

Data quality and integration technologies have evolved to support organizations grappling with fragmented and defective data. There is no doubt that the technologies support intrinsically symbiotic activities, but where does one end and the other begin in the data integration process? The answer is that the two functions should work together seamlessly.

Before data integration processes can deliver data to the data warehouse, customer relationship management system, or business analytics applications, the data needs to be analyzed and cleansed.

Analyzing and profiling the data prior to data integration are essential steps in the planning process and dramatically speed up the development of data integration workflows and mappings. This initial profiling helps organizations to identify and understand their source data and ultimately to reconcile the source data and the target systems through effective data transformation.

But data quality doesn’t stop there. Data quality is not an exercise that can be completed once and then forgotten about. The problem with data is that its quality degenerates over time. Just as data integration is an ongoing process, so too is data quality. Data quality encompasses more than finding and fixing missing or inaccurate data. It means delivering comprehensive, consistent, relevant, fit-for-purpose, and timely data to the business regardless of its application, use, or origin.

As data quality reaches across the enterprise, the distinctions between data quality and data integration blur even further. For just as data integration processes take advantage of data quality, so too enterprise data quality leverages data integration technology. As data quality initiatives get bigger and become more deeply embedded in operational systems and processes, they require the underlying data integration platform to deliver performance and scale, to help organizations realized the true value of their information assets.

The Right Tools for the Right Role

Despite the close linkages between data quality and data integration, the two disciplines require different skill sets and different tools. The data integration developer is focused on moving data from source to target as fast as possible, while the data quality professional works to ensure the quality of the raw data content by utilizing various data quality metrics to track accuracy, consistency, and completeness  Having a single platform for data quality and integration provides organizations with all the tools they need to access, analyze, cleanse, and deliver all data types on-premises, in the cloud, or in a hybrid environment.

As data quality management initiatives grow bigger and become more deeply embedded in operational systems and processes, they require the underlying data integration platform to deliver performance and scale to help organizations realize the true value of their information assets.

How Informatica Unifies Data Quality and Integration

Companies of all sizes rely on Informatica to access data they can trust throughout their enterprise based on our data integration and data quality solutions. Data Engineering Integration (DEI) ingests and delivers data from virtually any business system, in any format, and delivers that data throughout the enterprise at any speed. Data Engineering Quality (DEQ) provides profiling, cleansing, standardization, deduplication, and monitoring of data to ensure that the data delivered is not only accurate, but certifiably accurate. The products in the Intelligent Data Platform work together at different stages of the data integration and data quality process providing robust, enterprise-class, quality data from a  single integrated environment.

Want to learn more? For the 12th straight year Informatica has been named a Leader in the Gartner Magic Quadrant for Data Quality Tools,[1] and has been named a Leader for 14 years in the Gartner Magic Quadrant for Data Integration Tools.[2]

Next Steps


[1] Gartner Magic Quadrant for Data Quality Tools, Melody Chien, and Ankush Jain, 27 March 2019.

[2] Gartner, Magic Quadrant for Data Integration Tools, Ehtisham Zaidi, Eric Thoo, Nick Heudecker, 1 August 2019

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, express or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

When doing some research for this article I came across the following quote from TDWI in 2006: “Profiling, data quality, and data integration are three business practices that go together like bread, butter, and jam. . . data management professionals and their business counterparts need to coordinate efforts and design projects that integrate all three practices effectively.” Now fast-forward to 2020 and TDWI still highlights the importance of data quality and data management: “Organizations must take a holistic approach with key data management practices including data quality, governance, data preparation, and integration.”

Despite the many information technology advances since both articles were written, only now are some organizations realizing the true value of their enterprise data.  Increasing reliance on automated business processes, more stringent regulation, and the ever-increasing pitch of competition are driving organizations to focus on the data they use to run their business as they strive to improve customer service, comply with government regulations and customer mandates, and streamline global operations through digital transformation.

Data Everywhere

corporate data needs to be universally accessible, flexible, reusable, and trusted

To support today’s business processes and goals, all corporate data needs to be universally accessible, flexible, reusable, and trusted. Organizations need to know more about what is in their source systems, they need to be able to integrate data from multiple systems into new, more productive data-intensive applications, and they need to be able to cleanse and enhance data as well as monitor and manage the quality of data as it is used in different applications.

Businesses are using more data from more sources in more systems today, and new digital transformation initiatives mean that often data collected for one purpose is reused and repurposed in other applications. But data collected for use in operational systems may not be suitable for other applications such as business intelligence, customer relationship management, machine learning, or regulatory reporting. At the same time, fragmented and hybrid IT systems can lead to data duplication, lack of conformity, and other discrepancies if all related copies of data are not kept synchronized and current.

Symbiotic Relationship

Data quality and data integration technologies have evolved to support organizations grappling with fragmented and defective data. There is no doubt that the technologies support intrinsically symbiotic activities, but where does one end and the other begin? The answer is that the two functions should work together seamlessly.

Before data integration processes can deliver data to the data warehouse, customer relationship management system, or business analytics applications, the data needs to be analyzed and cleansed.

Analyzing and profiling the data prior to data integration are essential steps in the planning process and dramatically speed up the development of data integration workflows and mappings. This initial profiling helps organizations to identify and understand their source data and ultimately to reconcile the source data and the target systems.

But data quality doesn’t stop there. Data quality is not an exercise that can be completed once and then forgotten about. The problem with data is that its quality degenerates over time. Just as data integration is an ongoing process, so too is data quality. Data quality encompasses more than finding and fixing missing or inaccurate data. It means delivering comprehensive, consistent, relevant, fit-for-purpose, and timely data to the business regardless of its application, use, or origin.

As data quality reaches across the enterprise, the distinctions between data quality and data integration blur even further. For just as data integration processes take advantage of data quality, so too enterprise data quality leverages data integration technology. As data quality initiatives get bigger and become more deeply embedded in operational systems and processes, they require the underlying data integration platform to deliver performance and scale, to help organizations realized the true value of their information assets.

The Right Tools for the Right Role

Despite the close linkages between data quality and data integration, the two disciplines require different skill sets and different tools. The data integration developer is focused on moving data from source to target as fast as possible, while the data quality professional works to ensure the quality of the raw data content. Having a single platform for data quality and data integration provides organizations with all the tools they need to access, analyze, cleanse, and deliver all data types on-premises, in the cloud, or in a hybrid environment.

As data quality initiatives grow bigger and become more deeply embedded in operational systems and processes, they require the underlying data integration platform to deliver performance and scale to help organizations realize the true value of their information assets.

How Informatica Unifies Data Quality and Data Integration

Companies of all sizes rely on Informatica to access data they can trust throughout their enterprise based on our data integration and data quality solutions. Data Engineering Integration (DEI) ingests and delivers data from virtually any business system, in any format, and delivers that data throughout the enterprise at any speed. Data Engineering Quality (DEQ) provides profiling, cleansing, standardization, deduplication, and monitoring of data to ensure that the data delivered is not only accurate, but certifiably accurate. The products in the Intelligent Data Platform work together at different stages of the data integration and data quality process providing robust, enterprise-class, quality data from a  single integrated environment.

Want to learn more? For the 12th straight year Informatica has been named a Leader in the Gartner Magic Quadrant for Data Quality Tools,[1] and has been named a Leader for 14 years in the Gartner Magic Quadrant for Data Integration Tools.[2]

Next Steps


[1] Gartner Magic Quadrant for Data Quality Tools, Melody Chien, and Ankush Jain, 27 March 2019.

[2] Gartner, Magic Quadrant for Data Integration Tools, Ehtisham Zaidi, Eric Thoo, Nick Heudecker, 1 August 2019

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, express or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

First Published: Jun 23, 2020