Transforming data from liability to competitive advantage

With more than 100 data sources to substantiate, Fannie Mae needed to tackle data quality from both a technology and a workforce angle.

“By 2017, a third of Fortune 100 companies will experience information crises, including compromised, non-compliant, or inaccurate data.”


Bad data can lead to poor business agility, policy breaches, and even hefty fines from regulators. In fact, Gartner predicts that by 2017, a third of Fortune 100 companies will experience information crises, including compromised, non-compliant, or inaccurate data. These problems will stem from an inability to adequately assess the value of their data or completely trust their enterprise information.

Fortunately, there are ways to transform data quality from a liability to a competitive advantage. Take Fannie Mae, for example, which provides large-scale access to housing finance in the United States. The organization can’t afford to sacrifice data quality. It is integral to Fannie Mae’s critical business functions, including regulatory compliance, accounting, and financial reporting.

Fannie Mae relies on more than 100 data sources, including mortgage data, real estate attributes, and securities information. To put it in context, that data influenced business decisions that led to the financing of 215,000 home purchases, 212,000 refinanced mortgages, and 93,000 rental units. In the second quarter of 2014 alone.

Quality assurance

Due to its role as the leading source of liquidity for housing in the United States, the organization is subject to federal regulations. These rules were established after the 2008 global financial crisis. Fannie Mae is required to file quarterly reports to both the U.S. Federal Reserve and Wall Street that demonstrate the accuracy of its data. It must demonstrate how data enters, flows through, and is changed by multiple internal processing and reporting applications. Beyond that, it is responsible for tracing that data to the actions of individual users.

To meet these requirements, Fannie Mae devised a data quality strategy comprising both innovative technology solutions and data governance best practices. This unified approach to data quality allows the organization to proactively discover, profile, monitor, and cleanse its data in a consistent and reusable manner.

Establishing best practices

Instead of evaluating data quality in vertical business silos, Fannie Mae takes an integrated approach to defining and measuring data quality across the enterprise. Dubbed “Holistic Data Quality,” this unique strategy consists of:

  1. A consistent definition of data quality requirements across the organization
  2. Proactive measuring of data quality
  3. Continuous improvement of the quality of business-critical data

The result is a framework that provides greater consistency and transparency into data quality issues across the entire information supply chain.

To avoid missing data anomalies, Fannie Mae also proactively monitors and measures data quality via automatic alerts. This significantly reduces systemic issues and operational incidents.

The organization also makes it a practice to involve all parties in the pursuit of data quality. Following an initial focus on enterprise-critical data, each line-of-business (LOB) leader is responsible for governing and managing the quality of data relating to his or her LOB.

Consistent data quality requirements, a holistic framework, and greater accountability can help put today’s data chaos in order. For Fannie Mae, these best practices let them replace paper reports with business intelligence. Which, in turn, drives greater transparency into data flows and ensures clean and correct data is collected from multiple sources.

See how the company was able to find the “one source of truth” through its data management solution.

Related content


Fannie Mae

Informatica helped Fannie Mae ensure clean and correct data is collected and integrated from 100+ data sources


Citrix chooses innovation over gruelling manual processes

Becoming a market leader requires mastering customer data from multiple applications, as well as from both internal and external sources.


Take a collaborative approach to defining data quality

EMC’s Barbara Latulippe explains the importance of working closely with the business to understand who is going to use the data and how.


Turn an application data migration initiative into a data governance pilot

Make application data migration into more than simply moving old data to a new system. It is also an ideal opportunity to showcase the potential value from a data governance program.