Driving value without locking down your data

No data should be off-limits. Learn to unleash the power of all your data while still preserving its security.

There’s a perception that you can’t innovate in certain areas, that you can’t drive improvements across all areas of the enterprise. There’s a do-nothing mentality around sensitive information.

—Julie Lockner, vice president of product marketing at Informatica

Since the dawn of computing, there has been a struggle between openness and security. Information managers engage in an ongoing battle over the use of sensitive data even though its use could deliver significant insight to the business. The chief information security officer (CISO), with support from the chief financial officer, insists that sensitive data be locked down to keep it from falling into the wrong hands. The chief operating officer, chief marketing officer, and head of sales, however, want to get as much value from the data as possible to grow the business, differentiate user experiences, and streamline operations. How can you drive value from sensitive data without risking exposure?

"There’s a perception that if data is considered sensitive it is in "lock-down," which prevents innovation and improvements across the enterprise. There’s a do-nothing mentality around sensitive information. But tools exist that will allow you to conduct successful projects with that data without falling out of compliance or exposing that information," says Julie Lockner, vice president of product marketing at Informatica and co-author of the Potential at Work Community for Application Leaders.

Data masking, as it is known, protects sensitive data from unauthorized access by altering its value while maintaining its original characteristics. You can preserve data security and comply with data privacy regulations while retaining the meaning of the data. The information continues to be useful and relevant to the business, compliance is maintained and the risk of a data breach is avoided.

Different use cases

There are two approaches to data masking:

  1. Dynamic data masking, in which data is masked dynamically in the application
  2. Persistent data masking, in which data is permanently altered at the source

Dynamic data masking integrates with standard access controls such as lightweight directory access protocol (LDAP), ActiveDirectory, and identity access management software. It masks data in production applications without the need to write any code.

"Dynamic data masking lets you mask sensitive fields on the fly. You can share and move data around while ensuring only authorized users are allowed to see the true values. You can use the data in analytics and research without violating data privacy regulations," explains Lockner.

Persistent data masking protects sensitive data in non-production environments. Many times production applications are cloned for development, user acceptance testing, and training. Data is automatically and permanently altered during the cloning process to eliminate any chance of a data breach or risk of exposure. Because many companies outsource testing and development tasks, security has become paramount.

Overcoming organizational silos

An organization's operational silos often make it difficult to implement data masking from production to non-production environments. Differing perspectives and goals can make it difficult to work together. It is up to the information manager to be the cross-functional liaison between multiple teams.

"Most people perceive data governance and data security as two different competencies, but they’re not. They should be managed and coordinated as highly complementary initiatives," insists Lockner. "There would be significant economies of scale and greater business value delivered if they work together."

Find out what data masking can mean to your organization and why Gartner calls the technology data masking "mandatory" for certain enterprises in its 2012 Magic Quadrant for Data Masking Technology report.

Related content


Fannie Mae

Informatica helped Fannie Mae ensure clean and correct data is collected and integrated from 100+ data sources


Take a collaborative approach to defining data quality

EMC’s Barbara Latulippe explains the importance of working closely with the business to understand who is going to use the data and how.


How fresh is your data?

Simply getting data is not good enough. You must get it to the right people at the right time while it is still fresh enough to be useful.


Turn an application data migration initiative into a data governance pilot

Make application data migration into more than simply moving old data to a new system. It is also an ideal opportunity to showcase the potential value from a data governance program.