Data virtualization turns stakeholders into ‘co-developers’

Rapid prototyping turns a normally difficult requirements process into a collaborative exercise benefitting both business and IT.

“The previous way of doing things took too long—six to eight months. By that point the business or understanding of the requirement has changed.”

—Robert Myers, technical delivery manager at Informatica Professional Services

Integration projects rely on complex workflows to ensure quality outcomes. In order to build a strong foundation for your project, you need to ensure your requirements reflect business needs. Unfortunately, requirements gathering can be fraught with miscommunication, which invariably results in re-work.

So how do you fix this traditionally dysfunctional relationship? Start with prototyping. By building co-development into the process, stakeholders can see better results more quickly. Robert Myers, technical delivery manager at Informatica Professional Services, can attest to its success.

How are businesses solving data integration problems with prototyping?

Robert Myers: The previous way of doing things took too long—six to eight months. By that point the business or the understanding of the requirement has changed. So what we’ve done is started using virtualization.

We get a business user and an IT person together and use a rapid prototyping method. You’re capturing a requirement, you’re prototyping it, and you’re putting it in a data model. Then, you’re co-developing and exposing the data within a day or less.

Can data virtualization tools help with this?

Myers: Yes. You’re able to let the business user start mapping the data and let them apply the transformation. They‘re actually doing some of the initial work so they can see what they really want. Then, we hand it to the developer, who puts it in a more “productionalized”—or finished—state.

How do you avoid compromising quality?

Myers: Data virtualization enables the business to put quality rules in place as the method and the data are being presented to them. Businesses might say that they don’t have the time to participate this way. To counter this argument, IT must be seen as offering a service to the business that is superior to the business doing it for themselves.

You’re doing requirements validation through a business analyst “seeing” the prototyped data. That is an explicit process requirement. You‘re not lowering quality. You are actually increasing quality.

What are the governance advantages with this approach?

Myers: You’re measuring the data quality as it passes through the system of initiation, into your system of record, and into your analytics program. So you want to be able to very rapidly put those measurements in, which ties into your governance program.

The other thing we‘re starting to see now is the ability for a governance program that has a business glossary to map the logical data objects into the business glossary. Now they can immediately say, “let‘s change that definition, or we need to add a new definition.”

How does this lead to more collaboration between business and IT?

Myers: With prototyping, what developers are saying is, “I want to sit with the user, I want to understand the requirements, and I want to build it with them.” That’s where this combination of virtualization and user-enabled tools is cementing the partnership between business and IT.

For a real-world example, read how HealthNow improved collaboration and benefited from prototyping in this case study.

Related content


Templates build consistency and quality into complex processes

Reduce deployment time and inconsistent processes by building best practices into your mappings.


Speed of change simplifies machine data collection

Streaming technology moves data quickly, removing complexity and reducing potential for failure.


Get 3 steps closer to big data gold

NoSQL will not replace all your tools, but it will be a valuable addition to your toolbox.