C01-potential-at-work

Build on innovation and avoid reinvention

How can developers reuse objects if they don’t know they exist in the first place? Try employing a "write once, deploy anywhere" methodology.

dev3_build-on-innovation-and-avoid-reinvention-656x370

The key to efficiency is building on existing development in order to avoid wasting hundreds of hours rewriting code.

It may seem like you are tasked with a never-ending number of business-critical integration projects. The key to efficiency is building on existing development in order to avoid wasting hundreds of hours rewriting code. In a world where speed and responsiveness are inherent in reuse, you need to embrace plagiarism.

Redeployment has rarely been an option for a number of reasons. You don’t tend to have insight into the original processes. And most integration logic is hidden in files and is not searchable. The popularity of the mantra, “write once, deploy anywhere,” therefore, raises a number of questions:

  • How can you find out whether a relevant object already exists?
  • How can you be motivated to redeploy an existing object a second time if you are unaware of its origins?
  • Can you trust the work of the original developer?
  • How can you reuse what you can’t see or understand?

Reuse, reinvent, respond

You need to know that relevant objects and artifacts actually exist in order to deploy them even once. This is where an embeddable data management engine, such as Informatica’s Vibe virtual data machine (VDM), provides an advantage.

Vibe lets you deploy logic, regardless of platform or the type, volume, or source of the data. And if any of those elements change, you can redeploy objects without recoding or redeveloping them. The logical mappings are stored in the metadata repository. At runtime the VDM extracts the logic and compiles it in the appropriate execution instructions.

In other words, not only can you map once and deploy anywhere, but you can also apply the same mapping whether your data is stored in Hadoop or a commercial database management system. This capability also applies to data quality, masking, and profiling logic. The largest ramification, however, may be the business value a VDM can provide in helping an organization integrate new technology such as big data (e.g., Hadoop) using employees’ existing skill sets.

A changing paradigm

In some ways, developers have historically been compensated for the code they write. A VDM shifts this paradigm, requiring you to be agile in meeting the demands of the business. Not only will you infuse business value into your organization from your efforts integrating big data, but you will contribute to lowering operating costs for your business.

The role of data integration among a rapidly changing application landscape is becoming more critical. Gone are the days of standardizing on a single vendor’s suite. You have a much more heterogeneous world to deal with. Take advantage of the opportunity to focus less on rewriting code and more on innovating and evolving alongside rapidly changing technology.

For more information on how a VDM not only benefits your organization but also increases the importance of your role in the organization, read "Taking Charge of Change."

Related content

cc03-lexis-nexis.png

Lexis Nexis

Informatica gives Lexis-Nexis the power to shape the world with information and technology.

dev3_take-agility-to-next-level-by-using-data-virtualization-656x370.jpg

Take agility to the next level by using data virtualization for prototyping

Shed the waterfall model and reap the benefits of prototyping: earlier feedback, tighter collaboration, and more accurate results.

dev3_old-school-technique-for-new-school-big-data-656x370.jpg

An old-school technique for new-school big data

Use this obscure technique to realize the benefits of big data while avoiding flat file bottlenecks.

paw2_dev_profiletest_656x370.jpg

Profile, test, measure, improve, repeat

Learn from your own experience, and Disney's, by leaving nothing to chance and everything to data.