C01-potential-at-work

5 steps to deriving business value from the IoT with existing resources

Gathering and analyzing data for real-time decision-making doesn’t require investing in brand new tools, technologies, or training.

dev_5_steps_to_deriving_business_value-656x370

It is wholly inefficient to adopt new solutions if they require unrealistic levels of expertise to manage them or if they exhibit a lack of maturity as enterprise solutions.

Empowering business users with self-service access to big data is of arguable value if they cannot analyze and act on what they find while it’s still fresh and relevant. From a design standpoint, it is impractical to adapt ill-suited and unwieldy traditional architectures for innovative uses. And it is wholly inefficient to adopt new solutions if they require unrealistic levels of expertise to manage them or if they exhibit a lack of maturity as enterprise solutions.

There is a way, however, to improve responsiveness and extract value without sacrificing agility when designing applications with Internet of Things (IoT) data introduced. The five steps below provide an admittedly simplified snapshot of what’s involved in transforming machine data from mere logs to valuable insights into customers and operations. They can, likewise, transform you, as developer/architect from playing a support role to contributing real-time operational intelligence that can drive business strategy.

Step 1: Connect

By quickly configuring, deploying, and monitoring connections to any number of data sources, you remove the complexity inherent in hand-mapping sources to targets. To support this process, look to a visual big data ingestion tool, like Informatica’s Vibe Data Stream, that lets you choose data sources, identify target systems, and define desired outcomes in a graphical mapping function. And you can easily define processing speed and prioritize use cases that require real-time insights into the data versus those requiring near real-time or batch processing.

Step 2: Collect

You cannot expect real-time data streaming capabilities from a tool that is unreliable or unable to scale. You most likely have legacy solutions that could suffice if those are your expectations. Instead, you need to trust that your data collection or ingestion solution can meet high-volume demands with high availability and built-in failover.

Scalability is not only important in terms of your data, but is also key in managing resources or getting new employees up to speed. A tool based on standard APIs and SDKs will provide accessibility to a larger group of current and future employees, adding another layer of agility to the process in the meantime.

Step 3: Parse

When you collect real-time data, you are on the receiving end of everything from social to log to machine data. An essential step is processing and normalizing the data you need while bypassing that which you don’t. By connecting your data source directly to your target, you avoid the often-costly and time-consuming step of staging your data. Instead, it is delivered as indicated by pre-defined user requirements, bypassing any irrelevant data. As a result, you can guarantee that “real time” really is real-time.

Step 4: Analyze

You are now ready to put your data to work. At this stage, your data is analyzed, with anomalies or events triggering pre-defined responses, such as the example in Step 5. You reap the immediate rewards of an event-driven architecture fueled by real-time data instead of gambling that manual queries may actually detect anything of immediate or meaningful value.

Step 5: Act

In the final step, your pre-defined scenarios trigger actions. Patterns suggesting specific criminal activity, for example, may trigger immediate notifications alerting authorities to an urgent need for intervention. Alternatively, inconsistent credit card use indicating charges in two different cities at the same time may prompt a customer service representative phone call to ensure unusual spending behavior is legitimate. Or you may want to batch-process data to track the correlation between certain behaviors that while timely— such as the propensity that a couple newly expecting a child may want to take out a home equity loan, for instance—don’t require an immediate response.

Designing big data applications using IoT data in a pragmatic and systematic fashion removes many implementation hurdles. Using new Informatica tools lets you capitalize on existing technology and skills from current employees. This translates into operational efficiencies that allow you to devote more time to influencing top and bottom line business strategy implementations.

Get more information and apply these steps to your own data by downloading Vibe Data Stream for Machine Data.

Related content

cc03-lexis-nexis.png

Lexis Nexis

Informatica gives Lexis-Nexis the power to shape the world with information and technology.

dev_architect_for_change-656x365.jpg.jpg

Architect for change because you can't keep up with the IoT by hand-coding

Focus on data cleansing, prep, and quality when facing the higher volume, wider variety, and faster velocity of data generated by the Internet of Things.

edition-6-app-article-title-656x365_0007.jpg

Data virtualization turns stakeholders into 'co-developers'

Rapid prototyping turns a normally difficult requirements process into a collaborative exercise benefitting both business and IT.