Author: Wes Caldwell, CTO – ISS
Since the turn of the 21st Century, technology thought leaders saw the possibilities (and perils) of ever-increasing volumes, sources and ways to marshal and understand data. The big question is: how can organizations benefit from all this data rather than being swamped by it or have their useful data disconnected and slumbering away on various databases or storage media? Big data experts have been using variations of the “Vs” model to help people understand the landscape and critical factors of big data. Gartner identified three Vs back in 2001 and also identified 12 associated dimensions of Big Data. Below is an example graphic of the classical ‘three Vs’ of Big Data.
In a recent infographic, IBM cites four Vs—Volume, Velocity, Variety and Veracity—as a framework for understanding Big Data. Others throw Volatility and Validity into the mix. Any of these terms can be combined for a valuable framework for understanding the elements and challenges of the arena.
You can cite as many factors as you like, but this alphabet soup all needs to add up to the ultimate V: Value. This value could be the ability to spot a key trend that was previously invisible, finding cost savings from looking at an initiative or line of business from a different logistical perspective, or detecting a point of contact with customers or clients where small improvements could yield significant gains.
The evolution of data integration (getting all your legacy and real-time data into one place) and enterprise search (enabling you to query and actually make sense of data) together represent the keys to unlocking the fruits of big data.
To help understand (and eventually unlock) these gains, I will discuss the nature of three of the foundational Vs in upcoming blog posts—Volume, Variety and Velocity—and how meeting their challenges yields the one that really matters: Value. At the end of the day, you can have all the Vs you want, but if there’s no value, it’s all an academic exercise anyway.