Big Data: the top 3 misconceptions

We count down the top 3 most common misconceptions about Big Data

Big data: It's the thing everyone's talking about. But how many of us actually understand it?

Big Data is a big deal. Over the past 18 months or so, interest in the technology has skyrocketed, and it has become an indispensable business tool. However, though the term is widely used and accepted, not everyone is entirely sure what Big Data actually is or how it works.

Most of us know that Big Data is about storing and analysing the vast amounts of information generated everyday in order to extract insights that will make businesses more efficient. But how this data collection, storage and analysis works is still beyond many of us.

Here are some common misconceptions about Big Data, and a healthy dose of reality:

1. The most important thing about big data is its size

No. “I have an issue with the term Big Data,” Stefan Groschupf, CEO of data analytics company Datameer, told a rapt audience at a Big Data event in London. The CEO suggested that though the term is business friendly, it is misleading because by referring to the data’s size they are reducing the challenges of the industry to a problem of storage and technology. And though data storage is certainly a huge part of Big Data, the speed with which that data can be accessed and processed is perhaps an even more central issue.

Over the past 18 months or so, interest in the technology has skyrocketed, and it has become an indispensable business tool

Big Data has three main aspects, known as the three Vs: Volume, Variety and Velocity. So while volume is important, the real key to Big Data is the variety. “It is not just the fact that Big Data has changed, it is more that the definition of what is considered data has changed,” explains David Dietrich, an advisory technical consultant at EMC, who specialises in Big Data. “That is, most people think of data as rows and columns of numbers, such as Excel spreadsheets, RDBMSs, and data warehouses storing terabytes of structured data.”

2. There is a ‘one-size-fits-all’ data solution

Not even close. There are a variety of tools available to store, process and handle data, and each company should spend a little bit of time finding the most adequate solutions for themselves. Every company produces useful data that can lead to valuable insights, but not all that data is the same, and not every company will need to use it in the same way.

So there is very little point in investing in the latest buzzing Big Data solution when it will not deal with your company’s needs. It is, however, important to invest in the right type of analytics and storage for your company, otherwise the whole thing will just be a wasted resource.

For smaller companies in particular, it is important to decide what type of insights are valuable, and to focus resources. More data does not usually mean better insights. In fact, the reality can be that when collecting and storing an enormous amount of data, or simply the wrong type of data, can lead to poor decisions being taken. There is so much out there that companies must be clear about what they are looking for, and their Big Data solutions must be geared towards answering those questions.

3. Data analysis can eliminate uncertainty

It cannot. Big data can and should be deployed to reduce uncertainty, but no trend can totally remove uncertainty from any equation. In fact, believing that data can eliminate surprises, mistakes or predict future uncertainty, will only lead to frustration and disappointment. To rely too much on data solutions is to underestimate the human experience that is a vital part of business. Systematic risk can never be fully eliminates, but it can mitigated. Final decisions will always be made by people, and big data is simply another tool to help this process along. A good decision-maker will understand that uncertainty is a fact of life and business; he or she will try to reduce it, then move along.

Related topics: