60% of the Big Data projects fail to take off
Big Data has many potential advantages. It can provide new insights into consumer behavior, show you in which areas the organization can work more efficiently, predict future changes, and much more. However, many companies forge ahead into Big Data without being adequately prepared, and charge straight into a pitfall. Research shows that about 60% of Big Data projects stumble out of the starting blocks. How can you make sure that your organization doesn’t become a statistic?
Here are the seven most common Big Data pitfalls, and how to avoid them.
1. Not having a clear purpose for data
A lot of companies want to plunge into Big Data projects without a clear end goal in sight. They follow the Big Data hype and start collecting data, not knowing what to do with it when they get it, like a dog chasing cars. That’s a good way to waste a lot of time, money, and (human) resources. Before building a data warehouse or purchasing expensive BI tools, think of what you want to achieve with Big Data. Which decisions do you want to make faster? What processes should work more efficiently? What consumer behavior do you want to research? When you know what you want to achieve, you know which data to gather, and how to start using this data to achieve the desired results.
For instance, the city of Dublin started gathering traffic data with the goal of improving the flow of traffic and increasing traffic safety. They achieved this by placing sensors on roads and in buses and visualizing the data on maps. That allows them to respond to problems, in many cases before they arise. And a hospital in the Netherlands managed to shave 20 minutes off the time it takes to treat heart patients, where every second counts. So Big Data can even save lives.
2. Only looking at your own Big Data
Many companies don’t look past their own data, but analyzing Big Data from other sources can provide a lot of added value. Data from social media, web content, and data providers can supplement the internal vision with useful data from external sources, providing new insights. Don’t just look within your own company and/or industry, but also outside of it.
A store owner who only looks at the products they sell themselves, and not at the competition, is missing crucial insight into consumer behavior. Changes in consumer behavior are happening faster than ever, and being agile and anticipating changes in the marketplace are crucial in order to stay competitive.
Besides internal and external data, there’s also the so-called Zero Data: the data that you’re not registering (yet). Sometimes, the answer you’re looking for might be hiding somewhere you hadn’t thought to look. There’s a risk of missing the forest for the trees when you just look at the data you have. Always be on the lookout for new angles that may provide the insight you’re missing.
3. Not using the right tools
Many companies are still misusing Excel as their “data warehouse.” This costs a lot of time and money, because the program is anything but efficient. Most spreadsheets contain errors, and it takes a lot of time to filter them out. Entire meetings can be lost to interpreting facts and checking their accuracy, when using data should prevent these kinds of problems. After all, data is about facts, not feelings.
To help you choose the right tools, Passionned Group regularly conducts a BI Tools Survey to find the best Business Intelligence tools for specific company needs. This 100% vendor-independent survey assists companies in making the right choice of BI tool for their specific needs and desires.
4. Not complying with privacy regulations
Big Data comes with a lot of potential advantages, but also risks and temptations. Data has to be treated with respect for privacy. All kinds of laws and regulations exist for those reasons, such as the recently imposed GDPR rules. Take care to treat and store personal data with care and to comply with all the rules and regulations. Do you have the right to store and use the data? Is the data anonymous? Can you sell the data? Consider all the rules and avoid making mistakes.
Nestle Germany is a good example. They used data with care and didn’t have to make any adjustments to comply with GDPR – they already did. It’s okay not to push the rules to their limits. Just because you can, doesn’t mean you should. Some companies have gone so far as to pay hackers to keep a breach of personal information quiet, which is the other end of the spectrum. That’s what you want to avoid.
5. Confirmation bias
When doing research, there can be a tendency to look for results that support the answer you’re looking for, and leave out anything that doesn’t fit in with your preconceived notions. This phenomenon is known as confirmation bias. It’s best to do research without looking for a specific answer. Most professional data analysis is performed by trying to reject a hypothesis using a so-called null hypothesis.
Using Big Data is about making decisions based on facts and data, not gut feelings and opinions. That’s the way towards success and becoming an intelligent organization.
6. Analyzing a small sample size
Analyzing a small data sample is likely to lead to the wrong conclusions. A small sample size typically isn’t representative of the whole. In a small sample size it’s difficult to determine whether a variable is a statistical outlier or not. Therefore, extrapolating a small sample size onto a much larger data set often leads to misleading results. It’s important to make sure the sample size is adequate.
As an example, let’s say there’s a parking lot with five cars on it. The cars are blue, red, green, black, and white. Based on this information, you could conclude that every car in the world has a one in five chance of being one of these five colors, and there are no other colors. Of course, common sense tells us that this is pure nonsense, but this principle applies to small sample sizes in general: they aren’t representative of the whole.
7. Not having the right skills and staff
So, you’ve got the infrastructure, you’ve got the data, but there’s another problem – you don’t have the right data specialists or data scientists. When building, for example, a data warehouse, it’s important to have the right specialists with the right expertise to interpret the data. It’s also important to ensure the data quality. Employing the right people is crucially important in this.
The data scientist is in high demand right now, but often, companies are looking for a polymath superhero who can swoop in to solve all of their Business Intelligence problems in one fell swoop. This person doesn’t really exist. The trick is having the right teams that complement each other where necessary. That ensures that the organization is agile enough to solve problems.