The latest BI trends, big data trends, and AI trends put in perspective
BI and AI are are becoming an inseparable dynamic duo. This makes sense when you consider that both disciplines support the business in taking critical business and operational decisions. This duo is also fueled by the same thing: (big) data. Although BI and AI each have their own goals, applications, and issues, we’re seeing several converging trends that will dominate the twenties. Our top 9 of technology trends combines business intelligence trends with big data trends and data analysis trends.
Trend 1. “AI-first” strategies will dominate
The rate of technological development in BI trends is killing. The datafication and algorithmization of society is in full swing, but not everyone will see the full scope of it. You won’t see it until you understand it. After mobile-first and cloud-first strategies, it’s now time to seriously consider an AI-first strategy.
Executing a strategy consistently and without compromises is the key to success. Just ask Amazon or Apple’s shareholders. This does not mean, however, that you never have to adjust your strategy when new ICT trends appear. Organizations which have neglected to take advantage of the possibilities offered by the overarching trend SMAC (Social, Mobile, Analytics, Cloud) are most likely suffering the consequences, if they haven’t already been wiped out. The market in 2020 is still very VUCA: volatile, uncertain, complex, and ambiguous. Or, like a software vendor recently said: Uber yourself before you’re Kodak’ed.
Leverage the top trend: make AI-first into a norm
If you don’t want to miss the boat, make AI-first a standard and implement artificial intelligence (AI) in your business processes. Only by making AI a top priority can you ensure that you will still be relevant tomorrow. In short: think like Google in terms of AI-first, keep up with technological trends and apply Artificial Intelligence in all your platforms, products, and services. But don’t let all the IT trends distract you from improving your data quality.
Only by making AI a top priority can you ensure that you will still be relevant tomorrow
Trend 2. Data quality as a stable BI trend is more crucial than ever
Data quality has been a big issue ever since the first printing of The Intelligent Organization, and that’s not going to change any time soon, despite changing trends in data and the rise of AI. Data quality is the degree to which data is suited to your intended purpose. Nothing more and nothing less.
Organizations are losing millions of dollars because their data quality isn’t up to snuff. Fortunately, many managers already realize that low-quality data leads to unreliable predictions. This goes both for playing into BI trends as well as AI trends in a timely and adequate fashion. More and more organizations are realizing that reliable data has become a key factor that impacts operations.
Never use bad data as input for AI applications
Using “bad data” for AI applications completely kills the quality of the resulting predictions. The algorithms will be impure, so you probably won’t be able to put them to use. In the case of traditional reports, there’s a human eye that can see if there are any errors. But the traditional report builders are threatened with extinction, and an algorithm lacks that critical gaze. Bad data can lead to fatal results. Especially if you want to take advantage of the Internet of Things trend. But there’s hope.
Algorithms are becoming better at qualifying data quality using smart data profiling. Traditional methods for checking data quality are based on human estimates, which are inaccurate. Deep learning will take over this labor-intensive task. The algorithm can detect the outliers in the data. Peaks and troughs will stand out and point to a data quality problem.
But any way you slice it, the responsibility for data quality belongs with the process owners. But who looks out for the ethical use of data?
Trend 3. Data ethics and AI: algorithms have some explaining to do
The ethical use of AI and data has become a hot-button issue for politicians, policy makers, and pundits. Now that algorithms are playing an increasingly prominent role in decision-making processes, this topic will continue to be a topic of many discussions. We’re going to hear a lot more about this topic, from Explainable AI (algorithms that explain themselves) to Responsible AI, which covers more ground than just AI for Good (a trend we discussed last year).
Big data trend: how does the AI algorithm make a decision?
The million-dollar question is: how does the AI algorithm make a decision? It’s often impossible to explain this exactly (to both laymen and experts), but organizations that employ AI will have to be able to justify those decisions. Especially when it comes to decisions that affect people’s livelihoods. Artificial intelligence doesn’t have any ethics or morals in and of itself. What happens when algorithms play judge and jury in criminal justice? Or when they make automated decisions in the IRS or the healthcare industry? How ethical is the AI-driven pre-selection process for job applicants?
Leverage data science trends, but beware AI biases
In short: we have to be very careful to prevent historical biases from creeping into the decision-making process. A computer could incorrectly interpret the historical lack of women in the STEM fields as an inherent lack of competency, for example. Historical socio-economic factors are hard for an AI to consider, so it’s paramount to keep such things in mind when feeding data to an algorithm or machine learning model.
An algorithm that’s better at doing good than people? The proof remains in the pudding.
At the same time, algorithms expose human shortfalls. Algorithms can identify human prejudices. People, even moreso than algorithms, suffer from low-quality data, programming errors, unclear definitions, and “the chaos of life”. An algorithm only has to be better than human beings on average, as Yuval Noah Harari posits in his 21 lessons for the 21st century. And technology just keeps developing ever faster. Researchers from the Technical University in Delft (The Netherlands) are working on an algorithm that considers the ethics of decisions. An algorithm that’s better at doing good than people? The proof remains in the pudding.
Trend 4. GANs satisfy the hunger for data
Generative Adversarial Networks (GANs) may satisfy the hunger for data. Machine learning models require vast quantities of big data. The model has to be trained using historical data, after all. The more data, the more accurate the predictions it can make, as conventional wisdom has it. That data doesn’t just grow on trees for the average organization, however, which makes it hard to take advantage of this big data trend.
GANs enable two neural networks to generate new data based on an existing training set. Thanks to this data synthesis, AI and BI’s data hunger can be satisfied, and the algorithm can be further developed. The website ThisPersonDoesNotExist.com, for example, shows lifelike faces of people who don’t exist. In a similar vein, there is a website with AI-generated news articles.
Real data remains essential in the AI age
However promising these developments seem, data from the real world remains essential to the development of reliable algorithms. On top of that, GANs raise new ethical dilemmas, as they are ripe to abuse. But we suspect AI will become less data-hungry, making it easier to profit off of data trends. But that doesn’t solve all of our data problems.
A data science trend for the real geeks
Until recently, predictive models had to be fed data. A lot of data. Data that comes in all kinds of shapes, sizes, and formats. According to a well-known software developer, the number of data formats has more than doubled last year, from 162 to 342. Different formats are hard to connect and combine, which increases the time it takes to analyze the data. That’s why we expect that 2020 will see greater investments in a data science trend for the geeks among us: the fine-tuning of machine learning models, so that gigantic data sets will no longer be required. If this takes off, the speed of analysis, and thus productivity, will increase.
There’s a lot of ground to gain when it comes to metadata, as well. Using a metadata catalog, you can quickly see where certain data can be found. This makes it much easier and faster for users to get to the data they need. But there are more helpful tools, such as NLP.
Trend 5. NLP is taking off
More and more employees will have to start working with data and data trends, but organizations don’t always have the time and resources to make their employees data literate. The solution may lie in Natural Language Processing (NLP), a technology that enables computers to answer questions asked in natural speech, so that laypeople can also start analyzing data.
NLP can recognize speech, or search text documents to unearth relevant insights. That makes it easy to generate a report without having to be a data expert. The user could ask a simple question, such as “what is the worst-selling product per region”, and the software will scan the data and provide an answer. This data analytics trend is still in its infancy.
Trend 6. Data experts with AI skills are still rare and expensive
Despite the rise of technologies like GANs and NLP, the demand for data experts (such as data scientists) just keeps on growing. The shortage of qualified data professionals is far from being resolved. The speed of technological developments is only increasing, which makes it harder and harder for training programs and curriculums to keep up.
College graduates are often ill-prepared to apply modern BI techniques and AI in bleeding-edge organizations. The experience requirements for data scientists are impossible to meet for those who just graduated. The question is whether the data scientist superhero that many organizations want is even reasonable in the first place. In sum: a shortage of qualified data scientists could pump the brakes on the data science trend.
Trend 7. Continuous Intelligence is the wave of the future
Continuous Intelligence (CI), also called real-time analytics, is not just another buzzword, but a trend recognized and acknowledged by most analysts. It’s an AI-based method of continuously interpreting data, discovering patterns, and learning the value of the data.
Business Intelligence tools fall short
The traditional BI Tools fall short in this area, because they rely on the human touch. Orchestration is required in every phase, from accessing the data to visualizing the insights in BI dashboards. This isn’t odd when you consider that BI was never meant to handle the complexity of large-scale analytics that popped up as a result of the digital revolution. Users expect that they’ll be able to get to the necessary information much faster and can query data in real-time in 2020.
Use CI as leverage for other AI-based technologies
Gartner expects that in 2022, more than half of the new business systems will incorporate Continuous Intelligence, allowing users to access context-sensitive data in real-time, drastically improving the speed and quality of decision-making. Smarter decisions will be made. Continuous Intelligence will also function as leverage for various other technologies, such as augmented analytics, event stream processing, business rule management, and machine learning (ML). All this combines to make CI one of the most important overarching technological trends.
Trend 8. AutoML takes off
Speaking of Machine Learning, there’s a trend emerging here, too: AutoML. Gartner expects that this year alone, 40% of all the tasks of a data scientist will be taken over by machines, with no human interaction at all. This trend perfectly synergizes with the shortage of data scientists described earlier. Frameworks and tools will be developed, allowing end-users to develop their own machine learning models. For data scientists, all that will remain is developing deep learning models.
Use ready-made machine learning models
Google has built up a reputation when it comes to image recognition software, speech recognition, and last, but not least: translation software. Google Cloud has been making ready-made Machine Learning models available to developers for some time. These do-it-yourself machine learning models were trained by Google itself based on data from Google’s own search engine. But that means that these models are only familiar with public data.
If you want to deploy ML based on your own specific data, you have to train the model yourself. And that makes ML a job for specialists, and these are in short supply. This is why Google developed AutoML, a collection of machine learning products that enables you to train your own ML models without any prior knowledge of machine learning. You don’t need any data scientists to integrate ML in your business processes anymore.
Tech giants support AI
It’s not just Google working on AutoML, other tech giants such as IBM, Microsoft, Alibaba, et cetera are working to develop their own platforms to support this growing AI trend.
Trend 9. Data-driven working and transparency are essential AI success factors
In 2020, data-driven working will no longer just be an ideal that organizations carefully try to strive for. Data-driven working in business and nonprofits will increasingly be seen as a key factor in doing business. The same goes for transparency.
Expectations are sky-high. Customers, suppliers, stockholders, and other stakeholders simply expect organizations to have their affairs in order. A 360-degree customer view, an omnichannel experience, and an optimal customer journey are requirements to building a reliable customer relationship. Stockholders expect business controllers to use algorithms and storytelling to enable real-time reporting, fast closing, and forecasting.
Transparency and data-driven working are persistent trends with great social impact and relevance that transcend the technological trends of fads of the day.