Three Ways Generative AI Will Redefine Business Intelligence

By Greg Bucko, Data Consultant

Three Ways Generative AI Will Redefine Business Intelligence

AI Meets BI: How Generative Artificial Intelligence Will Change Visualization Tools Forever

Celebrating its 20th anniversary this year, Tableau has revolutionized data interaction, visualization, and dashboarding (alongside competitors like PowerBI and Qlik). These tools aim to simplify data interpretation through interactive visualization, catering to both non-technical and advanced users. Over the past five years, a shift towards “augmented analytics” has emerged, with AI-driven chat interfaces becoming integral to these platforms. Features like PowerBI’s “Q&A” and Tableau’s “Ask Data” allow users to query data in natural language, albeit in a structured manner.

However, the advent of generative AI, popularized by ChatGPT, is set to further transform these tools. Generative AI will introduce new ways to interact with data, potentially eradicating the divide between analysts and end-users once and for all. This shift will necessitate new skills for data professionals, reshaping their roles in ways yet to be fully explored. With that said, let us explore three ways generative AI will redefine business intelligence and data analytics in the months and years to come.

1. Data Preparation

Analysts spend much of their time engaged in the task of cleaning, blending, and transforming data to prepare it for analysis. Generative AI could assist by creating and interpreting the code needed to execute queries or data transformations, allowing the analyst to simply request that a set of features be brought together for analysis or that a new variable be created. This is a step towards a “post-code” future for data analysts, a shift that has been evolving for years but is now accelerated by generative AI.

For example, although PowerBI data transformations can be performed almost entirely through the user interface without having to interact with code, blending multiple datasets can quickly lead to complexity, computational slowdown, and human errors. Generative AI tools have already shown significant skill in generating, testing, and optimizing code including DAX and M, which are the primary languages used to manage and transform data within PowerBI. Therefore, code interpretation and execution that is based on an analyst’s chat input will quickly become an integral feature of these visualization tools.

2. Outlier Detection and Active Data Monitoring

Today, outlier detection in Tableau requires engaging with tools such as the “Box and Whisker Plot” chart type to visualize the distribution of your data and identify data that is outside of expected norms. You may also need to create a specific calculated field that identifies outliers based on certain filtering criteria. This process can be time-consuming and requires a good understanding of Tableau’s data visualization and analysis capabilities. With generative AI, the tools could automatically highlight outliers, or do so with a prompt such as “calculate x, removing any outliers beyond three standard deviations.” While this may appear to only save a few minutes for trained analysts, the real benefit comes when Tableau dashboards are being “actively monitored” by AI, allowing them to rapidly highlight and alert end users (both technical and non-technical users) of outlying observations or activity.

What is Active Data Monitoring?

Imagine an analyst who is always keeping an eye on your data, looking for early signs of potential problems, day after day, around the clock. These monitoring tools will operate in a sort of “always-on” state, as AI can continuously explore data and highlight issues to business users. These features exist to some degree today, but they are primarily driven by pre-coded upper and lower bounds input by experienced humans. AI will take this a step further by not requiring strict pre-coding, but rather using sophisticated pattern detection algorithms that uncover novel relationships in the data. These algorithms would be like those used in applications such as fraud detection or predicting machine failures in factories.

3. Uncovering Insights

If the first two items on this list give the impression that the role of professional analyst could become redundant, let this third item give you hope. People have learned a lot about generative AI over the last year, and one thing is for certain: it still needs humans to work at its best. Whether that is through powerful prompt chaining techniques, embedding enterprise data, or just making sure the AI is giving the correct answer, the need for a human analyst remains for the foreseeable future. However, less time dedicated to data prep and exploration means more time for uncovering insights in the data, something most organizations are starving for.

The very beginning of this trend is emerging, as tools such as ChatGPT’s Code Interpreter can now accept raw data and provide insights into that data in seconds. For example, uploading the famous dataset that documents the details (and fates) of those on board the Titanic, simply prompting ChatGPT’s Code Interpreter with “provide insights and analysis of this dataset” produces a plethora of analysis on first pass. It begins by listing all of the variables contained within the dataset, provides an overview of outliers and null values, then dives into the data producing an array of histograms in order to highlight significant population differences. It then discusses those differences among passenger groups (such as gender and ticket class) using natural language, not unlike an analyst writing up their findings. For example, ChatGPT easily identified some of the key Titanic dataset insights such as: “Passengers in the first class had the highest survival rate, followed by those in the second and third class. This indicates that socio-economic status played a role in the survival of the passengers.”

Although this capability is certainly a game-changer for rapid analysis and exploration, there are important limitations. Most notably, hallucinations and errors still plague generative AI models, often in mathematical computations. What this means is that nothing produced by an AI generative model can be taken at face value at this time. Analysts will still be required to check for the accuracy of the output, which will somewhat reduce the time-saving benefit. Also, important context could be limited by models such as GPT-4 that are not connected to the internet nor are trained on the most current data, potentially missing important bits of context that could improve analysis. The generative AI tools implemented in modern BI software will need to be fine-tuned for mathematical and statistical accuracy, as well as having additional ways to add context.

What’s next for business intelligence tools?

This is just the beginning of the integration of generative AI into business intelligence tools, a trend that is poised to dramatically transform how businesses manage and interpret visual data. By automating data exploration and transformation while also generating insightful visual representations and insights, generative AI will empower businesses to make more informed decisions. But AI won’t be doing that empowering alone, as it will still require an engaged analyst who understands the strengths and limitations of generative AI and can squeeze the maximum value from these tools.


The Keiter Technologies Data Solutions team has a wealth of experience helping to implement AI, machine-learning, and analysis capabilities for our clients. From optimization models to advanced analytics, Our team can provide the tools, guidance, strategy, and support necessary to ensure a smooth transition to an AI-driven, data-centric future for your business. As a trusted partner, Keiter’s Data Solutions team is committed to helping organizations unlock the full potential of their data and derive actionable insights that can help your business grow. Contact us for assistance with your businesses data challenges and opportunities.

Share this Insight:

About the Author


Greg Bucko

Greg Bucko, Data Consultant

Greg is a highly experienced analytics professional who has helped lead data strategy in a variety of industries including hospitality, insurance, retail, agriculture, and financial services. He is a data consultant at Keiter Innovative Data Solutions and provides thought leadership and advisement on matters of enterprise data and analytics strategy.

He is a passionate leader focused on helping organizations understand their customers, markets, and products better through the use of advanced data analytics and consumer insights. Greg specializes in building and leading data analytics teams in organizations that are transitioning to a more data-driven, customer-centric business strategy. He is also skilled in delivering complex analysis in a way that is clear, concise, and easy to apply directly to corporate strategy.

More Insights from Greg Bucko

The information contained within this article is provided for informational purposes only and is current as of the date published. Online readers are advised not to act upon this information without seeking the service of a professional accountant, as this article is not a substitute for obtaining accounting, tax, or financial advice from a professional accountant.

Categories

Contact Us