AI in BI
Opportunities for Differentiation
Made by Matt David
AI is changing the way we BI
The speed at which BI tools are adopting AI features is rapid — text to code, text to chart, text to dashboard, etc.
While teams are excited about it, the impact of these features on analysts is a bit mixed. This is partially due to data quality, partially due to the models, and partially due to not understanding the value created in an analytics process.
I wrote about why to BI in the AI age to show the value of the analytics process and why us analysts are still relevant.
On this page I want to document how AI features are getting commoditized across the BI landscape (Power BI, Tableau, Looker) and what areas have not been explored that might yield better results for AI in the BI world.
Existing AI Capabilities
The most widely used BI tools are quickly reaching a similar level of parity around what can be done with AI. With a bit of natural language you can generate code, charts, in some cases full dashboards/reports, get summaries, and get ideas about what to ask next.
It is important to note that there can be large differences in how helpful they are depending on what context the AI has access to, which model is being used, and the quality of the data in general.
AI Capability Opportunities
I wanted to draw attention to some areas where I have seen little progress that I believe would improve the experience regardless of the maturity of the data org.
Opinionated Analysis
Most tools give people exactly what they ask for but thats not what analysts do. Analysts explore the data in different ways, make judgments about what to present at what level of detail, and then provide context and caveats to it.
While it would be hard to mimic this exactly, tools could present data in opinionated ways that could increase the rate of accurate understanding. For instance it could always deliver data at one level deeper than what was asked. So if they ask average sales, plot it on top of a histogram so they can see what is being summarized by that metric. If they ask for sales yesterday plot it over a week so they can see it in more context.
Unstructured Analysis
Currently unstructured data analysis is largely happening in the tool it is produced in — Call sentiment in Gong/Grain, service ticket insights in Zendesk/Intercom, etc.
This works well but could be more powerful if sentiment of a call or ticket could be correlated to churn data. To do this BI tools could explore how to mix this new qualitative data in with quantitative data. This may be easier in code based tools but could be supported with no-code as well. In fact some data quality tools are already dipping their toes in.
There's probably an opportunity here for a bigger architectural trend. We could speed run a modern unstructured data stack where we develop pipelines, a centralized warehouse, and preparation techniques that could make this data widely available to not only BI tools but all matter of SaaS applications.
Existing Context Provided
More context typically means a higher chance of the AI selecting the right table and column which are typically the biggest culprit in a faulty analysis.
Context Opportunities
Currently the focus is on granting access to more and more context locations. While this is obviously worthwhile I'd love to see some focus more on creating context to support making the existing analyses more discoverable and understandable.
Make Search Functional
Charts are notoriously sparse of keywords, sometimes only containing a one word description of whats on the X and Y axis. This makes searching for a chart in a BI tool about a specific topic very difficult.
AI can enrich a chart with relevant key words by taking out keywords from the SQL and by generating an estimated summary of the chart for people to search against. Now I would push for the creator of the chart to edit and approve but AI can jump start this.
Take Annotations Seriously
Humans need context as well to reliably interpret charts. Often times these are not added and only a few people have the institutional knowledge to know when things were launched and which spikes were caused by some bug or campaign.
While AI can come up with plausible reasons for why data is up or down it would provide a lot of value in just noticing where changes in data occur and prompting the right human to add in the annotation.
This would make it easier for other fellow humans and our friendly cyborg compatriots to interpret the data going forward!
Existing Interaction Patterns
Clippy style helpful wizard have made their return with slide in chat windows to invoke AI's across tools.
Important to note most code oriented BI tools provide inline prompting or AI informed autocomplete at this point.
Interaction Pattern Opportunities
Most tools are focused on giving people what they ask for, we can give them more!
Bottom Up vs Top Down
The danger of most interaction patterns with AI in BI is that it can be a confirmation bias machine. I want to see how well sales is doing and it shows me its up and to the right. While this can be compensated for with careful prompting and investigating the data I'd like to propose that there is another model that may be better suited to our AI friends.
AI could monitor our data across more segmentations than we'd ever check and surface anomalous data all the time. It could passively always be doing a bunch of deep dives giving the human analyst interesting spots to dig in further.
Imagine in your BI tool there was an anomaly tab, an up and to the right tab, a down to the right tab, an on trend/estimate tab, and an off trend estimate tab. Where you could quickly see whats going up, down, and spikes. You could filter this by topic and sort by popularity or by governed metrics.
I believe this pattern is going to become a much larger trend and that data quality companies will add this to their offerings.
Automatic Test Setup
Manually configuring dashboards or analyses for every launch can get tedious. While important to ensure that new events are instrumented properly there are many existing relevant metrics that analyst find, build, and often tweak slightly to add to the dashboard for monitoring.
Much of this work could be automated with the help of AI. It could notice new fields or entry types in a column and spin up a few charts to monitor it. So the experience is that whenever a new version of your app or website goes your dash to monitor the impact is preconfigured, and in fact it could set up the relevant test charts to show if the performance is improving or not. It could also solicit you to see if there is a particular segment you want to focus on and adjust accordingly.
Close the Iteration loop
While I'm a believer of humans in the loop on analytics one could conceivable feed data results from a feature or a landing page into a "vibe coding" tool to make adjustments and then to generate a AB test with a new version to continuously improve performance. There are obviously many brand/experience/security things to consider but this could largely be resolved with an approval flow.
In fact Amplitude is doing this already, don't get dusted BI folks!
AI is moving fast, keep your head on a swivel
Want to talk about the future of analytics, shoot me a note @fronofro or mcdavid1991@gmail.com