The generative AI boom is fundamentally changing the landscape of vendor offerings. We believe that one largely ignored area where generative AI will have a disruptive impact is enterprise analytics, specifically business intelligence (BI).

In the last few weeks, Forrester has seen a flurry of BI vendor announcements (surprise, surprise) on recently rolled-out functionality or near-term plans to take advantage of large language models (LLMs). While each vendor’s approach is somewhat different, we are seeing similar capabilities and approaches emerge:

  • Data sourcing. Using publicly available third-party data (e.g., US Census Bureau or Department of Labor) to enrich enterprise data has traditionally been a manual search and copy-and-paste exercise. LLM-based search can now return such data in a tabular format for cataloging in a BI system. We expect most BI vendors to offer such functionality. The LLM-based search part of the feature will become a commodity, but the way each vendor catalogs the data and adds the new data source to the semantic layer will remain differentiated.
  • Metadata enrichment. A rich semantic layer is key to successful BI implementations. Until recently, populating a semantic layer with database column descriptions, synonyms, and other attributes was largely manual. This task can be automated by ingesting sample metadata into an LLM and having it extract enriched metadata. We expect this functionality to quickly become a commodity. However, each vendor may offer different approaches to creating calculated fields based on LLM recommendations.
  • Text mining. Traditionally, BI has focused on querying and analyzing structured data. In recent years, the leading BI vendors have introduced NLP-based text mining capabilities to analyze data stored in long strings (i.e., open-ended responses in a VoC/CX survey) and extract key topics and sentiment. Forrester expects most of the BI vendors to rapidly shift to leveraging LLMs as a significant part of their text mining pipeline. While domain-specific ontologies and training will continue to provide market advantage, we expect that this functionality will become largely undifferentiated.
  • Natural language query (NLQ). Forrester sees conversational UI as a vital capability to help enterprises further democratize data. In the past, each BI vendor used proprietary NLP to convert a natural language question into an SQL query. We believe that most vendors will shift to LLMs for this conversion, creating differentiation by using prompt engineering to tune questions and enrich the question with data and semantic context. Moreover, vendors will be able to differentiate on their ability to offer NLQ transparency, explainability, and customization.
  • Natural language generation (NLG). NLG is a key capability for effective data communication and data storytelling. Once again, this is a space where BI vendors historically built proprietary functionality. Forrester now expects that much of this capability will be driven by LLMs at a much lower cost of entry, allowing all BI vendors to offer some NLG. While simple NLG will now be within the reach of all BI vendors, advanced capabilities (the result set that gets passed from the LLM for NLG or ML models used to enhance data stories) will remain an opportunity for differentiation.
  • Guided analytics. The nirvana of LLM-based BI is guided analysis, as in “Here is the next step in the analysis” or “Since you asked that question, you should also ask the following questions.” Most leading BI platforms already offer basic guided analysis based on proprietary approaches, but we expect most of them to port this functionality to LLMs. LLM-based guided analysis could be a meaningful differentiator.

The bottom line for enterprises is to be ready for LLM-based functionality in your BI tools. Be prepared to ask vendors what capabilities they offer, how those capabilities work, how the integration works, and what the pricing options (who pays for the LLM APIs) look like.

If you have more questions, please reach out for an inquiry with us.