Grounding Analytical AI Agents with Looker’s Trusted Metrics

1 month ago 13
News Banner

Looking for an Interim or Fractional CTO to support your business?

Read more

The growth and adoption of generative AI enables new ways for users to engage with products and services. Your organization has likely spent significant time and resources establishing trusted self-service analytics capabilities - and AI is revolutionizing how users can interact with their business data. For example, instead of building a query, users can use their voice or type their question using natural language . AI is the ‘magic’ that sits between the user and the data being queried - and that’s what type of interactions and experiences product and engineering teams are tasked with bringing to their users. 

Over the last decade, Bytecode has helped over 1,000 organizations build successful data stacks and Looker for trusted self-service analytics. Today, a common question we are getting from clients is how can I bring AI to my analytics and how long will this take? Clients who have invested in Google Cloud's vertically integrated data and AI cloud (BigQuery, Looker and Vertex AI) are excited to learn that the answer is both easier and faster than they expected.

In the age of generative AI, Looker customers have a tremendous time-to-value advantage by extending Looker’s trusted metrics into analytical agents for both fast and reliable insights for AI-powered Business Intelligence.

At the heart of many generative AI use cases are Large Language Models (LLMs). There is a growing number of  available LLMs pre-trained on massive amounts of data which make them generally impressive out of the box to interpret, translate or summarize text or another modality, but largely lacking when it comes to focused knowledge about your data and business. The speed at which an organization can point the ‘magic’ of AI at their data is essentially the same as how quickly one can become frustrated. Off-the-shelf LLMs aren’t (yet) trained to speak your specific language or know your unique business. Frustrations can arise when LLM generated definitions and the actual business terms differ, and this becomes particularly clear in analytical use cases. For example, how does your organization define a high value customer? It could be a combination of dollars spent over their lifetime, the recency of their last purchase or their repeat purchasing cadence. Permutations are endless and specific for every business. 

This data dilemma presents an opportunity to leverage your existing investments with Google’s Data Cloud, including Looker’s semantic modeling layer, where your business definitions are globally available to all your users and systems, including generative AI. By combining large language models in Gemini and Looker’s semantic layer, we are able to quickly fine tune and ground your analytical agent in your ‘business truths’ to deliver a new interface for end users to access trusted insights to make decisions.

For end users, getting analytical answers can now be as easy as asking a question. We’re seeing demand skyrocket within organizations that have successfully deployed analytical agents.

One of Bytecode’s clients is OfficeSpace, a SaaS workplace management platform. OfficeSpace is integrating AI agents into their core product to enable faster and easier management of desk reservations and space management experiences. Central to their product are analytics on office space usage and utilization. By leveraging the Gemini models and Looker they are able to go beyond dashboards and deliver accurate, governed insights directly within their chat experience. The AI models are trained on OfficeSpace then tuned to their specific use cases and use LookML to generate the queries, ensuring the answer is the same whether the user is chatting a question, reviewing a dashboard or downloading a dataset. 

The challenge

LLMs are providing a critical piece of the technology solution to make AI-Powered BI a reality. For OfficeSpace delivering their analytics chat experience, it is a two part problem:

  1. How to translate the many different ways a person could ask about occupancy rate into a standard question. This is the domain where LLMs are strongest.

  2. How to create a query that accurately calculates occupancy rate. This is the domain where semantic layers are strongest. 

The solution

What each technology is doing:

  1. The LLM is responsible for:

    1. Translating the end user question into a standard 

    2. Picking the Looker measure and dimension that most closely matches the question

  2. Looker is responsible for:

    1. Semantic layer to define business logic and generate query

    2. Delivering resulting data summary and/or visualization

Bytecode Blog Architecture

Getting Gemini and Looker to communicate end-to-end is trivial. The accuracy of the experience is unlocked through a robust semantic layer, the grounding of the language model and prompt training to tune Gemini to accurately translate the question and select the right objects in LookML. Creating data agents with Gemini is very similar to creating curated self-service analytics - we focus on the end user, the questions they need to be able to answer and then curate the experience. 

Demystifying the training process 

Training sounds complicated, but it’s really similar to a traditional analytics problem. Just like when we build a dashboard, we identify the business questions our end users need to be able to answer and then build a set of visualizations to help make decisions. 

In the training process, we're taking the same approach. We come up with a list of business questions and provide Gemini the expected Looker Explore result. By providing Gemini common business questions and the expected results, you will see dramatic improvements in the accuracy of the auto-generated responses.

Lowering barriers grows data-empowered users

The delivery of trusted AI in Business Intelligence (BI) is a significant development for internal and productized analytics. We are breaking down even more barriers to adoption as we simplify the interface for users to answer data questions. Other customers are leveraging AI in BI to deliver analytics to their front line workers who just need a quick answer. 

We’re excited to partner with Google Data Cloud to demystify the process and deliver accurate, governed AI-powered BI. Reach out to Bytecode if you are searching for an experienced partner to help realize your analytical AI agent use cases, or visit the Google Cloud console to access Looker, Vertex AI and Gemini models to get started on your own.

Read Entire Article