AI Knowledge Analytics provides comprehensive visibility into your AI agents’ performance, usage patterns, and resource consumption. These powerful dashboards help you optimize costs, identify improvement opportunities, and ensure your AI solutions deliver maximum value to your organization.

Key Analytics Features

Usage Metrics

Track conversations, users, and document generation across all agents

Token Consumption

Monitor input and output token usage with detailed breakdowns

Cost Analysis

Track expenditure with detailed cost attribution by agent and model

Performance Trends

Identify usage patterns and performance changes over time

Agent Comparison

Compare effectiveness and efficiency across your AI agent portfolio

Custom Date Ranges

Analyze data across flexible time periods from 1 day to 12 months

Analytics Dashboards

AI Knowledge Analytics offers multiple dashboards to help you understand different aspects of your AI deployment.

The Usage Analytics dashboard provides detailed insights into how your agents are being utilized:

Key Metrics

  • Generated Responses: Total number of AI responses generated across all agents
  • End Users: Number of unique users interacting with your agents
  • Created Documents: Total documents generated by your agents
  • Total Tokens Count: Aggregate token usage across all conversations
  • Input Tokens: Tokens consumed by user queries and context
  • Output Tokens: Tokens generated in AI responses

This dashboard helps you understand usage patterns and user engagement, allowing you to identify your most valuable agents and usage trends over time.

Using Analytics Effectively

1

Access Analytics

Navigate to the Analytics section from your AI Knowledge dashboard.

2

Select time period

Choose your desired timeframe for analysis using the date selectors.

Options include standard periods (1 day, 7 days, 30 days, 12 months) or custom date ranges.

3

Review key metrics

Examine the main performance indicators for your agents.

Pay special attention to significant changes or trends in usage and costs.

4

Drill down into specific agents

Click on individual agents to see detailed performance metrics.

Compare agents to identify best practices and improvement opportunities.

Best Practices for Analytics

Regular Reviews

Schedule weekly or monthly analytics reviews to track performance trends

Benchmark Agents

Compare similar agents to establish performance benchmarks

Token Optimization

Identify and optimize high token consumption scenarios

User Feedback Correlation

Connect analytics data with user feedback for deeper insights

Cost Allocation

Use analytics to allocate AI costs to appropriate departments

Continuous Improvement

Implement regular optimizations based on analytics insights

Token Optimization Strategies

Based on analytics insights, consider these strategies to optimize token usage and costs:

1

Knowledge base refinement

Streamline knowledge bases to include only the most relevant information.

2

Prompt engineering

Refine system prompts and instructions to be more efficient.

3

Model selection

Choose the most cost-effective model for each use case.

4

Context window management

Optimize how much context is included in each interaction.

Custom usage analytics

Both LLM and embeddings usage are tracked by usage events, persisted with an aggPayload custom mapping to enable numeric aggregations in Elasticsearch/Opensearch requests.

Example usage :

{
  "type": "usage",
  "aggPayload": {
    "channel": "api",
    "api": "completions",
    "tool": "genericQuery",
    "usage": {
      "firstTokenDuration": 804,
      "completion_tokens": 92,
      "prompt_tokens": 2515,
      "total_tokens": 2607,
      "duration": 3953,
      "cost": 0.00720749999999999951
    },
    "projectId": "project id",
    "model": "gpt-4o",
    "provider": "openai",
    "messageId": "message id",
    "userId": "user id",
    "finishReasons": [
      "stop"
    ]    
  }
}

Example ES/OS aggregations :

Search request body
{
  "size": 0, 
  "query": {
    "bool": {
     "must": {
      "range": {
       "@timestamp": {
        "gte": "2025-04-09T14:27:41.684Z",
        "lt": "2025-04-16T14:27:41.684Z"
       }
      }
     },
     "filter": [
      {
       "terms": {
        "type": [
         "usage"
        ]
       }
      },
      {
       "bool": {
        "should": [
         {
          "bool": {
           "must_not": [
            {
             "term": {
              "payload.api": "embeddings"
             }
            }
           ]
          }
         }
        ]
       }
      },
      {
       "terms": {
        "payload.projectId": [
         "<project id>"
        ]
       }
      }
     ]
    }
   },
   "aggs": {
    "totalUsers": {
     "cardinality": {
      "field": "aggPayload.userId"
     }
    },
    "totalMessages": {
     "filter": {
      "term": {
       "type": "usage"
      }
     }
    },
    "tokens": {
     "sum": {
      "field": "aggPayload.usage.total_tokens"
     }
    },
    "completionTokens": {
     "sum": {
      "field": "aggPayload.usage.completion_tokens"
     }
    },
    "promptTokens": {
     "sum": {
      "field": "aggPayload.usage.prompt_tokens"
     }
    }
   }
}

Next Steps

Explore more detailed guides for AI Knowledge Analytics: