Advanced Topics
Explore advanced techniques for leveraging event-driven architecture in AI Builder applications
As you become more proficient with AI Builder, understanding and leveraging its event-driven architecture can help you build more sophisticated, powerful, and efficient applications. This guide explores advanced topics focused on event-driven patterns and their practical applications.
Event-Driven Architecture (EDA)
Event-driven architecture is the foundation of AI Builder’s flexibility:
- Events as First-Class Citizens: All system and user actions generate events
- Decoupled Components: Services communicate through events, not direct calls
- Asynchronous Processing: Actions occur independently of event producers
- Scalability: Components can scale independently based on event load
- Extensibility: New capabilities can subscribe to existing event streams
In Prisme.ai, events flow through the system as messages containing:
- An event type (e.g.,
message.created
,user.login
) - A payload with event-specific data
- Metadata about the source, timestamp, and routing information
Event-driven architecture is the foundation of AI Builder’s flexibility:
- Events as First-Class Citizens: All system and user actions generate events
- Decoupled Components: Services communicate through events, not direct calls
- Asynchronous Processing: Actions occur independently of event producers
- Scalability: Components can scale independently based on event load
- Extensibility: New capabilities can subscribe to existing event streams
In Prisme.ai, events flow through the system as messages containing:
- An event type (e.g.,
message.created
,user.login
) - A payload with event-specific data
- Metadata about the source, timestamp, and routing information
Event-driven architecture provides several advantages:
- Loose Coupling: Components can evolve independently
- Real-Time Processing: Events are processed as they occur
- Resilience: Failures in one component don’t cascade to others
- Auditability: Complete event history provides audit trail
- Flexibility: Easy to add new event consumers without modifying producers
For AI applications, EDA enables:
- Seamless integration of multiple AI models and tools
- Progressive enhancement of features without disruption
- Detailed tracking of user interactions for personalization
- Complex workflows that adapt based on AI processing results
Prisme.ai implements EDA through several mechanisms:
- System Events: Generated by the platform for actions like page loads or authentication
- User Events: Triggered by user interactions with blocks
- Automation Events: Created by automation execution
- Custom Events: Defined by developers for application-specific needs
Events can be:
- Emitted by blocks and automations
- Listened for by automations to trigger actions
- Queried for analysis and reporting
- Persisted for auditing and historical analysis
Working with Events
Emitting Events
In automations, you can emit events to trigger other processes:
Blocks can also emit events when users interact with them:
These events flow through the system and can trigger other automations or be recorded for analysis.
Listening for Events
Automations can be triggered by specific events:
This creates a chain of actions that can flow through your application, each step triggered by the completion of previous steps.
Accessing Event History
View event history in several ways:
- Activity Tab: See recent events in your workspace
- Event Explorer: Query and filter events for analysis
- Elasticsearch/OpenSearch: Advanced querying for deeper analysis
The complete event stream provides valuable insights into application usage, performance, and user behavior.
Analyzing Event Patterns
Advanced analytics can reveal important patterns:
- User Journeys: Track how users move through your application
- Bottlenecks: Identify where processes slow down
- Error Patterns: Detect recurring issues
- Usage Trends: See how usage evolves over time
- Feature Adoption: Measure which features are most used
These insights drive continuous improvement of your applications.
Advanced Event Analytics
Every event in your workspace is stored in Elasticsearch/OpenSearch, enabling custom analysis:
System Mapping
Create visual maps of your systems based on actual usage:
Track event flows between components
Visualize user journeys through your application
Identify unused features or dead-end paths
Discover unexpected usage patterns
Map integration points with external systems
Usage Analytics
Understand how users engage with your applications:
Measure feature adoption and frequency of use
Track user session patterns and duration
Identify popular and underutilized features
Analyze conversion funnels and drop-off points
Segment users by behavior patterns
Performance Monitoring
Track system performance metrics:
Measure response times for different operations
Identify bottlenecks in processing flows
Track API usage and latency
Monitor automation execution times
Analyze resource utilization patterns
Pattern Discovery
Find meaningful patterns in your event data:
Discover common user behavior sequences
Identify correlations between events
Detect anomalies that may indicate issues
Recognize seasonal or time-based patterns
Find optimization opportunities
Event Mapping for Analytics
Introduction to Event Mapping
Introduction to Event Mapping
As part of Prisme.ai’s event-driven architecture, we process events structured with a dynamic identifier payload
. To ensure consistent and efficient aggregation in both Elasticsearch and OpenSearch, it’s essential to explicitly define the mapping for fields used in the payload.
Without proper mapping, you may encounter issues such as:
- Aggregation inconsistencies between Elasticsearch and OpenSearch
- Fields interpreted with incorrect data types
- Performance issues with complex queries
- Limitations in available analysis capabilities
Benefits of Explicit Mapping
Benefits of Explicit Mapping
Reliability and Consistency
Ensures uniform data treatment:
Consistent field types across all events
Prevents errors caused by automatic inference
Guarantees that aggregations work properly
Maintains data integrity over time
Performance Optimization
Improves query and analysis speed:
Optimizes indexing for known data structures
Enables more efficient storage patterns
Improves complex aggregation performance
Reduces processing overhead for queries
Maintenance and Scalability
Simplifies ongoing management:
Easy-to-read YAML configuration
Workspace-specific mapping definitions
Simplified schema evolution
Better documentation of data structures
Cross-Platform Compatibility
Works consistently across search engines:
Identical behavior in Elasticsearch and OpenSearch
Consistent query results across environments
Reliable migrations between search technologies
Future-proof for search engine updates
Implementing Event Mapping
Implementing Event Mapping
To implement explicit event mapping, add configuration to your workspace YAML:
This example defines the schema for the usage
event type, specifying that fields like usage.total_tokens
and usage.cost
should be treated as numeric values with specific formats.
When implementing:
- Identify the event types requiring explicit mapping
- Define their schema with appropriate data types
- Add the configuration to your Workspace
- Emit sample events
- Test with analytical queries to verify proper behavior
Advanced Usage Analytics Example
Advanced Usage Analytics Example
Here’s how you might use mapped events for advanced analytics:
With proper mapping, these aggregations will be fast and accurate, providing valuable insights into application usage and performance.
Practical Event-Driven Patterns
User Activity Tracking
User Activity Tracking
Track and analyze user behavior:
These events can be analyzed to:
- Create user journey maps
- Identify popular features
- Measure engagement
- Detect unusual behavior
- Personalize experiences based on usage patterns
AI Model Performance Tracking
AI Model Performance Tracking
Monitor and optimize AI model usage:
This data enables:
- Cost tracking and optimization
- Performance benchmarking
- Model selection refinement
- Usage pattern analysis
- Identifying optimization opportunities
Multi-Step Workflows
Multi-Step Workflows
Implement complex business processes through event chains:
This approach creates modular, maintainable workflows that are:
- Easily extendable with new steps
- Resilient to failures (steps can be retried individually)
- Transparent (full visibility into process state)
- Analyzable (measure performance of each step)