Event Warehouse
Write collected events to your data warehouse for long-term storage and analysis.
Overview
The event warehouse writer automatically writes incoming events to a dedicated table in your data warehouse. This gives you a queryable event history alongside your other customer data.
Configuration
- Navigate to Events > Warehouse
- Select a source (warehouse connection)
- Configure the target schema and table name
- Click Save
Table Schema
Events are written with the following columns:
| Column | Type | Description |
|---|---|---|
event_id | string | Unique event identifier |
type | string | Event type (track, identify, page, group) |
event | string | Event name (for track events) |
user_id | string | User identifier |
anonymous_id | string | Anonymous identifier |
properties | JSON | Event properties |
traits | JSON | User traits (for identify events) |
context | JSON | Event context (IP, user agent, etc.) |
timestamp | timestamp | Event timestamp |
received_at | timestamp | When SignalSmith received the event |
write_key_id | string | Which write key sent the event |
Schema Evolution
When new event types or properties appear:
- New columns are not automatically added (schema is fixed)
- New properties are stored in the
propertiesJSON column - Query JSON columns using your warehouse’s JSON functions
Partitioning
Events are partitioned by date for efficient querying:
- BigQuery — Partitioned by
received_atdate - Snowflake — Clustered by
received_at - Databricks — Partitioned by
received_atdate
API Reference
See Events API for warehouse configuration endpoints.