Getting Started
Get up and running with Enrich.sh in under 5 minutes.
Prerequisites
- An Enrich.sh account — sign up at dashboard.enrich.sh
- Your API key (found in Dashboard → Settings → API Keys)
Step 1: Get Your API Key
Your API key looks like this:
sk_live_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxWARNING
Keep your API key secret! Never expose it in client-side code. Use a server-side proxy or the SDK's beacon() method.
Step 2: Install the SDK
npm install enrich.shOr use curl directly — no SDK required.
Step 3: Create a Stream
A stream defines where and how your data is stored. Create one via the Dashboard or API.
Option A: Dashboard (Recommended)
- Go to dashboard.enrich.sh → Streams
- Click Create New Stream
- Enter a
stream_id(e.g.,events,logs,purchases) - Configure fields and click Create
Option B: API
Simple stream (flex mode — accepts everything):
curl -X POST https://enrich.sh/streams \
-H "Authorization: Bearer sk_live_your_key_here" \
-H "Content-Type: application/json" \
-d '{
"stream_id": "events"
}'With typed fields + evolve mode (recommended):
curl -X POST https://enrich.sh/streams \
-H "Authorization: Bearer sk_live_your_key_here" \
-H "Content-Type: application/json" \
-d '{
"stream_id": "events",
"schema_mode": "evolve",
"fields": {
"event": { "type": "string" },
"url": { "type": "string" },
"user_id": { "type": "string" },
"ts": { "name": "timestamp", "type": "int64" },
"metadata": { "type": "json" }
}
}'INFO
evolve mode auto-detects schema changes and alerts you in the Dashboard. See Streams Configuration for all modes.
Supported Field Types
| Type | Description | Example |
|---|---|---|
string | Text (default) | "hello" |
int64 | Integer / timestamp | 1738776000 |
float64 | Decimal number | 99.99 |
boolean | True / false | true |
json | Nested objects / arrays | {"a": 1} |
TIP
See Streams Configuration for complete details on field types, nested objects, and schema modes.
Step 4: Send Your First Event
Using curl
curl -X POST https://enrich.sh/ingest \
-H "Authorization: Bearer sk_live_your_key_here" \
-H "Content-Type: application/json" \
-d '{
"stream_id": "events",
"data": [
{
"event": "page_view",
"url": "https://example.com/pricing",
"user_id": "user_123",
"ts": 1738776000
}
]
}'Using the SDK
import { Enrich } from 'enrich.sh'
const enrich = new Enrich('sk_live_your_key_here')
// Buffer + auto-flush (recommended for high volume)
enrich.track('events', {
event: 'page_view',
url: 'https://example.com/pricing',
user_id: 'user_123',
})
// Or send immediately
await enrich.ingest('events', {
event: 'page_view',
url: 'https://example.com/pricing',
user_id: 'user_123',
})Response
{
"accepted": 1,
"buffered": 1
}Step 5: Send a Batch (Recommended)
For production use, always batch your events:
curl -X POST https://enrich.sh/ingest \
-H "Authorization: Bearer sk_live_your_key_here" \
-H "Content-Type: application/json" \
-d '{
"stream_id": "events",
"data": [
{ "event": "page_view", "url": "/home", "ts": 1738776000 },
{ "event": "page_view", "url": "/pricing", "ts": 1738776001 },
{ "event": "click", "element": "signup_btn", "ts": 1738776002 },
{ "event": "page_view", "url": "/signup", "ts": 1738776003 }
]
}'TIP
Batch 50–100 events per request for optimal throughput.
Step 6: Connect Your Warehouse
Go to Dashboard → Stream → Connect to get S3-compatible credentials and ready-to-paste SQL for your warehouse.
| What You Get | Details |
|---|---|
| S3 endpoint | {account_id}.r2.cloudflarestorage.com |
| Bucket | enrich-{customer_id} |
| Access | Scoped read-only |
| Egress cost | $0 |
| Works with | ClickHouse, BigQuery, DuckDB, Snowflake, Python |
See Streams → Connecting Your Warehouse for copy-paste SQL examples.
Step 7: Query Your Data
Using DuckDB
INSTALL httpfs;
LOAD httpfs;
-- Configure R2 credentials (from Dashboard → Connect)
SET s3_region = 'auto';
SET s3_endpoint = 'your-account.r2.cloudflarestorage.com';
SET s3_access_key_id = 'your_r2_access_key';
SET s3_secret_access_key = 'your_r2_secret';
-- Query your data
SELECT *
FROM read_parquet('s3://enrich-your-id/events/2026/02/**/*.parquet');Using the SDK
const urls = await enrich.query('events', { days: 7 })
// Pass directly to DuckDB
await conn.query(
`SELECT * FROM read_parquet(${JSON.stringify(urls)})`
)Using Python
import duckdb
conn = duckdb.connect()
conn.execute("""
SET s3_region = 'auto';
SET s3_endpoint = 'your-account.r2.cloudflarestorage.com';
SET s3_access_key_id = 'your_r2_access_key';
SET s3_secret_access_key = 'your_r2_secret';
""")
df = conn.execute("""
SELECT event, COUNT(*) as count
FROM read_parquet('s3://enrich-your-id/events/2026/02/**/*.parquet')
GROUP BY event
ORDER BY count DESC
""").fetchdf()
print(df)Next Steps
- Streams Configuration — Define data types, nested objects, and schema modes
- API Reference — Full endpoint documentation
- Best Practices — Optimize your integration
- Enrichment Templates — Automatic data enrichment
- SDK Reference — Node.js & browser SDK
