Skip to content

API Reference

Base URL: https://enrich.sh

All endpoints require authentication via Bearer token unless noted otherwise.

Authentication

Include your API key in the Authorization header:

Authorization: Bearer sk_live_your_key_here
PrefixEnvironmentUsage
sk_live_ProductionReal data, billing applies
sk_test_SandboxTesting, no billing

Ingest Events

POST /ingest

Send events to be buffered and stored as Parquet. Events are validated against the stream's schema_mode — in strict mode, invalid events are routed to the Dead Letter Queue.

Request Body:

FieldTypeRequiredDescription
stream_idstringTarget stream identifier
dataarrayArray of event objects

Example:

bash
curl -X POST https://enrich.sh/ingest \
  -H "Authorization: Bearer sk_live_your_key" \
  -H "Content-Type: application/json" \
  -d '{
    "stream_id": "events",
    "data": [
      { "event": "click", "url": "/buy", "ts": 1738776000 },
      { "event": "purchase", "amount": 99.99, "ts": 1738776001 }
    ]
  }'

Response (200 OK):

json
{
  "accepted": 2,
  "buffered": 1502
}

Limits

LimitValue
Max payload size1 MB
Max events per request~3,000 (within 1 MB)
Min events per request1
Rate limit~200 req/s per customer

TIP

For high-volume ingestion, batch 50–100 events per request for optimal throughput.

Error Responses

StatusErrorDescription
400stream_id is requiredMissing stream_id field
400data must be an arrayInvalid data format
400data array is emptyEmpty data array
404Stream 'x' not foundStream doesn't exist
413Payload too largeExceeds 1 MB request limit
429Monthly event limit exceededUpgrade plan required

Streams

List Streams

GET /streams

Response:

json
{
  "streams": [
    {
      "stream_id": "events",
      "schema_mode": "evolve",
      "template": null,
      "created_at": "2026-02-01T10:00:00.000Z"
    },
    {
      "stream_id": "clicks",
      "schema_mode": "flex",
      "template": "clickstream",
      "created_at": "2026-02-03T15:00:00.000Z"
    }
  ]
}

Create Stream

POST /streams
FieldTypeRequiredDefaultDescription
stream_idstringUnique identifier (alphanumeric + _ + -)
fieldsobjectnullField type definitions
templatestringnullEnrichment template
schema_modestringflexstrict, evolve, or flex
webhook_urlstringnullURL to forward events on flush

Example:

bash
curl -X POST https://enrich.sh/streams \
  -H "Authorization: Bearer sk_live_your_key" \
  -H "Content-Type: application/json" \
  -d '{
    "stream_id": "purchases",
    "schema_mode": "strict",
    "fields": {
      "order_id": { "type": "string" },
      "amount": { "type": "float64" },
      "currency": { "type": "string" }
    }
  }'

Response (201 Created):

json
{
  "stream_id": "purchases",
  "schema_mode": "strict",
  "template": null,
  "created_at": "2026-02-05T12:00:00.000Z"
}

Update Stream

PUT /streams/:stream_id

Same body as Create — only include fields to update.

Delete Stream

DELETE /streams/:stream_id

WARNING

This deletes the stream configuration only. Data already stored in R2 is not deleted.


Dead Letter Queue

List DLQ Events

GET /streams/:stream_id/dlq

Get events rejected by strict mode validation.

ParameterTypeDefaultDescription
daysinteger7Time range (last N days)
limitinteger100Max events to return

Response:

json
{
  "stream_id": "transactions",
  "dlq_count": 42,
  "events": [
    {
      "rejected_at": "2026-02-15T10:30:00Z",
      "reason": "extra_field",
      "field": "unknown_col",
      "original": {
        "order_id": "abc123",
        "amount": 99.99,
        "unknown_col": "should not be here"
      }
    }
  ]
}

DLQ Event Metadata

FieldDescription
rejected_atTimestamp of rejection
reasonmissing_field, extra_field, or type_mismatch
fieldWhich field caused the rejection
originalFull original event payload

DLQ events are also stored as Parquet at {stream_id}/_dlq/ and queryable via DuckDB.


Schema Events

Get Schema Change History

GET /streams/:stream_id/schema-events

View detected schema changes for streams in evolve mode.

ParameterTypeDefaultDescription
daysinteger7Time range

Response:

json
{
  "stream_id": "erp_events",
  "schema_events": [
    {
      "type": "new_field",
      "field": "discount_code",
      "detected_type": "string",
      "detected_at": "2026-02-15T10:30:00Z",
      "event_count": 127
    },
    {
      "type": "type_change",
      "field": "amount",
      "previous_type": "int64",
      "detected_type": "string",
      "detected_at": "2026-02-15T11:00:00Z",
      "event_count": 3
    },
    {
      "type": "missing_field",
      "field": "currency",
      "detected_at": "2026-02-15T12:00:00Z",
      "event_count": 45
    }
  ]
}
Change TypeDescription
new_fieldA field appeared that isn't in the schema
type_changeA field's data type changed
missing_fieldA previously-present field is no longer being sent

Connect (S3 Credentials)

Get S3 Access Credentials

GET /connect

Get S3-compatible credentials for direct warehouse access to your data.

Response:

json
{
  "endpoint": "https://abcdef123456.r2.cloudflarestorage.com",
  "bucket": "enrich-cust_abc123",
  "access_key_id": "your_r2_access_key",
  "secret_access_key": "your_r2_secret_key",
  "region": "auto",
  "egress_cost": "$0",
  "example_path": "s3://enrich-cust_abc123/events/2026/02/**/*.parquet",
  "sql": {
    "duckdb": "SET s3_region='auto'; SET s3_endpoint='abcdef123456.r2.cloudflarestorage.com'; SET s3_access_key_id='...'; SET s3_secret_access_key='...'; SELECT * FROM read_parquet('s3://enrich-cust_abc123/events/2026/02/**/*.parquet');",
    "clickhouse": "SELECT * FROM s3('https://abcdef123456.r2.cloudflarestorage.com/enrich-cust_abc123/events/2026/02/**/*.parquet', '...', '...', 'Parquet');"
  }
}

TIP

The Dashboard Connect page provides ready-to-paste SQL for ClickHouse, BigQuery, DuckDB, Snowflake, and Python.


Stream Replay

Replay Events to Webhook

POST /streams/:stream_id/replay

Re-send historical events from a time range to a webhook URL.

FieldTypeRequiredDescription
fromstringStart date (YYYY-MM-DD)
tostringEnd date (YYYY-MM-DD)
webhook_urlstringTarget URL

Example:

bash
curl -X POST https://enrich.sh/streams/events/replay \
  -H "Authorization: Bearer sk_live_your_key" \
  -H "Content-Type: application/json" \
  -d '{
    "from": "2026-02-01",
    "to": "2026-02-14",
    "webhook_url": "https://your-api.com/replay-target"
  }'

Response:

json
{
  "replay_id": "rpl_abc123",
  "status": "started",
  "stream_id": "events",
  "from": "2026-02-01",
  "to": "2026-02-14",
  "estimated_events": 450000
}

Use cases: ML model retraining, backfilling downstream systems, disaster recovery.


Templates

List Templates

GET /templates

Get available enrichment templates with their schemas.

Response:

json
{
  "templates": [
    {
      "id": "clickstream",
      "name": "clickstream",
      "description": "Web and app analytics with session tracking, geo, and domain enrichment",
      "fields": [
        { "name": "url", "type": "string" },
        { "name": "timestamp", "type": "int64" },
        { "name": "user_id", "type": "string" }
      ]
    },
    {
      "id": "transaction",
      "name": "transaction",
      "description": "Payment and purchase event enrichment",
      "fields": [
        { "name": "txn_id", "type": "string" },
        { "name": "txn_amount", "type": "string" },
        { "name": "txn_currency", "type": "string" }
      ]
    }
  ]
}

Usage

Get Usage Stats

GET /usage
ParameterTypeDefaultDescription
start_datestring30 days agoStart date (YYYY-MM-DD)
end_datestringtodayEnd date (YYYY-MM-DD)
stream_idstringallFilter by stream

Example:

bash
curl "https://enrich.sh/usage?start_date=2026-02-01&end_date=2026-02-05" \
  -H "Authorization: Bearer sk_live_your_key"

Response:

json
{
  "usage": [
    {
      "date": "2026-02-05",
      "stream_id": "events",
      "event_count": 125000,
      "bytes_stored": 4521000,
      "file_count": 3
    }
  ],
  "totals": {
    "event_count": 223000,
    "bytes_stored": 7721000,
    "file_count": 5
  }
}

Live Stats

Get Real-Time Stats

GET /stats

Real-time ingestion statistics powered by Cloudflare Analytics Engine.

Response:

json
{
  "status": "ok",
  "customer_id": "cust_abc123",
  "hour": "2026-02-07T09",
  "requests": 1523,
  "events": 45690,
  "errors": 2,
  "histogram": [
    { "minute": "2026-02-07T09:30", "events": 1200, "requests": 40, "errors": 0 }
  ],
  "hourly": [
    { "hour": "2026-02-07T08", "events": 52000, "requests": 1700, "errors": 5 }
  ]
}
FieldDescription
statusok if data flowing, idle if no recent activity
histogramMinute-by-minute stats for last hour
hourlyHourly aggregates for last 24 hours

Errors

Get Recent Errors

GET /errors
ParameterTypeDefaultDescription
limitinteger50Max errors to return (max 100)
stream_idstringallFilter by stream

Response:

json
{
  "customer_id": "cust_abc123",
  "errors_24h": 12,
  "by_type": {
    "validation": 8,
    "schema_rejection": 2,
    "rate_limit": 1,
    "processing": 1
  },
  "recent": [
    {
      "id": "err_xyz789",
      "type": "validation",
      "message": "stream_id is required",
      "endpoint": "/ingest",
      "request_body": "{ ... }",
      "created_at": "2026-02-07T09:45:00.000Z"
    }
  ]
}

Health Check

GET /health

Public endpoint — no authentication required.

json
{
  "status": "ok",
  "timestamp": "2026-02-05T18:00:00.000Z",
  "version": "1.0.0"
}

HTTP Status Codes

CodeDescription
200Success
201Created
400Bad Request (validation error)
401Unauthorized (invalid/missing API key)
404Not Found
405Method Not Allowed
413Payload Too Large
429Rate Limited / Quota Exceeded
500Internal Server Error

Rate Limits

TierRequests/secEvents/monthFeatures
Starter5010MFlex mode, shared storage
Pro200100MAll schema modes, dedicated R2 bucket, DLQ, alerts
Scale300500MEverything + webhook forwarding, stream replay
EnterpriseCustomCustomCustom SLAs, dedicated support

When rate limited, you'll receive a 429 response with a Retry-After header.

Serverless data ingestion for developers.