How to stream Postgres to Elasticsearch
Build real‑time search indexes with Postgres change data capture (CDC) and Elasticsearch. Learn to keep your search indices in sync with your database.
This guide shows you how to capture Postgres changes with Sequin and stream them to an Elasticsearch index.
With Postgres data flowing into Elasticsearch you can power full‑text search, vector retrieval, and analytics without manual re‑indexing. By the end of this guide you’ll have an Elasticsearch index that stays continuously in sync with your database.
Prerequisites
If you self‑host Sequin:
If you use Sequin Cloud:
Basic setup
Prepare an Elasticsearch cluster
Sequin converts Postgres rows to JSON documents and sends them to the Elasticsearch Bulk API. You need a reachable cluster:
-
Local development ⇒ run Elasticsearch in Docker:
The script starts Elasticsearch + Kibana and prints connection details:
Use the API key (not the user/password) when configuring the sink.
-
Production ⇒ Elastic Cloud or self‑managed cluster.
If your Elasticsearch instance is running on your laptop but Sequin runs in the cloud, connect them with the Sequin CLI’s tunnel command.
Map Postgres tables to Elasticsearch indices
Elasticsearch stores documents in indices. As a rule of thumb create one sink per Postgres table → one index. That keeps the mapping consistent and simplifies queries.
Advanced scenarios – sharding, multi‑tenancy, or routing documents to multiple indices – can be handled with filters and soon with Sequin Routing functions.
Create an Elasticsearch sink
Navigate to Sinks → Create sink → Elasticsearch and follow the steps below.
Source configuration
Under Source pick the table whose changes you want to index.
Optionally:
- Select which actions (
insert
,update
,delete
) you care about. - Add filters such as
in_stock = true
to index only a subset of rows.
Backfill existing rows
Enable Initial backfill if you want Sequin to load existing rows into Elasticsearch before streaming live changes.
Transform rows to documents
Your transform must emit JSON compatible with the index mapping.
Sequin automatically builds the document _id
from the table’s primary key. Let us know if you need to use a different _id
field.
Typical transform:
For dynamic schemas you can just return record
unchanged.
See the Elasticsearch ingest best‑practices for mapping guidance.
Delivery settings
In the Elasticsearch card enter:
- Endpoint URL:
http://host.docker.internal:9200
(or your cluster URL) - Index name:
products
- Authentication type:
api_key
- Authentication value:
<api-key>
Keep the default Batch size unless you have special throughput needs. Sequin supports up to 10,000 docs per batch.
Create the sink
Name the sink (e.g. products-elasticsearch
) and click Create sink. Sequin queues a backfill (if selected) and then starts streaming live changes.
Verify & debug
-
In Sequin’s web console watch the Messages count increase.
-
Query Elasticsearch:
-
If documents are missing:
- Check the Messages tab for failed deliveries.
- Inspect the error returned by Elasticsearch; mapping conflicts and authentication issues are common.
Next steps
Ready to go further? Explore: