- Boot Sequin
- Connect to a sample playground database
- Start a local Elasticsearch + Kibana stack
- Create an Elasticsearch index
- Create a Sequin sink from Postgres to Elasticsearch
- Watch your data flow in real‑time
This is the quickstart for streaming Postgres to Elasticsearch. See the how-to guide for an explanation of how to use the Elasticsearch sink or the reference for details on all configuration options.
Run Sequin
The easiest way to get started with Sequin is with our Docker Compose file. This file starts a Postgres database, Redis instance, and Sequin server.
1
Create directory and start services
- Download sequin-docker-compose.zip.
- Unzip the file.
- Navigate to the unzipped directory and start the services:
Alternative: Download with curl
Alternative: Download with curl
Alternative: Clone the repository
Alternative: Clone the repository
2
Verify services are running
Check that Sequin is running using You should see output like the following:
docker ps
:Sequin, Postgres, Redis, Prometheus, and Grafana should be up and running (status:
Up
).Login
The Docker Compose file automatically configures Sequin with an admin user and a playground database.Let’s log in to the Sequin web console:
1
Open the web console
After starting the Docker Compose services, open the Sequin web console at http://localhost:7376:

2
Login with default credentials
Use the following default credentials to login:
- Email:
- Password:
View the playground database
To get you started quickly, Sequin’s Docker Compose file creates a logical database called
sequin_playground
with a sample dataset in the public.products
table.Let’s take a look:1
Navigate to Databases
In the Sequin web console, click Databases in the sidebar.
2
Select playground database
Click on the pre-configured 
sequin-playground
database:
The database “Health” should be green.
3
View contents of the products table
Let’s get a sense of what’s in the You should see a list of the rows in the We’ll make modifications to this table in a bit.
products
table. Run the following command:This command connects to the running Postgres container and runs a
psql
command.products
table:Start Elasticsearch & Kibana
We’ll run Elasticsearch locally with Docker using Elastic’s start‑local helper script.The script:Copy the API key and API endpoint URL – you’ll need them when configuring the sink.
- Downloads the Elasticsearch & Kibana images
- Generates credentials
- Starts both services via docker‑compose
Create an index
Next create the You should receive:
products
index that will receive documents.Make sure to replace
<api-key>
with the API key you copied earlier.Create an Elasticsearch sink
With the playground database connected and the index created, you’re ready to add a sink that pushes changes to Elasticsearch.
1
Head back to the Sequin console and navigate to the Sinks tab
Click Sinks in the sidebar, then Create Sink.
2
Select sink type
Choose Elasticsearch and click Continue.
3
Verify source configuration
In the Source card you’ll see the 
sequin_playground
database and the products
table pre‑selected. Leave the defaults.
4
Add a transform
Open the Transform card, click + Create new transform and use the following Elixir function in a Transform function:Name the transform
products-elasticsearch
and click Create transform.5
Select the transform
Navigate back to the Sinks tab and select the transform you just created.
If you don’t see the transform you just created, click the refresh button.
6
Configure a backfill
Open Initial backfill and choose Backfill all rows so the existing data is loaded into Elasticsearch as soon as the sink is created.
7
Configure Elasticsearch
In the Elasticsearch card enter:
- Endpoint URL:
http://host.docker.internal:9200
- Index name:
products
- Authentication type:
api_key
- Authentication value:
<api-key>
(copied earlier)

8
Create the sink
Give it a name, e.g.
products-elasticsearch
, and click Create Sink.Sequin will first backfill all rows from the products
table, then stream every change in real‑time.Query your data in Elasticsearch
Your backfill should load all rows from the You should see the documents from your Postgres table.
products
table into Elasticsearch. When it completes, you should see the sink health is green and the backfill card displays Processed 6 and ingested 6 records in 1s
.You can now query your data in Elasticsearch:See changes flow to Elasticsearch
Let’s test live updates:Search for the new product:
1
Insert a product
2
Update a product's price
Update a product's price
Change a product's name
Change a product's name
Delete a product
Delete a product
3
Each change appears (or disappears) in Elasticsearch within a few seconds.
Great work!
- Started Elasticsearch + Kibana locally
- Created an index
- Loaded existing data via backfill
- Streamed live changes
- Queried Elasticsearch