This guide shows you how to set up Postgres change data capture (CDC) and stream changes to Meilisearch using Sequin.

With Postgres data streaming to Meilisearch, you can build real-time search experiences, maintain up-to-date search indexes, and provide fast, typo-tolerant search for your users.

By the end of this how-to, you’ll have a configured Meilisearch instance that stays in sync with your database.

Prerequisites

If you’re self-hosting Sequin, you’ll need:

  1. Sequin installed
  2. A database connected

If you’re using Sequin Cloud, you’ll need:

  1. A Sequin Cloud account
  2. A database connected

Basic setup

Prepare your Meilisearch instance

Sequin converts Postgres changes and rows into JSON documents and sends them to your Meilisearch index.

You’ll need to have a Meilisearch instance running and accessible to Sequin. You can run Meilisearch locally for development or use a hosted version.

If you’re using Sequin Cloud and developing locally, you can use the Sequin CLI’s tunnel command to connect Sequin’s servers to a local Meilisearch instance.

Mapping Postgres tables to Meilisearch indexes

From the Meilisearch docs:

In Meilisearch, each index contains documents with the same structure.

Sequin recommends creating one sink per Postgres table. Each sink will stream changes from a single Postgres table to a single Meilisearch index. This ensures schema consistency across indexed documents.

For large-scale applications, you may consider sharding indexes—e.g., sharding a users index by region into users_us, users_eu, users_asia, etc.

You can use Sequin’s filters to accomplish this. For example, to shard users by region, create a sink per region with filters to route the correct rows to the corresponding Meilisearch index.

Sequin will soon support Routing functions for dynamically routing rows from one table to multiple Meilisearch indexes within a single sink.

Create Meilisearch sink

Navigate to the “Sinks” tab, click “Create Sink”, and select Meilisearch Sink.

Configure the source

1

Select source table or schema

Under “Source”, select the table or schema you want to stream data from.

2

Specify filters

Choose whether to include insert, update, and/or delete actions. For full sync, select all three.

For example, if you uncheck deletes, rows deleted from Postgres won’t be removed from the Meilisearch index.

You can also use filters to only include certain rows—e.g., WHERE in_stock = true for product availability or for sharding by tenant.

3

Specify backfill

Optionally trigger a backfill to populate your Meilisearch index with existing data from your table. This is optional at sink creation and can be done later.

4

Transforms

Meilisearch requires a defined schema with:

  1. A unique document primary key (string or integer).
  2. Searchable and filterable fields.

Use a functional transform to convert Postgres rows to Meilisearch documents:

def transform(action, record, changes, metadata) do
  %{
    id: record["id"],
    name: record["name"],
    description: record["description"],
    price: record["price"],
    in_stock: record["in_stock"]
  }
end

For dynamic schemas or minimal transformation:

def transform(action, record, changes, metadata) do
  record
end

Or explicitly assign the id:

def transform(action, record, changes, metadata) do
  Map.put(record, "id", record["product_id"])
end
5

Specify message grouping

Keep the default setting to ensure updates to the same row are processed in order. This helps maintain consistent state in Meilisearch.

Configure delivery

1

Specify batch size

The right value for “Batch size” depends on your data and your requirements.

Meilisearch supports bulk document addition via the /documents endpoint. A good starting point is 100–1000 documents per batch. Very large batches (10k+) may impact memory or throughput depending on your instance size.

2

Select the Meilisearch index you created

Under “Meilisearch Index”, select the index you want to stream data to.

3

Create the sink

Give your sink a name, then click “Create Meilisearch Sink”.

Verify & debug

To verify that your Meilisearch sink is working:

  1. Make some changes in your source table.
  2. Verify that the count of messages for your sink increases in the Sequin web console.
  3. Search in your Meilisearch index to confirm the document changes. (You can use the meilisearch-ui for a simple UI to interact with your Meilisearch indexes/tasks)

If documents don’t seem to be flowing:

  1. Click the “Messages” tab to view the state of messages for your sink.
  2. Click any failed message.
  3. Check the delivery logs for error details, including the response from Meilisearch.

Next steps

Assuming you’ve followed the steps above for your local environment, “How to deploy to production” will show you how to deploy your implementation to production.