Provision a Postgres user for Sequin
When in development, it’s probably fine to supply Sequin with an existing user. However, when in production, you should create a dedicated user for Sequin. The user needs the following permissions:connectpermission on the database.selectpermission on all the tables you want to connect to Sequin.replicationpermission to read from replication slots.
Enable logical replication on GCP Cloud SQL
By default, logical replication is not enabled on GCP Cloud SQL. You can double check if logical replication is enabled by connecting to your database and running the following commands:off, then logical decoding or replication is not enabled.
To enable it, follow these steps:
Enable pglogical flag
In the Google Cloud Console, navigate to your Cloud SQL instance and follow these steps:
- Go to the “Configuration” tab for your database instance
- Click “Edit configuration”
- Under “Flags” section, add a new flag:
cloudsql.enable_pglogical=on - Click “Save” to apply the changes
Connect Sequin to your GCP Cloud SQL database
After enabling logical replication on GCP Cloud SQL, you’ll now connect to your database in Sequin.Enter connection details in Sequin
In the Sequin Console, click on the Connect Database button and enter the credentials for your GCP Cloud SQL database:
You can find these connection details in your GCP Cloud SQL instance overview.
- Host: Your Cloud SQL instance IP address
- Port: 5432 (default Postgres port)
- Database: Your database name
- Username: The sequin database user you created earlier
- Password: The password for your sequin database user
Create a publication
Connect to your database using the SQL client of your choice and execute the following SQL query to create a publication:If you want to publish changes from all tables, you can use:
Create a sink
With your GCP Cloud SQL database connected to Sequin, you are ready to create a sink. Follow one of our guides below to get started:Stream to GCP Pub/Sub
Send changes to GCP Pub/Sub topics to trigger Cloud Functions and power event-driven architectures
Stream to Webhooks
Send database changes to your HTTP endpoints to trigger workflows and keep services in sync
Stream to Redis
Stream changes to Redis Streams for real-time data processing and caching
Stream to Kafka
Publish database changes to Kafka topics for event streaming and processing

