Home

Postgres Changes

Listen to Postgres changes using Supabase Realtime.

Let's explore how to use Realtime's Postgres Changes feature to listen to database events.

Quick start#

In this example we'll set up a database table, secure it with Row Level Security, and subscribe to all changes using the Supabase client libraries.

1

Set up a Supabase project with a 'todos' table

Create a new project in the Supabase Dashboard.

After your project is ready, create a table in your Supabase database. You can do this with either the Table interface or the SQL Editor.


_10
-- Create a table called "todos"
_10
-- with a column to store tasks.
_10
create table todos (
_10
id serial primary key,
_10
task text
_10
);

2

Allow anonymous access

In this example we'll turn on Row Level Security for this table and allow anonymous access. In production, be sure to secure your application with the appropriate permissions.


_10
-- Turn on security
_10
alter table "todos"
_10
enable row level security;
_10
_10
-- Allow anonymous access
_10
create policy "Allow anonymous access"
_10
on todos
_10
for select
_10
to anon
_10
using (true);

3

Enable Postgres replication

Go to your project's Replication settings, and under supabase_realtime, toggle on the tables you want to listen to.

4

Install the client

Install the Supabase JavaScript client.


_10
npm install @supabase/supabase-js

5

Create the client

This client will be used to listen to Postgres changes.


_10
import { createClient } from '@supabase/supabase-js'
_10
_10
const client = createClient(
_10
'https://<project>.supabase.co',
_10
'<your-anon-key>'
_10
)

6

Listen to changes by schema

Listen to changes on all tables in the public schema by setting the schema property to 'public' and event name to *. The event name can be one of:

  • INSERT
  • UPDATE
  • DELETE
  • *

The channel name can be any string except 'realtime'.


_11
const channelA = client
_11
.channel('schema-db-changes')
_11
.on(
_11
'postgres_changes',
_11
{
_11
event: '*',
_11
schema: 'public',
_11
},
_11
(payload) => console.log(payload)
_11
)
_11
.subscribe()

7

Insert dummy data

Now we can add some data to our table which will trigger the channelA event handler.


_10
insert into todos (task)
_10
values
_10
('Change!');

Usage#

You can use the Supabase client libraries to subscribe to database changes.

Listeniing to specific schemas#

Subscribe to specific schema events using the schema parameter:


_11
const changes = client
_11
.channel('schema-db-changes')
_11
.on(
_11
'postgres_changes',
_11
{
_11
schema: 'public', // Subscribes to the "public" schema in Postgres
_11
event: '*', // Listen to all changes
_11
},
_11
(payload) => console.log(payload)
_11
)
_11
.subscribe()

The channel name can be any string except 'realtime'.

Listening to INSERT events#

Use the event parameter to listen only to database INSERTs:


_11
const changes = client
_11
.channel('schema-db-changes')
_11
.on(
_11
'postgres_changes',
_11
{
_11
event: 'INSERT', // Listen only to INSERTs
_11
schema: 'public',
_11
},
_11
(payload) => console.log(payload)
_11
)
_11
.subscribe()

The channel name can be any string except 'realtime'.

Listening to UPDATE events#

Use the event parameter to listen only to database UPDATEs:


_11
const changes = client
_11
.channel('schema-db-changes')
_11
.on(
_11
'postgres_changes',
_11
{
_11
event: 'UPDATE', // Listen only to UPDATEs
_11
schema: 'public',
_11
},
_11
(payload) => console.log(payload)
_11
)
_11
.subscribe()

The channel name can be any string except 'realtime'.

Listening to DELETE events#

Use the event parameter to listen only to database UPDATEs:


_11
const changes = client
_11
.channel('schema-db-changes')
_11
.on(
_11
'postgres_changes',
_11
{
_11
event: 'DELETE', // Listen only to DELETEs
_11
schema: 'public',
_11
},
_11
(payload) => console.log(payload)
_11
)
_11
.subscribe()

The channel name can be any string except 'realtime'.

Listening to specific tables#

Subscribe to specific table events using the table parameter:


_12
const changes = client
_12
.channel('table-db-changes')
_12
.on(
_12
'postgres_changes',
_12
{
_12
event: '*',
_12
schema: 'public',
_12
table: 'todos',
_12
},
_12
(payload) => console.log(payload)
_12
)
_12
.subscribe()

The channel name can be any string except 'realtime'.

Listening to multiple changes#

To listen to different events and schema/tables/filters combinations with the same channel:


_21
const channel = supabase
_21
.channel('db-changes')
_21
.on(
_21
'postgres_changes',
_21
{
_21
event: '*',
_21
schema: 'public',
_21
table: 'messages',
_21
},
_21
(payload) => console.log(payload)
_21
)
_21
.on(
_21
'postgres_changes',
_21
{
_21
event: 'INSERT',
_21
schema: 'public',
_21
table: 'users',
_21
},
_21
(payload) => console.log(payload)
_21
)
_21
.subscribe()

Filtering for specific changes#

Use the filter parameter for granular changes:


_13
const changes = client
_13
.channel('table-filter-changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'INSERT',
_13
schema: 'public',
_13
table: 'todos',
_13
filter: 'id=eq.1',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

Available filters#

Realtime offers filters so you can specify the data your client receives at a more granular level.

Equal to (eq)#

To listen to changes when a column's value in a table equals a client-specified value:


_13
const channel = supabase
_13
.channel('changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'UPDATE',
_13
schema: 'public',
_13
table: 'messages',
_13
filter: 'body=eq.hey',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

This filter uses Postgres's = filter.

Not equal to (neq)#

To listen to changes when a column's value in a table does not equal a client-specified value:


_13
const channel = supabase
_13
.channel('changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'INSERT',
_13
schema: 'public',
_13
table: 'messages',
_13
filter: 'body=neq.bye',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

This filter uses Postgres's != filter.

Less than (lt)#

To listen to changes when a column's value in a table is less than a client-specified value:


_13
const channel = supabase
_13
.channel('changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'INSERT',
_13
schema: 'public',
_13
table: 'messages',
_13
filter: 'id=lt.100',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

This filter uses Postgres's < filter, so it works for non-numeric types. Make sure to check the expected behavior of the compared data's type.

Less than or equal to (lte)#

To listen to changes when a column's value in a table is less than or equal to a client-specified value:


_13
const channel = supabase
_13
.channel('changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'UPDATE',
_13
schema: 'public',
_13
table: 'profiles',
_13
filter: 'age=lte.65',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

This filter uses Postgres' <= filter, so it works for non-numeric types. Make sure to check the expected behavior of the compared data's type.

Greater thank (gt)#

To listen to changes when a column's value in a table is greater than a client-specified value:


_13
const channel = supabase
_13
.channel('changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'INSERT',
_13
schema: 'public',
_13
table: 'products',
_13
filter: 'quantity=gt.10',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

This filter uses Postgres's > filter, so it works for non-numeric types. Make sure to check the expected behavior of the compared data's type.

Greater than or equal to (gte)#

To listen to changes when a column's value in a table is greater than or equal to a client-specified value:


_13
const channel = supabase
_13
.channel('changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'INSERT',
_13
schema: 'public',
_13
table: 'products',
_13
filter: 'quantity=gte.10',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

This filter uses Postgres's >= filter, so it works for non-numeric types. Make sure to check the expected behavior of the compared data's type.

Contained in list (in)#

To listen to changes when a column's value in a table equals any client-specified values:


_13
const channel = supabase
_13
.channel('changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'INSERT',
_13
schema: 'public',
_13
table: 'colors',
_13
filter: 'name=in.(red, blue, yellow)',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

This filter uses Postgres's = ANY. Realtime allows a maximum of 100 values for this filter.

Receiveing old records#

By default, only new record changes are sent but if you want to receive the old record (previous values) whenever you UPDATE or DELETE a record, you can set the replica identity of your table to full:


_10
alter table
_10
messages replica identity full;

caution

RLS policies are not applied to DELETE statements, because there is no way for Postgres to verify that a user has access to a deleted record. When RLS is enabled and replica identity is set to full on a table, the old record contains only the primary key(s).

Private schemas#

Postgres Changes works out of the box for tables in the public schema. You can listen to tables in your private schemas by granting table SELECT permissions to the database role found in your access token. You can run a query similar to the following:


_10
grant select on "non_private_schema"."some_table" to authenticated;

caution

We strongly encourage you to enable RLS and create policies for tables in private schemas. Otherwise, any role you grant access to will have unfettered read access to the table.

Custom tokens#

You may choose to sign your own tokens to customize claims that can be checked in your RLS policies.

Your project JWT secret is found with your Project API keys in your dashboard.

caution

Do not expose the service_role token on the client because the role is authorized to bypass row-level security.

To use your own JWT with Realtime make sure to set the token after instantiating the Supabase client and before connecting to a Channel.


_20
const { createClient } = require('@supabase/supabase-js')
_20
_20
const supabase = createClient(process.env.SUPABASE_URL, process.env.SUPABASE_KEY, {})
_20
_20
// Set your custom JWT here
_20
supabase.realtime.setAuth('your-custom-jwt')
_20
_20
const channel = supabase
_20
.channel('db-changes')
_20
.on(
_20
'postgres_changes',
_20
{
_20
event: '*',
_20
schema: 'public',
_20
table: 'messages',
_20
filter: 'body=eq.bye',
_20
},
_20
(payload) => console.log(payload)
_20
)
_20
.subscribe()

Refreshed tokens#

You will need to refresh tokens on your own, but once generated, you can pass them to Realtime.

For example, if you're using the supabase-js v2 client then you can pass your token like this:


_10
// Client setup
_10
_10
supabase.realtime.setAuth('fresh-token')

Limitations#

Realtime systems usually require forethought because of their scaling dynamics. For the Postgres Changes feature, every change event must be checked to see if the subscribed user has access. For instance, if you have 100 users subscribed to a table where you make a single insert, it will then trigger 100 "reads": one for each user.

There can be a database bottleneck which limits message throughput. If your database cannot authorize the changes rapidly enough, the changes will be delayed until you receive a timeout.

If you are using Postgres Changes at scale, you should consider using separate "public" table without RLS and filters. Alternatively, you can use Realtime server-side only and then re-stream the changes to your clients using a Realtime Broadcast.

From our observations, we recommend the following limits depending on your database size:

Micro to medium#

FiltersRLS UsageConcurrent ClientsRecords per second per clientMessages per second (total)Latency p95 (ms)
🚫🚫500105,000257
🚫🚫1,0001010,000800
🚫🚫5,000210,0001,120

Large and above#

FiltersRLS UsageConcurrent ClientsRecords per second per clientMessages per second (total)Latency p95 (ms)
🚫🚫1,0001010,000261
🚫🚫5,000210,000952
🚫🚫10,000110,000949
🚫🚫200,0000.04 (1 in 25 seconds)8,00015,709
50021,000262
1,00011,000431
5,0000.2 (1 in 5 seconds)1,000702

Don't forget to run your own benchmarks to make sure that the performance is acceptable for your use case.

We are making many improvements to Realtime's Postgres Changes. If you are uncertain about the performance of your use case, please reach out using Support Form and we will be happy to help you. We have a team of engineers that can advise you on the best solution for your use-case.