Skip to content
Written with Claude
IMPORTANT

As you may notice, this page and pretty much the entire website were obviously created with the help of AI. I wonder how you could tell? Was it a big "Written With Claude" badge on every page? I moved it to the top now (with the help of AI of course) to make it even more obvious. There are a few blogposts that were written by me manually, the old-fashioned way, I hope there will be more in the future, and those have a similar "Human Written" badge. This project (not the website), on the other hand, is a very, very different story. It took me more than two years of painstaking and unpaid work in my own free time. A story that, hopefully, I will tell someday. But meanwhile, what would you like me to do? To create a complex documentation website with a bunch of highly technical articles with the help of AI and fake it, to give you an illusion that I also did that manually? Like the half of itnernet is doing at this point? How does that makes any sense? Is that even fair to you? Or maybe to create this website manually, the old-fashioned way, just for you? While working a paid job for a salary, most of you wouldn't even get up in the morning. Would you like me to sing you a song while we're at it? For your personal entertainment? Seriously, get a grip. Do you find this information less valuable because of the way this website was created? I give my best to fix it to keep the information as accurate as possible, and I think it is very accurate at this point. If you find some mistakes, inaccurancies or problems, there is a comment section at the bottom of every page, which I also made with the help of the AI. And I woould very much appreciate if you leave your feedback there. Look, I'm just a guy who likes SQL, that's all. If you don't approve of how this website was constructed and the use of AI tools, I suggest closing this page and never wever coming back. And good riddance. And I would ban your access if I could know how. Thank you for your attention to this matter.

Zero to CRUD API: Auto-Generate REST Endpoints from PostgreSQL Tables

CRUD · No SQL Required · TypeScript · Rapid Development · January 2026


What if you could build a complete REST API with Create, Read, Update, and Delete operations without writing a single line of backend code? With NpgsqlRest's CRUD auto-generation, you define a table, add a comment, and you're done. This post walks through building a fully functional Contacts Manager application in under 50 lines of SQL.

The Traditional Way vs. The NpgsqlRest Way

Building a typical CRUD API requires:

Traditional StackLines of Code
Database schema~20
ORM models~50
Repository layer~100
Service layer~150
Controller/routes~200
Validation logic~100
TypeScript types~50
API documentation~100
Total~770 lines

With NpgsqlRest:

NpgsqlRestLines of Code
Database schema + comment~15
Config JSON~20
TypeScript typesAuto-generated
API documentationAuto-generated
Total~35 lines

That's a 95% reduction in code. And the code you don't write has no bugs.

How It Works

NpgsqlRest reads your PostgreSQL table structure and automatically generates REST endpoints based on:

  1. Table columns become request/response fields
  2. Primary keys enable update and delete operations
  3. Column types become TypeScript types
  4. Table comments control which operations are exposed

The magic is in one SQL comment:

sql
sql
comment on table contacts is 'HTTP';

This single line creates:

MethodEndpointOperation
GET/api/contacts/Select all rows
GET/api/contacts/?id=1Filter by any column
PUT/api/contacts/Insert new row
PUT/api/contacts/returning/Insert and return data
POST/api/contacts/Update existing row
DELETE/api/contacts/?id=1Delete row

Plus additional endpoints for conflict handling, returning variants, and more.

Project Structure

Source Code: The complete working example is available at github.com/NpgsqlRest/npgsqlrest-docs/examples/11_crud_auto_api

code
11_crud_auto_api/
├── sql_11/
│   ├── V1__example_11_schema.sql         # Tables and sample data
│   ├── R__example_11_contacts_api.sql    # API annotations (repeatable)
│   ├── R__example_11_login.sql           # Authentication function
│   └── R__example_11_logout.sql          # Logout function
├── src/
│   ├── example11Api.ts                   # Auto-generated by NpgsqlRest
│   └── app.ts                            # Frontend application
├── public/
│   └── index.html                        # HTML entry point
└── config.json                           # NpgsqlRest configuration

Step 1: Define Your Table (15 Lines)

The schema is simple - just define your table structure:

sql
sql
-- V1__example_11_schema.sql

drop schema if exists example_11 cascade;
create schema example_11;

-- A simple contacts table
create table example_11.contacts (
    id serial primary key,
    name text not null,
    email text,
    phone text,
    created_at timestamp default now()
);

-- Sample data
insert into example_11.contacts (name, email, phone) values
    ('Alice Johnson', 'alice@example.com', '555-0101'),
    ('Bob Smith', 'bob@example.com', '555-0102'),
    ('Carol White', 'carol@example.com', '555-0103');

That's it. No ORM mappings. No repository interfaces. No service classes. Just a table.

Step 2: Enable the API (9 Lines)

In a separate repeatable migration, add the API annotations:

sql
sql
-- R__example_11_contacts_api.sql

-- Enable only the CRUD endpoints we need
-- disabled = turn off all endpoints by default
-- enabled = turn on only what we use
comment on table example_11.contacts is '
HTTP
@disabled
@enabled select, insert_returning, update, delete
@authorize
';

Let's break down each line:

AnnotationEffect
HTTPEnable CRUD endpoint generation for this table
@disabledStart with all operations disabled
@enabled selectEnable GET (list/filter contacts)
@enabled insert_returningEnable PUT with response (create contact, return created data)
@enabled updateEnable POST (update contact)
@enabled deleteEnable DELETE (remove contact)
@authorizeRequire authentication for all operations

This fine-grained control means you only expose what you need. No accidental data leaks.

Step 3: Configure NpgsqlRest (20 Lines)

json
json
{
  "ApplicationName": "11_crud_auto_api",

  "StaticFiles": {
    "RootPath": "./11_crud_auto_api/public"
  },

  "Auth": {
    "CookieAuth": true,
    "CookieAuthScheme": "cookies",
    "CookieValidDays": 1,
    "CookieName": "example_11_auth"
  },

  "NpgsqlRest": {
    "IncludeSchemas": [ "example_11" ],
    "RequiresAuthorization": true,

    "CrudSource": {
      "Enabled": true,
      "CommentsMode": "OnlyWithHttpTag",
      "CrudTypes": ["All"]
    },

    "ClientCodeGen": {
      "FilePath": "./11_crud_auto_api/src/{0}Api.ts"
    }
  }
}

Key settings:

  • CrudSource.Enabled: true - Turn on automatic CRUD generation
  • CommentsMode: "OnlyWithHttpTag" - Only create endpoints for tables with HTTP comment
  • ClientCodeGen.FilePath - Auto-generate TypeScript client

The Generated API

When NpgsqlRest starts, it reads the table structure and generates these endpoints:

GET - List and Filter

bash
bash
# Get all contacts
curl http://localhost:8080/api/example-11/contacts/

# Filter by any column
curl http://localhost:8080/api/example-11/contacts/?name=Alice%20Johnson
curl http://localhost:8080/api/example-11/contacts/?email=bob@example.com
curl http://localhost:8080/api/example-11/contacts/?id=1

Response:

json
json
[
  {
    "id": 1,
    "name": "Alice Johnson",
    "email": "alice@example.com",
    "phone": "555-0101",
    "createdAt": "2026-01-07T12:00:00"
  }
]

PUT - Create

bash
bash
curl -X PUT http://localhost:8080/api/example-11/contacts/returning/ \
  -H "Content-Type: application/json" \
  -d '{"name": "David Brown", "email": "david@example.com", "phone": "555-0104"}'

Response (the created contact with auto-generated ID):

json
json
[
  {
    "id": 4,
    "name": "David Brown",
    "email": "david@example.com",
    "phone": "555-0104",
    "createdAt": "2026-01-07T12:30:00"
  }
]

POST - Update

bash
bash
curl -X POST http://localhost:8080/api/example-11/contacts/ \
  -H "Content-Type: application/json" \
  -d '{"id": 1, "name": "Alice Smith", "phone": "555-9999"}'

The primary key (id) identifies which row to update. Only provided fields are modified.

DELETE - Remove

bash
bash
curl -X DELETE "http://localhost:8080/api/example-11/contacts/?id=1"

The Generated TypeScript Client

NpgsqlRest automatically generates a fully-typed TypeScript client:

typescript
typescript
// example11Api.ts (auto-generated)

interface IContactsGetRequest {
    id?: number | null;
    name?: string | null;
    email?: string | null;
    phone?: string | null;
    createdAt?: string | null;
}

interface IContactsGetResponse {
    id: number | null;
    name: string | null;
    email: string | null;
    phone: string | null;
    createdAt: string | null;
}

export async function contactsGet(
    request: IContactsGetRequest
) : Promise<{
    status: number,
    response: IContactsGetResponse[],
    error: {status: number; title: string; detail?: string | null} | undefined
}> {
    const response = await fetch(baseUrl + "/api/example-11/contacts" + parseQuery(request), {
        method: "GET",
        headers: { "Content-Type": "application/json" },
    });
    return {
        status: response.status,
        response: response.ok ? await response.json() : undefined!,
        error: !response.ok ? await response.json() : undefined
    };
}

export async function contactsPut(request: IContactsPutRequest) { /* ... */ }
export async function contactsPost(request: IContactsPostRequest) { /* ... */ }
export async function contactsDelete(request: IContactsDeleteRequest) { /* ... */ }

Every column becomes a typed property. Every endpoint becomes a typed function. If you rename a column in PostgreSQL, the TypeScript build fails - catching errors before production.

The Frontend Application

Using the generated client, the entire frontend is straightforward:

typescript
typescript
// app.ts
import {
    login,
    logout,
    contactsGet,
    contactsPut,      // INSERT with returning
    contactsPost,     // UPDATE
    contactsDelete    // DELETE
} from "./example11Api.ts";

// GET all contacts
async function loadContacts() {
    const result = await contactsGet({});
    if (result.status === 200) {
        renderContacts(result.response);
    }
}

// PUT - Create new contact
async function createContact(name: string, email: string, phone: string) {
    const result = await contactsPut({
        name,
        email: email || null,
        phone: phone || null
    });

    if (result.status === 200 && result.response.length > 0) {
        showMessage(`Contact "${result.response[0].name}" created!`);
        loadContacts();
    }
}

// POST - Update contact
async function updateContact(id: number, name: string, email: string, phone: string) {
    const result = await contactsPost({
        id,
        name,
        email: email || null,
        phone: phone || null
    });

    if (result.status === 204) {
        showMessage("Contact updated!");
        loadContacts();
    }
}

// DELETE - Delete contact
async function deleteContact(id: number) {
    const result = await contactsDelete({ id });

    if (result.status === 204) {
        showMessage("Contact deleted!");
        loadContacts();
    }
}

No manual fetch calls with string URLs. No manual JSON parsing. No manual type definitions. The generated client handles everything.

Fine-Grained Access Control

The annotation system gives you precise control over what's exposed:

Enable Specific Operations

sql
sql
-- Read-only table
comment on table audit_log is '
HTTP
@disabled
@enabled select
';

-- Insert-only table (no updates or deletes)
comment on table events is '
HTTP
@disabled
@enabled insert_returning
';

-- Full CRUD except delete
comment on table products is '
HTTP
@disabled
@enabled select, insert_returning, update
';

Require Authentication

sql
sql
-- All operations require login
comment on table contacts is '
HTTP
@authorize
';

-- Only updates require authorization
comment on table public_data is '
HTTP
for select
@anonymous
for update, delete
@authorize
';

Custom Endpoints

sql
sql
-- Custom URL path
comment on table users is '
HTTP GET /v1/users
HTTP POST /v1/users
';

Understanding CRUD Tags: The for Keyword

When NpgsqlRest generates CRUD endpoints from a table, each operation receives tags that identify what type of operation it is. The for keyword creates a scope where subsequent annotations apply only to endpoints matching those tags.

Available CRUD Tags

OperationTags
Selectselect, read, get
Insertinsert, put, create
Insert Returninginsert_returning, returning
Updateupdate, post
Update Returningupdate_returning, returning
Deletedelete
Delete Returningdelete_returning, returning
On Conflict Do Nothingon_conflict_do_nothing, on_conflict
On Conflict Do Updateon_conflict_do_update, on_conflict

How for Creates Scopes

The for keyword starts a scope - all annotations that follow apply only to operations matching those tags, until the next for or end of comment:

sql
sql
comment on table orders is '
HTTP

for select                    -- Start scope for SELECT endpoints
@allow_anonymous              -- SELECT is public
@cached                       -- SELECT results are cached

for insert update             -- Start scope for INSERT and UPDATE
@authorize                    -- Require authentication

for delete                    -- Start scope for DELETE only
@authorize admin              -- Require admin role
';

This creates different configurations for different operations from a single table comment.

Real-World Tag Patterns

Pattern 1: Read-Only API

Expose data for querying but prevent any modifications:

sql
sql
comment on table reference_data is '
HTTP
@disabled insert, update, delete
';

Or using the whitelist approach:

sql
sql
comment on table reference_data is '
HTTP
@disabled
@enabled select
';

Pattern 2: Append-Only Audit Log

Allow inserting records but prevent updates and deletes:

sql
sql
comment on table audit_log is '
HTTP
@disabled update, delete
';

Pattern 3: Simplified API (No RETURNING Variants)

Keep the API surface minimal by removing /returning/ endpoints:

sql
sql
comment on table orders is '
HTTP
@disabled returning
';

The returning tag matches all RETURNING variants: insert_returning, update_returning, delete_returning, and their on_conflict counterparts.

Pattern 4: Upsert-Only (Idempotent API)

For sync endpoints where clients should only use upsert operations:

sql
sql
comment on table sync_state is '
HTTP
@disabled
for on_conflict
@enabled
';

Only /on-conflict-do-update/ and /on-conflict-do-nothing/ endpoints are exposed.

Pattern 5: Public Read, Authenticated Write

sql
sql
comment on table products is '
HTTP
for select
@allow_anonymous

for insert update delete
@authorize
';

Anyone can browse products, but only authenticated users can modify them.

Pattern 6: Different Roles for Different Operations

sql
sql
comment on table sensitive_data is '
HTTP
for select
@authorize viewer

for update
@authorize editor

for delete
@authorize admin
';

Viewers can read, editors can modify, only admins can delete.

For production APIs, start with everything disabled and explicitly enable what you need:

sql
sql
comment on table contacts is '
HTTP
@disabled                                    -- 1. Disable all endpoints
@enabled select, insert_returning, update, delete  -- 2. Enable only these
@authorize                                   -- 3. All require auth
';

Why use whitelist?

  1. Explicit is better than implicit - You know exactly what's exposed
  2. Safer defaults - New CRUD types in future versions won't be auto-enabled
  3. Cleaner API surface - Clients see only the endpoints they need
  4. Easier security auditing - Clear documentation of what's public

For full documentation on the tag system, see Tags annotation, Disabled annotation, and Enabled annotation.

Running the Example

bash
bash
cd examples/11_crud_auto_api

# Create database tables
bun run db:up

# Build frontend
bun run build

# Start server
bun run dev

Open http://127.0.0.1:8080 and login with bob / bob123.

What You Get Without Writing

Let's tally what NpgsqlRest generated automatically:

ComponentTraditionalNpgsqlRest
REST endpoints (4 operations)Hand-coded routesAuto-generated
Request validationManual checksColumn constraints
Response serializationJSON library configBuilt-in
TypeScript interfacesHand-maintainedAuto-generated
API client functionsHand-written fetchAuto-generated
Query parameter filteringCustom parsingBuilt-in
Primary key handlingORM configurationAutomatic
Authentication hooksMiddleware setupAnnotations
Error responsesCustom error handlingRFC 7807 compliant

Beyond Basic CRUD

The CRUD auto-generation is just the starting point. You can combine it with:

Computed Columns

sql
sql
create table orders (
    id serial primary key,
    items jsonb,
    subtotal numeric generated always as (
        (select sum((item->>'price')::numeric * (item->>'quantity')::int)
         from jsonb_array_elements(items) as item)
    ) stored
);

The generated API includes subtotal as a read-only field.

Row-Level Security

sql
sql
alter table contacts enable row level security;

create policy contacts_user_policy on contacts
    using (user_id = current_setting('app.user_id')::int);

Combined with NpgsqlRest's user context injection, each user only sees their own data.

Custom Functions for Complex Operations

When CRUD isn't enough, add functions for complex logic:

sql
sql
create function example_11.merge_contacts(
    _primary_id int,
    _secondary_id int
) returns example_11.contacts
language plpgsql
as $$
declare
    _result example_11.contacts;
begin
    -- Merge logic here
    return _result;
end;
$$;

comment on function example_11.merge_contacts is 'HTTP POST';

The function appears alongside CRUD endpoints in the generated client.

Performance

CRUD auto-generation has zero runtime overhead compared to hand-written endpoints:

  • Direct SQL execution - No ORM translation layer
  • Prepared statements - Queries are parameterized and cached
  • Streaming responses - Large result sets don't consume memory
  • Connection pooling - Efficient database connection reuse

In benchmarks, NpgsqlRest consistently outperforms traditional frameworks.

When to Use CRUD Auto-Generation

Ideal for:

  • Admin panels and dashboards
  • Internal tools
  • MVPs and prototypes
  • Data management interfaces
  • Any table-centric UI

Consider custom functions when:

  • Complex business logic spans multiple tables
  • You need computed fields not in the table
  • Operations require transactions with specific isolation
  • You want to hide the underlying schema

The beauty is you can mix both approaches. Use CRUD for simple tables, custom functions for complex operations.

Conclusion

Building a REST API shouldn't require hundreds of lines of boilerplate. With NpgsqlRest's CRUD auto-generation:

  1. Define your table - The schema is your API contract
  2. Add one comment - HTTP enables everything
  3. Fine-tune with annotations - Enable only what you need
  4. Get a typed client - TypeScript interfaces generated automatically

The Contacts Manager example demonstrates a complete, production-ready CRUD application in:

  • 15 lines of SQL for the schema
  • 9 lines of SQL for API annotations
  • 20 lines of JSON configuration
  • ~44 lines total vs ~770 lines in a traditional stack

That's a 95% reduction in code - and the code you don't write has no bugs. The types you don't maintain never drift. The endpoints you don't document are never outdated.

That's the power of making PostgreSQL the single source of truth.

Comments