Skip to content
Written with Claude
IMPORTANT

As you may notice, this page and pretty much the entire website were obviously created with the help of AI. I wonder how you could tell? Was it a big "Written With Claude" badge on every page? I moved it to the top now (with the help of AI of course) to make it even more obvious. There are a few blogposts that were written by me manually, the old-fashioned way, I hope there will be more in the future, and those have a similar "Human Written" badge. This project (not the website), on the other hand, is a very, very different story. It took me more than two years of painstaking and unpaid work in my own free time. A story that, hopefully, I will tell someday. But meanwhile, what would you like me to do? To create a complex documentation website with a bunch of highly technical articles with the help of AI and fake it, to give you an illusion that I also did that manually? Like the half of itnernet is doing at this point? How does that makes any sense? Is that even fair to you? Or maybe to create this website manually, the old-fashioned way, just for you? While working a paid job for a salary, most of you wouldn't even get up in the morning. Would you like me to sing you a song while we're at it? For your personal entertainment? Seriously, get a grip. Do you find this information less valuable because of the way this website was created? I give my best to fix it to keep the information as accurate as possible, and I think it is very accurate at this point. If you find some mistakes, inaccurancies or problems, there is a comment section at the bottom of every page, which I also made with the help of the AI. And I woould very much appreciate if you leave your feedback there. Look, I'm just a guy who likes SQL, that's all. If you don't approve of how this website was constructed and the use of AI tools, I suggest closing this page and never wever coming back. And good riddance. And I would ban your access if I could know how. Thank you for your attention to this matter.

Changelog v3.8.0 (2025-02-11)

Version 3.8.0 (2025-02-11)

Full Changelog

New Feature: Configuration Key Validation

Added startup validation that checks all configuration keys in appsettings.json against the known defaults schema. This catches typos and unknown keys that would otherwise be silently ignored (e.g., LogCommand instead of LogCommands).

Controlled by the new Config:ValidateConfigKeys setting with three modes:

  • "Warning" (default) — logs warnings for unknown keys, startup continues.
  • "Error" — logs errors for unknown keys and exits the application.
  • "Ignore" — no validation.
json
json
"Config": {
  "ValidateConfigKeys": "Warning"
}

Example output:

code
[12:34:56 WRN] Unknown configuration key: NpgsqlRest:KebabCaselUrls

Removed

  • Removed the Config:ExposeAsEndpoint option. Use the --config CLI switch to inspect configuration instead.

Kestrel Configuration Validation

Configuration key validation also covers the Kestrel section, checking against the known Kestrel schema including Limits, Http2, Http3, and top-level flags like DisableStringReuse and AllowSynchronousIO. User-defined endpoint and certificate names under Endpoints and Certificates remain open-ended and won't trigger warnings.

Syntax Highlighted --config Output

The --config CLI switch now outputs JSON with syntax highlighting (keys, strings, numbers/booleans, and structural characters in distinct colors). When output is redirected to a file, plain JSON is emitted without color codes. The --config switch can now appear anywhere in the argument list and be combined with config files and --key=value overrides.

Improved CLI Error Handling

Unknown command-line parameters now display a clear error message with a --help hint instead of an unhandled exception stack trace.

Universal fallback_handler for All Upload Handlers

The fallback_handler parameter, previously Excel-only, is now available on all upload handlers via BaseUploadHandler. When a handler's format validation fails and a fallback_handler is configured, processing is automatically delegated to the named fallback handler.

This enables scenarios like: CSV format check fails on a binary file → fall back to large_object or file_system to save the raw file for analysis.

sql
sql
comment on function my_csv_upload(json) is '
@upload for csv
@check_format = true
@fallback_handler = large_object
@row_command = select process_row($1,$2)
';

Optional Path Parameters

Path parameters now support the ASP.NET Core optional parameter syntax {param?}. When a path parameter is marked as optional and the corresponding PostgreSQL function parameter has a default value, omitting the URL segment will use the PostgreSQL default:

sql
sql
create function get_item(p_id int default 42) returns text ...
comment on function get_item(int) is '
HTTP GET /items/{p_id?}
';
  • GET /items/5 → uses the provided value 5
  • GET /items/ → uses the PostgreSQL default 42

This also works with query_string_null_handling null_literal to pass NULL via the literal string "null" in the path for any parameter type:

sql
sql
create function get_item(p_id int default null) returns text ...
comment on function get_item(int) is '
HTTP GET /items/{p_id}
query_string_null_handling null_literal
';
  • GET /items/null → passes SQL NULL to the function

Fixes

  • Fixed query string overload resolution not accounting for path parameters. GET endpoints with path parameters and overloaded functions (same name, different signatures) would resolve to the wrong function. The body JSON overload resolution already handled this correctly.
  • Added missing QueryStringNullHandling and TextResponseNullHandling entries to ConfigDefaults, which caused them to be absent from --config output.
  • Added missing Pattern, MinLength, and MaxLength properties to default validation rule schemas in ConfigDefaults.

Machine-Readable CLI Commands for Tool Integration

Added new CLI commands designed for programmatic consumption by tools like pgdev. All JSON-outputting commands use syntax highlighting when run in a terminal and emit plain JSON when piped or redirected.

--version --json

Outputs version information as structured JSON including all assembly versions, runtime, platform RID, and directories:

code
npgsqlrest --version --json

--validate [--json]

Pre-flight check that validates configuration keys against known defaults and tests the database connection, then exits with code 0 (success) or 1 (failure):

code
npgsqlrest --validate
npgsqlrest --validate --json

--config-schema

Outputs a JSON Schema (draft-07) describing the full appsettings.json configuration structure — types, defaults, and enum constraints. Can be used for IDE autocomplete via the $schema property or as the foundation for config editing UIs:

code
npgsqlrest --config-schema

--annotations

Outputs all 44 supported SQL comment annotations as a JSON array with name, aliases, syntax, and description for each:

code
npgsqlrest --annotations

--endpoints

Connects to the database, discovers all generated REST endpoints, outputs full metadata (method, path, routine info, parameters, return columns, authorization, custom parameters), then exits. Logging is suppressed to keep output clean:

code
npgsqlrest --endpoints

--config (updated)

The --config --json flag has been removed. The --config command now always uses automatic detection: syntax highlighted in terminal, plain JSON when output is piped or redirected.

Stats Endpoints: format Query String Override

Stats endpoints now accept an optional format query string parameter that overrides the configured Stats:OutputFormat setting per-request. Valid values are html and json.

code
GET /api/stats/routines?format=json
GET /api/stats/tables?format=html

Comments