Skip to content
Written with Claude
IMPORTANT

As you may notice, this page and pretty much the entire website were obviously created with the help of AI. I wonder how you could tell? Was it a big "Written With Claude" badge on every page? I moved it to the top now (with the help of AI of course) to make it even more obvious. There are a few blogposts that were written by me manually, the old-fashioned way, I hope there will be more in the future, and those have a similar "Human Written" badge. This project (not the website), on the other hand, is a very, very different story. It took me more than two years of painstaking and unpaid work in my own free time. A story that, hopefully, I will tell someday. But meanwhile, what would you like me to do? To create a complex documentation website with a bunch of highly technical articles with the help of AI and fake it, to give you an illusion that I also did that manually? Like the half of itnernet is doing at this point? How does that makes any sense? Is that even fair to you? Or maybe to create this website manually, the old-fashioned way, just for you? While working a paid job for a salary, most of you wouldn't even get up in the morning. Would you like me to sing you a song while we're at it? For your personal entertainment? Seriously, get a grip. Do you find this information less valuable because of the way this website was created? I give my best to fix it to keep the information as accurate as possible, and I think it is very accurate at this point. If you find some mistakes, inaccurancies or problems, there is a comment section at the bottom of every page, which I also made with the help of the AI. And I woould very much appreciate if you leave your feedback there. Look, I'm just a guy who likes SQL, that's all. If you don't approve of how this website was constructed and the use of AI tools, I suggest closing this page and never wever coming back. And good riddance. And I would ban your access if I could know how. Thank you for your attention to this matter.

Changelog v3.7.0 (2025-02-07)

Version 3.7.0 (2025-02-07)

Full Changelog

Fixes

  • Fixed comma separator bug in Excel Upload Handler error response when processing multiple files. The fileId counter was not incremented on error, causing malformed JSON output when an invalid file was followed by additional files.

  • Fixed CustomHost configuration in ClientCodeGen not accepting an empty string value. Setting "CustomHost": "" was treated the same as null (triggering host auto-detection) because GetConfigStr uses string.IsNullOrEmpty. Now an explicit empty string correctly produces const baseUrl = ""; in generated TypeScript, which is useful for relative URL paths.

New Features

  • Added fallback_handler parameter to the Excel Upload Handler. When set (e.g., fallback_handler = csv), if ExcelDataReader fails to parse an uploaded file (invalid Excel format), the handler automatically delegates processing to the named fallback handler. This allows a single upload endpoint to accept both Excel and CSV files transparently:
sql
sql
comment on function my_upload(json) is '
@upload for excel
@fallback_handler = csv
@row_command = select process_row($1,$2)
';

New Feature: Pluggable Table Format Renderers

Added a pluggable table format rendering system that allows PostgreSQL function results to be rendered as HTML tables or Excel spreadsheet downloads instead of JSON, controlled by the @table_format annotation.

HTML Table Format

Renders results as a styled HTML table suitable for browser viewing and copy-paste into Excel:

sql
sql
comment on function get_report() is '
HTTP GET
@table_format = html
';

Configuration options in TableFormatOptions: HtmlEnabled, HtmlKey, HtmlHeader, HtmlFooter.

Excel Table Format

Renders results as an .xlsx Excel spreadsheet download using the SpreadCheetah library (streaming, AOT-compatible):

sql
sql
comment on function get_report() is '
HTTP GET
@table_format = excel
';

Configuration options in TableFormatOptions: ExcelEnabled, ExcelKey, ExcelSheetName, ExcelDateTimeFormat, ExcelNumericFormat.

  • ExcelDateTimeFormat — Excel Format Code for DateTime cells (default: yyyy-MM-dd HH:mm:ss). Examples: yyyy-mm-dd, dd/mm/yyyy hh:mm.
  • ExcelNumericFormat — Excel Format Code for numeric cells (default: General). Examples: #,##0.00, 0.00.

Per-Endpoint Custom Parameters

The download filename and worksheet name can be overridden per-endpoint via custom parameter annotations:

sql
sql
comment on function get_report() is '
HTTP GET
@table_format = excel
@excel_file_name = monthly_report.xlsx
@excel_sheet = Report Data
';

These also support dynamic placeholders resolved from function parameters:

sql
sql
comment on function get_report(_format text, _file_name text, _sheet_name text) is '
HTTP GET
@table_format = {_format}
@excel_file_name = {_file_name}
@excel_sheet = {_sheet_name}
';

TsClient: Per-Endpoint URL Export Control

Added two new custom parameter annotations to control TypeScript client code generation per-endpoint:

tsclient_export_url

Overrides the global ExportUrls configuration setting for a specific endpoint:

sql
sql
comment on function login(_username text, _password text) is '
HTTP POST
@login
@tsclient_export_url = true
';

When enabled, the generated TypeScript exports a URL constant for that endpoint:

typescript
typescript
export const loginUrl = () => baseUrl + "/api/login";

tsclient_url_only

When set, only the URL constant is exported — the fetch function and response type interface are skipped entirely. Implies tsclient_export_url = true:

sql
sql
comment on function get_data(_format text) is '
HTTP GET
@table_format = {_format}
@tsclient_url_only = true
';

This generates only the URL constant and request interface, which is useful for endpoints consumed via browser navigation (e.g., table format downloads) rather than fetch calls.


Comments