Skip to content
Written with Claude
IMPORTANT

As you may notice, this page and pretty much the entire website were obviously created with the help of AI. I wonder how you could tell? Was it a big "Written With Claude" badge on every page? I moved it to the top now (with the help of AI of course) to make it even more obvious. There are a few blogposts that were written by me manually, the old-fashioned way, I hope there will be more in the future, and those have a similar "Human Written" badge. This project (not the website), on the other hand, is a very, very different story. It took me more than two years of painstaking and unpaid work in my own free time. A story that, hopefully, I will tell someday. But meanwhile, what would you like me to do? To create a complex documentation website with a bunch of highly technical articles with the help of AI and fake it, to give you an illusion that I also did that manually? Like the half of itnernet is doing at this point? How does that makes any sense? Is that even fair to you? Or maybe to create this website manually, the old-fashioned way, just for you? While working a paid job for a salary, most of you wouldn't even get up in the morning. Would you like me to sing you a song while we're at it? For your personal entertainment? Seriously, get a grip. Do you find this information less valuable because of the way this website was created? I give my best to fix it to keep the information as accurate as possible, and I think it is very accurate at this point. If you find some mistakes, inaccurancies or problems, there is a comment section at the bottom of every page, which I also made with the help of the AI. And I woould very much appreciate if you leave your feedback there. Look, I'm just a guy who likes SQL, that's all. If you don't approve of how this website was constructed and the use of AI tools, I suggest closing this page and never wever coming back. And good riddance. And I would ban your access if I could know how. Thank you for your attention to this matter.

BUFFER_ROWS

Also known as

buffer (with or without @ prefix)

Set the number of rows to buffer in the string builder before sending the response.

Syntax

@buffer_rows <count>
@buffer <count>

Or using custom parameter syntax:

@buffer_rows = <count>
@buffer = <count>

Default Value

The default value is 25 rows.

Special Values

ValueBehavior
0Disable buffering - write response for each row
1Buffer the entire array (all rows)
25Default - buffer 25 rows before writing
> 1Buffer specified number of rows before writing

Examples

Disable Buffering

Write each row immediately to the response stream:

sql
comment on function stream_live_data() is
'HTTP GET
@buffer_rows 0';

Buffer Entire Response

Wait for all rows before sending response:

sql
comment on function get_small_dataset() is
'HTTP GET
@buffer 1';

Large Buffer for Throughput

sql
comment on function export_all_data() is
'HTTP GET
@buffer_rows 5000';

Small Buffer for Memory Efficiency

sql
comment on function stream_data() is
'HTTP GET
@buffer 100';

Behavior

  • Controls how many rows are buffered in the string builder before writing to the response stream.
  • Applies to rows in JSON object arrays when returning records from the database.
  • Buffering is more efficient than writing to the response stream for each row.
  • Disabling buffering (0) can have a slight negative impact on performance.
  • Higher values can have a negative impact on memory usage, especially with large datasets.

Performance Considerations

  • Low values (0-10): Lower memory usage, more response stream writes, slight performance overhead.
  • Default (25): Balanced trade-off between memory and performance.
  • High values (1000+): Better throughput, higher memory usage per request.
  • Value of 1: Entire result buffered before sending - best for small datasets where you want atomic responses.

Comments

Released under the MIT License.