99 days ago · Tech · 0 comments

Every developer has at least once in their career stumbled upon the scenario where a program would not print something as they initially thought. A few years ago, you would search for an answer on StackOverflow and find a page stating the dreaded you have to flush the buffer I’d never been curious enough to dig into the 'why' until recently. Let’s take a look at a quick example: #include <stdio.h> #include <unistd.h> int main() { for (int i = 1; i <= 5; i++) { printf("line %d\n", i); sleep(1); } return 0; } If we compile and run this in our terminal, we’ll get the following: $ gcc -o cprint cprint.c $ ./cprint line 1 ← appears at t=1 line 2 ← appears at t=2 line 3 ← appears at t=3 line 4 ← appears at t=4 line 5 ← appears at t=5 Something different happens if you pipe the result: $ ./cprint | cat ← nothing for 5 seconds... line 1 ← appears at t=5 line 2 line 3 line 4 line 5 The answer lies in how buffering is handled in TTY and in non-TTY environments. A libc implementation will use…

No comments yet. Log in to reply on the Fediverse. Comments will appear here.