2026-02-09-terrible-code-problem
Stop generating, start thinking, localghost
LLMs are trained (without our explicit consent) on all our shitty code, and we’ve taught them that that’s what they should be outputting. They are doomed to repeat humans’ mistakes, then be trained on the shitty reconstituted mistakes made by other LLMs in what’s (brilliantly) been called human centipede epistemology. We don’t write good enough code as humans to deserve something that writes the same stuff faster.
This post is licensed under
CC BY 4.0
by the author.