Who would have thought you have to read the AI code before pushing it to production? This is what I've been saying and explaining continuously: LLMs aren't deterministic; they can have hallucinations, and if just one hallucination passes, it can destroy your business.
This article is machine translated
Show original

Guillermo Rauch
@rauchg
03-04
A Vercel user reported an issue that sounded extremely scary. An unknown GitHub OSS codebase being deployed to their team.
We, of course, took the report extremely seriously and began an investigation. Security and infra engineering engaged.
Turns out Opus 4.6 *hallucinated a
From Twitter
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments
Share
Relevant content

