You must log in or # to comment.
Normally people use ChatGPT to vibe code, this is the first instance I’m aware of of ChatGPT using people to vibe code!
In soviet russia, ChatGPT uses you for vibe coding!
ChatGPT didn’t “think” anything. It generated instructions telling users to do things incorrectly based on the human-generated content in its training data, which it didn’t understand because it doesn’t understand anything.
Something like this should be a warning label on AI