Imagine you scheduled a group brainstorming session to get some important work done, and one of the attendees casually mentions that they took LSD for the first time right before the meeting.
That's how you should treat ChatGPT output.
@mattblaze and that's why anyone who even mentions #AI gets kicked off premises by me.
@mattblaze am I the only one thinking that that brainstorming session could actually be quite productive? And if not, it would at least be fun.
@Disputatore Maybe. But you definitely don't want to give the LSD guy final edit.
@mattblaze I honestly think you might get more out of the person on LSD than you would out of ChatGPT. (Okay, casually mentioning it might be a red flag, but still.)
@mattblaze (Hallucination is really a terrible term in relation to LLMs, because it bolsters the illusion that perception is actually happening, instead of it just being statistical error. )
@jbaggs Yes. Hallucination is just being used as a fancy word for "being wrong".
@mattblaze @Disputatore Oh wow, this reminded me of something I hadn't thought of in quite a while...
In college a friend's chemistry lab group had a report due, and after each doing their respective parts, one of them was tasked with the final document preparation and hand-in. The other two found out some time later (after getting some concerned questions from the professor), that hand-in guy had apparently been quite high while doing so (on exactly what I don't know, possibly LSD?). At the time I saved copies of the actual files from my friend and managed to dig them up, and it's really quite something. (Some choice excerpts pictured here.)
So yeah, don't give LSD guy final edit...but sometimes LSD guy can take you by surprise.
@mattblaze Just one?
In Michael Pollan's book "How To Change Your Mind" I remember he described 3 instances of computer EE chip designers taking LSD to "focus on and extend the computers memory" and complete the designs. YMMV.
So maybe a different drug?
@MHowell no. LSD is a fine example here. Just like with LLMs, maybe they’ll do something useful. Or dangerously nonsensical. Either way, you need to check, carefully, to find out which.