Make sure you ask the AI not to hallucinate because it will sometimes straight up lie. It’s also incapable of counting.

But where is it fun in it if I can’t make it hallucinate?

I do feel bad when I have to tell it not to. Hallucinating is fun!

But does it work to tell it not to hallucinate? And does it work the other way around too?

It’s honestly a gamble based on my experience. Instructions that I’ve given ChatGPT have worked for a while, only to be mysteriously abandoned for no valid reason. Telling AI not to hallucinate is apparently common practice from the research I’ve done.

Makes me wonder: Can I just asked it to hallucinate?

Yep. Tell it to lie and it will.

Create a post

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

  • Posts must be relevant to programming, programmers, or computer science.
  • No NSFW content.
  • Jokes must be in good taste. No hate speech, bigotry, etc.
  • 1 user online
  • 54 users / day
  • 172 users / week
  • 445 users / month
  • 2.42K users / 6 months
  • 1 subscriber
  • 1.61K Posts
  • 35.6K Comments
  • Modlog