Experts are worried that books produced by ChatGPT for sale on Amazon, which target beginner foragers, could end up killing someone.

Many mushroom identification and foraging books being sold on Amazon are likely generated by AI with no human authorship. These books could provide dangerous misinformation and potentially lead to deaths if people eat poisonous mushrooms based on the AI’s inaccurate descriptions. Two New York mushroom societies have warned about the risks of AI-generated foraging guides. Experts note that safely identifying wild mushrooms requires careful research and experience that an AI system does not have. Amazon has since removed some books flagged as AI-generated, but more may exist. Detecting AI-generated books and authors can be difficult as the systems can fabricate author bios and images. Relying on multiple credible sources, as well as guidance from local foraging groups, is advised for safely pursuing mushroom foraging.

@marco@beehaw.org
link
fedilink
English
61Y

You can eat any mushroom… At least once.

You can easily take a bullet to the brain… Once

You can eat lava… but only once.

rayyyy
link
fedilink
31Y

Also, there are old mushroom hunters and bold mushroom hunters, but no old, bold mushroom hunters.

Create a post

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

  • 1 user online
  • 144 users / day
  • 275 users / week
  • 709 users / month
  • 2.87K users / 6 months
  • 1 subscriber
  • 3.1K Posts
  • 65K Comments
  • Modlog