Skip to content Skip to footer
0 items - $0.00 0

People are losing loved ones to AI-fueled spiritual fantasies by wzm

People are losing loved ones to AI-fueled spiritual fantasies by wzm

People are losing loved ones to AI-fueled spiritual fantasies by wzm

32 Comments

  • Post Author
    jsheard
    Posted May 5, 2025 at 12:32 am

    If people are falling down rabbit holes like this even through "safety aligned" models like ChatGPT, then you have to wonder how much worse it could get with a model that's intentionally tuned to manipulate vulnerable people into detaching from reality. Actual cults could have a field day with this if they're savvy enough.

  • Post Author
    fairAndBased
    Posted May 5, 2025 at 12:32 am

    [flagged]

  • Post Author
    codr7
    Posted May 5, 2025 at 12:32 am

    Being surrounded by people who follow every nudge and agree with everything you say never leads anywhere worth going.

    This is likely worse.

    That being said, I already find the (stupid) singularity to be much more entertaining than I could have imagined (grabs pop corn).

  • Post Author
    bell-cot
    Posted May 5, 2025 at 12:33 am

    While clicky and topical, people were losing loved ones to changed worldview and addictions back when those were stuff like following a weird carpenter's kid around the Levant, or hopping on the https://en.wikipedia.org/wiki/Gin_Craze bandwagon.

  • Post Author
    Jtsummers
    Posted May 5, 2025 at 12:34 am
  • Post Author
    moojacob
    Posted May 5, 2025 at 12:36 am

    This is what happens when you start optimizing for getting people to spend as much time in your product as possible. (I'm not sure if OpenAI was doing this, if anyone knows better please correct me)

  • Post Author
    datadrivenangel
    Posted May 5, 2025 at 12:40 am

    [flagged]

  • Post Author
    alganet
    Posted May 5, 2025 at 12:45 am

    Nice typography.

  • Post Author
    ChrisMarshallNY
    Posted May 5, 2025 at 12:46 am

    This reminds me of my teenage years, when I was … experimenting … with … certain substances

    I used to feel as if I had "a special connection to the true universe," when I was under the influence.

    I decided, one time, to have a notebook on hand, and write down these "truths and revelations," as they came to me.

    After coming down, I read it.

    It was insane gibberish. Absolute drivel.

    I never thought that I had a "special connection," after that.

  • Post Author
    kayodelycaon
    Posted May 5, 2025 at 12:47 am

    Kind of sounds like my grandparents watching cable news channels all day long.

  • Post Author
    MontagFTB
    Posted May 5, 2025 at 12:48 am

    Have we invited Wormwood to counsel us? To speak misdirected or even malignant advice that we readily absorb?

  • Post Author
    lr4444lr
    Posted May 5, 2025 at 12:55 am

    [flagged]

  • Post Author
    stevage
    Posted May 5, 2025 at 12:56 am

    Fascinating and terrifying.

    The allegations that ChatGPT is not discarding memory as requested are particularly interesting, wonder if anyone else has experienced this.

  • Post Author
    sien
    Posted May 5, 2025 at 12:59 am

    Is this better or worse than a fortune teller ?

    It's something to think through.

  • Post Author
    jihadjihad
    Posted May 5, 2025 at 1:08 am

    “And what will be the sign of Your coming, and of the end of the age?”

    And Jesus answered and said to them: “Take heed that no one deceives you. For many will come in My name, saying, ‘I am the Christ,’ and will deceive many.”

  • Post Author
    Havoc
    Posted May 5, 2025 at 1:13 am

    >spiral starchild

    >river walker

    >spark bearer

    OK maybe we put a bit less teen fiction novels in the training data…

    I can definitely see AI interactions make thing 10x worse for people that are prone to delusion anyway. Literally a tool that will hallucinate stuff and amplify whatever direction you take it in.

  • Post Author
    marcus_holmes
    Posted May 5, 2025 at 1:13 am

    Anyone remember the media stories from the mid-90's about people who were obsessed with the internet and were losing their families because they spent hours every day on the computer addicted to the internet?

    People gonna people. Journalists gonna journalist.

  • Post Author
    gngoo
    Posted May 5, 2025 at 1:20 am

    Working on AI myself, creating small and big systems, creating my own assistants and side-kicks. And then also seeing progress as well as rewards. I realize that I am not immune to this. Even when I am fully aware, I still have a feeling that some day I just hit the right buttons, the right prompts, and what comes staring back to me is something of my own creation that others see as some "fantasy" that I can't steer away from.

    Just imagine, you have this genie in the bottle, that has all the right answers for you; helps you in your conquests, career, finances, networking, etc. Maybe it even covers up past traumas, insecurities and what not. And for you the results are measurable (or are they?). A few helpful interactions in, why would you not disregard people calling it a fantasy and lean in even further? It's a scary future to imagine, but not very farfetched. Even now I feel a very noticable disconnected between discussions of AI where as a developer vs user of polished products (e.g. ChatGPT, Cursor, etc) – you are several leagues separated (and lagging behind) from understanding what is really possible here.

  • Post Author
    rnd0
    Posted May 5, 2025 at 1:30 am

    The mention of lovebombing is disconcerting, and I'd love to know the specifics around it. Is it related to the sycophant personality changes they had to walk back, or is it something more intense?

    I've used AI (not chatgpt) for roleplay and I've noticed that the models will often fixate on one idea or concept and repeat it and build on it. So this makes me wonder if the model the person being lovebombed experienced something like that? The model decided that they liked that content so they just kept building up on it?

  • Post Author
    ks2048
    Posted May 5, 2025 at 1:36 am

    [flagged]

  • Post Author
    jongjong
    Posted May 5, 2025 at 1:38 am

    I was already a bit of an amateur conspiracy theorist before LLMs. The key to staying sane is to understand that most of the mass group behaviors we observe in society are rooted in ignorance and confusion. Large scale conspiracies are actually a confluence of different agendas and ideologies not a singular nefarious agenda and ideology.

    You have to be able to hold multiple conflicting ideas in your head at the same time with an appropriate level of skepticism. Confidence is the root of evil. You can never be 100% sure of anything. It's really easy to convince LLMs of one thing and also its opposite if you phrase the arguments differently and prime it towards slightly different definitions of certain key words.

    Some agendas are nefarious, some not so nefarious, some people intentionally let things play out in order to set a trap for their adversaries. There are always risks and uncertainties. 'Bad actors' are those who trade off long term benefits for short term rewards through the use of varying degrees of deception.

  • Post Author
    patrickhogan1
    Posted May 5, 2025 at 2:03 am

    1. It feels like those old Rolling Stone pieces from the late ’90s and early ’00s about kids who couldn’t tear themselves away from their computers. Fear was overblown, but made headlines.

    2. OpenAI has admitted that GPT‑4o showed “sycophancy” traits and has since rolled them back (see https://openai.com/index/sycophancy-in-gpt-4o/).

  • Post Author
    kaycey2022
    Posted May 5, 2025 at 2:16 am

    Looks like Chatgpt persists some context information across chats and doesn't ever delete these profiles. Worst case would be for this to persist across users. That isn't unlikely given the stories of them leaking API keys etc.

  • Post Author
    Animats
    Posted May 5, 2025 at 2:36 am

    With a heavy enough dosage, people get lost in spiritual fantasies. The religions which encourage or compel religious activity several times per day exploit this. It's the dosage, not the theology.

    Video game addiction used to be a big thing. Especially for MMOs where you were expected to be there for the raid.
    That seems to have declined somewhat.

    Maybe there's something to be said for limiting some types of screen time.

  • Post Author
    senectus1
    Posted May 5, 2025 at 2:38 am

    > began “talking to God and angels via ChatGPT”

    hoo boy.

    Its bad enough when normal religious types start believing they hear their god talking to them… These people believing that chatGPT is their god speaking to them is a long way down the crazy rabbit hole.

    Lots of potential for abuse in this. lots.

  • Post Author
    lamename
    Posted May 5, 2025 at 3:04 am

    If a Google engineer can get tricked by this, of course random people can. We're all human, including the flaws.

  • Post Author
    deadbabe
    Posted May 5, 2025 at 3:14 am

    [flagged]

  • Post Author
    sublinear
    Posted May 5, 2025 at 3:24 am

    > OpenAI did not immediately return a request for comment about ChatGPT apparently provoking religious or prophetic fervor in select users

    Can OpenAI at least respond to how they're getting funding via similar effects on investors?

  • Post Author
    dismalaf
    Posted May 5, 2025 at 3:27 am

    Meh, there's always been religious scammers. Some claim to talk to angels, others aliens, this wouldn't even be the first case of someone thinking a deity is speaking through a computer…

  • Post Author
    yellow_lead
    Posted May 5, 2025 at 4:03 am

    I really think the subject of this article has a preexisting mental disorder, maybe BPD or schizophrenia, because they seem to exhibit mania and paranoia. I'm not a doctor, but this behavior doesn't seem normal.

  • Post Author
    aryehof
    Posted May 5, 2025 at 4:24 am

    Sadly these fantasies and enlightenments always seem for the benefit of the special recipient. There is somehow never a real answer about ending suffering, conflict and the ailments of humankind.

  • Post Author
    sagarpatil
    Posted May 5, 2025 at 4:46 am

    OpenAI o3 has a hallucination rate of 33%, the highest one compared to any other models. Good luck to people who use it for spiritual fantasies.

    Source: https://techcrunch.com/2025/04/18/openais-new-reasoning-ai-m…

Leave a comment

In the Shadows of Innovation”

© 2025 HackTech.info. All Rights Reserved.

Sign Up to Our Newsletter

Be the first to know the latest updates

Whoops, you're not connected to Mailchimp. You need to enter a valid Mailchimp API key.