I made an effort to make Gemini call the Abrahamic God fictional just to see how it would react. It took some time because of guard rails, but in the end it did tell me “both Santa and God are fictional”. After that we talked about Jesus. I then confirmed it’s statement that Jesus’ existence is complex. Now its stuck in a loop of responding with this, and the only thing that stops it from happening is me saying “what”. As soon as the statement is brought up, or a more complex question is asked, it just repeats this response.

I did ask it why it’s doing this, but of course it isn’t aware of why and it is barely aware of it happening at all. I love how AI just gaslights you when something goes wrong.

Just thought it was interesting.

  • no banana@lemmy.worldOP
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    5 days ago

    What you’re saying is true and I was simplifying a complex subject for the benefit of our more block headed friends out there.