• CubitOom@infosec.pub
    link
    fedilink
    English
    arrow-up
    7
    ·
    13 days ago

    Man Believes Machine that Tells You Only what it Thinks You Want to Hear, Poisons Himself

    • shalafi@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      13 days ago

      It would return a proper answer if he asked a proper question. He either crafted a question to get the answer he wanted or he went back and forth with the prompt until he got what he wanted.

      I tried asking if sodium bromide was a good substitute for table salt and got a completely factual answer. Was tempted to manipulate it, but it gets so dumb after the first question or two I can’t bring myself to mess with it.