OpenAI fears humans will become ’emotionally reliant’ on ChatGPT’s human voice

OpenAI, the maker of ChatGPT, has revealed concerns users may develop emotional dependency on the chatbot’s forthcoming voice mode.

The ChatGPT-4o mode is currently being analysed for safety ahead of a rollout to the community. It enables, to a certain extent, users to converse naturally with the assistant as if it were a real person.

With that comes the risk of emotional reliance, and “increasingly miscalibrated trust” of an AI model that would be exacerbated by interactions with an uncannily human-like voice. A voice that can also take account of the user’s emotions through tone of voice.

Save big on the PlayStation VR 2 with this Amazon deal

The PS VR 2 has plummeted to just £423.50 on Amazon. Save £97.49 on the 4K gaming headset when you shop today. That’s 18% off the 2023 headset’s £529.99 RRP.

  • Amazon
  • Was £529.99
  • £423.50

View Deal

The findings of the safety review (via Wired), published this week expressed concerns about language that reflected a sense of shared bonds between the human and the AI.

“While these instances appear benign, they signal a need for continued investigation into how these effects might manifest over longer periods of time,” the review reads. It also says the dependence on the AI might affect relationships with other humans.

“Human-like socialization with an AI model may produce externalities impacting human-to-human interactions. For instance, users might form social relationships with the AI, reducing their need for human interaction—potentially benefiting lonely individuals but possibly affecting healthy relationships. Extended interaction with the model might influence social norms. For example, our models are deferential, allowing users to interrupt and ‘take the mic’ at any time, which, while expected for an AI, would be anti-normative in human interactions,” the document adds.

Furthermore, the review pointed out the possibility of over-reliance and dependence.

“The ability to complete tasks for the user, while also storing and ‘remembering’ key details and using those in the conversation, creates both a compelling product experience and the potential for over-reliance and dependence.”

The team said there’ll be further study on the potential for emotional reliance on the voice-based version of ChatGPT. The feature drew mainstream attention earlier this summer due to the voice’s startling resemblance to the actress Scarlett Johansson. The Hollywood star, who actually played an AI being its user fell in love with in the film Her, refused the offer to voice OpenAI’s assistant.

However, the end result ended up sounding suspiciously like her anyway, despite CEO Sam Altman’s insistance the voice wasn’t cloned.

Source: www.trustedreviews.com

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *