Image

ChatGPT is referring to customers by their names unprompted, and a few discover it ‘creepy’

Some ChatGPT users have noticed a strange phenomenon recently: occasionally, the chatbot refers to them by name as it reasons through problems. That wasn’t the default behavior previously, and several users claim ChatGPT is mentioning their names despite never having been told what to call them.

Reviews are mixed. One user, software developer and AI enthusiast Simon Willison, called the feature “creepy and unnecessary.” Another developer, Nick Dobos, said he “hated it.” A cursory search of X turns up scores of users confused by — and wary of — ChatGPT’s first-name basis behavior.

“It’s like a teacher keeps calling my name, LOL,” wrote one user. “Yeah, I don’t like it.”

It’s not clear when, exactly, the change happened, or whether it’s related to ChatGPT’s upgraded “memory” feature that lets the chatbot draw on past chats to personalize its responses. Some users on X say ChatGPT began calling them by their names even though they’d disabled memory and related personalization settings.

OpenAI hasn’t responded to TechCrunch’s request for comment.

In any event, the blowback illustrates the uncanny valley OpenAI might struggle to overcome in its efforts to make ChatGPT more “personal” for the people who use it. Last week, the company’s CEO, Sam Altman, hinted at AI systems that “get to know you over your life” to become “extremely useful and personalized.” But judging by this latest wave of reactions, not everyone’s sold on the idea.

An article published by the Valens Clinic, a psychiatry office in Dubai, may shed some light on the visceral reactions to ChatGPT’s name use. Names convey intimacy. But when a person — or chatbot, as the case may be — uses a name a lot, it comes across as inauthentic.

“Using an individual’s name when addressing them directly is a powerful relationship-developing strategy,” writes Valens. “It denotes acceptance and admiration. However, undesirable or extravagant use can be looked at as fake and invasive.”

In a similar vein, perhaps another reason many people don’t want ChatGPT using their name is that it feels ham-fisted — a clumsy attempt at anthropomorphizing an emotionless bot. In the same way that most folks wouldn’t want their toaster calling them by their name, they don’t want ChatGPT to “pretend” it understands a name’s significance.

This reporter certainly found it disquieting when o3 in ChatGPT earlier this week said it was doing research for “Kyle.” (As of Friday, the change seemingly had been reverted; o3 called me “user.”) It had the opposite of the intended effect — poking holes in the illusion that the underlying models are anything more than programmable, synthetic things.

SHARE THIS POST