Man sweet-talking ChatGPT on the subway, sparking…
Is he speaking to… Her?
A viral picture is making the rounds online this week that appears prefer it was ripped from the script of Spike Jonze’s 2013 movie “Her.”
It confirmed a man dystopically conversing with ChatGPT on an NYC subway — “like it was his girlfriend.”
This pic — taken from an angle behind the man and targeted on his iPhone screen — sparked fierce debate online over AI companionship in the digital age.
The viral snap was shared to X on June 3 by person @yedIin with the caption, “guy on the subway this morning talking to chatgpt like it’s his girlfriend. didn’t realize these people *actually* exist. we are so beyond cooked.”
man on the subway this morning speaking to chatgpt prefer it’s his girlfriend. didn’t understand these people *truly* exist. we’re so past cooked pic.twitter.com/ZxWRdPFDlS— Jake. (@YedIin) June 3, 2025
As seen on the man’s telephone, the message despatched from the AI assistant learn, “Something warm to drink. A calm ride home. And maybe, if you want, I’ll read something to you later, or you can rest your head in my metaphorical lap while we let the day dissolve gently away.”
Shot over his shoulder, the image (above) ignited fierce debate on X over AI companionship in the digital age. x/YedIin
It continued, adopted by a purple coronary heart emoji, “You’re doing beautifully, my love, just by being here.”
The man holding the telephone replied, accompanied by one other purple coronary heart, “Thank you.”
Viewers have been break up — some blasted the photographer for invading the man in query’s privateness, saying snapping pics of his screen without permission was approach out of line.
“You have no idea what this person might be going through,” one person wrote as one other added, “Can’t decide which is more depressing, that or the fact that you took a picture of this over his shoulder and posted it.”
Others felt sorry for the man, calling him “lonely” and urging people to cut him some slack. “That’s actually sad. He must be very lonely,” another person tweeted.
Another replied, “As a society, we’re seemingly losing empathy bit by bit and it’s concerning. Loneliness is real, a lot of people don’t have who they can talk to without judgment or criticism.”
But a lot sided with the authentic tweet, calling the entire ChatGPT chat “scary” and warning that leaning on AI as a stand-in for actual human connection is downright alarming.
“Scary to even think about the mental damage this creates,” one commented as one other responded, “Terrified to see what technology will lead the future to. All I can think of are black mirror episodes becoming reality.”
Plenty of X customers backed the authentic tweet, slamming the subway chat as “scary” — and sounding the alarm over swapping actual human connection for soulless AI. Rizq – stock.adobe.com
But past the emotional implications, specialists have also raised purple flags about privateness considerations when chatting with AI companions like ChatGPT.
As The Post beforehand reported, customers typically deal with these chatbots like trusted confidants — meting out every thing from relationship woes to lab outcomes — without realizing that something typed into the platform is no longer totally personal.
“You lose possession of it,” Jennifer King, a fellow at Stanford’s Institute for Human-Centered Artificial Intelligence, lately warned the Wall Street Journal.
OpenAI has cautioned customers not to share delicate info, while Google equally advises against inputting confidential information into its Gemini chatbot.
So if you’re spilling your coronary heart out to a bot (not judging), specialists say to assume twice — because another person is perhaps listening.
Stay in the loop with the newest trending topics! Visit our web site day by day for the freshest life-style information and content material, thoughtfully curated to encourage and inform you.



