A young influencer couple said they were barred from boarding their flight to Puerto Rico after ChatGPT gave them the wrong visa information to enter the Caribbean Island.
In a video shared by the Spanish tourists on social media, TikToker Mery Caldass appears in tears as she walks through the airport after not being allowed to travel.
As partner Alejandro Cid comforts her, Caldass tells the camera: ‘Look, I always do a lot of research, but I asked ChatGPT and they said no,’ referring to whether they needed a visa to enter the country.
‘I don’t trust that son of a b***h anymore’, she adds.
But in between her tears, the influencer jokes that the AI tool gave them the wrong information as an act of revenge after she insulted it.
‘I don’t trust that one anymore because sometimes I insult him, I call him a b*****d, you’re useless, but inform me well…that’s his revenge’.
The video has racked up 6.1 million views on TikTok and several users have poked fun at the couple for asking ChatGPT for information instead of checking official travel advice.
‘Well, natural selection I guess. If you are going to take a transoceanic trip and you put all your advice in ChatGPT, little has happened to you,’ one user commented.

A young influencer couple were barred from boarding their flight to Puerto Rico after ChatGPT gave them the wrong visa information

Influencer Mery Cladass and her boyfriend Alejandro Cid posted a video on TikTok following their ordeal. Screen grab shows Caldass crying in an airport after being barred from their flight
‘But who trusts ChatGPT for those types of situations?,’ another said.
Others came to ChatGPT’s defence, claiming the AI tool’s answer was not incorrect and that instead the couple had asked it the wrong question about the necessary documents to enter Puerto Rico.
Spanish tourists do not need a visa to enter the Caribbean island, however holidaymakers must process an Electronic Travel Authorization (ESTA) online.
The couple’s ordeal with ChatGPT comes a day after a man was left fighting for his sanity after replacing table salt with a chemical more commonly used to clean swimming pools after following AI advice.
The 60-year-old American spent three weeks in hospital suffering from hallucinations, paranoia and severe anxiety after taking dietary tips from ChatGPT.
Doctors revealed in a US medical journal that the man had developed bromism – a condition virtually wiped out since the 20th century – after he embarked on a ‘personal experiment’ to cut salt from his diet.
Instead of using everyday sodium chloride, the man swapped it for sodium bromide, a toxic compound once sold in sedative pills but now mostly found in pool-cleaning products.
Symptoms of bromism include psychosis, delusions, skin eruptions and nausea – and in the 19th century it was linked to up to eight per cent of psychiatric hospital admissions.

The influencer joked that the AI tool gave her the wrong information as an act of revenge after she insulted it

Pictured: Spanish influencer Mery Caldass, who has nearly 400,000 followers on Instagram

Caldass and Cid’s video has racked up over 6 million views on social media
The bizarre case took a disturbing turn when the man turned up at an emergency department insisting his neighbour was trying to poison him.
He had no previous history of mental illness.
Intrigued and alarmed, doctors tested ChatGPT themselves. The bot, they said, still recommended sodium bromide as a salt alternative, with no mention of any health risk.