What a pain!
Among the UK’s most famous TV doctors are increasingly seeing their names and likenesses co-opted to sell scam products to unsuspecting social media users, latest research warns.
The phenomenon is known as deepfaking — using artificial intelligence to create sophisticated digital fabrications of real people. In these faux videos, an individual’s head could also be superimposed onto one other person’s body, or their voice could also be replicated in a convincing way.

The research — published as a feature article Wednesday within the BMJ — finds that general practitioners Hilary Jones and Rangan Chatterjee and the late health guru Michael Mosley, who died last month, are getting used to advertise products without their consent.
In Jones’ case, meaning unwittingly shilling blood pressure and diabetes cure-alls and hemp gummies.
Jones, 71, who is thought for his work on “Good Morning Britain,” amongst other TV shows, said he employs a social media specialist to forage the net for deepfake videos that misrepresent his views and tries to get them taken down.
“There’s been an enormous increase in this sort of activity,” Jones shared. “Even in the event that they’re taken down, they simply pop up the following day under a unique name.”
It could actually be tricky to discern which videos are forged. Recent research finds that 27% to 50% of individuals cannot distinguish authentic videos about scientific subjects from deepfakes.
It might be even harder if the video includes a trusted medical skilled who has long appeared within the media.

John Cormack, a retired UK doctor, worked with the BMJ to attempt to get a way of how widespread the deepfake doctor phenomenon is across social media.
“The underside line is, it’s less expensive to spend your money on making videos than it’s on doing research and coming up with latest products and getting them to market in the traditional way,” Cormack said within the article. “They appear to have found a way of printing money.”
Cormack said the platforms that host the content — resembling Facebook, Instagram, X, YouTube and TikTok — ought to be held accountable for the computer-generated videos.
A spokesperson for Meta, which owns and operates Facebook and Instagram, told the BMJ that it’ll investigate the examples highlighted within the research.
“We don’t permit content that intentionally deceives or seeks to defraud others, and we’re continuously working to enhance detection and enforcement,” the spokesperson said. “We encourage anyone who sees content which may violate our policies to report it so we are able to investigate and act.”
What to do in case you detect a deepfake video
- Look rigorously on the content or take heed to the audio to make sure that your suspicions are justified
- Contact the person shown endorsing the product to see if the video, image or audio is legitimate
- Query its veracity with a comment on the post
- Use the platform’s built-in reporting tools to share your concerns
- Report the user or account that shared the post






