Channel 4 is causing a stir for its decision to use a deepfake version of Queen Elizabeth II as an alternative to her traditional festive broadcast.

The faux version of the Queen is voiced by actress Debra Stephenson. Channel 4 describes its decision to air the deepfake as a “stark warning” and “powerful reminder that we can no longer trust our own eyes.”

Channel 4’s intentions did not stop people from questioning their decision-making.

“We haven’t seen deepfakes used widely yet, except to attack women,” said Sam Gregory, the programme director of Witness, an organization using video and technology to protect human rights, per The Guardian.

RELATED: Meghan Markle Tops List Of Most Tweeted About Royals

“We should be really careful about making people think that they can’t believe what they see,” Gregory added. “If you’ve not seen them before, this could make you believe that deep fakes are a more widespread problem than they are.”

The programme director argued: “It’s fine to expose people to deepfakes, but we shouldn’t be escalating the rhetoric to claim we’re surrounded by them.”

Gregory was not the only one to warn of the danger of deepfakes. Areeq Chowdhury is a technology policy researcher behind deepfakes of politicians during the 2019 general election in England.

Chowdhury argued people should be aware of its associated risks, but argued the dangers of misinformation are far lower than the dangers of deepfake pornagraphy.

“The risk is that it becomes easier and easier to use deepfakes, and there is the obvious challenge of having fake information out there, but also the threat that they undermine genuine video footage which could be dismissed as a deepfakes,” he said.

RELATED: Prince Philip Poses With The Queen In New Photo

“My view is that we should generally be concerned about this tech, but that the main problem with deepfakes today is their use in non-consensual deepfake pornography, rather than information,” Chowdhury noted.

Deepfakes expert Henry Ajder was not too worried about the Queen’s deepfake, but recommended including disclaimers to drive the point home.

“I think in this case the video is not sufficiently realistic to be a concern, but adding disclaimers before a deepfake video is shown, or adding a watermark so it can’t be cropped and edited, can help to deliver them responsibly,” he said.

“As a society, we need to figure out what uses for deepfakes we deem acceptable, and how we can navigate a future where synthetic media is an increasingly big part of our lives,” Ajder concluded. “Channel 4 should be encouraging best practice.”