This round-up of claims has been compiled by Full Fact, the UK's largest fact checking charity working to find, expose and counter the harms of bad information.
Glasgow candidate's 'illustrative' AI campaign videos do not show real events
An independent candidate in the Scottish parliamentary elections has told Full Fact that "illustrative" AI-generated videos he shared of himself meeting voters, speaking at rallies and visiting a primary school and a hospital do not show real events.
Two videos show Arzoo Waqqar Abdullah, who is standing for the Glasgow Southside constituency, carrying out campaign activities. The posts feature a disclaimer that describes each video as an "illustrative AI scene". In the original posts however, this was only mentioned at the end of the caption, which in many cases was only visible if users clicked "see more". It is therefore possible some voters may have watched these videos without seeing any warning that they are not real.
After we contacted Mr Abdullah, he removed the original posts and re-uploaded the videos with the disclaimer that each was an "illustrative AI scene" more prominently displayed at the top of the captions. We're grateful to him for engaging with us.
Mr Abdullah told Full Fact the scenes depicted in the video were "not real events", and said: "The videos were always meant to be illustrative and represent my goals – things I aspire to do – rather than past events."
We shared Mr Abdullah's original posts with the Electoral Commission – the body responsible for regulating election campaigns. It told us: "We expect anyone using AI-generated campaign material to do so in a way that does not mislead voters, and to label it clearly so that voters know how it has been created."
The Electoral Commission is currently piloting a "deepfake detection" system that "monitors online content for deepfake audio and video intended to mislead voters about the electoral process or falsely depict candidates". However, it told us that because each of the videos in question was labelled as an "illustrative AI scene", they would not have fallen under the scope of its pilot.
There are a number of clues that the videos were made using AI. For example, in the background there are buses that blend into each other and shops with invented names or gibberish signs, or that look nothing like their real-world counterparts. Full Fact's AI tools also identified the clips as having a SynthID (Google AI's watermark). Our guide to spotting AI content, and toolkit, can help you tell when something might not be what it seems.
Fake AI videos of Muslims praying in the street
Fake videos that appear to show Muslims praying in the middle of busy streets have been gaining traction online. The clips, which have been shared on Facebook, show people apparently praying on roads lined with UK flags, and also feature audio in the background that sounds like someone saying "Allahu Akbar" on a loudspeaker.
The videos have generated hundreds of negative comments from people who seem to believe they are real. But they are not genuine, and have been created with artificial intelligence (AI).
All four similar videos Full Fact looked at contain SynthID, an invisible watermark added to content created or edited with Google's AI tools, in both their visuals and the audio. While the presence of a watermark cannot tell us whether AI was used to completely generate something new or modify existing content, other clues suggest these videos were almost certainly wholly generated with AI. Text on the shopfronts and buses is garbled and some of the flags are distorted throughout the videos. People and vehicles also regularly glitch.



