Video-sharing platform TikTok has said that it removed 43,000 pieces of content for violating its misinformation policies in the four weeks leading up to and during the European elections in June.
The company said it also removed over 2,600 pieces of content for violating its civic and election integrity policies.
According to TikTok, over 96% of violative misinformation content was removed before it was reported, with over 80% being removed before receiving a single view.
"Some of the misinformation narratives we observed included narratives around migration, climate change, security and defence and LGBTQ rights," TikTok said.
The company established a dedicated 'Mission Control Centre' in its Dublin office to tackle the spread of disinformation ahead of the elections.
It also launched a dedicated in-app Election Centre for each EU Member State.
"In Ireland, this was visited more than 104,000 times and search banners were viewed by more than 912,000," TikTok said.
"The rise of AI-generated content more generally has created new challenges for the industry ahead of the many elections taking place this year," it added.
Last month, an investigation by campaign group Global Witness found that TikTok approved 16 advertisements targeting Ireland that contained election disinformation.
The group submitted adverts containing false information encouraging people to vote online and by text; false information around the voting age; and incitement of force against immigrant voters.
TikTok approved all of the ads for publication but they were withdrawn by Global Witness before anyone could see them.
In response to the investigation, TikTok said all 16 of the ads violated its advertising policies and were correctly identified by TikTok's systems, but were approved after additional review due to human error by a moderator, who has since received retraining.
The company said it had instituted new practices for moderating political ads.