Facebook says it will continue to run political adverts and has no plans to fact check their content – despite Google and Twitter already banning them.
The social networking giant will instead ‘improve transparency’ by making it easier for users to chose to see ‘fewer’ political and social adverts on its platform.
Ahead of the UK general election in 2019, Twitter and TikTok banned political ads, while Google banned advertisers from targeting voters based on their politics.
Rob Leathern, Facebook’s director of product management, said the firm was ‘not deaf’ to the criticism but felt banning or restricting ads wasn’t the solution.
He said they had considered a similar approach to Google but decided to continue to allow advertisers to target political affiliations.
‘We think people should be able to see what politicians have to say,’ Facebook CEO Mark Zuckerberg said in defending the policy.
‘I don’t think it’s right for tech companies to censor politicians in a democracy.’
The company met with a range of organisations to discuss options, including non-profits, government organisations and political groups including parties.
‘In the absence of regulation, Facebook and other companies are left to design their own policies,’ Mr Leathern said.
‘We have based ours on the principle that people should be able to hear from those who wish to lead them, warts and all, and that what they say should be scrutinised and debated in public.’
Expanded transparency features will be available from the first quarter of 2020 in countries where ‘paid for by’ disclaimers are placed on political ads.
Meanwhile, political ad controls will be rolled out from early summer in the US, before eventually expanding to more locations, Facebook confirmed.
Facebook has been criticised over its content policies by politicians from the right, left and centre.
Democrats have blasted the company for refusing to fact-check political advertisements, while Republicans have accused it of discriminating against conservative views – a charge that it has denied.
The company has also been heavily criticised and fined for the part it played in the 2016 US presidential election and the harvesting of user data from the network by Cambridge Analytica.
Facebook agreed to pay a fine of £500,000 following an investigation into its misuse of personal data in political campaigns in October 2018.
While the company isn’t banning political adverts, it says it is taking measures to attempt to reduce fake content.
Facebook says it is cracking down on deepfake videos in the lead-up to the 2020 US presidential election, in an attempt to curb the spread of misinformation.
Videos will also be banned if they are made by AI or machine learning that ‘merges, replaces or superimposes content on to a video, making it appear to be authentic’.