Met Police and Facebook unite to fight live streaming of terrorism

The project will begin in October

Tuesday, September 17, 2019

Facebook and the Metropolitan Police have joined forces to improve the site’s ability to detect live streaming of terrorism and potentially alert officers about attacks sooner.

The social media giant will provide body cameras for officers at the force’s firearms training centres to try to help its artificial intelligence (AI) system identify videos of incidents more quickly and accurately.

By having more first-person footage of violent events in its system, the AI will find it easier to spot violating content and remove it.

Facebook came under fire for the spread of a live stream video showing the New Zealand mosque shootings in March, which left 51 dead.

The video was viewed fewer than 200 times during its live broadcast but remained on the site and was watched around 4,000 times in total before being removed.

Facebook said it did not have enough first-person footage for the system to match the Christchurch video up against.

Therefore, the site has approached the Met to increase the number of images needed to train its machine learning tools.

The UK’s counter terrorism chief has welcomed the efforts.

Neil Basu said the technology could help improve police response times, as well as prevent the “glorification” of acts of terrorism and the “promotion of the toxic ideologies that drive them”.

The project - which is part of a global effort and also includes Instagram - will begin from October.

Read more

Businessman jailed for sharing Christchurch shooting video

Facebook revamps livestreaming policy after Christchurch attack