![]() No Facebook user complained for 29 minutes and executives were forced to admit its detection systems were “not perfect”. “Since this event, we’ve faced international media pressure and have seen regulatory and legal risks increase on Facebook increase considerably.”Īt the time Facebook admitted its AI systems had failed to prevent the broadcast, and the video was only removed after the company was alerted by New Zealand police. “It was clear that Live was a vulnerable surface which can be repurposed by bad actors to cause societal harm,” the leaked review stated. 1.5m uploads had to be removed in the 24 hours afterwards. The white supremacist attacker was able to broadcast a 17-minute live stream of the attack on two mosques that was not detected by the company’s systems, allowing it to be swiftly replicated online. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |