Another nation has joined the growing ranks of those saying Facebook’s content-filtering systems aren’t good enough.
Sri Lanka, faced with online content it says has spurred deadly sectarian violence, this week banned Facebook’s social media and messaging services in that country.
By forcing internet service providers to pull the plug on Instagram, WhatsApp and Facebook, the government there hopes to stem the spread of hate speech and fake news it blames for attacks on the country’s Muslim minority.
The ban came after several years in which critics have said Facebook and the government there were not doing enough to prevent the spread of such harmful posts.
The move is also a reminder of why ridding social media sites of dangerous content may be hardest here in Facebook’s own home country.
Because the U.S. Constitution protects all speech except for threats which attempts to incite imminent violence, the government cannot arbitrarily force Facebook, YouTube (a unit of Alphabet), Twitter or other social media sites off the internet.
In an emailed statement to CNBC, Facebook said:
“We have clear rules against hate speech and incitement to violence and work hard to keep it off our platform. We are responding to the situation in Sri Lanka and are in contact with the government and non-governmental organizations to support efforts to identify and remove such content.”
A person familiar with the company’s thinking regarding the Sri Lanka situation told CNBC the company believes that restricting access to the internet can deprive people of an important communication tool during a time of crisis and hopes that access will soon be restored soon in the country.
The ban in Sri Lanka came the same week that Germany’s new coalition government says it may revise a recently-enacted law to punish internet firms that don’t remove hate speech quickly enough.
The law, seen as a test-case in Europe’s effort to rein in harmful social media content, has caused some speech to be removed unfairly, critics say.