Monday, May 29, 2017











Charlotte Silver-24 May 2017

A US federal court in New York court has ruled that Facebook cannot be held liable for its users’ activity.
Israeli lawfare group Shurat HaDin filed a $1 billion lawsuit last July, alleging that Facebook violates the Anti-Terrorism Act by serving as a platform for Hamas and “knowingly … facilitat[ing] this terrorist group’s ability to communicate, recruit members, plan and carry out attacks and strike fear in its enemies.”
Shurat HaDin has strong links to Israel’s Mossad spy agency.
In dismissing the case on 18 May, US District Judge Nicholas Garaufis upheld Facebook’s argument that the company was protected by the Communications Decency Act, which says certain internet services are not liable for content created by a third party.
The immunity this law extends to Facebook is based on the idea that such internet companies are not the publishers or speakers of the content under question, and therefore cannot be held liable for its alleged harm.

New legal strategy fails

Shurat HaDin’s case was filed on behalf of the American families of five individuals who died – and in one case survived – in attacks allegedly carried out by Hamas.
The case was among the first to invoke the Anti-Terrorism Act to argue that an internet company is liable for its users’ activity.
The Anti-Terrorism Act allows US citizens to file civil suits against organizations and people accused of providing material support to groups designated as “foreign terrorist organizations” by the US government.
In order to invoke the law, plaintiffs must show that the alleged material support was in the service of a designated terrorist group. The US designated Hamas as a terrorist organization in the 1990s.

Earlier lawsuit also dismissed

In the same ruling, Judge Garaufis also dismissed an earlier lawsuit filed by Shurat HaDin in 2015 on behalf of 20,000 Israelis, who claimed they were threatened by potential violence that was being planned or incited on Facebook.
The court dismissed this lawsuit on the basis that the plaintiffs failed to establish an actual injury that could be connected to Facebook’s conduct.
Nitsana Darshan-Leitner, the founder of Shurat HaDin, said her group will appeal the ruling.
“We believe there is a fatal mistake in their ruling because the court totally did not address the issue of aiding and abetting terrorism,” she said following the ruling last week.
In its lawsuit, Shurat HaDin claimed Facebook’s approach to regulating content was inconsistent and piecemeal, arguing that Hamas should not be allowed to operate any pages on Facebook.

Censorship concerns

Facebook has been the target of an intense campaign by the Israeli government. Last year, public security minister Gilad Erdan charged that the blood of Israelis killed by Palestinians in violence related to Israel’s military occupation in the West Bank is “on the hands of Facebook.”
Along with its lawsuits, Shurat HaDin sponsored a campaign, which included plans for a billboard opposite Facebook CEO Mark Zuckerberg’s home, personally implicating him in conflict-related deaths.
Over the weekend, The Guardian published leaked information on how Facebook monitors and censors activity on its social media platform. The exposé portrays the company as struggling to implement consistent censorship rules for categories of content that can be hard to define, including so-called terrorism, pornography and threats of violence.
The Guardian’s report suggests “terrorism” is one of the trickiest topics for Facebook to monitor.
The company has been accused of censoring political activity, especially by Palestinians and Kashmiris, in the name of clamping down on terrorist recruitment and propaganda.
Facebook has defended its censorship of content related to Kashmir, where the Indian government is waging a brutal military campaign against protesters, by stating: “There is no place on Facebook for content that praises or supports terrorists, terrorist organizations or terrorism.”
Facebook is reportedly developing an artificial intelligence mechanism “to distinguish between news stories about terrorism and actual terrorist propaganda.”