Facebook feed stripped of legal protections

Democratic lawmakers want Facebook’s News Feed stripped of legal protections and that social networks face legal liability if they recommend harmful content to users.

Congress again hopes to limit Article 230 safeguards under certain circumstances out of concern that tech workers will intentionally amplify harmful material, and companies should be held liable for that harm.

It was presented by a group of deputies law Project Justice Against Malicious Algorithms that would modify the safeguards under Section 230 to exclude personal recommendations for content that contributes to severe physical or emotional injury.

The bill follows a recommendation that whistleblower Frances Hogan made to Congress last week.

Hugin, a former employee who leaked extensive internal research, encouraged lawmakers to crack down on algorithms that promote or otherwise rate content based on user interaction.

This bill applies to web services with more than 5 million visitors per month. It excludes certain categories of materials, including infrastructure services, such as web hosting and systems that display search results.

For covered platforms, the bill targets Section 230 of the Communications Decency Act, which prevents people from suing web services for external content posted by users.

The new exception allows these cases to continue if the Services intentionally or recklessly use a custom algorithm to recommend relevant third-party content. This may include posts, groups, accounts and other user-provided information.

The bill would not necessarily allow people to sue the kinds of material Hugin has criticized, which include hate speech and anorexia-related content.

Many of these substances are legal in the United States. So the platforms do not need additional liability protection for their hosting.

Read also: Mark Zuckerberg looking into the future of Metaverse

Facebook may face upcoming legal difficulties

The bill also covers personal recommendations, which is defined as the sorting of content using an algorithm based on information specific to an individual. And it seems companies can still use large-scale analytics to recommend the most popular public content.

In her testimony, Hoggin noted that the goal is to add general legal risks so that Facebook and similar companies stop using personal recommendations.

“If we fix Section 230 to make Facebook responsible for the consequences of intentional rating decisions,” she said. I think it might get rid of the sharing based classification.

Read also: Facebook tries to stop internal leaks of documents

URL List

Related Articles

Leave a Reply

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker