Most Americans believe social media companies have a responsibility to remove offensive content from their platforms. But how tech firms decide which posts should be taken down is where public confidence fades.
In a survey published Wednesday by Pew Research Center , 66 percent of Americans say social networks have a responsibility to delete offensive posts and videos. But determining the threshold for removal has been a tremendous challenge for such companies as Facebook, YouTube and Twitter — exposing them to criticism that they’ve been too slow and reactive to the relentless stream of abusive and objectionable content that populate their platforms.
The screening process also has drawn accusations of selective enforcement colored by political bias, though the social networks are adamant they are politically neutral.
[ YouTube under federal investigation over allegations it violates children’s privacy ]
When it comes to removing offensive material, 45 percent of those surveyed said they did not have much confidence in the companies to decide what content to take down. And nearly 1 in 4 say they have no confidence in the companies to make these calls at all.
Pew, which broke down the survey results by political affiliation, found that this particular lack of faith in social media companies was shared on sides of the aisle.
[ YouTube will remove more white supremacist and hoax videos, a more aggressive stance on hate speech ]
Among Democrats, 37 percent have confidence in the companies to determine what content should be taken down, compared with 62 percent who said they did not have too much confidence or none at all, the study found.
Republicans were even more skeptical: 23 percent of GOP supporters said they trusted the decisions of the companies to remove offensive posts, compared with 76 percent who largely lacked confidence.
Earlier this month, YouTube, the Google-owned video site, announced it would take a more aggressive stance against hate speech, including removing videos that falsely deny the Holocaust and other major historical events took place. In response to criticism that social media companies are doing too little to combat hateful ideologies that rely on their platforms to gain an audience, YouTube and its peers have begun to take a broader view of what constitutes hate speech. Critics have zeroed in on the prevalence of posts that promote discrimination, peddle conspiracy theories about world events and that harm children.
On Wednesday, The Washington Post reported that the Federal Trade commission is in the late stages of an investigation into YouTube, for allegedly violating children’s privacy. YouTube declined to comment on the FTC probe. Consumer groups and youth advocates have claimed that the video site fails to filter content by appropriate age levels, exposing children to a near-endless stream of troubling content.
Comment s Hamza Shaban Hamza Shaban is a technology reporter for The Washington Post. Previously, he covered tech policy for BuzzFeed. Follow Market Watch Dow 26,621.67 Today 0.44% S&P 2,939.88 Today 0.46% NASDAQ 8,021.4 Today 0.43% Last Updated:12:31 PM 06/20/2019 Subscriber sign in We noticed you’re blocking ads! Keep supporting great journalism by turning off your ad blocker. Or purchase a subscription for unlimited access to real news you can count on. Try 1 month for $1 Unblock ads Questions about why you are seeing this? Contact us
LINK ORIGINAL: Washington Post