social media-speech
Caption +

Facebook has privately sought advice from the Family Research Council, a conservative Christian public-policy group, and its president Tony Perkins, above, center.

Show MoreShow Less

The world’s biggest social-media companies, under fire for failing to police content on their sites, have invited an array of outside groups to help them figure out who should be banned and what’s considered unacceptable.

That solution is creating a new set of problems — public fights, complaints and legal battles.

Facebook Inc., Twitter Inc. and Google’s YouTube unit have made a concerted push to seek out input from hundreds of groups, a growing number of which lean to the right. The companies have become receptive to behind-the-scenes lobbying as well.

Among the initiatives, Facebook has privately sought advice from the Family Research Council, a conservative Christian public-policy group, and its president, Tony Perkins, according to people familiar with those meetings. Twitter’s Chief Executive Jack Dorsey recently hosted dinners with conservatives, including Grover Norquist, the founder and president of Americans for Tax Reform, which advocates for lower taxes. Advisers on the left include the Southern Poverty Law Center, a civil-rights group that keeps a list of hate groups.

For users frustrated by the lack of clarity around how these companies make decisions, the added voices have made matters even murkier. Meetings between companies and their unofficial advisers are rarely publicized, and some outside groups and individuals have to sign nondisclosure agreements.

And in many cases, posts that are hateful to one group are considered fair game — or even uncomfortable truths — to others on the opposite end of the spectrum, opening a whole new arena to continue the political and ideological fights that are often a staple of social media.

When Twitter executives struggled with whether to follow other Silicon Valley companies and remove conspiracy theorist Alex Jones from the platform in August, Dorsey privately sought counsel from Ali Akbar, a conservative political activist, Akbar says.

Akbar advised Dorsey against kicking off Jones, despite pressure from users and Twitter employees. Akbar argued that Jones hadn’t violated any of the site’s rules — a point Dorsey also made when he explained the decision in a Twitter post. Dorsey didn’t disclose Akbar’s involvement.

“It’s important that Jack sought a right-of-center perspective which cannot be found at Twitter,” Akbar says. “Jack was brave.”

Twitter ultimately banned Jones about a month later, citing a violation of its abusive-behavior policy.

Akbar says in 2018 he also complained to Dorsey about potential discrimination against a survivor of the shooting at Marjory Stoneman Douglas High School in Parkland, Fla., who was in favor of gun rights. The student wasn’t “verified” on Twitter — a badge given to users that are in the public interest — while several other survivors of the shooting who were in favor of more gun control were given the recognition.

After Akbar’s intervention, the student’s account was verified, Akbar says.

Twitter spokesman Brandon Borrman says the company and its executives personally maintain many outside relationships “to help us benefit from other perspectives on the critical societal issues we deal with.” He says outsiders “never override our rules and no outside adviser makes the ultimate decision or dictates our actions,” and that Twitter is working to be more transparent on the outsiders involved in its process.

On the Alex Jones decision, Borrman says Dorsey “did not and does not personally make enforcement decisions, he defers to the deep expertise of the team.”

The reliance on outside opinions goes along with other initiatives tech companies have launched to build their defenses. Companies have added complex internal guidelines on what kinds of posts should be banned and hired thousands of new employees to review content.

YouTube has boosted its “trusted flaggers” program — groups that are asked by the company to point out inappropriate content on the site — from 10 to more than 100 between 2017 and 2018. Twitter’s Trust and Safety Council spans about 48 organizations around the world.

Facebook says it now consults with hundreds of organizations after it decided late last year to seek more outside counsel on issues like hate speech and misinformation — broadly known as “content moderation issues.”

The tech companies have found themselves in an impossible situation, given the billions of posts that are generated each month and the conflicting agendas of their users, says Klon Kitchen, who manages tech policy for the conservative Heritage Foundation. The foundation has recently forged a relationship with Facebook.

Kitchen has advised the company that these issues are not likely to ever go away. “These are problems you manage, not problems you solve.”

Peter Stern, who handles Facebook’s outside engagement efforts from the company’s Menlo Park, Calif., headquarters, says the company now seeks advice from up to a dozen outside groups on each policy decision it makes on its platform. He declined to say which groups are consulted.

“If we change the policy, we’re going to hear about it, so we might as well involve them. We had been doing it, but not in a systemized way.”

Adam Thierer, a senior research fellow at the right-leaning Mercatus Center at George Mason University, says he used to consult with Facebook and other tech companies. The futility of trying to please all sides hit home after he heard complaints about a debate at YouTube over how much skin could be seen in breast-feeding videos.

While some argued the videos had medical purposes, other advisers wondered whether videos of shirtless men with large mammaries should be permitted as well.

“I decided I don’t want to be the person who decides on whether man boobs are allowed,” says Thierer.

Brian Amerige, a former Facebook senior engineering manager, resigned from the company after seven years in October, in part because he objected to the way it handled which content is considered objectionable.

Amerige says he felt Facebook was trying to avoid allowing anything controversial on the platform, and hampering free speech in doing so. The move to involve more outside groups — conservative or liberal — is in his opinion only making things worse.

“What happens when you have an undefinable principle and you defer to other people? It becomes a bunch of one-off decisions,” he says.

A Facebook spokeswoman declined to comment.

While outside groups are technically unpaid, the tech companies contribute to some of the organizations they are seeking out for guidance. Alphabet Inc.’s Google contributes to more than 200 third-party groups, including the Heritage Foundation, National Cyber Security Alliance, and Americans for Tax Reform, according to the company. Facebook and most other companies don’t disclose their donations to outside groups.

Executives see the outreach to a cross-section of groups in part as a form of political protection, to defend against the allegation that they are biased against conservatives, a complaint lodged repeatedly last year by President Donald Trump and Republican lawmakers. Some of the conservative groups tapped recently by tech platforms complain that the companies defer too closely to the Southern Poverty Law Center when defining what constitutes hate speech.

Many companies and other groups rely on the center’s list of hate groups, counting nearly 1,000 across the U.S., according to its website. The group also writes about some of those groups on its “Hatewatch” blog.

Keegan Hankes, a senior research analyst at the SPLC, says the group lobbies tech platforms to remove content it considers hate speech, such as when it successfully asked Facebook to remove content posted by the League of the South, a neo-Confederate group.

A spokesman for League of the South didn’t respond to requests for comment.

Several of the liberal-leaning organizations that Facebook works with turned against the company in December, sending a letter that criticized the company for its role in “generating bigotry and hatred towards vulnerable communities and civil rights organizations,” according to a copy of the letter from groups including the NAACP and the Interfaith Center on Corporate Responsibility.

Load comments