U.S. President Donald Trump holds up a Bible as he stands in front of St. John’s Episcopal Church across from the White House after walking there for a photo opportunity during ongoing protests over racial inequality in the wake of the death of George Floyd while in Minneapolis police custody, at the White House in Washington, June 1, 2020.
Tom Brenner | Reuters
Facebook, Twitter and The New York Times all have different business models and missions. But the three companies were once again rocked this week by a pestering question they’ve all struggled to answer for years — how to balance free speech and political neutrality.
Facebook’s Chief Executive Mark Zuckerberg has tried to keep his platform agnostic toward ideological clashes by allowing most speech, including President Donald Trump’s post this week that “when the looting starts, the shooting starts.” Twitter took a different approach, hiding the tweet behind a warning that it “glorifies violence.” The New York Times, meanwhile, decided to run an op-ed by U.S. Senator Tom Cotton (R – Ark.) that called for the federal government to send in the military to suppress protests, in order to help “provide a debate on important questions.”
All three companies were criticized for their decisions. Hundreds of Facebook employees staged a virtual walkout on Monday. President Trump called out Twitter for hypocrisy, accusing the company of targeting “Republicans, Conservatives & the President of the United States” and asking for “Section 230 to be revoked,” an allusion to a provision in the Communications Decency Act that protects technology companies from legal liability for user content. More than 800 New York Times staff members signed a letter opposing the publication of Cotton’s op-ed Thursday, causing the Times to put out a statement blaming “a rushed editorial process” for its publication.
There’s a clear problem here — one that’s been lingering for years. There are no clear guidelines for how to deal with controversial political speech, especially when messages disseminate false or unverified information.
Right now, company execs and editors make decisions on an ad hoc basis, flitting from crisis to crisis.
One solution, according to Jeff Jarvis, a professor at the CUNY (The City University of New York) graduate school of journalism, is for platforms to define their “north stars” — guiding principles that inform and dictate the type of conversation they expect from participants.
“One size fits all regulation doesn’t work,” Jarvis said in an interview. “Platforms need to set their own standard and then be held accountable to whatever standard they set. Reddit is different from Facebook, which is different from Twitter. If Facebook says, we expect a respectful humane conversation, they’re going to be expected to do that. If it says, nope, actually, anything and everything goes, they’ll be accountable for that. The problem is Facebook hasn’t established a clear north star.”
The lack of a defined north star also led The New York Times to stumble with the Cotton op-ed, Jarvis said. Initially, the Times defended the decision by trying to ensure its readers had access to all types of opinions.
“It would undermine the integrity and independence of The New York Times if we only published views that editors like me agreed with, and it would betray what I think of as our fundamental purpose — not to tell you what to think, but to help you think for yourself,” wrote New York Times editorial page editor James Bennet in a column Thursday.
The Times’s problem, Jarvis said, is that manufacturing conversation by ensuring different viewpoints are heard, including those that some viewed as fascist, is overly simplistic and does a disservice to the newspaper’s readers.
“Again, what is The New York Times’ north star?,” Jarvis said. “Neutrality is a myth. Journalism cannot be objective but we must be transparent. We get in trouble all the time when we try to chase the god of objectivity to balance neutrality. I think that’s what happened at The Times.”
Europe tests government oversight
Several countries in Europe have turned to government regulation.
Germany passed a law in 2018 mandating companies that operate social media platforms remove “obviously illegal” content within 24 hours or face fines of up to 50 million euros. France upped the ante, passing a law that goes into effect on July 1 which gives social media platforms just one hour to remove extreme content and 24 hours to eliminate hate speech.
Still, there’s so much ambiguity to hate speech and disinformation, especially when it comes from world leaders, that simple government laws won’t solve the problem of how decisions are made.
Journalism schools may be ground zero to develop structures that news organizations can bring to publications to guide decision making. But more than three years into the Trump presidency, which has been dominated by an astounding number of lies from the president and his surrogates and consistent responses of “fake news,” it doesn’t seem like schools are coalescing around any kind of standard.
“While we have had informal discussions around how to treat the false utterances of public figures (political or otherwise), we’ve not codified those discussions into any form of style guide or recommendations, and I am not aware that other schools are either,” said Charles Whitaker, dean of Northwestern’s Medill School of Journalism, in an email.
The onus, then, falls on technology platforms and journalism organizations to devise their own standards. Defining them to embrace potential biases, rather than avoiding them, will do the world a service — and won’t lead to the breakdown of democracy by isolating factions, one of the fears of 19th century historian and political scientist Alexis de Tocqueville, said Jarvis.
“The filter bubble doesn’t exist,” said Jarvis. “People online are more aware of opponents arguments than those offline.”
Producing media and technology platforms that serve particular social and political needs extends the conversation rather than limits it, said Jarvis. The imperative is for media companies to fill community gaps by focusing on niches rather than trying to be one thing to all. This also makes sense from an economic viability perspective. With defined “north stars,” new organizations can tailor conversations and content to unserved communities.
“Monopoly mass media no longer needs to serve everyone,” Jarvis said. “There should be more conservative networks, whatever that means today, than just Fox News. We need to have outlets that better serve those that are left out of mass media — African Americans, Latino Americans, LGBTQ Americans, and so on. I don’t think it comes from a single publication. I think it comes from the collection of the conversation.”
WATCH: Facebook is taking employee virtual walkout very seriously, tech reporter says