National Opinions

Social-media platforms were used like lethal weapons in New Zealand - that must change now

Right from the twisted start, those who plotted to kill worshipers at two New Zealand mosques depended on the passive incompetence of Facebook, YouTube and other social-media platforms.

They depended on the longtime priorities of the tech giants who, for years, have concentrated on maximizing revenue, not protecting safety or decency.

They got it.

Many hours after the massacre, a horrific 17-minute video - showing a man in black shooting with a semiautomatic rifle at those running from mosques and shooting into piles of bodies - could still be easily accessed on YouTube.

My colleague, Washington Post tech reporter Drew Harwell, summed up the social-media disaster succinctly in a tweet: "The New Zealand massacre was live-streamed on Facebook, announced on 8chan, reposted on YouTube, commented about on Reddit, and mirrored around the world before the tech companies could even react."

It gets worse. The brutality that killed at least 49 people and wounded many others was fueled and fomented on social media - inviting support and, no doubt, inspiring future copycats.

One of the suspects had posted a 74-page manifesto railing against Muslims and immigrants, making it clear that he was following the example of those like Dylann Roof, who in 2015 murdered nine black churchgoers in Charleston, South Carolina.

ADVERTISEMENT

All of it ricocheted around the globe, just as planned.

The platforms, when challenged on their role in viral violence, tend to say that there is no way they can control the millions of videos, documents and statements being uploaded or posted every hour around the world. They respond when they can, which is often with agonizing slowness and far too late.

And they insist on presenting themselves not as media companies with some sort of gatekeeping or editing responsibility, but as mere platforms - places for their billions of users to do pretty much what they wish.

To the extent that the companies do control content, they depend on low-paid moderators or on faulty algorithms. Meanwhile, they put tremendous resources and ingenuity - including the increasing use of artificial intelligence - into their efforts to maximize clicks and advertising revenue.

This is far from the first time acts of violence have been posted in real time. Since Facebook's live-video tool began in 2015, it's been used to simulcast murder, child abuse and every sort of degradation.

But the tragedy in New Zealand takes this dangerous - and largely untended - situation to a new level that demands intense scrutiny and reform.

Granted, there are tough issues here, including those involving free speech and the free flow of information on the internet.

Reddit, for one, often takes the view that its users deserve to be treated like grown-ups, to see what they want to see.

As their representatives Friday closed down a thread called "watchpeopledie," where users commented on the massacre video, they sounded regretful:

"The video is being scrubbed from major social-media platforms, but hopefully Reddit believes in letting you decide for yourself whether or not you want to see unfiltered reality," their post said. "Regardless of what you believe, this is an objective look into a terrible incident like this."

Where are the lines between censorship and responsibility?

These are issues that major news companies have been dealing with for their entire existences - what photos and videos to publish, what profanity to include.

Editorial judgment, often flawed, is not only possible. It's necessary.

The scale and speed of the digital world obviously complicates that immensely. But saying, in essence, "We can't help it" and "That's not our job" are not acceptable answers.

Friday's massacre should force the major platforms - which are really media companies, though they don't want to admit it - to get serious.

As violence goes more and more viral, tech companies need to deal with the crisis that they have helped create.

They must figure out ways to be responsible global citizens as well as profit-making machines.

The views expressed here are the writer’s and are not necessarily endorsed by the Anchorage Daily News, which welcomes a broad range of viewpoints. To submit a piece for consideration, email commentary(at)adn.com. Send submissions shorter than 200 words to letters@adn.com or click here to submit via any web browser. Read our full guidelines for letters and commentaries here.

Margaret Sullivan

Margaret Sullivan is The Washington Post’s media columnist. Previously, she was The New York Times public editor, and the chief editor of The Buffalo News, her hometown paper.

ADVERTISEMENT