Skip to main content

Facebook photo of WDBJ7 reporter Alison Parker, 24, and cameraman Adam Ward, 27.

Shortly after 11 a.m. on Wednesday, somebody on Twitter under the handle @Bryce_Williams7 wrote: "I filmed the shooting see Facebook."

Attached to the feed was a 56-second video showing Virginia-based reporter Alison Parker and cameraman Adam Ward during a live news segment interviewing a local official. A shooter approaches, raising a handgun. Shots are fired, killing the two employees of television station WDBJ.

The graphic GoPro-style video, filmed from the shooter's eye-level, ushered in a horrific new chapter in social media, where not only did an audience learn of gun violence, but they almost immediately were able to see a video of it filmed by the gunman, spread far and wide across all manner of platforms. In many cases, users couldn't avoid the video which appeared on their timelines and often began playing with no active click.

Police say the suspect, Vester Lee Flanagan II, a former on-air employee at WDBJ under the name Bryce Williams, died later of a self-inflicted gunshot wound.

The distribution of a video – evidence of a crime posted by a suspect – raises a host of thorny issues about the role of social media, the value of violent images and the ability of companies and users to set and maintain content standards on privately owned platforms.

The @BryceWilliams7 Twitter account was suspended mere minutes after the video was posted; the Facebook page that also hosted the video was removed a few minutes later. Neither were fast enough to stop users from downloading the clips, which were then reuploaded up on dozens of different Twitter accounts, YouTube pages, Facebook accounts, Reddit posts and many other sites.

"Social-media sites are popular because they allow for content to be distributed virally," said Anatoliy Gruzd, director of the Social Media Lab at Ryerson University. "It's really hard to come up with policies for all possible violations. Something that's considered violent for one group could be educational for another group."

He said these platforms face similar struggles with content from natural disasters, which often feature dead victims, and from bloody civil struggles like the Maidan protests in Ukraine.

Some media have questioned the ethics of auto-play, where videos posted to Facebook or Twitter apps begin rolling as soon as users scroll by, often with no deliberate action on the part of the user.

Dozens have expressed disgust at seeing the Virginia shooting in their timelines and many have disabled the auto-play feature on their devices.

YouTube has also struggled to remove the shooting videos posted by its users. "Our hearts go out to the families affected by this terrible crime," the Google-owned company said in a statement. "YouTube has clear policies against videos of gratuitous violence and we remove them when they're flagged."

And yet, a simple YouTube search of the shooter's name provided dozens of examples of the video, some of which had been online for hours.

The site's posted policy says: "It's not okay to post violent or gory content that's primarily intended to be shocking, sensational, or disrespectful." But it also goes on to offer something of a dispensation for such content: "If posting graphic content in a news or documentary context, please be mindful to provide enough information to help people understand what's going on in the video."

Facebook is more declarative: "We remove graphic images when they are shared for sadistic pleasure or to celebrate or glorify violence." That also means Facebook has broad leeway to decide what is sadistic or glorifying, and what is not.

Reddit's policies on prohibited content are black and white when it comes to spam, so-called revenge porn and impersonations, but graphic violence is not explicitly banned unless it incites others to violence.

Mr. Gruzd said that better video-and-image analysis might help crackdown on content that platforms wish to ban, but even then the potential for false positives and overpolicing remains high. It's not clear how an algorithm could understand the difference between a killer's attempt to document his crimes and footage from a police body camera that captures a shooting of great public interest, such as the death of Samuel DuBose, shot by a University of Cincinnati police officer in July.

"Citizens might want to have access to a copy of that video, versus to be censored by a certain authority," Mr. Gruzd said. "The same feature that can help democracy can hurt moral society."

Spokespeople for WDBJ have asked other outlets not to use or share the video of their co-workers' deaths, but broadcasts used footage captured on Mr. Ward's camera, including screen captures of the alleged shooter.

"Other than the astonishing nature of the video, it adds little information about what happened. The facts are clear without using it," wrote Al Tompkins, senior faculty member for broadcasting and online at the Poynter Institute for journalism.

Some users on Twitter have suggested that since anyone with a social-media account can be a publisher, users ought to think about the ethics that traditional publishers have tried to apply to graphic content.

For instance, the code of ethics of the U.S. Society of Professional Journalists states: "Avoid pandering to lurid curiosity, even if others do."

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe