#Logang4life: Online Virality and Virtual Witch Hunts

orange cylinder with eyes

FEATURE

#Logang4life: Online Virality and Virtual Witch Hunts

By Georgia Griffiths

First came the backlash. Thousands upon thousands of tweets called for Paul to be removed from YouTube. The more vulgar called for his head on a platter, while some took the opportunity to publicise suicide prevention resources – the exact causes Paul purported to be drawing attention to through his video. The video was taken down within a few days, and a somewhat non-apology apology was issued by Paul in a tweet:

“I do this shit every day. I’ve made a 15 minute TV show EVERY SINGLE DAY for the past 460+ days. One may understand that it’s easy to get caught up in the moment without fully weighing the possible ramifications[2].”

Paul pointed to the fact that the video hadn’t been monetised, meaning he wouldn’t make money from ads on that video, as evidence of his altruistic purpose. In his eyes, it was a misguided mistake. In the eyes of the broader Internet, it shouldn’t have been one he recovered from.

YouTube’s policies, however, in combination with decisions from their upper management, have allowed Paul to continue his career on the platform relatively unscathed. YouTube has a range of guidelines that, supposedly, creators must adhere to. It operates on a ”three-strike” system, under which creators are banned if they receive three strikes, within three months, for inappropriate content. As it currently stands, the Policy Centre section entitled “Violent or Graphic Content” contains this key paragraph:

“It’s not okay to post violent or gory content that’s primarily intended to be shocking, sensational or gratuitous. If a video is particularly graphic or disturbing, it should be balanced with additional context and information…In some cases, content may be so violent or shocking that no amount of context will allow that content to remain on our platforms.”

It is likely that Paul’s video was “primarily intended to be shocking, sensational” and “gratuitous”, and very likely that the footage wasn’t balanced with enough “additional context and information” to justify YouTube’s decision not to remove it. Logan and his friends appear to stumble upon the body in the vlog-style video, with minimal warning, and Logan both shows the body on-screen and describes what may have happened. Logan turns back to the body multiple times while saying, “I’m so sorry Logang, this was supposed to be a fun vlog”. There is a brief mention of how devastating suicide can be, after the body is shown, but there was no reference to any support services. In fact, it could be argued, especially given the young demographic of Paul’s viewers, that the content may have been “so violent or shocking that no amount of context” would be suitable for it to remain on the platform. The video was reported thousands of times, but it was manually reviewed by YouTube and left up without so much as an age restriction. In the end, it was removed by Paul himself, but not until it had been viewed more than six million times.

There’s no doubt that Paul’s monolithic position on the site means he gets special privileges. As much as YouTube tries to deny it, Paul generates huge amounts of revenue for both himself and the site, and to severely punish him would be to lose a key money maker. After waiting nine days to comment on the incident, YouTube tweeted a vague response stating “the channel violated our community guidelines, we acted accordingly, and we are looking at further consequences.” Paul was eventually given one strike. In contrast, it was reported by a member of YouTube’s ‘Trusted Flagger’ program that smaller accounts that re-uploaded the footage after its deletion were automatically given strikes[3]. It wasn’t until Paul later uploaded videos of tasering dead rats, and taking a live fish out of a pond, that his ad privileges were temporarily taken away by YouTube management. It seems that online, as within most of Western society, money and fame can give you a free pass.

The Logan Paul incident highlights a key issue for social media platforms: where is the line to be drawn between protecting content consumers, particularly those of a younger demographic, such as Paul’s viewers, and creating an inappropriately censored environment? To what degree are companies like YouTube, Facebook, and Twitter even responsible for the content posted on their sites in the first place? Many platforms argue that it’s not their role to police user-generated content. Facebook, among others, frequently refers to themselves as a “technology company”, similar to Microsoft or Apple, as opposed to a “media company” like BuzzFeed. By positioning themselves as merely the “disseminators of information” instead of a “media outlet”, social media companies aim to protect themselves from the predictable onslaught in the wake of questionable content becoming available on their platforms. Paul himself, as an example of a creator responsible for “questionable content”, takes a similar view, suggesting that the onus to protect vulnerable viewers falls to sources outside the platform, such as parents. “I’m going to be honest… I think parents should be monitoring what their children are watching more,” he told Good Morning America in the aftermath of the Aokigahara video[4].

The risk we run in giving responsibility over user-generated content to platform owners is systematic, or (possibly more insidious) accidental, censorship. Just last year, The New York Times reported on the automatic removal of videos documenting the atrocities occurring in Syria, a move that could “potentially jeopardiz[e] future war crimes prosecutions[5].” This was an inadvertent side effect of YouTube’s efforts to stop militant propaganda from being posted on the site, through the introduction of algorithms which remove content in breach of the company’s guidelines without needing content to first be flagged by users.

Yet, despite their hesitations, social media platforms are clearly starting to take some responsibility for filtering uploaded content. This is demonstrated, for example, in YouTube’s attempt to remove extremist content. Facebook has also recently begun removing any live streams containing graphic or violent content, after a spate of live-streamed murders cast light on the unpredictability of the medium. There remains an argument, however, that despite these efforts it is simply impossible for platforms to keep track of the huge volumes of content being created each day. According to Forbes Magazine, four hundred hours of content was uploaded to YouTube every hour in 2017. Given the impossibility of human moderators assessing all uploaded content, the platforms rely on both algorithmic methods, which are evidently far from perfect, and on users to report inappropriate content.

Humans reviewing content, are, of course, also prone to mistakes. Combined with increased ability to connect with other people, metaphorical online “witch hunts” are becoming more visible and more problematic every time someone offends. The concept of a “witch hunt” is obviously not new, and it is not confined to the cybersphere; the advent of social media, however, has allowed instances that may have otherwise stayed within limited groups to be broadcast across the world, meaning anyone with a passing interest can comment and dissect the actions or words of others. In some contexts, this can be valuable – having the ability to report images that are genuinely offensive, inappropriate, and/or distressing is not an inherently bad thing. The issue arises when misinformed or uninformed parties throw their opinion into the ring as if they were truth.

Not knowing all facts of the case is not always entirely the fault of the participants. The Internet cycle tends to prioritise the juiciest parts of a story, those that will garner the most clicks and follows. Similarly, the ‘offending’ content may not be offensive to all, but those who do find it offensive might be the ones that can, or do, shout the loudest. At its core, whether material is offensive or not is an individual’s subjective moral decision. Some things, such as showing a dead body online like Paul did, are generally recognised within Western society to be offensive and inappropriate, but it isn’t always so clear cut. This is where reasonably justifiable online outrage morphs into a “witch hunt”.

For Rachel Tuvel, an associate professor of philosophy at the University of Rhodes, “mob justice” was delivered swiftly. In 2017, Tuvel wrote an article for Hypatia, a feminist journal, as summarised by Justin Weinberg of the Daily Nous, considering whether social frameworks which “support accepting transgender individuals’ decisions to change sexes…provide support for accepting transracial individuals’ decisions to change races”[6]. While that topic is undeniably controversial, the backlash afforded to Tuvel was harsh and disproportionate, considering that she was merely exploring her theory in an academic journal, supposedly the environment in which all theories can be rigorously tested. According to Kelly Oliver, from Vanderbilt University, “some academics supported Tuvel in private while actually attacking her in public… [while others] were pressuring, even threatening, Tuvel that she wouldn’t get tenure and her career would be ruined if she didn’t retract her article[7].” The attacks on Tuvel included comments by other academics on social media sites, and an open letter to Hypatia seeking that the article be retracted[8]. This case, while largely remaining confined to academic circles, is a clear example of how a “witch hunt” can seriously damage an individual’s life. Googling Tuvel’s name reveals a first page of results largely dedicated to reports on and analysis of the article and the backlash that followed. While Paul may be able to overcome that kind of backlash, as his massive following is primarily among children, it’s likely that the circus around Tuvel’s 2017 article will follow her forever. Justifiable backlash or not, there’s a strong argument to be made for the right to a fair trial, as opposed to lifelong punishment at the hands of a group of, mostly uninformed, strangers online.

Ultimately, these questions will continue to re-emerge as social media platforms continue to grow and evolve. Should “justice” be left to the online mob? Or should we trust huge corporations to deliver appropriate sanctions to those who violate social standards online? There will be more Logan Pauls. There will be more Rachel Tuvels. The highs and lows of online virality will reappear, just as soon as the next online spectacle occurs.


  1. Koerber, Brian, Logan Paul Wishes His Fans Were Older. They’re Not, Mashable Australia, Feb 2 2018, <https://mashable.com/2018/02/01/logan-paul-young-fans-demograpphic/
  2. Paul, Logan, Tweet dated 1 Jan 2018, <https://twitter.com/LoganPaul/status/948026294066864128>
  3. Titcomb, James, YouTube moderators ‘approved’ Logan Paul’s dead body video, The Telegraph, 3 January 2018, <https://www.telegraph.co.uk/technology/2018/01/03/youtube-moderators-approved-logan-pauls-dead-body-video/>
  4. Farokhmanesh, Megan, Logan Paul tells Good Morning America his controversial video was intended to show the ‘harsh realities’ of suicide, The Verge, 1 February 2018, <https://www.theverge.com/2018/2/1/16959022/logan-paul-suicide-video-apology-good-morning-america>
  5. Browne, Malachy, Youtube Removes Videos Showing Atrocities in Syria, The New York Times,  22 August 2017, <https://www.nytimes.com/2017/08/22/world/middleeast/syria-youtube-videos-isis.html>
  6. Weinberg, Justin, Philosopher’s Article on Transracialism Sparks Controversy, Daily Nous, 1 May 2017, <http://dailynous.com/2017/05/01/philosophers-article-transracialism-sparks-controversy/>
  7. Ibid.
  8. Ibid.

Posted

in