Censoring Facebook: Social network's violent video dilemma

 
Teenager looks at Facebook Facebook said on average 665 million people accessed its site every day in March

Related Stories

Facebook's decision to remove videos showing people being decapitated leaves the firm in a quandary: should or shouldn't it impose a wider censorship policy?

Originally the social network rejected calls from users to delete the clips saying that it wanted to "preserve people's rights to describe, depict and comment on the world".

But after the BBC revealed that one of its own safety advisers - the head of the Family Online Safety Institute - had criticised its decision, the firm announced a U-turn, saying it would remove clips showing beheadings while it re-evaluated its rules.

That potentially opens a can of worms.

Since publishing the article, readers have contacted the BBC to complain about other videos, including:

  • one that shows killings which do not involve beheadings
  • clips involving cruelty to dogs and other animals
  • a smartphone recording of a schoolgirl being punched to the ground by another pupil

In all cases they said the network had refused their requests to remove the material. A spokeswoman for Facebook confirmed its policy had only been amended in regard to decapitations.

But imposing stricter controls would open the firm up to other criticism.

Before his death, internet freedom campaigner Aaron Swartz warned of the dangers of privately owned parts of the net limiting what was posted onto their sites. He called this "corporate tyranny" and named Facebook as a specific concern.

The social network could not provide a date for when its review would be complete. The following range of opinions suggest it will struggle to please everyone.

Richard Allan, Facebook

Start Quote

Richard Allan

This is a complex challenge as Facebook is a large, diverse community”

End Quote

More than a billion people express themselves and comment on the world in which we live through Facebook and most of the time this is entirely without problem.

On occasions, there are concerns about some of the content that is being shared and we have put in place a reporting system so that people can tell us about this.

The reported content is evaluated against our community standards and appropriate action is taken where our rules have been breached.

When drawing up and enforcing our approach to acceptable behaviour and content on Facebook, we aim to strike the right balance between enabling people to share information, news and content - and protecting the community as a whole.

This is a complex challenge as Facebook is a large, diverse community and we are continually presented with novel situations.

While we freely admit that we do not always get it right, the trouble-free daily experience of the vast majority of Facebook users demonstrates that our systems are working well in all but the most exceptional cases and that they are improving over time.

Richard Allan

  • Joined Facebook in 2009 to become its director of policy in Europe, Middle East and Africa
  • Previously a Liberal Democrat MP

As we said last week, we are reviewing our rules related to content showing graphic violence.

In doing so we are clear that there are situations where it is important for people to be able to share content through Facebook even if this can at times be quite shocking.

For example, people caught up in violent incidents such as the recent Boston bombings or the ongoing conflict in Syria want to be able to report on their experiences and may use quite graphic content to do this.

This illustrates the kind of challenge that our highly experienced team deals with on a daily basis as we strive to offer a space for sharing that is mindful of everyone's expectations.

Celia Mellow, petition organiser

Start Quote

Celia Mellow

What I feared the most was that my younger sister could easily have witnessed that disgusting video”

End Quote

As a person who holds a strong sense of justice, I had no hesitation in setting up a petition for the removal of the sickening decapitation video I was shocked to find on my Facebook news feed.

What shocked me even more was the fact that I had to actually make a petition in any hope for the video to be removed.

No matter how many times my friends and I reported it, we all received the same message, stating that "it doesn't violate Facebook's community standard on graphic violence, which includes depicting harm to someone or something".

How does a video of an innocent woman being brutally murdered not "violate" this? I can only hope that there is a criminal investigation that will bring her justice.

As a loyal Facebook fan, I understand that Facebook is only allowing people to have freedom of speech. However, I think it is about time they drew a line between what is and isn't appropriate for the public.

Celia Mellow

  • Surrey-based 15-year-old who started a petition to urge Facebook to delete decapitation videos
  • Plans to join the police after leaving school

Facebook's audience starts from children aged 13 - what I feared the most was that my younger sister could easily have witnessed that disgusting video.

No-one should be exposed to such graphic horror. Sadly, that video isn't the only inappropriate content to have wandered onto Facebook recently. I have heard of others showing extreme violence and cruelty to both humans and animals.

It's time that new stricter regulations are made by Facebook in order to remove these vile videos for good so that it might return to being the safe social network it used to be.

Jeremie Zimmermann, La Quadrature du Net

Start Quote

Jeremie Zimmermann

Any intervention by Facebook to remove or block access to content... would in practice amount to privatised censorship”

End Quote

Any intervention by Facebook to remove or block access to content beyond what a court might order - while respecting basic fundamental rights and the principle of proportionality - would in practice amount to privatised censorship, and nobody has an interest in going there.

A dominant, centralised actor such as Facebook would be incentivised to spend as little money as possible determining which content would be lawful or not, suitable or not, etc.

This would raise the question of what criteria would be used. Opening such a breach would ensure that any government could pressure Facebook to consider their own criteria, whether for political, religious or other reasons.

Under such conditions we can be sure that the fundamental right to freedom of speech or the right to a fair trial would not be respected.

As surely as we cannot trust giant centralised corporations to defend our fundamental freedoms, we cannot ask them to become the judges and enforcers of what information should be shared online.

Jeremie Zimmermann

  • Co-founder of the French digital rights group La Quadrature du Net
  • Campaigns for the "free circulation of knowledge" on the internet

Protecting children on the net is a responsibility of their parents in the first place. It cannot be outsourced to Facebook.

It is a matter of educating them about the difference between between privacy, publicity and a circle of trust.

Since Facebook collects and stores so much information it should be able to determine when one of its members is a minor and is about to be exposed to content that has been reported as unsuitable, and display a warning message.

Users would then be free to choose to take that advice, or make a conscious choice to access the content.

Stephen Balkam, Family Online Safety Institute (Fosi)

Start Quote

Steve Balkam

This is new territory for us all”

End Quote

Facebook, and most other social media sites, have explicit terms of service about what is and what is not acceptable to be hosted on their websites.

Stephen Balkam

  • Chief executive of Fosi
  • One of five charity leaders consulted by Facebook about online safety issues

Some go further and have created what are known as community standards.

These more clearly state the rules about what kinds of content will be removed.

Facebook, YouTube and Twitter have robust reporting mechanisms so that ordinary users can flag inappropriate or abusive content for review.

What is challenging for these companies is how and where to draw the line.

This will help them determine when to invoke the "public interest" principle in keeping material - such as images from the Boston Marathon bombing - up on their site, even though they depict graphic violence.

This is new territory for us all as we navigate the rules, ethics and standards of user-generated content sites.

Andrew McDiarmid, Center for Democracy & Technology

Start Quote

Andrew McDiarmid

Platforms have a responsibility to consider the human rights impact of their policies”

End Quote

The controversy over Facebook's treatment of shocking videos of beheadings is the latest illustration of the enormous complexity at work when it comes to promoting the exercise of human rights online.

Billions of people rely on internet platforms to speak and access information in the networked public sphere, but the platforms are controlled by private companies, whose terms of service in large part determine the contours of free expression.

In one sense, platform operators are themselves speakers that have the right to determine their own policies. At the same time, these "digital sovereigns" - to borrow a phrase from Rebecca MacKinnon - effectively govern their users' exercise of free expression rights.

Platforms have a responsibility, particularly as they grow to Facebook-scale, to consider the human rights impact of their policies and to minimise restrictions on free expression.

Andrew McDiarmid

  • Senior policy analyst at the Center for Democracy & Technology in Washington DC
  • The non-profit organisation campaigns against government censorship and other threats to the free flow of information on the internet

This is especially true with respect to government restrictions. It would be troubling indeed if government pressure precipitated the video's removal in this case.

A key step in carrying out this responsibility is ensuring that content policies are clearly communicated and fairly applied.

The horrific beheading video and Facebook's reported reaction demonstrate the challenges that arise when trying to develop and apply clear, consistent standards in the complex and multi-faceted realm of online communication.

Context matters a great deal. Different companies might draw the line in different places, and just because something is offensive or disturbing does not mean it necessarily violates a particular term. And it certainly does not make it illegal.

Because of this complexity, systems for assessing content require constant refinement to ensure that free expression is protected.

Advocates, too, must remain vigilant that the private players that provide so much public value online are meeting their responsibilities to users.

Is it complicated and prone to mistakes and close calls? Yes, but the alternative - mandated content policies and individual governments vying for control over the global internet - is untenable and fraught with risk for free expression.

Dr Lynne Jordan, British Psychological Society

Start Quote

Lynne Jordan

People, whether young or old, can be negatively affected by witnessing violence”

End Quote

The main concern, as an experienced psychologist, in working with the effects of actual and vicarious violence is a lack of awareness of violation of choice.

Material is posted on news feeds and "liked" indiscriminately without thought as to the rights of under-aged youngsters and others who may view it.

People, whether young or old, can be negatively affected by witnessing violence either on screen or in reality.

Effects include trauma responses such as replaying the images, feeling scared and vulnerable, ashamed, invaded or violated and confused, as well as angry and helpless, which is reinforced via the news feed as these things pop up uninvited.

Ethical codes are there for safety and to preserve the right to choose what is viewed when users are considered of age or able to understand the implications. Social media sites are mostly not obliged to adhere to such codes which creates a problem, particularly if they issue their own vague inadequate guidelines.

Social networks' news feeds allow material to arrive on people's pages that might never be sought by choice.

Dr Lynne Jordan

  • A chartered psychologist and member of the British Psychological Society
  • Has worked as a therapist for nearly 30 years focusing on problems caused by trauma

Extensive "friendship lists" develop with people who may not be actual friends but through casual contact get "befriended", perhaps out of obligation or a need to fit in, be liked etc.

The material is often posted supposedly to prevent the spread of violent crime or other violations, but in fact it can inadvertently escalate it by sidestepping the consent of the people accessing the feeds.

This is reminiscent of the "ban smoking in public places" debate with the concern of whose rights we are protecting.

In that debate it was largely about public physical health. This debate concerns public mental health and wellbeing.

 

More on This Story

Related Stories

The BBC is not responsible for the content of external Internet sites

Comments

This entry is now closed for comments

Jump to comments pagination
 
  • rate this
    +3

    Comment number 205.

    Chris hc @ 183:
    If you don't understand something, don't comment on it!
    If you don't use FB then you won't know that when someone else likes or shares a photo or video it shows up in their friends' stream; they have little or no control over what appears in their stream short of deleting or hiding the friends who share the content.
    BTW, I think you mean COULDN'T care less!

  • rate this
    +2

    Comment number 204.

    @192 "Ban facebook"

    Just like communist China.

  • rate this
    +7

    Comment number 203.

    I haven't read the full article, yet, as it's too long but I get the gist of it, I use FB most days & have seen some horrific sights, a poor dog being skinned alive it made me sick, as far as I am concerned it what you see can be prosecuted in a court of law as a crime then it shouldn't be seen on FB.

  • rate this
    +2

    Comment number 202.

    171.Minerve

    The world is a big bad place, just because you censor it doesn’t change that fact. Kids will be kids, curiosity will get the better of them, boys will look at porn, always have always will. Yes its more accessible now, but that’s just the way it is. You can’t change it, you can’t fight it so you have to make sure your kids are robust enough to cope with it.

  • rate this
    -2

    Comment number 201.

    Hi Rupert #170, but creative, constructive,& social interaction did take place pre social media, with more physical, eye-to-eye, heart to heart human contact than now, which is surely the essence of why we exist. Faceboook, Twitter etc, are after all just limited shelf-life fads, even if they last a generation, before we move onto the next "thing" . These things are ok, but keep them in context

  • rate this
    +17

    Comment number 200.

    I was horrified recently to see a page saying that people with autism were a danger to society, using Newton as an example. Some awful statements. Also pages dedicated to extreme homophobia and sexism. Violence isn't the only thing that needs to be addressed on Facebook.

  • rate this
    +2

    Comment number 199.

    The big problem with the net is not necessarily what is posted, but the fact that people can post anonymously+ hide behind a pseudo profile.

    Technology will become so advanced eventually that everyone will have a much stronger identity online- e.g FB won't allow profiles that aren't genuine.

    This will solve a whole host of problems and leave the sick/mentally disturbed exposed for all 2 see

  • rate this
    0

    Comment number 198.

    People who think killing people on camera for a stunt is acceptable are the ones that the arc light of publicity should shine fiercely on.
    The world media have a responsibility to ignore (and say they ignore) Barbarity!
    The people who think it ok are from a Stone Age mentality. We should not allow such people into civilisation until they evolve, never mind allow them onto Facebook!

  • rate this
    +3

    Comment number 197.

    If kids are anything like I was at the age of 10ish, they'd have seen all kinds of stuff that would make their parents eyes bleed. I've made far more physically, and psychologically lasting mistakes as an adult.

    Don't worry, as long as you're a half decent parent your little darlings will be fine.

  • rate this
    -2

    Comment number 196.

    The fact that they are even questioning wheather to censor these videos is beyond me. Children use facebook! They do not need to be exposed to image that graphic at such a young age. I have no doubt it would mentally scar them.

  • rate this
    +2

    Comment number 195.

    Facebook did not even act when pictures and information that violated a court order were reported. Reports of images relating to the new identities of Jon Venables and Robert Thompson were repeatedly rejected. It's reporting system needs a massive overhaul.

  • rate this
    +1

    Comment number 194.

    The arguments about if you don't like it switch/log off do miss the point somewhat. You could use the same argument to justify the broadcast of live rapes or child abuse or torture. There has to be a line in the sand to protect the victims and/or their family if nothing else. Its not just about the viewer. A detail that seems to be lost on many internet warriors.

  • rate this
    +7

    Comment number 193.

    Anyone who wants these videos censored should remember that one of their "friends" felt it was appropriate to post a link to it in the first place. Your first point of action is to contact that "friend" with your concern and ask them to remove it, or to delete, or hide, them from your friends list if you feel strongly enough that it is inappropriate. YOU have the power to control your news feed.

  • rate this
    -6

    Comment number 192.

    There wouldn't be a dilemma if there wasn't a facebook..
    There is a facebook, so there is a dilemma.
    Ban facebook.

  • rate this
    +2

    Comment number 191.

    this isn't about letting people put up photos/videos of unpleasant things they've seen on holiday, there are pro purveyors of "snuff" video who love the idea of being able to demonstrate their product on fb. These people are highly experienced & will run rings round fb. It effectively has a ban on pornography, if it didn't, professional pornography would move in, its the same with extreme violence

  • rate this
    +4

    Comment number 190.

    The amount of spam and chain messages being shared on Facebook, without any consideration of the content (most of which is factually incorrect or simply made up!), means that I use it less and less these days. Thankfully none of my friends have ever shared violent or sexual content, and if they did, at the very least I'd be blocking their news feeds.

  • rate this
    +31

    Comment number 189.

    Why don't Facebook users just 'unfriend' those that post offensive content? Your Facebook feed is something you create yourself by the people you connect to.

    I would rather police my own feed than demanding censorship. It's disappointing to me that so many people demand an external authority decide what they are allowed to see.

  • rate this
    0

    Comment number 188.

    The disputed founder of Facebook has no real sense of a wider responsibility, he's a geeky, weird nutcase who's made himself immensely wealthy on the back of a non-essential frippery.

  • rate this
    +15

    Comment number 187.

    This is a typical stance of a US company (and the US in general); Extreme, graphic (and real) violence is all too readily accepted while sexual activity is censored as a despicable act. You would be banned for posting a video of consensual sex.
    I firmly believe it is this attitude that leads to the vast disparity in violent crime statistics between the US and Europe.

    Violence is fine. Sex is evil

  • rate this
    -2

    Comment number 186.

    # 178 And here we go again

    the point I am making is that they use my fb account as I think they are too young to have their own and I am over 18 . I have now deleted the account as I cannot be sure about what is posted/uploaded so this company has lost a user as a direct consequence of their policy and I hope other parents do the same until their policy changes .

 

Page 20 of 30

 

More Technology stories

RSS

Features

  • Bad resultsBlame game

    The best excuses to use when exam results don't make the grade


  • Police respond to a shooting in Santa MonicaTrigger decision

    What really happens before a police officer fires his gun?


  • Child injured by what activists say were two air strikes in the north-eastern Damascus suburb of Douma (3 August 2014)'No-one cares'

    Hope fades for Syrians one year after chemical attack


  • Lady AlbaGoing Gaga Watch

    Social media's use ahead of the independence referendum


  • Pro Israel activists hold a banner reading 'Against Anti-Semitism and hate of Israel' at a demonstration as part of Quds Day in Berlin, Germany, 25 July 2014'Rising tide'

    Do statistics support claims that anti-Semitism is increasing?


BBC © 2014 The BBC is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.