Moderation, Hosting, Escalation and User Management
Part 1: Hosting, Moderation and Escalation
In this article
- Editorial Responsibility
- Managing the Space
- Hosting and User Management
- Rationale for Moderation
- House Rules
- Scope and Level of Moderation
- Alternative Forms of Moderation
- Moderation of Personal Spaces for Users on BBC Online
- Moderation of Community Spaces
- Reporting Serious Incidents
- Keeping a Record
- Moderation at Times of Special Sensitivity
- Special Sensitivity: Election Campaigns
- Special Sensitivity: Armed Conflict
- Interactivity and Safeguarding Trust
- Freedom of Expression and Criticism of the BBC
- BBC Investigation Service
There must be a named individual in the relevant Division to take responsibility for user contributions, just as for BBC content. They should ensure that the space maintains appropriate overall standards of moderation, hosting and user management. This will be the relevant Commissioning Executive or equivalent. If a site or space is being run by an independent company for the BBC, then the relevant Commissioning Executive or equivalent is responsible.
The named individual is responsible for seeing that their staff are properly trained, that appropriate levels of hosting are maintained and that alerts and users' complaints about moderation decisions are answered within a reasonable time. The relevant Divisional Social Media Executive and the Moderation Services team in FM&T can offer specialist advice and guidance.
Managing the Space
Every online space where user contributions are published must be moderated to remove illegal and inappropriate content, it must have appropriate user management, and it should normally have a host to provide a visible and active presence.
Hosting and User Management
A BBC host should enhance the quality, enjoyment and distinctiveness of online spaces for our users. A host's active and visible presence should also lessen the risk of an online space becoming dominated by a small number of bullies or troublemakers. Hosting can save time and effort in the long run because if a space is taken over by a small group, it may require a huge amount of effort to restore the space to a point where other users feel welcome enough to participate again in significant numbers. The presence of a host may also enable lighter touch moderation.
A good host will understand online etiquette and use it to work collaboratively with individual and groups of users. As the local public voice of the BBC, a host should be able to demonstrate the values of transparency and accountability in action, for example by providing authoritative information about when the next series starts or why a BBC site is being closed down, and they should ensure that the BBC House Rules are upheld. They will work with the moderation team and let them know about any particular sensitivities or issues.
The appropriate level of hosting will vary depending on the nature of the service and the expectations of the users of that service.
Users may value BBC blogs and message boards as an opportunity to interact directly with the BBC as much as with each other. Good hosting on a blog may mean that blog authors or members of their production team interact regularly with their users through blog comments. Where authors aren't going to read and reply to some comments on their blog, it may be better to make this clear to the audience or for example to describe the space as something other than blog - we should take care to manage audience expectations appropriately. Message board hosts should provide a more visible and dedicated presence. They are likely to meet and greet new users, encourage more contributions, defuse rows and monitor the overall tone of conversations; they may seek feedback from the audience and should be seen to act on this where possible.
Some BBC social media spaces may not require a traditional hosting presence e.g. simple picture galleries of images provided by members of the public. But whether or not a BBC space has an active and visible host, a named individual must be responsible for user management. This will normally be the BBC host, but where there isn't one it will usually be the site producer or another member of the production team. The person responsible for user management should deal with moderation referrals, escalation and complaints about moderation in a timely way. For popular or controversial sites, this may involve some significant effort. See the section on escalation for guidance on dealing with complaints about moderation decisions.
Hosts may be members of staff or part of an independent production team, if that team is providing online content to the BBC. Hosts must be properly trained so that for example they can escalate tricky issues with confidence. Any proposal for an independent production team to act as hosts should be agreed well in advance with the Commissioning Executive or equivalent and the relevant Divisional Social Media Executive.
Rationale for Moderation
The BBC moderates user contributions in order to enhance the experience for our users when they participate in public online space and to protect the BBC's reputation. Successful online spaces operate by consent and we should encourage a genuine sense of ownership among our users. But effective moderation requires a delicate balancing act; if we intervene too much, users are likely to resent the intrusion and may vote with their feet; if we do not remove material which is likely to cause extreme offence, they may indeed be offended.
Every interactive space on BBC Online should publish or link to simple, easily accessible rules of conduct or House Rules. These give users a clear signal about how to interact, they give moderators and hosts a clear set of standards against which to assess every contribution to a specific space and they do the same for users who may wish to use an Alert to complain about other users' comments or conduct. Standard House Rules are available and will normally be suitable.
Moderators and hosts may take users' expectations of different spaces into account where appropriate e.g. when they moderate for strong language. Content which breaks the House Rules should normally be removed. However, it may sometimes be possible for material to remain if for example the online community responds robustly and in an authoritative way to an offensive comment.
Scope and Level of Moderation
All BBC Internet sites and digital public spaces which feature user contributions and other forms of social media must be moderated. This may involve premoderation, postmoderation or reactive moderation. The moderator may in some exceptional cases also be the host.
Premoderation is where material cannot be accessed by visitors to the site until the moderator has seen it and decided it is suitable for placing on the site.
Postmoderation is where the moderator sees the material, and decides whether it is suitable to remain on the site, after it has been posted.
Reactive moderation is where visitors to the site alert the moderator to an inappropriate or offensive comment after it has been published. The moderator does not read every comment.
Deciding on the appropriate level of moderation for a BBC site is the responsibility of the Commissioning Executive or equivalent for the site where the content is viewed. See below for referral for reactive moderation. Executives may seek guidance from their Divisional Social Media Representative or from Editorial Policy. The Moderation Services team in FM&T can also help. Adequate provision should be made in case the level of moderation needs to be strengthened in response to changes in circumstance (e.g. a move from reactive to pre or postmoderation in response to a specific item on the news agenda).
There should be a clear escalation strategy. See below.
The audience should be informed what the current level of moderation is and, in broad terms, how it works.
Moderation may be done by a suitably qualified external contractor approved by Social Media Executive, FM+T or by suitably trained BBC staff.
Moderators will not normally edit contributions for grammar or spelling although they may edit for use of strong language. Comments with substantial problematic content are normally rejected as a whole rather than edited. We should give a reason and the general rule is that user contributions may be resubmitted once altered.
Sites dealing with particularly sensitive areas may require premoderation. Sites designed to appeal to children are usually premoderated. Other sensitive areas which might require premoderation include those discussing personal health problems. Areas on BBC Online which invite users to email or upload pictures in are usually premoderated. New users may be premoderated for a limited period.
It may sometimes be necessary to move an area or an individual from postmoderation or reactive moderation into premoderation for a limited period.
Before considering whether to set up a live picture feed of images contributed by users from a third party site, see the risk assessment section in the Guidance Note on Links and Feeds
A minority of areas where users contribute on the BBC site are postmoderated. Postmoderation allows users to see their comments being published without delay while every message is read by a moderator. Postmoderation is likely to be suitable, for example, for sites which attract robust debate about current affairs. Sites which have a history of controversy or polarised debate, personal or racist abuse or potentially defamatory comments are likely to benefit from the level of attention provided by postmoderation.
Commissioners of BBC sites which carry postmoderated content should ensure that comments from the public are seen, checked and, where necessary, removed within an agreed time limit. The time limit should be agreed in advance with the relevant Divisional Social Media Executive, as the maximum period of time a postmoderated comment should normally be visible before being checked by a moderator.
In cases of sensitivity, the time limit may need to be within one hour of posting.
Any agreed time limit should not prevent a moderator from checking comments sooner after publication, particularly where special vigilance is called for. For example, when publishing potentially sensitive comments about an armed conflict during the conflict itself, it may be necessary to check all such messages even more rapidly.
Reactive moderation may be suitable for less sensitive sites where a higher degree of self-regulation is appropriate. It is more likely to work with an active, mature online community where few comments have to be removed or where there is a large fan base which cares passionately about the nature and quality of debate, who can be relied upon to alert the BBC to any issues. Reactive moderation is more likely to suit topics which tend not to attract polarised or extreme responses, for example a site to support a gardening or cooking programme; but with a high level of supportive alerts from the online community and the constant presence of a BBC host, it may exceptionally be suitable for some other, more contentious, subjects.
In order to maintain the quality of contributions and promote a congenial atmosphere and encourage new users to participate, reactive moderation requires consistently active and visible hosting of the site. It is not suitable for a site which is likely to attract a high proportion of children.
It may be necessary to move a reactively moderated site to postmoderation for a limited period for example in war, national crisis or during elections. See below. Producers should consider whether it may be necessary to move a programme support space into postmoderation for a limited period around the transmission of a high profile or sensitive TV or radio programme or series.
Any proposal for reactive moderation of user contributions on BBC Online must be referred in advance to the relevant Divisional Social Media Executive who may consult Editorial Policy.
Online user generated content linked to any radio or TV programme must be appropriate to the programme and its likely audience. See s.5 Editorial Guidelines on Harm and Offence for more details.
How to Handle Alerts
Reactive moderation relies on alerts from members of the public and moderators should treat each one with care and consideration. If in doubt, moderators should refer an alert to the host or named individual responsible for user management for advice. The most difficult cases may need to be referred to the relevant Commissioning Executive or equivalent. We may respond to an alert by removing content on a temporary basis while a problem is being investigated. Each alert should get a clear response in a reasonable time from the moderator. A record should be retained of the alert, the comment complained about, the decision reached and the reason given for the decision. The procedures in each area should be checked with the Editor, Moderation Services, FM&T.
Alternative Forms of Moderation
As user expectations and the technology supporting online communities evolve, we may wish to try out other forms of moderation, including models based on trust, reputation and peer-assistance. There should be a clear editorial justification.
Any proposals must first be discussed with the Divisional Executive for Social Media and Editorial Policy.
Moderation of Personal Spaces for Users on BBC Online
Where the BBC offers users the chance to have their own personal pages on our site, with room for more substantial content than is available from posting a comment on a blog, the BBC will need to consider several factors:
- visitors should be aware that they are accessing content which has not been written or created by the BBC
- to avoid giving the perception of BBC endorsement on a BBC branded page, it should be made clear at an early stage to users that these pages are intended for personal use only. The space should not be used as a campaigning platform to promote any political, commercial or charitable cause, or for fundraising. Particular care needs to be taken when moderating, to avoid this.
For the same reason, we should be able to moderate BBC usernames, which may appear as BBC urls. For example, users should not have BBC display names which promote the names of commercial products or services
Moderation of Community Spaces
The BBC may offer sections of its sites to serve communities of people. These may be people who share the same hobby or pastime, people who have shared difficulties or people who belong to the same geographical community. These may be groups of people or local organisations as well as individuals. The same care should be taken when moderating such sites as outlined in the "personal space for users" section above. It is particularly important that no group should use BBC online pages as a platform to promote campaigns which are commercial, political or particularly controversial.
We would not normally allow our pages to be used for any significant fundraising activity. Editorial Policy can advise.
See Reactive Moderation above for how to handle Alerts.
In order to protect our users and our brand, it may sometimes be necessary to go further than simply rejecting or removing a single contribution. It may become necessary to stop a user from publishing on a specific space, to put a space into read only mode, to remove a user's ability to publish to any BBC space or even to report them to the BBC Investigations Service. Every interactive site or service must have a formal escalation policy with a clear line of editorial referral. Every member of the team should know how to escalate a serious editorial issue promptly.
Complaints about moderation should be handled in accordance with the House Rules and the appeals system for hosting and moderation decisions for social media. See http://www.bbc.co.uk/blogs/moderation.shtml#canappeal for details. See s.19 Editorial Guidelines for more on handling complaints generally.
We should treat as urgent
- Incidents which may compromise individual safety (whether a member of the public or staff)
- serious incidents involving children and young people. See below for reporting route
- Issues which may damage the reputation of the BBC (e.g. audience deception or legal issues such as defamation, contempt of court or serious breach of data protection )
We should not be afraid to act decisively where necessary.
This may for example be appropriate if the moderators have been overwhelmed by demand or because a BBC space has been taken over by a group of extremists or because a conversation thread has degenerated into a vehicle for repetitive abuse. In such cases, it may be much better to stop updating a page - or many pages - of user contributions rather than to publish something which may be harmful or which could bring the BBC into disrepute.
The Commissioning Executive or equivalent should always be kept informed. Editor, Moderation Services, FM&T should be informed of any serious incident. Legal issues should be referred to BBC Programme Legal Advice. Editorial Policy may also be consulted.
The escalation policy should enable technical as well as editorial issues to be notified promptly to the correct team e.g. the failure of part of the moderation interface may require rapid technical and editorial intervention.
We should be open and honest with our audience if we decide to suspend or close all or part of an interactive space or if it becomes inaccessible as a result of technical failure.
Reporting Serious Incidents
Any incident of suspected "grooming" must be referred promptly to the CBBC Interactive Executive Management Team, who will be responsible for reporting it to the appropriate authorities. The team can be contacted via the internal global address list under CBBC Interactive Executive Management Team. Where necessary, the phone number can also be accessed externally via the BBC switchboard.
For more details of how to report suspected "grooming" and how to respond to other serious incidents including serious cyberbullying, suspected child sex abuse images or criminal incitement to racial hatred, see the Guidance Note on Interacting with Children and Young People Online.
See also the section below on when to refer to the BBC Investigation Service.
Keeping a Record
The Commissioning Executive or equivalent for a site featuring user contributions should ensure that an adequate record of editorial decisions is maintained during the lifetime of the site or service. This should include
- logging all user contributions received, and whether or not it was published to the site or removed after publication
- keeping a record of all hosting and moderation decisions, with reasons given
- keeping a copy of any content removed
- noting any events which were escalated to senior management, with the outcome
This may be achieved via automated systems (e.g. content management systems or the moderation software) or by keeping manual records. We should have an appropriate policy for how long we store these records, bearing in mind that some spaces may generate more moderation and hosting decisions and queries than others.
Moderation at Times of Special Sensitivity
At times of special sensitivity, for example during armed conflict, General Elections or in the aftermath of a terrorist incident, we may need to change our arrangements for handling social media and interactivity, in order to protect the reputation of the BBC.
In such circumstances
- Hosting and moderation policies for blogs, message boards and comment pages will need to be reviewed and, where necessary, steps taken to increase the level of scrutiny
- Where members of the public send us user contributions for publication by BBC News and other outlets, greater emphasis on attribution and verification may be necessary. See the Guidance Note on User Contributions for BBC News Output
- BBC participation on third party sites and services may need to be reviewed, together with the use of third party content on our site (for example, content provided by automated feeds from community sites like Wikipedia)
Commissioning Executives or equivalent will be responsible for making sure that their sites respond appropriately. Divisional Executives for Social Media may jointly review the circumstances with Editorial Policy to agree a consistent approach across the BBC.
Special Sensitivity: Election Campaigns
Special care needs to be taken during election campaigns when the BBC's obligations of due impartiality are under intense public scrutiny. This applies to General Elections, European Elections, local government elections, elections to the devolved legislatures or any referendums. Chief Adviser Politics will issue detailed guidelines for such elections at the appropriate time.
- concentrating debate and discussion about the campaign in a limited number of spaces where it is customary to debate political issues
- taking particular care with moderation and hosting in those spaces
- taking care to ensure that interactive spaces are vehicles for lively debate and not hijacked by organised campaigns of one particular group or party
- avoiding any debate about the election while the polls are open
More advice can be obtained from Chief Adviser Politics.
Special Sensitivity: Armed Conflict
At times of armed conflict, there are particular sensitivities for example about not disclosing operational military plans, about not naming casualties until next-of-kin have been informed and about how to handle unsubstantiated rumours. This applies particularly to user-generated content as our users have become accustomed to seeing their content published as soon as they send it to us.
See the Editorial Guidelines Section 11: War, Terror and Emergencies for more advice.
Special arrangements may also be necessary at other times for example in the aftermath of a terrorist outrage or other emergency
Interactivity and Safeguarding Trust
Anyone actively intervening to steer the course of an online discussion for a BBC purpose without revealing their link to the BBC must be acting in the public interest and must refer to a senior editorial figure or for independents to the commissioning editor. In the most serious cases, for example where there is a proposal to use a false identity as part of a BBC investigation, referral must also be made to Director, Editorial Policy and Standards. See the Editorial Guidelines Section 7: Privacy for more on "the public interest".
In any other circumstances, BBC staff and others working for the BBC ( eg independent production teams or marketeers ) must not mislead our online audiences by contributing content on BBC interactive spaces while using pseudonyms or screen names which give the impression that they are not working for the BBC. For example, an interactive producer working on a new commenting space must not seed it with his contributions in a way which gives the false impression that he is just another ordinary member of the public when he is in fact working for the BBC to develop that space. Active hosting and good marketing and promotion can do that job. By the same token, a producer working on a BBC programme who wishes to rebut criticism of the programme on a BBC interactive space should either do so via the space's host or find a way to make it clear that he is contributing as a BBC employee who works on that programme. BBC staff and others working for the BBC should not contribute to BBC Online in ways which may artificially inflate a programme's ratings or reputation.
A BBC member of staff who is acting in a purely personal capacity should be entitled to post to a message board or to rate a BBC programme. See the Conflicts of Interest Guidelines and the Guidance on Personal use of Social Networking and, if in any doubt, refer to your line manager.
Freedom of Expression and Criticism of the BBC
We should aim to accommodate the widest possible range of opinions consistent with the rules of the relevant community and the law.
Content which is critical of the BBC for example of talent, programmes or policies, should not be removed unless it breaks the House Rules. It is part of our commitment to freedom of expression and part of being accountable to our audiences that we welcome critical comments and contributions as well as praise and congratulations. This is not always easy and sometimes talent managers and interactive executives may need to remind other BBC people about why we should value the full range of opinion about the BBC on BBC sites. Allegations of censorship are likely to be much more damaging to the BBC's reputation overall than critical comments about our output. This applies equally to BBC branded spaces on third party social networking sites. However, people who work for the BBC should be treated as well on our spaces as people who don't. So repeatedly posting personal or offensive comments about individual members of the public or about people who work for the BBC may be treated as harassment by BBC moderators and hosts.
BBC Investigation Service
If a user has been disrupting a BBC space for a significant period of time to the extent that our ability to offer an effective and enjoyable service on that space is seriously threatened or compromised or they have been conducting a campaign of serious harassment against another user on BBC Online and we have exhausted other means at our disposal of managing the user (eg by banning them), we may wish to refer the case to the BBC Investigation Service. Producers should also consult Editor, Moderation Services,FM&T.
The user may be committing a criminal offence and where necessary the BBC Investigation Service can refer the matter to the police.
See the Editorial Guidelines Section 7: Privacy for our policy on handing over contributors' personal details to third parties.
For guidance on defamation and contempt of court, see the Editorial Guidelines Section 18: The Law. Programme Legal Advice can advise.