Children and Young People Online, Interacting with
Part 3: Content
In this article
Moderation of User Generated Content
We should take special care to mitigate risk around content, contact and conduct when running interactive online spaces designed to appeal to children
Messages and other user contributions on children's sites will normally be checked by a moderator before publication. On some occasions, it may not be appropriate to publish a message, even though it does not break the House Rules, because the content is very personal or the child is in distress. We may then wish to offer suitable online support or helpline information.
Spaces designed to appeal to children and young people are usually premoderated. Any proposal to use any other form of moderation for children and young people will need to be able to demonstrate that it offers a high level of child protection and must be referred to Editorial Policy.
Spaces which publish pictures or video from members of the public are usually premoderated. Any proposal to use any other form of moderation must be referred to Editorial Policy.
BBC moderators will not post messages containing personally identifiable information such as email addresses or instant messenger usernames or phone numbers on children's interactive spaces e.g. message boards. Where a child is clearly identified as having posted an email address on a BBC space which is not designed for children, the moderator will remove the message.
Sites will explain to users what pre-moderation is and broadly how it works.
Where the BBC offers online search specifically for children, each site and associated subsite must first have been approved as suitable for children by BBC editorial staff.
Sites selected by us as being suitable for children should not include unmoderated chatrooms or forums, private messaging features, embedded video or dynamic advertising. Even if we check embedded video and dynamic advertising carefully, the content may change again shortly afterwards.
All sites should be regularly reviewed for continued inclusion in the service.
Users should see an interstitial which makes it clear that they are about to leave the BBC site and offers them the choice of going on or returning to the BBC site
Users should be able to alert the editorial owner of the search engine if they find an approved site which contains harmful or inappropriate material.
While the selection and evaluation of suitable sites is done by BBC editorial staff, technical measures may also help. For example, automated keyword alerts may help to alert the editorial owner of the search engine to any significant changes to an approved site.
The same principles apply to the selection of individual external links from pages aimed at children (eg CBBC pages) as to the selection of external links to be included in any search services specifically for children (eg CBBC Search) - see above.
Links on global navigation pages which appear on pages designed for children should be suitable for a general audience.
For an audience of children, we should not link to social networking sites whose minimum age for participation is 13 or more; or to multiplayer games with unmoderated chat.
(See Guidance: Links and Feeds.)
Where programmes or sites designed for children are featuring difficult stories or issues which may require online support including links to external charities, CBBC can advise on which external sites are appropriate for children.