- 28 Nov 08, 16:42 GMT
A young producer on Radio 4's Today programme got a shock the other day when he logged onto Facebook. In his newsfeed was an item posted by a friend linking to a photo, allegedly of the mother of Baby P, the child killed in the recent notorious case. Various obscene and threatening comments were then posted next to the photo. Facebook is removing the picture and other details when notified by members, but new groups keep popping up.
The names of the baby's parents are also being distributed via text, and presumably on other web outlets. A court order protects the identity of these people and no mainstream media outlet would even consider breaking that ruling. So what are the responsibilities of social networks when their members break the law in this manner - and how carefully should they police the material that is generated by their users?
It all comes down to how you see the likes of Facebook and MySpace - are they just technology platforms, playgrounds for their users to exploit as they will? Or are they now becoming major media businesses facing the same regulatory demands as a broadcaster or a newspaper? The networks would like to be treated much the same as telecoms businesses - after all nobody is suggesting that the mobile network on which that text about Baby P's parents was sent should have any responsibility to stop that happening.
But there is growing pressure from politicians and regulators, who, as the online child safety expert John Carr told me, "are getting more self-confident, more assertive, about the internet, and are no longer convinced it's lawless and ungovernable." Mr Carr, who acts as an advisor to Facebook's rival MySpace, pointed out that MPs on the Culture Select Committee had ordered social networks to be more proactive in their policing of their members.
Nicholas Lansmann, chief executive of the Internet Service Providers Association said on the Today Programme that sites like Facebook are "not the police of the internet" and worked under the "notice and take-down regulations", which required them to remove illegal content once notified about it. He said that system was working well but that it was very difficult to tackle what were in effect millions of self-publishers particualrly given the cross-border nature of the internet.
Facebook is becoming the place where millions of British people get news, views, and all sorts of media content, and that means it will be under the spotlight more and more. It has a perfectly decent case to argue when it comes to its self-regulatory model - so shouldn't it be out there making that case?
The BBC is not responsible for the content of external internet sites