Psychiatrists raise concerns as the BBC finds children are swapping extreme images of weight loss.Read more
Social media firms have come under criticism after a massacre in two mosques in New Zealand which killed 50 people was live-streamed on Facebook.
Several world leaders have called on social media companies to take more responsibility for the extremist material posted on their platforms.
Facebook has said a live broadcast of last week's massacre in Christchurch was viewed fewer than 200 times but that the video was viewed 4,000 times in total before it was removed.
Google said in an emailed statement it had removed tens of thousands of videos since Friday's tragedy and terminated hundreds of accounts created to promote or glorify the shooter.
"We took a number of steps, including automatically rejecting any footage of the violence, temporarily suspending the ability to sort or filter searches by upload date, and making sure searches on this event pulled up results from authoritative news sources," it said.
"We know there is much more work to do."
New Zealand Prime Minister Jacinda Ardern called on social media platforms to do more to combat terror, after a gunman killed 50 people in an attack on two mosques in Christchurch and live-streamed it on Facebook.
"We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published," she said during a special meeting of parliament. "They are the publisher. Not just the postman. There cannot be a case of all profit no responsibility."
A group of MPs is calling for a tax on social media companies' profits, saying the firms are operating in "an online wild west".
Its report, which follows a year-long inquiry into the health impact of social media, says the industry should do more to protect children and young people online.
The government is due to publish its own proposals within weeks.
It says all kinds of steps are being explored to increase online safety.
Read more here.