A new roll-out on the YouTube kids app has been released, which the company hope will help to protect children and young people online.
Launched on 10 May, the update says: "Parents can now choose from multiple collections of channels by trusted partners and YouTube Kids. Topics include a variety of subjects, from arts and crafts, and music to gaming, learning, and so much more."
YouTube Kids announced recently that it would give parents more control over what videos you can see on the YouTube Kids app, and also use people instead of computers to decide what videos are appropriate for you to see.
This comes after some children reported upsetting videos which were repeatedly found on the service.
Earlier this year some of you told us you'd seen videos on the app, and on YouTube Kids, that worried you.
YouTube currently uses something called an algorithm.
An algorithm is a special computer programme, which then decides which videos can appear on YouTube Kids.
This means any video uploaded to YouTube can also appear on YouTube Kids if the company's algorithms decide it is suitable.
But this doesn't always work very well as the computers can make mistakes, and inappropriate videos have repeatedly appeared on YouTube Kids.
Your parents will be able to change the settings on YouTube Kids to:
- Approve every video before the app will allow you to watch it,
- Get YouTube staff to decide what videos are safe for kids, rather than just letting the computer algorithm decide, and
- Only let you watch "trusted collections" (these are videos from approved brands, which are more likely to be safe for you to watch than other random videos on the site)
Back in February, Newsround gave five of you the chance to talk to someone who helps run YouTube about your concerns.
These five children had all seen things on YouTube Kids that upset them or worried them.
Google's Katie O'Donovan is just one of the many people in charge of how YouTube is run.
After hearing about your bad experiences, she said that she was sorry and that the company was working to better protect children.
Children's charity NSPCC said these new stricter controls for parents were "encouraging" but "long overdue".
A spokesperson for the NSPCC said: "Parents should have the confidence that a platform designed for children only shows appropriate content, and that videos which some children might find distressing or upsetting do not slip through the net."
YouTube says there are also more changes planned for YouTube Kids, which will let parents block videos they don't want children to see.
YouTube's Malik Ducard says his company have never stopped listening to feedback and are continuing to improve the app.