A machine has taught itself how to play video games better than a human.
The computer program, which is inspired by the human brain, learned how to play 49 classic games from Space Invaders and Pong, to boxing and tennis games.
In more than half of the games, it was as good or better than a professional human player!
Researchers from Google DeepMind said this was the first time a system had learned how to master such a wide range of complex tasks.
You might think a computer should be brilliant at gaming from the start. But this computer had to learn how to play.
Dr Hassabis from DeepMind explains: "The only information we gave the system was the raw pixels on the screen and the idea that it had to get a high score. And everything else it had to figure out by itself."
This research is the latest in "deep learning", which is paving the way for smarter machines.
Scientists are developing computer programs that, like the human brain, can be given large amounts of data, such as images or sounds, and then find useful information or patterns from it.
It's important because when machines like self-driving cars are being used, they have to deal with the unexpected; they can't be pre-programmed with every eventuality that might happen.