Main content

Brian May's Cosmic Clouds 3-D; How fish move between waterbodies and Jim Al-Khalili's take on physics

Brian May's Cosmic Clouds 3-D, how fish move between water bodies, and the Royal Society’s Science Book Prize's shortlisted author, Jim Al-Khalili's take on physics.

There are few images as awe-inspiring as those of the deep cosmos. Photos of the stars, galaxies, constellations and cosmic nebulae are difficult to improve on, but a new book might have done just that, by making them stereoscopic. David Eicher, Editor-in-Chief of Astronomy Magazine teamed up with astro-photographer J. P. Metsavainio, all engineered by astrophysicist and stereoscope enthusiast Dr Brian May, and they’ve created the first ever book on nebulae in 3-D, It’s called ‘Cosmic Clouds 3-D’, and is published by The London Stereoscopic Company.

Have you ever thought about how fish arrive in a new pond or lake? Birds fly, other animals walk, or crawl, but fish are somewhat restricted to watery routes, and new lakes don’t necessarily have watery routes that fish can swim down. This question has been puzzling biologists for centuries. Andy Green, professor at the Doñana Biological Station in Spain has finally come up with the answer – a small number of fish eggs can survive in the guts of birds such as ducks.

The Royal Society’s Insight Investment Science Book Prize shortlist was announced last week. And as every year, Inside Science is previewing each of the books, and talking to the six authors in line for this most prestigious literary prize. This week, physicist and Radio 4 brethren Jim Al-Khalili talks to Adam about how his book The World According to Physics shines a light on the most profound insights revealed by modern physics.

Presenter – Adam Rutherford
Producer – Fiona Roberts

Produced in partnership with The Open University

Available now

37 minutes


  • Thu 1 Oct 2020 16:30
  • Thu 1 Oct 2020 21:00

Take the Synthetic Voices and Personality Test

Take the Synthetic Voices and Personality Test

Prof Trevor Cox wants help in researching how we respond to artificially generated voices