How are humans going to become extinct?

 
Artificial intelligence Prepare to meet your maker: Will humans become extinct at our own hand?

What are the greatest global threats to humanity? Are we on the verge of our own unexpected extinction?

An international team of scientists, mathematicians and philosophers at Oxford University's Future of Humanity Institute is investigating the biggest dangers.

And they argue in a research paper, Existential Risk as a Global Priority, that international policymakers must pay serious attention to the reality of species-obliterating risks.

Last year there were more academic papers published on snowboarding than human extinction.

The Swedish-born director of the institute, Nick Bostrom, says the stakes couldn't be higher. If we get it wrong, this could be humanity's final century.

Been there, survived it

So what are the greatest dangers?

First the good news. Pandemics and natural disasters might cause colossal and catastrophic loss of life, but Dr Bostrom believes humanity would be likely to survive.

Femur of a dodo The femur of a dodo: An estimated 99% of all species that have existed have become extinct

This is because as a species we've already outlasted many thousands of years of disease, famine, flood, predators, persecution, earthquakes and environmental change. So the odds remain in our favour.

And in the time frame of a century, he says the risk of extinction from asteroid impacts and super-volcanic eruptions remains "extremely small".

Even the unprecedented self-inflicted losses in the 20th Century in two world wars, and the Spanish flu epidemic, failed to halt the upward rise in the global human population.

Nuclear war might cause appalling destruction, but enough individuals could survive to allow the species to continue.

If that's the feelgood reassurance out of the way, what should we really be worrying about?

Dr Bostrom believes we've entered a new kind of technological era with the capacity to threaten our future as never before. These are "threats we have no track record of surviving".

Lack of control

Likening it to a dangerous weapon in the hands of a child, he says the advance of technology has overtaken our capacity to control the possible consequences.

Nick Bostrom Nick Bostrom says there is a plausible threat of extinction in the next century

Experiments in areas such as synthetic biology, nanotechnology and machine intelligence are hurtling forward into the territory of the unintended and unpredictable.

Synthetic biology, where biology meets engineering, promises great medical benefits. But Dr Bostrom is concerned about unforeseen consequences in manipulating the boundaries of human biology.

Nanotechnology, working at a molecular or atomic level, could also become highly destructive if used for warfare, he argues. He has written that future governments will have a major challenge to control and restrict misuses.

There are also fears about how artificial or machine intelligence interact with the external world.

Such computer-driven "intelligence" might be a powerful tool in industry, medicine, agriculture or managing the economy.

But it also can be completely indifferent to any incidental damage.

Unintended consequences

These are not abstract concepts.

Seán O'Heigeartaigh, a geneticist at the institute, draws an analogy with algorithms used in automated stock market trading.

Start Quote

Lord Rees

This is the first century in the world's history when the biggest threat is from humanity”

End Quote Lord Rees Astronomer Royal

These mathematical strings can have direct and destructive consequences for real economies and real people.

Such computer systems can "manipulate the real world", says Dr O'Heigeartaigh, who studied molecular evolution at Trinity College Dublin.

In terms of risks from biology, he worries about misguided good intentions, as experiments carry out genetic modifications, dismantling and rebuilding genetic structures.

"It's very unlikely they would want to make something harmful," he says.

But there is always the risk of an unintended sequence of events or something that becomes harmful when transferred into another environment.

"We are developing things that could go wrong in a profound way," he says.

"With any new powerful technology we should think very carefully about what we know - but it might be more important to know what we don't have certainty about."

And he says this isn't a career in scaremongering, he's motivated by the seriousness of his work. "This is one of the most important ways of making a positive difference," he says.

Chain reaction

This eclectic group of researchers talk about computers able to create more and more powerful generations of computers.

It won't be that these machines suddenly develop a line in sarcasm and bad behaviour. But research fellow Daniel Dewey talks about an "intelligence explosion" where the accelerating power of computers becomes less predictable and controllable.

DNA molecule There are fears of unintended consequences from biological engineering

"Artificial intelligence is one of the technologies that puts more and more power into smaller and smaller packages," says Mr Dewey, a US expert in machine super-intelligence who previously worked at Google.

Along with biotechnology and nanotechnology, he says: "You can do things with these technologies, typically chain reaction-type effects, so that starting with very few resources you could undertake projects that could affect everyone in the world."

The Future of Humanity project at Oxford is part of a trend towards focusing research on such big questions. The institute was launched by the Oxford Martin School, which brings together academics from across different fields with the aim of tackling the most "pressing global challenges".

There are also ambitions at Cambridge University to investigate such threats to humanity.

Lord Rees, the Astronomer Royal and former president of the Royal Society, is backing plans for a Centre for the Study of Existential Risk.

"This is the first century in the world's history when the biggest threat is from humanity," says Lord Rees.

He says that while we worry about more immediate individual risks, such as air travel or food safety, we seem to have much more difficulty recognising bigger dangers.

'Error or terror'

Lord Rees also highlights concerns about synthetic biology.

"With every new technology there are upsides, but there are also risks," he says.

Sean O'Heigeartaigh Geneticist Seán O'Heigeartaigh warns of the uncertain outcomes of biological experiments

The creation of new organisms for agriculture and medicine could have unforeseen ecological side-effects, he suggests.

Lord Rees raises concerns about the social fragility and lack of resilience in our technology-dependent society.

"It's a question of scale. We're in a more inter-connected world, more travel, news and rumours spread at the speed of light. Therefore the consequences of some error or terror are greater than in the past," he says.

Lord Rees, along with Cambridge philosopher Huw Price and economist Sir Partha Dasgupta and Skype founder Jaan Tallinn, wants the proposed Centre for the Study of Existential Risk to evaluate such threats.

So should we be worried about an impending doomsday?

This isn't a dystopian fiction. It's not about a cat-stroking villain below a volcano. In fact, the institute in Oxford is in university offices above a gym, where self-preservation is about a treadmill and Lycra.

Dr Bostrom says there is a real gap between the speed of technological advance and our understanding of its implications.

"We're at the level of infants in moral responsibility, but with the technological capability of adults," he says.

As such, the significance of existential risk is "not on people's radars".

But he argues that change is coming whether or not we're ready for it.

"There is a bottleneck in human history. The human condition is going to change. It could be that we end in a catastrophe or that we are transformed by taking much greater control over our biology.

"It's not science fiction, religious doctrine or a late-night conversation in the pub.

"There is no plausible moral case not to take it seriously."

 

More on This Story

The BBC is not responsible for the content of external Internet sites

Comments

This entry is now closed for comments

Jump to comments pagination
 
  • rate this
    0

    Comment number 735.

    George Carlin commented many years ago something about the earth having plastic, now it doesn't need us. Probably enough plastic to last many eons after we're gone.

  • rate this
    0

    Comment number 734.

    Nothing lasts forever even the sun will burn out and the universe will end. everything that has a beginning has an end.

  • rate this
    0

    Comment number 733.

    The Universe will not notice the absence of humans it will go on experimenting in its way with new forms of life. Humans can not hope to hang on indefinitely. How long a million more years ten million? Right now the dramatic inequality of wealth in the world is the nearest danger with corporate fascism taking over. Malaysians have been protesting this new world order of the 1% dictatorship.

  • rate this
    0

    Comment number 732.

    696.GhostOfTheSun
    24th April 2013 - 22:15
    "If we colonise other worlds not even the destruction of Earth will kill us off"

    Agreed. At present we have 7 billion eggs in one 8000 mile diameter basket.

    Time to colonise the solar system and get ourselves some insurance against dinosaur killers and other planetary extinction events.

  • rate this
    0

    Comment number 731.

    Only people that think about these things become stressed out about it, for a few minutes. The rest just don't give a damn. The human race is not immortal but its just a matter of time and method of our demise. It could be a thousand and one things that kills us off. If we can delay things then that's worth fighting for.

  • rate this
    0

    Comment number 730.

    its a simple case of profit before people and communities, multi national companies and corporations would shift all their plant to antarctic and teach the penguins to do the work of humans if it meant they could make a penny more per unit profit,and how our best most highly intelligent doctors and professors cant see what is happening is beyond me

  • Comment number 729.

    This comment was removed because the moderators found it broke the house rules. Explain.

  • rate this
    0

    Comment number 728.

    We have become widely distributed geographically and culturally. With such diversity of lifestyle we are going to be a hard species to wipe out.

    Our civilization is another matter. We may find it hard to sustain a high technology civilization through the triple ropadope of fossil fuel depletion, overpopulation and climate change.

  • rate this
    0

    Comment number 727.

    The FHI eggheads are a supremely brainy bunch, and their concerns should be taken seriously. However as a aspiring synthetic biologist I cringe every time it is used as a scapegoat for research potentially going wrong. In molecular biology progress is glacial and we are still taking baby steps. Synbio has a promising and radical future, but people really need to keep their knickers on.

  • rate this
    0

    Comment number 726.

    the most depressing thing as that none of the people who posted today will ever know how it eventually ends :(

  • rate this
    +1

    Comment number 725.

    People said much the same about the dangers to the human body of travelling at high speed on railways when they were introduced. This has more in common with science fiction 'iRobot' films than it does with reality. I bet you that this will not happen. Just like all of the other imminent threats that spurious science throws up. Then again, if I'm wrong, at least I won't have to settle the debt...

  • rate this
    0

    Comment number 724.

    From someone who spent his childhood and early adulthood under the threat of total extinction of life on this planet from nuclear war, I would say we've come a long way. To say this century is the most dangerous is myopic. And we were told back in the seventies ("Limits to Growth" Club of Rome) we would all be starving by the year 2000 - another group of "experts" who got it SO wrong.

  • rate this
    0

    Comment number 723.

    I tend to agree. Take (a)social media for example and cellular phones. We, as people, are addicted to these two things. Many people are oblivious to the world around them, instead focused on the screens in front of them. We feel the need to share everything in our lives with everyone else, even if it harms us in the end. We spend our lives on these things. www.facebookdetox.com

  • rate this
    +2

    Comment number 722.

    My money's on the cats taking over and enslaving humanity through the dual tactic of cuteness and human treats.

  • rate this
    +1

    Comment number 721.

    Perfection or extinction. There is no other option. Stagnating will result in our eventual race-death anyway, so we might as well go for it.

    Here's to tomorrow. Onwards and upwards.

  • rate this
    0

    Comment number 720.

    Humanity is condemned by the very thing that has made it successful. We relent to our animal instincts and act out Darwin's theory of survival of the fitness. Since we believe in nothing other than what our senses tell us we will forever be doomed to satisfy those senses. To do otherwise is to be superstitious, ignorant, and just plain stupid.....Right?

  • rate this
    +2

    Comment number 719.

    I complete disagree. One of many large global artificial intelligence computer networks gave me the answer in .0043 Mila seconds: the BBC did this same story in 2004, Oxford slam dunks the age old question for good. There is scientifically probability to having technology be problematic. Did anyone else have the Genetically Modified Chicken Genetic modified imitation Hamburger Bun for lunch?

  • rate this
    +1

    Comment number 718.

    I do not think, we as a species will see the end of this century.

    As a previous comment stated, compassion is one of the keys.

    I just wish I was at the controls.

    Over & out.

  • rate this
    0

    Comment number 717.

    Homo Sapiens WILL eventually become extinct.

    When we do,the world will just keep on turning just like it did when the dinosaurs became extinct via Asteroid 65,000,000years ago.
    We,when compared to the immensity of time since the earth was born, are totally insignificant.
    It is only the religions which have claimed the opposite as true!

  • rate this
    +1

    Comment number 716.

    Wait, I thought the answer to all this was simply 42 !!!!

 

Page 1 of 37

 

More Business stories

RSS

Features

BBC © 2014 The BBC is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.