海角大神

Stephen Hawking warns of mankind wiping itself out: where to find hope?

Stephen Hawking spoke of a multitude of manmade threats encircling the planet in a recent BBC lecture. But how real are the risks, and what hope is there of mitigation?

|
Dave Chidley/AP/File
Professor Stephen Hawking lectures on his research, life and times the Perimeter Institute in Waterloo, Ontario, June 20, 2010.

Physicist聽Stephen Hawking has said the chances of cataclysmic events which could affect the survival of humanity聽are soaring, and we have only ourselves to blame.

In comments while recording the annual BBC Reith lectures,聽the renowned physicist聽asserted that disaster befalling planet Earth in the next 1,000 to 10,000 years is a 鈥渘ear certainty鈥, and that increasingly the threats facing humankind are of our own making.

Yet he also insisted that humanity will likely survive, because by the time catastrophe strikes, we shall have colonized other worlds.

"However, we will not establish for at least the next hundred years,鈥 said Prof. Hawking, 鈥渟o we have to be very careful in this period".

The threats Hawking specified included nuclear war, global warming, and genetically engineered viruses, proposing that progress in science and technology is in some ways a gamble, improving the lives of billions, but also introducing the means to end humanity.

Yet this is no new debate. The idea that man鈥檚 advancement could be his very undoing has been with us for decades, centuries.

鈥淭he vices of mankind are active and able ministers of depopulation,鈥 wrote the British economist and demographer Thomas Malthus in 1798.聽鈥淭hey are the ; and often finish the dreadful work themselves.鈥

In 鈥,鈥 geographer William B. Meyer wrote in 1996: "humankind has become a force in the biosphere as powerful as many natural forces of change, stronger than some, and sometimes as mindless as any."

Meyer聽continues, 鈥淣ature has not retired from the construction (or demolition) business, but humankind has in the recent past emerged as a strong competitor.鈥

One of the areas of particular concern is artificial intelligence and the autonomous weapons being enabled by huge advances in the technology, systems such as 鈥渁rmed quadcopters that can search for and ,鈥 as an open letter published in July 2015 warned.

Hawking signed this letter along with over 1,000 researchers, experts, and business leaders, including co-founders of Apple and Skype, which went on to say 鈥渢he deployment of such systems is 鈥 practically if not legally 鈥 feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms鈥.

鈥淚f any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the end point of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow鈥.

In an interview on Thought Economics, Jaan Tallinn, co-Founder of Skype and Cambridge Centre for the Study of Existential Risk, divides AI into two categories: 鈥榮ub-human鈥 AI, which includes technologies such as autonomous weapons, and 鈥榮uper-human鈥 AI, which would have the ability to reason and model better than humans themselves.

In the same interview, however, Sir Crispin Tickell, former diplomat and advisor to successive UK Prime Ministers, insists that any risk to humanity from the intentional misuse of technology 鈥 from people who "" 鈥 is far less of a concern than that of accidental misuse.

鈥淪cience is hard, and scientific breakthroughs are even harder, and so most scientists are not motivated to think of these negative consequences,鈥 said Sir Tickell.

鈥淲hen you are an AI researcher for example, you鈥檙e highly motivated to improve the capability and performance of your system, rather than research the side effects those systems could have in the world鈥.

But while there are inherent risks to our advancing technologies, so are there many experts and researchers conscious of the dangers, working hard to mitigate the risks.

Many organisations have arisen to undertake this responsibility, including (funded in part by Elon Musk, founder of Tesla), , at Oxford University in the UK, and the at Cambridge University in the UK.

And while it is ironic that such a prominent scientist as Stephen Hawking should be the one to , as the BBC notes, perhaps such eminent academics are exactly the people who need to be acting as guardians, watchmen, on the technologies that most of us know little about.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
海角大神 was founded in 1908 to lift the standard of journalism and uplift humanity. We aim to 鈥渟peak the truth in love.鈥 Our goal is not to tell you what to think, but to give you the essential knowledge and understanding to come to your own intelligent conclusions. Join us in this mission by subscribing.
QR Code to Stephen Hawking warns of mankind wiping itself out: where to find hope?
Read this article in
/Science/2016/0119/Stephen-Hawking-warns-of-mankind-wiping-itself-out-where-to-find-hope
QR Code to Subscription page
Start your subscription today
/subscribe