"Modern science is well-acquainted with the idea of natural risks, such as asteroid impacts or extreme volcanic events, that might threaten our species as a whole. It is also a familiar idea that we ourselves may threaten our own existence, as a consequence of our technology and science. Such home-grown “existential risk” – the threat of global nuclear war, and of possible extreme effects of anthropogenic climate change – has been with us for several decades.
However, it is a comparatively new idea that developing technologies might lead – perhaps accidentally, and perhaps very rapidly, once a certain point is reached – to direct, extinction-level threats to our species. Such concerns have been expressed aboutartificial intelligence (AI), biotechnology, and nanotechnology, for example...."
The Cambridge University Centre for the Study of Existential Risk resources pagehttp://cser.org/resources-reading/
No comments:
Post a Comment