The Expanse and Philosophy. Группа авторов
Читать онлайн книгу.Arginusae and later of Leon of Salamis. Eventually execution came for Socrates too.
We have to wonder whether Avasarala and her cautious approach ever had a chance. The colonists on Ilus arrived by running the blockade maintained by the truce between Earth, Mars, and the Belt. That truce could not last forever; eventually one of the three powers in the solar system would break with the others and attempt to use the Ring System. Avasarala goes so far as to leak classified video from Ilus in order to frighten the electorate enough to side with her (“The One‐Eyed Man”). Yet scare tactics like these will only work in the short run, if at all. Hundreds of ships from Earth waited outside Abaddon’s Gate as the season began. With pirates like Marco Inaros picking them off, it seems likely that many of the would‐be colonists would eventually prefer to take their chances running the blockade. While UN ships were willing to destroy Belters who tried to run the blockade, they probably wouldn’t have done so if the ships were full of their fellow citizens. Gao was likely more right than she knew when she said, “Colonization of the Ring systems is inevitable” (“Subduction”).
While philosophers might imagine that truth will always win in the marketplace of ideas, Avasarala had no such illusions after losing the election. She left the incoming Secretary General a remarkable message: “As for policy and the direction you’re taking Earth and all her peoples, well, we disagree. One of us is wrong. I think it’s you. But I hope it’s me” (“Cibola Burn”).
Actual Threats to Our Existence
Of course, The Expanse is a work of science fiction, and its success or failure rests primarily on whether it entertains us, not whether it tells us something significant about our own world. And we certainly do not face the threat of extinction because of accidental exposure to alien technology! However, we would do well to recognize the actual threats to our existence. One survey of risk experts concluded that there is a 19 percent chance of human extinction by the year 2100.8 Ord’s estimate is closer to 17 percent.9 Some of that risk is the result of familiar nightmares such as nuclear war and engineered pandemics, and some of that risk is a function of technology that is, for the moment, only the stuff of science fiction, such as molecular nanotech weapons and runaway artificial general intelligence. These technologies present humanity with a dilemma that is similar to the one presented by Abaddon’s Gate. Should we develop these technologies quickly, even though they pose a significant risk of killing us all and extinguishing forever our vast potential? Can we resist developing them even if we recognize it is a manifestly bad idea to do so? We may soon see for ourselves. Like Avasarala, we can only hope that the pessimists are wrong.
Notes
1 1. Derek Parfit, Reasons and Persons (Oxford: Clarendon Press), 453.
2 2. Toby Ord, The Precipice: Existential Risk and the Future of Humanity (New York: Hachette Books), 3.
3 3. Ibid.
4 4. Nick Bostrom, “Existential Risk Prevention as Global Priority,” Global Policy 4 (2013), 15–31.
5 5. Max Tegmark, Life 3.0: Being Human in the Age of Artificial Intelligence (New York: Knopf), 221.
6 6. Nick Bostrom, “Astronomical Waste: The Opportunity Cost of Delayed Technological Development,” Utilitas 15 (2003), 308–314.
7 7. Ord, The Precipice, 37.
8 8. Anders Sandberg and Nick Bostrom, “Global Catastrophic Risks Survey,” Technical Report #2008‐1, Future of Humanity Institute (2008), Oxford University, 1–5.
9 9. Ord, The Precipice, 167.
Конец ознакомительного фрагмента.
Текст предоставлен ООО «ЛитРес».
Прочитайте эту книгу целиком, купив полную легальную версию на ЛитРес.
Безопасно оплатить книгу можно банковской картой Visa, MasterCard, Maestro, со счета мобильного телефона, с платежного терминала, в салоне МТС или Связной, через PayPal, WebMoney, Яндекс.Деньги, QIWI Кошелек, бонусными картами или другим удобным Вам способом.