Brief Note on the Survival of Humanity

Brief Note on the Survival of Humanity

Hello Bard CEP readers!

It’s that time of year again – Summer is upon the Northern Hemisphere. For those of us still involved in the world of schooling, that means a time of rest, and, perhaps also a time of change. Over the past few months, I’ve found myself moving away from the beautiful natural scenery of the Hudson Valley and moving toward the wonderful bustling life of New York City (or Queens, as I am often corrected). Additionally, in June, I began my internship as a Research Assistant at the Global Catastrophic Risk Institute (GCRI), a startup think tank focused on finding ways to make the universe a better place.

This past year at Bard CEP, many of us in Gautam’s economics courses learned how to calculate risk and why this is important. Risk has many definitions, although the simplest may be the magnitude of an event multiplied by the probability of its occurrence. In my opinion, the analysis of risk is fascinating and incredibly useful, if for no other reason than providing individuals a common metric by which to value an action. This means that the outcomes of risk analyses can have a very large impact on decision-making processes. Thus, analyzing global risks could result in a very large impact on policy-making.

Joining GCRI has allowed me to build upon this strong foundational knowledge that Bard CEP provided and apply this knowledge towards new, original research that could reduce the likelihood of human extinction! Honestly, I would be hard pressed to find better motivation for myself. Global catastrophic risks (GCRs) are risks that pose a significant threat to humanity at the global scale. GCRs are risks of the highest magnitude, regardless of probability. Indeed, many GCRs may be highly unlikely. Some examples of commonly studied GCRs include nuclear war, catastrophic climate change, plagues, and artificial intelligence or other emerging technologies. Also relevant is work from the seemingly disparate fields of astrobiology (that’s right, aliens!), astrophysics (cosmic impacts), geology (super volcanoes), ecology (ocean acidification, anoxia, or biodiversity loss), social-ecology (societal collapse), and many more that I am unaware of.

Within my first month with GCRI, I researched, wrote, and submitted my first paper for publication in an academic journal along with my colleagues, Seth Baum and Jacob Haqq-Misra. The paper proposes a scenario in which humanity decides to implement one of the more popular geoengineering methods available, solar radiation management. If some future catastrophe were to eliminate society’s ability to continue managing this geoengineering method, humanity could face a second catastrophe in the form of rapid planetary warming. This double-catastrophe scenario could ultimately result in the extinction of humanity and should be avoided. Writing this paper was very exciting, and I’m interested to see how it is accepted in the scientific and risk communities.

I’m currently working on a paper discussing the importance of catastrophe recovery research and laying-out a possible methodology for future work in this area. This paper is mainly focused on helping future researchers answer the question of “what to do after some large catastrophe that eliminates the majority of the human population and advanced technological civilization?” Related is the question of “after a large catastrophe, how can we increase the likelihood of human survival and reconstruction of advanced technological civilization?” Exciting stuff!

I’ll be sure to keep this blog updated with any other relevant work as well.

Here’s to humanity!

-Tim

 

One comment

Leave a Reply to Jordan M. Kincaid Cancel reply

Your email address will not be published. Required fields are marked *