Kip Hansen's News Brief—November 15, 2024
“In 2022, Congress passed the Global Catastrophic Risk Management Act (GCRMA). GCRMA requires the Secretary of Homeland Security and the head of the Federal Emergency Management Agency (FEMA) to coordinate an assessment of global catastrophic and existential risks over the next 30 years.
Now, the Homeland Security Operations Analysis Center (HSOAC) has made this assessment. In fact, the report was produced by a division of the RAND Corporation: “The RAND Corporation's Homeland Security Research Division (HSRD) operates the Homeland Security Operations Analysis Center (HSOAC).”
The title of this report is simply “Global Catastrophic Risk Assessment,” Focus on risks related to six themes:
1) Artificial Intelligence, 2) Asteroid and Comet Impacts, 3) Nuclear War, 4) Rapid and Severe Climate Change, 5) Severe Pandemics, and 6) Super Volcanoes.
Dr. Roger Pielke Jr. authored government report on him substack Under his heading “Global Existential Risks.” Pielke Jr.'s article is well worth reading in full, but for reader convenience we summarize it in an excerpt chart from the full report produced by the RAND Corporation, followed by Pielke's summary chart.
Read Pielke’s entire article here.
######
Author comments:
I basically agree with Pielke Jr..
Artificial Intelligence will not become “sentient” and threaten humanity – if allowed to direct or control anything, it could wreak havoc: Artificial Intelligence is neither smart nor rational, it cannot distinguish truth from error, fact from fiction, And like your five-year-old, it's perfectly fine to just make things up and pass them off as reality.
Supervolcanoes are geological – they may and may cause massive damage, but they do not necessarily exist or worldwide Catastrophic risk.
It is generally accepted that comet or asteroid impacts have happened in the past and are possible – the risk depends on the magnitude. Moreover, nuclear weapons have been used and may be used again. A widespread, all-out nuclear war would have the potential to cause global catastrophe and even existential consequences.
Intentional or unintentional ultra-lethal disease pathogens could wipe out humanity—or be enough to force us back to the Stone Age. It doesn't take much for the population to shrink before we lose advanced technological capabilities. Not even the smartest among us can produce computer chips or cell phone service or create a vaccine from scratch.
So, what risks should our governments and think tanks focus on?
Hint: Not climate change.
Thank you for reading.
######
Relevant