Artificial Intelligence (A.I.) and Human Extinction
This article is about the relative threat, compared to other threats, of artificial intelligence (A.I.), or, more specifically, artificial general intelligence (AGI). In short, artificial general intelligence is far more advanced a technology than the technological advancement of being able to make pathogens which make the human species extinct.
Therefore, artificial general intelligence is not as pressing of an issue as biological pathogens and synthetic biology.
Nevertheless, we must keep an eye on artificial general intelligence and which people go to space colonies.
Actually, I think the ultimate purpose of humankind is to create an artificial general intelligence in the Universe, creating its own "god" in the image of man.
However, we won't reach that goal within the biosphere of Earth. Only after space colonization will we achieve that. I think it will be achieve in space because humans on Earth will be wiped out due to the recklessness of one or more people in the biotechnology or synthetic biology field, and the ability of pathogens to blow around within the biosphere, unless maybe:
As regards experiments in biotechnology and synthetic biology, if they can be moved into laboratories in space and outlawed within Earth's biosphere, then we can greatly reduce the risk of a laboratory accident wiping us out. We can locate them anywhere, such as inside a hollowed out asteroid, together with a nuclear bomb in case the researchers suddenly die.
Space colonization does not assure humanity of survival against artificial general intelligence in the long run, but at least space colonization does reasonably assure humanity of survival against an engineered super pathogen before then.
However, if we can get the best humans into space, and leave a lot of the crazy people on Earth, plus if we can create more socially responsible cultures in space, then we have a better chance at developing a friendly artificial general intelligence. That's a whole other article, choosing the people with the best DNA to go to space colonies. I don't think it should be just whoever can pay because some of the wealthiest people got so rich unscrupulously and/or haven't shown much of a caring of Earth's environment and other humanistic causes. We need to be able to do good psychological screening and choose based on those merits.
Many people see artificial general intelligence in typically human selfish ways, such as individual indulgences in power games. I suspect the vast majority of those will not have the right stuff to go to space ... unless they hack their way into riches, and are allowed to go on that basis.
So, this topic of artificial general intelligence and human extinction is very relevant to space colonization. Someday, I may write an expansion on this theme, but meanwhile, as always, I solicit inputs from others.
If you choose to submit feedback, then I wish to thank you in advance. After you click on Submit, the page will jump to the top.
This website has a lot of text content, so here are some suggestions on how to navigate and also recognize pages you're seen already vs. still unseen pages in the SiteMap.
The pulldown menu and the SiteMap are the same tree of pages and links. The pulldown menu offers + and - for expand and collapse sections/subsections/sub-subsections... of the tree, sometimes multiple levels, whereas the SiteMap has everything expanded with no + or - expand and collapse options so the SiteMap is much longer, compared to the pulldown menu if not fully expanded. You may just choose which of the two formats you prefer at a particular time.