Upper post

This site is about all roadmaps which are created by Alexey Turchin. The roadmaps are intended to become new way of thinking, which will help to solve most complex problems. First of all I want to solve the problems of […]

Doomsday Argument Map

The Doomsday argument (DA) is controversial idea that humanity has a higher probability of extinction based purely on probabilistic arguments. The DA is based on the proposition that I will most likely find myself somewhere in the middle of humanity’s […]

AGI safety solutions map

When I started to work on the map of AI safety solutions, I wanted to illustrate the excellent article “Responses to Catastrophic AGI Risk: A Survey” by Kaj Sotala and Roman V. Yampolskiy, 2013, which I strongly recommend. However, during […]

A Map: AGI Failures Modes and Levels

This map shows that AI failure resulting in human extinction could happen on different levels of AI development, namely, before it starts self-improvement (which is unlikely but we still can envision several failure modes), during its take off, when it […]

Immortality Roadmap

The Roadmap to Personal Immortality is list of actions that one should do to live forever. The most obvious way to reach immortality is to defeat aging, to grow and replace the diseased organs with new bioengineered ones, and in […]