Good and evil in the context of AGI

Good vs Evil
Author

Lucas Shen

Published

April 2, 2024

Something funny I thought about during lunch.

Assume one individual could create, align and control an AGI, and have access to resources at scale, what could one do in the name of good or evil to the world?

Naively, good means UBI and abundance. Evil means taking the world and rule as a dictator.

Given Man’s Search for Meaning, assuming one does things to pursue a meaningful life, be the first AGI powered dictator at scale is actually not fulfilling the purpose very well.

  1. It’s too easy. For one who can create and control AGI, fair to assume one has excellent taste. Controlling and manipulating those with no means to fight back is simply boring.
  2. Being the one ultimate meta villain has an unexpected side effect: the world, for the first time in the history, would unite under the same cause to fight for survival and freedom. Imagine the net creation of meaning!!! It’s orders of magnitude more than go to the moon, fight capitalists and communists combined.

A more sophisticated evil doer would seriously consider UBI and distribute abundance indiscriminately. The real fun lies in the spiral cycle of hope and despair and the beauty/ugliness of human nature.

  1. Most people would love you superficially. An once of amusement maybe.
  2. UBI is a neutral amplifier. Tune the resource distribution to max, a sustainable level that Earth could afford, you get to see the true nature of everyone. Some would go ahead and create beautiful artifacts as side effect of self-expression, others would get lost and stuck in the downward spiral of 7 sins. Hard to predict which way would gain more momentum, but one thing is certain: chaos.

What an irony. The meaning of good and evil seem to be lost in this stupid language game.

After thoughts

Obviously, one is not stupid enough to kill everyone, which is THE most boring way to use AGI.

The future is here, just not distributed evenly.

What’s really scary to me is the diffusion process between the first AGI to everyone has access to similar level of AI.

No guarantee the diffusion would happen at all. During the process, what the world would become? How societies and civilization as a whole would evolve? How individuals would adapt?