Before we need a Butlerian Jihad, let’s pause and think about this

You’ve probably heard that some tech thought leaders are calling for a 6-month pause in AI development so we can put some parameters around this developing industry. Morning Wire had an interesting show on the subject.

Several analogies come to mind. The Sorcerer’s Apprentice. The genie in the bottle. The Butlerian Jihad in Dune. The Star Trek episode “What are Little Girls Made Of?” Terminator.

In short, this is a classic, oft-repeated fear: that we will create something we can’t control, which will then control (or destroy) us.

Remember Isaac Asimov and his rules of robotics, which, BTW, never quite worked — indicating that we probably won’t come up with the right set of rules.

But humanity is plowing forward without any rules at all, which is almost certainly a mistake.

Here are some questions.

  • Will companies observe the pause, even if we have one?
  • Who should create the rules?
  • Who should enforce the rules?
  • Should we impose rules on U.S. companies while rogue nations proceed with no limits?

In Dune, the horror of enslavement to computers led all of humanity to agree to absolute war against anyone who created a thinking machine. We’re not just talking about the death penalty. They would nuke your planet if you violated the terms of the jihad.

Is the threat that serious?

My concern is that we’re ruled by morons who are more concerned about trivial things that play well in the press than in any sort of long-term planning. How can we expect such a broken political system to yield any reliable results? It’s as if the Hun is invading, and all our generals are worried about what color to make the uniform pants.