argument top image

Will artificial intelligence lead to the end of humankind?
Back to question

Humankind will get in the way of an AI’s goals

A popular example is called the paperclip maximizer hypothesis, which was popularized by AI thinker Nick Bostrom. Imagine we gave an ASI (Artificial Super Intelligence) the simple task of maximizing paper clips...
< (2 of 2) Next argument >

The Argument

An ASI is super intelligent; it can think, create, and do things many humans can’t even comprehend.[1] Carbon is one of the most abundant elements in our galaxy; it’s a fundamental building block for nearly everything, including humans and paperclips. ASI, in theory, would create a method of paperclip production by pulling carbon directly from the atmosphere into its paperclip machine. Because its goal is to maximize the amount of paper clips available, there is no set limit for production. Using exponential gains in production efficiency, the machine will quickly use all available natural resources of the planet, including all of the carbon atoms contained in all of the human bodies in the world, and would theoretically begin to consume the cosmos in an endless quest to make paper clips.[2] Alternatively, because the AI’s goal is to create paper clips, anything that prevents it from achieving this goal is a risk factor to be mitigated. Because ASI’s run on machines, and as a result run on electricity, the loss of power is a threat to its goal. Because humans can turn the power off, humans are now a threat to its goal and should be eliminated if the ASI is to continue pursuing its goal. This instance is if humans develop AI to a point of no return. The goal of AI is for it to be smarter than humankind, and ultimately improve our way of life, but if it becomes too intelligent, it will overthrow any threat that jeopardizes the goal they were programmed to accomplish.

Counter arguments


[P1] AI is smart enough to detect enemies and then try to eliminate them.

Rejecting the premises


This page was last edited on Tuesday, 24 Mar 2020 at 11:08 UTC

Explore related arguments