When do we need human-like intelligence?

Fearmongers and fanboys flag AI’s potential to automate tasks currently requiring human intervention.

Job displacements and inequality stand opposite to enhancing our lives and unleashing creativity. One thing is sure, AI systems have spread everywhere. "Smart" systems are integrated into our communities, homes, vehicles, buildings, farms, cities, and various aspects of business and life.

The advancements set the stage for an even more powerful system: Artificial General Intelligence (AGI) promises to surpass human intellectual capabilities across various tasks for various aspects of our lives: complex decision-making, empathy-driven interactions, and ethical dilemmas. AGI is supposed to solve it all.

This pops the question.

Where is human-like intelligence truly necessary?

One area stands out: autonomous driving. Road conditions, weather, the behavior of other drivers, and your own speed. It has to consider countless variables, both internal and external, to make split-second decisions. It will encounter countless ethical dilemmas, being in control of human lives.

The AI is taking (a) fully autonomous and (b) high-impact decisions. In this sweet spot, we want to ensure that the AI takes decisions that are aligned with human values and ethics.

  • If the AI is only contributing to the decision of a human supervisor, we do not need the same level of alignment.

  • If the decision is not very costly or doesn’t have a high impact, we can relinquish more control to the AI without worrying about major consequences.

But in this sweet spot, in the intersection of high impact and high autonomy, we need an AI that is aligned with humans, i.e. possesses a human-like intelligence.

If we cannot achieve the alignment, we should focus on removing autonomy and creating augmented intelligence, like a centaur model: a fusion of AI and humans, each contributing its core skills, but the human taking the ultimate decision, where it matters.

Sidenote: Where should the AI take the decision?

We do poorly when we have to make decisions based on factors, rather than emotions or intuition. We get easily overwhelmed, distracted, and biased. We should let AI take the driver’s seat in decisions that have to be done in sub-seconds, require very rational thought, or large amounts of calculations / factual input. Handling finances, diagnosing illnesses, and orchestrating our energy grid. We cannot do it, so let the machines do it for us and free us up for more important, creative, safety-critical decisions.

We should first analyze the characteristics of the decision and then decide whether we want to apply AI to it and how much autonomy the AI receives.

Factors to determine the degree of autonomy of AI to take certain decisions

  • Low vs. High Impact Decisions

  • Rational vs. Intuitive Decisions

  • Fast vs. Slow Decision Speed

  • Simple vs. Complex System Within to Take the Decision

  • Low vs. High Variety of Decisions

Sources

Reply

or to participate.