Mitch Inoz
1 min readMay 19, 2023

--

Say that AI is right 99.99% of the time and humans only 97%, then it stands to reason that we leave decision-making to AI, the better decision-maker. The consequence is that if AI decides to unleash nuclear war, we must accept that it’s more likely to be the right decision, even if we do not understand completely why. A slightly uncomfortable position for us humans. But we are not in a position to judge anymore: AI would have done the better estimation of the consequences of all the alternatives, and it would take ages for us to analyze AI’s reasoning. Humans would have made a qualitatively inferior decision, most likely due to personal biases of the people in power and our inability to evaluate all alternatives and their consequences. It’s a scary future, buthumans have no other option: We must develop this AI. We are humans: If we can, we must, even if it means the end of humanity. BTW, the rules and regulations governing the development and use of AI, are also best written and maintained by AI, after all it does a better job at it.

--

--

Mitch Inoz
Mitch Inoz

Written by Mitch Inoz

IT-, biotech-, fintech survivor, fan of: languages, critical thinking, golf, tennis, Cruyff and is now an omil (Old Man In Lycra)

No responses yet