pferdefreund
Citizen of Zooville
What if sorting everything out would mean for the artificial intelligence to use all available resources recklessly to advance itself as much as possible and spread in the universe? A more advanced intelligence does not necessarily need to be what we would consider benevolent. Indeed, if we humans are the most intelligent species on this planet and eradicate other species as quickly as we do although we understand the mechanisms, then that doesn't shine a very likable light on intelligence.
And vice versa, there is no fundamental reason to believe it could not be benevolent. Much like some humans invest their lives into protecting/creating/stewarding nature sanctuaries. In particular as it could be a signs [as marked] of (very) advanced intelligence(s) to foresee (1) implications of your own and others actions (2) and therefore (3) collectively decide to take oneself back (4). Only a stupid farmer pulls absolutely everything that is there from the ground this year to maximize his profits. The smarter ones leave enough to maintain or even ever so slightly build up the humus.
It is important to spell that out: Just because something is in nature, or looks natural - that doesn't mean it is 'a great way to do it, or to be'. "Nature", if we personify the happenings of natural science phenomena around us for a second, is incredibly wasteful and dumb at the same time. It is not this greater force of good that people like to take it as when they look at pictures with a lot of green and are happy about the "intact" landscape.
For example, if humans don't screw up, but manage a game-reserve well, the number of animals stabilizes and events of mass starvations of game due to overpopulation of the reserve don't happen. The number and variety of plants increases, as the young plants are not immediately eaten by the young game. Nature can't do that. She cycles around a stable point. The natural mechanisms at work there are indeed "grab everything you can get as fast as you can get it and grow as much as possible" - And where are nature's solution to use ground water in areas which would otherwise be lush if not for the sole reason of lacking water? She will also probably never come up with energy-saving solutions in her organisms like geardrives.
And to address the specific thing: You are talking about von-Neumann probes. A true universal AI would quickly realize (I mean, when I can do it, then an AI with a comparable IQ of 500 must be bored by this) that it takes merely a single space-launch and some operations and industry in space to get that scheme going. Why bother on pesky Earth with an oxygenating atmosphere, locals which sue you over everything, etc when you just need to blast a tiny computer with a copy of your self, some robotics and some miniaturized chemical plant into space? Land it on the Moon, use the resources there to make a few more copies, land those on the asteroids, orbit the gas giants and comets, and presto you have access to every chemical element and the energy (the sun/solar) to fabricate it all. Then send that off to the next star systems with further copies of yourself and the scheme continues like this. Space is vast, and the resources in space are almost endless. An AI also can afford patience, as it doesn't really die. Why then use "up" all the resources on Earth as quickly as possible and get in trouble with the monkeys there? Or hurt your AI-ethics by not protecting lifeforms lower than you? Buggering off with one launch and happily populating the biologically "liveless" universe is actually the easy way out.
If this AI thing happens, it will present a plan how to fix every problem with the minimal impact on everyone possible, and what will happen? We'll see newspaper op-eds that this is all well and nice but has the AI considered that "plan" is a very masculin-y word and maybe it should have used "plans and women-plans" as a formulation instead.
And who wants to deal with idiots like that when you just resolved the ring-magnetic-field problem of fusion reactors for them?
Last edited: