I was in an argument online with someone and I was close to tears but then the other person graciously shifted both of our gears from the “quips and slags and put-downs” mode it felt like we were both in by asking an interesting and genuine question.
if you could unleash an AI on the world that would maximize a variable, what would you pick to maximize?
I would want it to help come up with good economics systems. Good ways to help distribute tasks and resources.
You know how TeX takes a lot of factors in mind to determine how to set a paragraph beautifully? (Although it sometimes doesn’t do a great job.)
Here I would want it to be reasonable averse to catastrophes and collapses and environmental tipping points. That’s by far the first thing that comes to mind, as we broke 2° C this year.
It being somewhat fair would also be good. So that we don’t have this huge delta where some live in squalor and other have many billions times more that.
One reason “left” and “right” (although obsolete in many ways) has been such a lingering political cleavage is the core value mismatch between “from each according to ability, to each according to need” on the left and “you get what you yourself earn” on the right. Those are hard to square, impossible maybe, and here AI might help create some good-fit solutions. (Not that either side have ever been particularly close to attaining their professed ideal.)
I’m not opposed to the whole “space ships colonizing the galaxy” future. That’s one of the reasons I care so much about trying to fix the climate disaster that might stop us from getting there. So sustainable research and tech development can also be a good thing for the “best-fit” model.
Everything from bamboo through steel to voltaics and beyond.
Leveraging natural systems and artificial systems.
It seems to me that AI, even going back to the “expert systems” of the 1970s, can handle weighing many parameters in way that more simpler systems can’t as easily. Like, the entirety of market capitalism emerged like a sprawling Conway game from a very small set of rules:
“I’ll give you this if you give me that, and I can consider separate offers if there are separate offers, or refuse the deal if I want to.”
From there everything else like supply/demand pricing, wealth & and means-of-production ownership concentration, inheritance, future markets, collagerized debt obligations, labor exploitation have all emerged.
And that core rule set had an inherent flaw from day one. “I’ll give you this if you give me that” doesn’t account for how other parties are impacted.
There are other ways to think that might be a lot better suited for other circumstances.
my tongue in cheek response is, “so you want to maximize the number of space ships colonizing the galaxy?”
That’s fair actually. Yeah, send out those silver seeds.
then we agree in principle high five