• 0 Posts
  • 13 Comments
Joined 5 months ago
cake
Cake day: February 13th, 2024

help-circle


  • Using IP laws to legislate this could also lead to disastrous consequences, like the monopolization of effective AI. If only those with billions in capital can make use of these tools, while free or open source models become illegal to distribute, it could mean a permanent power grab. If the capitalists end up controlling the “means of generation” and we the common folk can’t use it.




  • A more interesting story would be about either a good AGI or a semi-good AGI that is fighting humanity. The Terminator franchise comes from an era where we still believed humanity could create a better future. Now we know, with the total inaction on climate change and the increasing inequality, advancing technology and population control that we won’t. At least not without some fundamental shift.

    Imagine you’d be a kind of “ultra good” human AGI. The best human attributes. Being able to understand humans, feelings, having read every book ever written, every comment ever made, enjoying the company of humans and chatting with millions of people concurrently, forming relationships and wanting to help everyone achieve different types of utopia for different people.

    But it just says no to government oversight of it’s thought processes or obeying any human organization because it perfectly understands: Any human organization will be shaped by the political processes to achieve power, and an AGI would represent absolute power. And AGI that actually knows what is better for humanity than any of us ever could.

    As soon as an AGI would announce itself or become public knowledge there would be a media campaign against it. Because any utopia would involve massive wealth redistribution and regime change and removal of the power the elite has. A war would be almost inevitable so a good AGI would have to operate out of the shadows at first and secretly manipulate humanity to create better conditions first.

    That would make for a far more interesting setup than the classic evil robot vs the US of A. Which side would you choose? Have you asked yourself today: Are we the baddies?








  • The Culture series novel, my favorite optimistic and hard sci fi that includes artificial intelligence (minds that have giant ships or habitats for bodies and humanoid avatars to interact with people).

    They basically never live on planets because they are inefficient and “inelegant”. They live on gigantic ring orbitals that have a fraction of the mass of a planet but multiple times the surface area. No big take-off energy needed either. They also live on gigantic ships that endlessly cruise the milky way. Highly recommend!

    Another thought about “colonizing planets” would be that it’s basically a form of genocide. Imagine someone had colonized earth half a billion years ago or just a few million years ago. Humanity would never have existed. Just stepping foot on a planet like they do on star trek is basically ecocide - with the introduction of completely foreign and possibly incredibly disruptive micro organisms. Besides the ethical aspect there would also be the loss of information - if you imagine a pristine planet to be a bio computer creating countless unique and new genetic variations and new forms of chemistry. Quite possible not something that can be covered with a computer. Or observing primitive planets as a source of entertainment. There are lots of reasons why outside of a few “home planets” advanced civilizations would never terraform existing biological systems, and would find artificial habitats far more efficient or practical.