
If you’re a trucker on CB radio, you could overtake the underundertaker, over.
If you’re a trucker on CB radio, you could overtake the underundertaker, over.
I have to stop and take a deep breath every time I see the “can’t put the genie back in the bottle” thought-terminating cliche.
Okay, granted:
Someone foolishly released a chaotic force into the world which is doing irreversible damage and shows no intention of stopping.
What part of that makes you conclude “Well we better just do nothing”?
Ned Ludd shoulda put down his hammer and opened his own factory.
“What’s penetrating gaze?”
“Not much, what’s penetrating you?”
https://en.wikipedia.org/wiki/Philosophical_zombie
A philosophical zombie (or “p-zombie”) is a being in a thought experiment in the philosophy of mind that is physically identical to a normal human being but does not have conscious experience.
If we’re talking about atom-by-atom reconstruction, then the question is about philosophical zombies.
I don’t put much stock in any philosophies that say you the constructed being definitely would be a zombie. But I do believe in the possibility that you the constructed being could be a zombie.
Deterministic atheism isn’t at odds with a soul or non-physicalism. See: Walden Pod
Oh I look at that part of my phone. But that’s all I do to it.
See also: Do artifacts have politics?
Square root of negative one
It ends in either replacing humans with AGI or massive atrocities in an attempt to achieve it.
And there are people in positions of real power who believe in this stuff and act on it.
Andreessen posted a manifesto where he said that deliberately delaying AGI is basically mass murder and should be treated as such.
If it’s shit, that’s bad.
If it’s the shit, that’s good.
Depends on the definition of “you”
I mean, they definitely had logos. But what about pathos?
While eroding the body of actual practitioners that are necessary to train the thing properly in the first place.
It’s not simply that the bots will take your job. It that was all, I wouldn’t really see that as a problem with AI so much as a problem with using employment to allocate life-sustaining resources.
But if we’re willingly training ourselves to remix old solutions to old problems instead of learning the reasoning behind those solutions, we’ll have a hard time making big, non-incremental changes to form new solutions for new problems.
It’s a really bad strategy for a generation that absolutely must solve climate change or perish.
Technical term is a loppitoffamy