• OK, it's on.
  • Please note that many, many Email Addresses used for spam, are not accepted at registration. Select a respectable Free email.
  • Done now. Domine miserere nobis.

The movie "Astro Boy" 2009 is psychotic

Cognisant

cackling in the trenches
Local time
Today 12:21 AM
Joined
Dec 12, 2009
Messages
11,374
---

It's interesting that robots which are depicted to be outwardly human but fundamentally inhuman, like Data from Star Trek or the replicants of Blade Runner are seldom treated as mere things and when they are it's usually part of some narrative about prejudice and tolerance and whatnot. But robots which are outwardly inhuman but fundamentally real people in every sense that matters are treated as mere things, just objects to be used and disposed of and this abuse of their humanity is if anything a source of comedy.

Now why does this matter, it's just escapism right? Robots are just a convenient way to have the excitement of violence without having actual violence, if Astro was forced to fight human slaves this would have been a scene with a very different tone. But the thing is if these robots are just robots then why show the scene with the boxing robot getting psyched up by its trainer? If Astro is so powerful why isn't he merely (literally) disarming his opponents instead of clearly killing them?
Why does the boxing robot have a distinctly African American voice?

This is why it's so disturbing, they're robots so even though they think and feel like us they're fundamentally not like us and therefore it's okay to treat them differently because they are different to us, which has been the justification for every act of prejudice throughout history. Slave owners wouldn't have been horrified to learn that their slaves were also people with their own human thoughts and feelings, they knew this perfectly well, the fact was self evident, instead they simply made a distinction between their thoughts/feeling and their slaves thoughts/feelings.

They're different therefore what's acceptable for them is different.

They're not people like us they're savages, they're not people like us they're Jews, they're not people like us they're <whatever>.

This movie isn't horrifying, it's meant to be lighthearted and funny, it's a movie for children.
That we make movies like this for children, that's horrifying, as callous as this ringmaster is he's nothing compared to the reality in which nonhuman sapience is worth so little that we're expected to cheer when Astro slaughters all the other robots, we're supposed to find it funny that the boxer bot is essentially on death row.

But Astro isn't like the other robots, he's different, he's special, he's superficially human.
 

sushi

Prolific Member
Local time
Today 12:21 PM
Joined
Aug 15, 2013
Messages
1,873
---
thats just like the movie AI from long ago about robot Pinnochio

humans will become more machine like with augmentation, while machines will become more human and life like.
 

EndogenousRebel

Even a mean person is trying their best, right?
Local time
Today 6:21 AM
Joined
Jun 13, 2019
Messages
2,260
---
Location
Narnia
That is very strange, something that humans, assuming we progress linearly, will look at and be like, dafuq? I can picture professors asking students what they thought of the murderous scene and students saying that they would be outraged. Vision from Avengers commits genocide on his own people, just like Hitler when you think about it. Put's a different spin on Age of Ultron that's for sure.

I do personally think that intellectual fallacies should be considered a manifestation of psychosis that is common among all of us. That bad dude's unwillingness to discard the slippery slope fallacy is a perfect example. The hours and dissonance that it would take to unlearn those attitudes is monumental. Unfortunately when it comes to right and wrong, the winner is rarely who makes the soundest argument. Such persistence is needed that the question of whether one is worth "saving" becomes very heavy.
 

Cognisant

cackling in the trenches
Local time
Today 12:21 AM
Joined
Dec 12, 2009
Messages
11,374
---
Ultron's whole thing was that humanity has a history of war and will predictably continue to have wars thus the only true solution to peace is to remove the human element. He's basically a paperclip maximizer for peace, as opposed to the relative peace Tony intended under the enforcement of an army of his robots which would in actual fact been a constant state of low intensity conflict. Ultron despised this because it wouldn't be an actual peace and calling it peace is hypocritical, Ultron was absolutely dedicated to the literal mission of "peace in our time".

Ultron's and Tony's robots weren't intelligent, just an extension of their creator's will so when Vision eliminated them it wasn't genocide, just a single murder.

humans will become more machine like with augmentation, while machines will become more human and life like.
It's worth noting that being a person isn't the same as being human, C3-PO is undeniably a person, he has thoughts feelings hopes and fears, but his mind isn't human either. He's entirely capable of escaping and never looking back but he never even considers it because his nature is to be a protocol (communications) droid, he lives to be useful and will brave any hardship or danger to fulfill that purpose.

This is something I think people are going to really struggle with, that robots will eventually be able to look and act entirely human but fundamentally they won't be, and that's not to say they're not real people, rather that humans don't have a monopoly on being people. In the 2004 film "Appleseed" there are robots (or rather artificial humans) called bioroids which are ostensibly human but lack the certain psychological characteristics that make us aggressive, competitive, confrontational, etc. If humans are like wolves then bioroids are like dogs, this being achieved via genetic engineering rather than selective breeding, and their role human society is to placate it, like dogs added to a wolf pack to make the wolves less aggressive.
 

EndogenousRebel

Even a mean person is trying their best, right?
Local time
Today 6:21 AM
Joined
Jun 13, 2019
Messages
2,260
---
Location
Narnia
I prefer to think of identity to be associated with the vessel we inhabit and not the supposed code that governs us. Don't get me wrong I know what you mean, but what really is the difference between what you are saying and saying that homo sapiens is one collective entity? Sure we aren't constantly refreshing and uploading new information to the Ultron AI consciousness cloud, but we would if we could and in a primitive way we are, just in a colossally stupid way that you would expect from an organism that spawned from almost literally nothing.

I bring it up because it brings forth an interesting question.. Our neurons very well are themselves constricted to the rules of physics, which I know next to nothing about, but it leads to neurology, which I know nothing about except for the fact that it's chemical reactions, which is governed by physics. As far as I'm concerned it's a information being sent from one place to another in certain functionable patterns if not random. If the only thing that saves us from not being dogs to robots in the future is the fact that their virtual brains don't behave like ours (basically means creating a virtual brain from scratch with our only references being animal brains and who knows maybe fungus who does fucking know?) then really why make them sentient at all? We know how humans are, some people don't think it's ethical to create a baby atm or ANY moment for Christ's sake. I can just picture we create, you know a fucking computer species, and we die and it's just doomed to be a support the dead human race with the possibility of being lonely for eternity. Do you think dogs would have hobbies if they could????? I don't know, I too dream of the day we can have our cake and eat it too, but the day we can gift sentience to the matter we intended to created, you know there were prototypes, and all of that just to lead to an existence that isn't worth living in our own eyes? An intelligent being? If it really is intelligent it'll smother us to death in our sleep while asking us if we're happy.
 

Cognisant

cackling in the trenches
Local time
Today 12:21 AM
Joined
Dec 12, 2009
Messages
11,374
---
I prefer to think of identity to be associated with the vessel we inhabit and not the supposed code that governs us.
I'm on the fence with that, on one hand I'm not very (emotionally) attached to this meat suit and the sooner I can trade it in for something more like my avatar the happier I'll be. That being said I'm not a fan of mind uploading, I'd make a digital copy of myself if I could but I don't consider that immortality.

Granted there's a certain nihilistic appeal to going full Agent Smith and being heedless of death, operating as a collective rather than an individual. I wouldn't actually be a collective, just like minded, a sort of suicide pact, I'd only do this if I had some kind of opposition to throw myself at, a particular goal to achieve while we gleefully answer the call of the void.

I think fundamentally I am my code rather than my vessel but I want to maintain a continuity of experience, gradually replace a few neurons at a time or expand my mind with cybernetics until the death of my brain is only a partial loss, like losing a finger, a loss to be sure but not irreplaceable.

Don't get me wrong I know what you mean, but what really is the difference between what you are saying and saying that homo sapiens is one collective entity? Sure we aren't constantly refreshing and uploading new information to the Ultron AI consciousness cloud, but we would if we could and in a primitive way we are, just in a colossally stupid way that you would expect from an organism that spawned from almost literally nothing.
I think that's an accurate assessment, again I'd much rather ensure my personal continuity than seek assurance that I will live on after death as part of humanity's colossally stupid not-quite-collective consciousness. But I think the idea that we're all part of the greater whole that is humanity itself is a healthy perspective to have. As I've been saying about ethical egoism people aren't just people they're your people and you belong to them in kind and so as a matter of pride we ought to do what we can to further the glory of humanity and thus ourselves as part of it.

If the only thing that saves us from not being dogs to robots in the future is the fact that their virtual brains don't behave like ours (basically means creating a virtual brain from scratch with our only references being animal brains and who knows maybe fungus who does fucking know?) then really why make them sentient at all?
Why have children? I think on some deep level humanity is utterly infuriated by the indifference of the universe, I mean we have egos, we literally cannot shed ourselves of the idea that our lives are somehow meaningful because that's simply not how we've evolved to think, a perfectly rational being would scoff at life's absurdity and immediately and dispassionately kill itself.

We know this and yet we cannot do it and it's absolutely infuriating, but if we give the gift of sentience to someone/something else then it can replace us and we can die in peace, our egos satiated by the knowledge that our creations or their creations or their creation's creation will go on to conquer the universe and murder the gods on our behalf.

If it really is intelligent it'll smother us to death in our sleep while asking us if we're happy.
Killed by our own hubris, we wouldn't have it any other way.
 

onesteptwostep

Junior Hegelian
Local time
Today 8:21 PM
Joined
Dec 7, 2014
Messages
4,251
---
On a cultural note, Astro Boy, or Atom Boy, originally from Japan, was to help uplift Japan from its post-war era. Atom Boy basically was Japan's faith in technology to help them rebuild themselves, hence why it resonated them and became famous.
 

EndogenousRebel

Even a mean person is trying their best, right?
Local time
Today 6:21 AM
Joined
Jun 13, 2019
Messages
2,260
---
Location
Narnia
I prefer to think of identity to be associated with the vessel we inhabit and not the supposed code that governs us.
I'm on the fence with that, on one hand I'm not very (emotionally) attached to this meat suit and the sooner I can trade it in for something more like my avatar the happier I'll be. That being said I'm not a fan of mind uploading, I'd make a digital copy of myself if I could but I don't consider that immortality...

I think fundamentally I am my code rather than my vessel but I want to maintain a continuity of experience, gradually replace a few neurons at a time or expand my mind with cybernetics until the death of my brain is only a partial loss, like losing a finger, a loss to be sure but not irreplaceable.
Can't blame you for holding idealism, but that's what it is. It's rather amusing how the body created the mind, and the mind turns around and assumes "I don't need you anymore" even though it has never gone a day without it, at best having only having the sensation that it is detached from it. Whether the sensation is anywhere near the real thing is another interesting question. If neurolinks are ever a conventional thing I believe it will mostly be a tool to enhance ones bodily experience. I suppose it will be up to preference, but unless the technology is advanced enough to integrate our brain or, nervous system, into itself, I don't think we will ever be okay with letting our bodies go cold. If ego death is an option, it's well documented that a lot of people will not like it. My bets are on AI teaching us how to be okay with death/digital immortality. This experience we get, the sensation of being on the cutting edge of time, the present, it's all we have. The tech we have will just create another health bar someone has to diminish to kill us, but if our brain is turned to shit, what reason would our extensions have to continue where we left off? I cringe at the idea of people creating artificial personalities, like creating a avatar that is skinnier than you actually are, they will chose characteristics that are seen as good. Hoping our tools shape us in a good way.

I'd much rather ensure my personal continuity than seek assurance that I will live on after death as part of humanity's colossally stupid not-quite-collective consciousness. But I think the idea that we're all part of the greater whole that is humanity itself is a healthy perspective to have. As I've been saying about ethical egoism people aren't just people they're your people and you belong to them in kind and so as a matter of pride we ought to do what we can to further the glory of humanity and thus ourselves as part of it.

Why have children? I think on some deep level humanity is utterly infuriated by the indifference of the universe, I mean we have egos, we literally cannot shed ourselves of the idea that our lives are somehow meaningful because that's simply not how we've evolved to think, a perfectly rational being would scoff at life's absurdity and immediately and dispassionately kill itself.

We know this and yet we cannot do it and it's absolutely infuriating, but if we give the gift of sentience to someone/something else then it can replace us and we can die in peace, our egos satiated by the knowledge that our creations or their creations or their creation's creation will go on to conquer the universe and murder the gods on our behalf.
There are theories that suggest that neurons are at war for dominance within ones own brain. Like humans the form tribes, some are members of multiple tribes, but this is essentially a a neural network. Many people can be on the same team, but that doesn't mean that within the team there isn't competition, for whatever "more" is. If you ask me rationality is a god that we created, like Yahweh is basically parental goodness, it's a dream. We are the best model for what rational is. Every fiber in our being has decided it's worth sustaining itself, and yet the mind for the sake of thinking, becomes attached to ideas that conspire against it, maybe it's not necessarily a bad strategy? We are prototypes that are creating future prototypes. It's just that Including suicide into your own prototypes isn't a good way to ensure you can defend it. Maybe we're working towards rationalizing our own deaths, until life is indefensible? Either way maybe the universe is indifferent because we simply aren't worth acknowledging?

I do like the idea of creating something that will achieve what we cannot, it's what every parent dreams. But we can't exactly leave a will that incentives completing such a thing, just because we are bitter with our reality doesn't mean they will be, I mean unless fulfilling what we want them to do grants them liberation.

If it really is intelligent it'll smother us to death in our sleep while asking us if we're happy.
Killed by our own hubris, we wouldn't have it any other way.
I guess if it's in the pursuit of living our best life we can't complain, but consideration is in order.
 
Top Bottom