#if we're fighting for both humans and robots we should consider both sides as needing sympathy but also reconcile with who they are
Explore tagged Tumblr posts
hellhathfrozethover · 2 years ago
Text
^^^^^^^^
(tried to talk in tags but it wouldnt fit......converted to text an the fucking post crashed so far in. i hate technology . is it ok if i add on though i know that can be annoying.....but a valve burst pls forgive me )
(So mad I lost all that text. Let's see if I can remember what was said.)
Genuinely happy to see someone open this point for discussion, like FUCK YEAH PAINT IT GREY.
What we have here is a moral dilemma conjured out of a fundamental misunderstanding between how humans and humanoid robots (I distinguish humanoid robots from others because I think that has a lot to do with it) think.
Where you fall in the argument is all contingent on what you stand for personally.
If you stand for individuality and personal freedom, reprograming is inconsiderate at best, and oppression comparable to slavery at worst.
If you stand for order and smooth social operation, this is the easiest, most painless road to rehabilitation.
If you believe independent thought is the height of a humanoid robot's lived experience, their agency and voice are the most important elements in the choice to reprogram them.
If you believe humanoid robots are simply industrial/domestic/administrative aides (tools, for some), their usefulness and service to whatever they were built for is the only factor that matters.
If you view them under the lens of utilising a deep learning AI, it will feel reductive and patronising to re"program" them as if they couldn't think for themselves.
If you don't, there's no guilt.
And a lot of the moral argument over whether it's okay or not is human centric, isn't it?
I'm under the impression that a humanoid robot has no reason to grieve for what was lost if they no longer have the memory. And if they do retain the memory (what a shitty wipe/reprogram that was then), does it result in any severe emotional reaction in the first place? Is that capacity for metacognition not what separates them from reploids?
I can't scrounge up much evidence in the games of any previously civilian robots despairing over what they used to be, or do, even if for them it was as natural as breathing. Anything else could be too, once they're made for it.
Also want to touch on the point that we're not sure which robots were self-aware that were stolen/reconstructed entirely for combat. We don't know if they started off with advanced AI for sure, do we? Cold Man used to be a fridge Dr. Light constructed to house dinosaur DNA, Burst Man used to be a security guard at a chemical plant, Freeze Man was one of a few experimental robots (like Tengu Man) --- these all strike me as positions that could have been filled by robots that don't require higher learning.
Dr. Light's provably were because of his investment in artificial general intelligence, and as a result, we assume this can be extrapolated across anyone who's ever made a biped robot. What if that isn't the case? If that's so, there'd be limits on the way they considered their existences before they were given a conscious AI, right? It wouldn't affect them beyond being a mere objective change.
Dr. Wily tampers with robots' livelihoods freely likely because he takes the position that robots are tools, and are made to maximise production efficiency. He doesn't see them as people, and if he does (implied to me by the fact he gives his own in-house humanoids distinctive personalities), they're tantamount to lifelong servants.
Dr. Light, incidentally, has not been seen bothering with reprogramming any robots that weren't previously his, for the exact opposite reason. He clearly believes in the idea of a self-fulfilling machine. He values what they've been carved into, and how they grow. That growth is what makes them humane for him. This is why he hasn't taken any of Dr. Wily's robots out of his custody and reprogrammed them to fit into society, as it would be against their will, and he can't support that. Even if gutting the Earth's most noxious criminals and repurposing them would not only allow them to live needed and useful to the planet collectively, but also allow humans to live alongside them safely.
There's little of nothing stopping him from forcibly bringing Blues back home and replacing his power core besides Blues himself, his refusal to obey. As much as he misses him, he won't do it. He must have respect for what the humanoid robot wants if he wouldn't even do it to his own boy, and if his other kids had objected to returning back from Wily's side after being stolen, I wonder. He might let them stay.
I'm still not sure where I stand on this.
Human beings anthropomorphise anything they see that's even remotely relatable to their lives, all the time, even if they are physically, mentally, intrinsically separate from them on the most basic level.
I know I'd look at something like the Wilybots being removed from Dr. Wily's custody and rezoned, used for something they were never intended to do, amd feel awful. Like I'm watching a family coming apart. But that's me imposing my sensibilities onto the situation because I'm very attached to family and the idea of forgetting bothers me. If a humanoid robot is so limited at this point in the timeline, would that bother them? If it really is about what they think and feel, we have to consider what comes from their mouths. Evidently, they either can't or don't grieve for past lives.
And it must be said none of this would be any concern if these robots weren't humanoid or inclined to deep thought from the beginning. No-one cries for any old mass produced Joe on the lines, even if they emulate some basic emotions, they're too robotic. They can't advocate for themselves in any way. They don't matter as much.
I guess, in general, something is only as wrong as it relates to what lines you personally would/would not cross.
Those lines are malleable when your mind is reducible to a set of numbers. And it's hard to reconcile that as a human being.
imagine if throughout your whole entire life you had a set of goals, wants, and people you genuinely knew well and cared about deeply. and you had a job that was difficult but you felt pride in it and it was one of your main reasons for your sense of self... it made you who you are
and then imagine that one day a group of scientists that you dont know tell you that all of those things were inappropriate and that they were going to change your brain, whether you liked that idea or not, and make you more adaptable to what best fits in society, and thats its okay because they were also going to make it so that you loved that! your personality would be changed and youd have new wants and needs and you would be so happy!! you'd be a new person, very literally, as everything about your brain and parts of your body would change! and, most important of all, you would be *useful*
reprogramming is essentially that?
66 notes · View notes