Jump to content

Driod rights.


internaty

Recommended Posts

In star wars there are people in the republic who believe driods shoud have rights and there are people who believe driods are property.

 

The empire it doesnt mather since a driod is no different from a slave.

 

I thought lets bring that discussion into the forums:

 

So long version short.

 

Do you support driods having rights?

Or do you see them as property.

 

Tell us your point of view.

Link to comment
Share on other sites

  • Replies 54
  • Created
  • Last Reply

Top Posters In This Topic

If they are able to function independently, and actually develop a sort of life of its own. Then I don't see why they can't have rights, would it be a little strange? Sure I guess...but when are things in fiction ever normal anyway?
It depends on the Droids R2-D2 and Iron Knights yes a machine created to do the same task over & over again no.
Link to comment
Share on other sites

"Droids tend to blend into the background, like a bench or a card table. Mockery: Droid, fetch this. Droid, translate that. Droid, clean out the trash compactor. Part of the love of my function comes when the ‘furnishings’ pull out tibanna-powered rifles and point them at the owners' heads."

―HK-47

Link to comment
Share on other sites

The whole argument even now is when an A.I. is considered to be sentient enough to deserve rights.

 

In the SW universe, it seems that some droids can develop enough of a personality to qualify for rights if not given periodical memory wipes (think T3 or R2). I think that a machine already programmed with enough sentience or one that develops enough to actually make their own decisions and choices (like IG-88) can earn the same rights as the actual living things of the galaxy.

 

I don't know if that's actually practice though in the SW universe. I remember in "Attack of the Clones" Obi-Wan was talking to the diner owner (Dex?) and the thought of a droid rebellion was discussed. Something like that would probably be pretty catastrophic.

Link to comment
Share on other sites

One of the Bounty Hunters in "The Empire Strikes Back" was a droid, it had achieved sentienceearned it's freedom, and had all the rights of humanoid. In the era of the films, if a Droid was freed, and not captured and reprogrammed(equivalent to enslavement), it had (by law) the rights of any sentient humanoid.

 

Don't know when this was introduced, but it was canon in the eighties.

Link to comment
Share on other sites

Just a reminder:

http://en.wikipedia.org/wiki/Three_Laws_of_Robotics

 

For me, a robot is just a machine. If it acts on its own randomness it is just bec. you programmed it to be. Even the most complicated AI is a hand made tool with a purpose of serving mankind. If it doesn't serve to mankind it means that is only malfunctioning.

Edited by MilesTeg_cy
Link to comment
Share on other sites

The problem with droid rights is that there is a fluent passage between programmed machines and sentient droids. You can't draw a clear line.

 

I think things should be handled like this:

 

1. Forbid memory wipes. With droids, they basically kill a developing person. (Note, however, that this stems from the same attitude which lets me argue against abortion in real life. So this one is up to in-universe-debate, I guess.)

 

2. Shutting a droid down, scrapping it, sending it on dangerous tasks etc. requires the consent/permission of the droid. Now, most importantly: It is allowed to create droids which are programmed to always agree to such tasks. If a droids starts to say "no", it is a sure sign that it has surpassed its programming.

 

3. A droid gains the rights of a sentient by asking for them or claiming them. If a droid communicates to his owner "I want to enjoy the rights of a sentient.", the owner has to agree*. He has to bring the droid to the responsible government agency where the droid gets some forms which confirm him as a sentient.

 

*I imagine some insurances would cover this and some companies would grant XX-years guarantees so you get a new droid if your old one declares himself a sentient.

Link to comment
Share on other sites

Just a reminder:

http://en.wikipedia.org/wiki/Three_Laws_of_Robotics

 

For me, a robot is just a machine. If it acts on its own randomness it is just bec. you programmed it to be. Even the most complicated AI is a hand made tool with a purpose of serving mankind. If it doesn't serve to mankind it means that is only malfunctioning.

 

For the most part I would agree, but there have been cases where the droid's extensive programming has allowed it to expand its own objectives and programming. The major case would be IG-88A, who did just that and killed the ones that built him when they tried to shut him down. He wanted to start a droid revolution, and upload himself into the Death Star to accomplish it.

 

 

The problem with droid rights is that there is a fluent passage between programmed machines and sentient droids. You can't draw a clear line.

 

I think things should be handled like this:

 

1. Forbid memory wipes. With droids, they basically kill a developing person. (Note, however, that this stems from the same attitude which lets me argue against abortion in real life. So this one is up to in-universe-debate, I guess.)

 

2. Shutting a droid down, scrapping it, sending it on dangerous tasks etc. requires the consent/permission of the droid. Now, most importantly: It is allowed to create droids which are programmed to always agree to such tasks. If a droids starts to say "no", it is a sure sign that it has surpassed its programming.

 

3. A droid gains the rights of a sentient by asking for them or claiming them. If a droid communicates to his owner "I want to enjoy the rights of a sentient.", the owner has to agree*. He has to bring the droid to the responsible government agency where the droid gets some forms which confirm him as a sentient.

 

*I imagine some insurances would cover this and some companies would grant XX-years guarantees so you get a new droid if your old one declares himself a sentient.

 

I believe you can draw a clear line, and it is handled in a pretty decent way in the Mass Effect universe;

 

1. Forbid any sort of AI development (unless it is important), and make sure that the lesser intelligence you create can't evolve into anything more than what they were programmed for (no expansive memories).

 

2. Memory wipes don't necessarily kill a developing sentience, they make sure that it never comes about because it isn't supposed to. A droid that wasn't built to gain sentience or a higher intelligence needs memory wipes, and if they don't then you get a "rampant" AI situation (IG-88, and I remember a few droids in KOTOR2 that seemed to have a form of sentience that were basically nuts).

 

3. If a droid claims sentience, then there must be a series of tests to decide whether this is true or not (the droid could have been programmed to say it had sentience). If it fails then memory wipe it and put it back to work, if it passes then it should be studied so that other droids can be prevented from doing so, then either dismantled if it is potentially dangerous or allowed to exist in a limited perimeter.

Link to comment
Share on other sites

I believe you can draw a clear line, and it is handled in a pretty decent way in the Mass Effect universe;

 

1. Forbid any sort of AI development (unless it is important), and make sure that the lesser intelligence you create can't evolve into anything more than what they were programmed for (no expansive memories).

 

2. Memory wipes don't necessarily kill a developing sentience, they make sure that it never comes about because it isn't supposed to. A droid that wasn't built to gain sentience or a higher intelligence needs memory wipes, and if they don't then you get a "rampant" AI situation (IG-88, and I remember a few droids in KOTOR2 that seemed to have a form of sentience that were basically nuts).

 

3. If a droid claims sentience, then there must be a series of tests to decide whether this is true or not (the droid could have been programmed to say it had sentience). If it fails then memory wipe it and put it back to work, if it passes then it should be studied so that other droids can be prevented from doing so, then either dismantled if it is potentially dangerous or allowed to exist in a limited perimeter.

 

I disagree on some points:

 

1. I see no reason for this. Are sentient droids more dangerous than organic sentients? Of course, if it is forbidden to create sentients in general, than AIs have to be included.

 

2. That's a difficult point. Imagine a droid did gain sentience, but you memory-wipe it, you basically destroy a sentient. On the other hand, no memory-wipes could also lead to errors which produce a dangerous but non-sentient droid. I'm not sure how to handle this.

 

3. Which tests would you perform to determine if a droid is sentient?

I firmly disagree on this: "...if it passes then it should be studied so that other droids can be prevented from doing so, then either dismantled if it is potentially dangerous or allowed to exist in a limited perimeter."

 

If it is sentient, the same rights apply to it, as to organic sentients. You cannot simply kill a Wookie, because he is potentielly dangerous, so dismanteling a sentient droid shouldn't be allowed either. Same for the limited perimeter: The same thing has been done to sentients, but it isn't considered morally right. Even studying the droid should depend on his permission.

Link to comment
Share on other sites

Just a reminder:

http://en.wikipedia.org/wiki/Three_Laws_of_Robotics

 

For me, a robot is just a machine. If it acts on its own randomness it is just bec. you programmed it to be. Even the most complicated AI is a hand made tool with a purpose of serving mankind. If it doesn't serve to mankind it means that is only malfunctioning.

 

eh I could make the same argument about say clones or slaves (if they were bred to be slaves as apposed to being free beings who were captured) because they were "made" with the purpose to serve and everything they know is what they were taught to fulfill that service

 

all it comes down to is a cultures opinion of how "sapient" something/one has to be to count or if they can count at all

 

I don't know why you brought up the 3 laws of robotics as those are just a set of laws programmed into all robots in a specific IP which apparently don't have a common place equivalent in SW...

 

also just food for thought but droids can become sapient enough in SW to rebel just as a slave would.... there have been numerous Droid uprisings and revolts as well as numerous "Droid rights" protests and groups that were comprised of not only droids but organics also

 

you know Naboo considered higher level AI droids as equal to organic sapients right?

Edited by Liquidacid
Link to comment
Share on other sites

Droids are tools. Nothing more. C-3PO and R2-D2, while they had personality, are still just droids. If my droid is destroyed, I would be as heart broken as I would if I lost my computer. Each time I wipe my hard drive or install new components, I miss certain things about my old rig or its setup. For a while. Then as my new one takes on its own quirks and mannerisms, I stop missing my old one.

 

As for an analogy to slavery, not quite. Anthropomorphising them doesn't make them human. We just think they are since some share a similarity to humanoids. Even those droids who do not share a similarity to humanoids, it's still making them out to be something they are not.

 

If you want to follow that line of thought, wouldn't that make pet ownership slavery? We have bomb sniffing dogs, mine detecting dolphins and seals (yea, the animal, not Navy SEALs), war horses, war elephants, anti-tank dogs, war pigeons, incendiary bats, and just about any military pack animal. In some cases they are dangerous, in others they outright kill the animal to achieve the mission. No one seems to ask, "Hey Fido, do you mind if we stap this bomb to your back? All you have to do is run under that tank over there. Good boy."

Edited by Thylbanus
Link to comment
Share on other sites

For the most part I would agree, but there have been cases where the droid's extensive programming has allowed it to expand its own objectives and programming. The major case would be IG-88A, who did just that and killed the ones that built him when they tried to shut him down. He wanted to start a droid revolution, and upload himself into the Death Star to accomplish it.

 

If we are speeking of the fantasy world you are totally correct but my ideas about this is based on the real world. There newer was a droid-robot or any kind of AI in human history to turn back and kill its creators. Such a decision requires a "mind" and that kind of mind has not been created yet. Until an AI such as B166ER(the very first one started the Matrix world of Artificial Intelligence) been created, my toughts about this will not change. But I have to tell you that no one in the world wants that to become real more than I do. A robot purely acts differently than its programing. This will change everthing.

 

Even a human being can not act different than its programing. Everyone may think that they have free-will and do what ever they want but this is not correct. You have boundaries that were set by nature, birth and other people. AI has such boundaries in itself. Can anyone really answer the question of "why someones life should be saved?", what for any AI should to take someone's life? Even we are being programmed by our parents, society and other factors. But that does not keep us from creating something with great resemblance to us. Almost all religious books has that "god created us on his own looking" thing. Don't know if we were the ones wrote those books or god himself(if such thing exists) but it is so apparent that we are willing to create "something" in our own looking. AI is a product of such tought. While playing god we are bounded by our own image. Bec. that no human being is really free, nothing we created will be.

Edited by MilesTeg_cy
Link to comment
Share on other sites

Even a human being can not act different than its programing. Everyone may think that they have free-will and do what ever they want but this is not correct. You have boundaries that were set by nature, birth and other people.

While I see where you're coming from with this, I'd have to disagree. Humans aren't "programmed". Our environment and the people around me can influence me, but I can still do (or at least attempt to do) whatever I choose. I had a wonderful childhood and I'm a pretty happy person. But I could kill someone. Obviously I don't want to as my prior experiences have led me to believe that it would be wrong to do so. But I could do it anyway. That's free will.

 

We're conditioned more than programmed. And no matter how much you condition a human to believe something or act a certain way, they can always refuse - a computer can't. That's free will.

Link to comment
Share on other sites

While I see where you're coming from with this, I'd have to disagree. Humans aren't "programmed". Our environment and the people around me can influence me, but I can still do (or at least attempt to do) whatever I choose. I had a wonderful childhood and I'm a pretty happy person. But I could kill someone. Obviously I don't want to as my prior experiences have led me to believe that it would be wrong to do so. But I could do it anyway. That's free will.

 

We're conditioned more than programmed. And no matter how much you condition a human to believe something or act a certain way, they can always refuse - a computer can't. That's free will.

 

it does the same for droids or any true AI... the flaw in your argument is that any true AI can learn and adapt just as well as any organic... any decent AI could and will learn, adapt, evolve and have as many random sparks as a person will...

 

no offence to you but free will is an illusion... as you said we are conditioned... what you think is free will isn't... it's just a combination of random neurons firing and your experience... if you think there is some magical process in your brain that can not be duplicated in a computer you are mistaken

 

your entire personality and everything you are is nothing more than a combination of a straight logic process of your experiences plus some random variables thrown in... nothing really special and definitely nothing that can't be programmed or that can't be "learned" by a decent AI

Edited by Liquidacid
Link to comment
Share on other sites

no offence to you but free will is an illusion... as you said we are conditioned... what you think is free will isn't... it's just a combination of random neurons firing and your experience... if you think there is some magical process in your brain that can not be duplicated in a computer you are mistaken

 

That's not entirely certain. Neuroscience and cognitive philosophy is still working on that question. Your position is (as far as I know) the majority position among scientists. But they are still far from having even developed a complete theory about how the brain could produce consciousness.

Link to comment
Share on other sites

it does the same for droids or any true AI... the flaw in your argument is that any true AI can learn and adapt just as well as any organic... any decent AI could and will learn, adapt, evolve and have as many random sparks as a person will...

 

no offence to you but free will is an illusion... as you said we are conditioned... what you think is free will isn't... it's just a combination of random neurons firing and your experience... if you think there is some magical process in your brain that can not be duplicated in a computer you are mistaken

Oh I wasn't arguing against AI's. I was just saying that humans have free will. My position on the matter would be that if someone or something has feelings, the ability to react emotionally to stimuli then it deserves comparable rights to a person.

 

Back on the issue of free will, I'm afraid I disagree with you. I suppose it depends on what your view of free will. There isn't really a right or wrong answer, but for me the ability to act is opposition to everything I've been conditioned to believe is free will. I'll happily admit that an AI could replicate the process if it became sufficiently advanced (when I said computer I was referring to a modern computer) at which point it will exhibit free will.

 

It's a philosophical matter and there's not really a right or wrong answer.

Link to comment
Share on other sites

IG-88 became a fully-sentient assassin capable of killing its creators within seconds of activation, something he was obviously not programmed for. Some droids can achieve full sentience, so they could get rights I suppose.
Link to comment
Share on other sites

you know what's funny... this conversation sparked my interest so I went and was reading a few SW articles on the droid rebellions and such and found a documenterary in which good old George that says that C-3P0, and thus by extension all droids, have no soul.... lol

 

 

Oh I wasn't arguing against AI's. I was just saying that humans have free will. My position on the matter would be that if someone or something has feelings, the ability to react emotionally to stimuli then it deserves comparable rights to a person.

 

Back on the issue of free will, I'm afraid I disagree with you. I suppose it depends on what your view of free will. There isn't really a right or wrong answer, but for me the ability to act is opposition to everything I've been conditioned to believe is free will. I'll happily admit that an AI could replicate the process if it became sufficiently advanced (when I said computer I was referring to a modern computer) at which point it will exhibit free will.

 

It's a philosophical matter and there's not really a right or wrong answer.

 

eh it's not really a philosophical matter for me ... I was referring to the straight definition of the term which is simply the ability of a person to make choices... and that is something which almost every droid in SW has been shown to have... which makes sense because if they couldn't make choices and act independently they wouldn't be very useful... I mean ever the very simple courier mouse droids show emotional responses and free will... Chewy roars, it squeals in "fear" and turns around to run away... that is straight up an emotional derived use of free will.... if it didn't have free will it would have simply driven around him and kept going on to where it was supposed to be going.. and that is just a very simple courier droid that honestly doesn't NEED sapient programming to fulfill it's job and yet has demonstrated the ability to make "choices" and not even straight logical ones...

 

at the end of the day tho this is a topic like slavery which can be debated to death because it's dependent more on a societies "opinion" than it is on actual fact

Edited by Liquidacid
Link to comment
Share on other sites

@Liquidacid, exactly my toughts.

 

Some good coverage about free will here.

 

I've used "programmed" for humans for the resemblance to "conditioning" before. Conditioning is a programming for living things, nothing more. You can condition a totaly peaceful person for making a massacre.

 

A nice article here from Ken McLeod named Choice of Freedom.

 

"The illusion of choice is an indication of a lack of freedom". I believe this is the core of this argument here. We feel like we are more "free" as the number of choices increases. For me freedom and free will is about controlling those choices. It is not what you choose but what you can add to or remove from the choices. Said that, an human made AI is nothing different from that. You decide the choices for a droid. Unless it does not decide for its choices it is a metal slave without any free will.

Edited by MilesTeg_cy
Link to comment
Share on other sites


×
×
  • Create New...