Please upgrade your browser for the best possible experience.

Chrome Firefox Internet Explorer
×

Driod rights.


Maaruin's Avatar


Maaruin
01.11.2013 , 07:07 PM | #11
The problem with droid rights is that there is a fluent passage between programmed machines and sentient droids. You can't draw a clear line.

I think things should be handled like this:

1. Forbid memory wipes. With droids, they basically kill a developing person. (Note, however, that this stems from the same attitude which lets me argue against abortion in real life. So this one is up to in-universe-debate, I guess.)

2. Shutting a droid down, scrapping it, sending it on dangerous tasks etc. requires the consent/permission of the droid. Now, most importantly: It is allowed to create droids which are programmed to always agree to such tasks. If a droids starts to say "no", it is a sure sign that it has surpassed its programming.

3. A droid gains the rights of a sentient by asking for them or claiming them. If a droid communicates to his owner "I want to enjoy the rights of a sentient.", the owner has to agree*. He has to bring the droid to the responsible government agency where the droid gets some forms which confirm him as a sentient.

*I imagine some insurances would cover this and some companies would grant XX-years guarantees so you get a new droid if your old one declares himself a sentient.
"I was one of many. We were servants of the dark side… Sith Lords, we called ourselves. So proud. In the end we were not so proud. We hid… hid from those we had betrayed. We fell… and I knew it would be so."
-Ajunta Pall

Scutum's Avatar


Scutum
01.12.2013 , 10:25 AM | #12
Quote: Originally Posted by MilesTeg_cy View Post
Just a reminder:
http://en.wikipedia.org/wiki/Three_Laws_of_Robotics

For me, a robot is just a machine. If it acts on its own randomness it is just bec. you programmed it to be. Even the most complicated AI is a hand made tool with a purpose of serving mankind. If it doesn't serve to mankind it means that is only malfunctioning.
For the most part I would agree, but there have been cases where the droid's extensive programming has allowed it to expand its own objectives and programming. The major case would be IG-88A, who did just that and killed the ones that built him when they tried to shut him down. He wanted to start a droid revolution, and upload himself into the Death Star to accomplish it.


Quote: Originally Posted by Maaruin View Post
The problem with droid rights is that there is a fluent passage between programmed machines and sentient droids. You can't draw a clear line.

I think things should be handled like this:

1. Forbid memory wipes. With droids, they basically kill a developing person. (Note, however, that this stems from the same attitude which lets me argue against abortion in real life. So this one is up to in-universe-debate, I guess.)

2. Shutting a droid down, scrapping it, sending it on dangerous tasks etc. requires the consent/permission of the droid. Now, most importantly: It is allowed to create droids which are programmed to always agree to such tasks. If a droids starts to say "no", it is a sure sign that it has surpassed its programming.

3. A droid gains the rights of a sentient by asking for them or claiming them. If a droid communicates to his owner "I want to enjoy the rights of a sentient.", the owner has to agree*. He has to bring the droid to the responsible government agency where the droid gets some forms which confirm him as a sentient.

*I imagine some insurances would cover this and some companies would grant XX-years guarantees so you get a new droid if your old one declares himself a sentient.
I believe you can draw a clear line, and it is handled in a pretty decent way in the Mass Effect universe;

1. Forbid any sort of AI development (unless it is important), and make sure that the lesser intelligence you create can't evolve into anything more than what they were programmed for (no expansive memories).

2. Memory wipes don't necessarily kill a developing sentience, they make sure that it never comes about because it isn't supposed to. A droid that wasn't built to gain sentience or a higher intelligence needs memory wipes, and if they don't then you get a "rampant" AI situation (IG-88, and I remember a few droids in KOTOR2 that seemed to have a form of sentience that were basically nuts).

3. If a droid claims sentience, then there must be a series of tests to decide whether this is true or not (the droid could have been programmed to say it had sentience). If it fails then memory wipe it and put it back to work, if it passes then it should be studied so that other droids can be prevented from doing so, then either dismantled if it is potentially dangerous or allowed to exist in a limited perimeter.
<--------------------------------------------------->
In my right arm I bear the shield,
With which I shall protect the republic.

Maaruin's Avatar


Maaruin
01.12.2013 , 11:20 AM | #13
Quote: Originally Posted by Scutum View Post
I believe you can draw a clear line, and it is handled in a pretty decent way in the Mass Effect universe;

1. Forbid any sort of AI development (unless it is important), and make sure that the lesser intelligence you create can't evolve into anything more than what they were programmed for (no expansive memories).

2. Memory wipes don't necessarily kill a developing sentience, they make sure that it never comes about because it isn't supposed to. A droid that wasn't built to gain sentience or a higher intelligence needs memory wipes, and if they don't then you get a "rampant" AI situation (IG-88, and I remember a few droids in KOTOR2 that seemed to have a form of sentience that were basically nuts).

3. If a droid claims sentience, then there must be a series of tests to decide whether this is true or not (the droid could have been programmed to say it had sentience). If it fails then memory wipe it and put it back to work, if it passes then it should be studied so that other droids can be prevented from doing so, then either dismantled if it is potentially dangerous or allowed to exist in a limited perimeter.
I disagree on some points:

1. I see no reason for this. Are sentient droids more dangerous than organic sentients? Of course, if it is forbidden to create sentients in general, than AIs have to be included.

2. That's a difficult point. Imagine a droid did gain sentience, but you memory-wipe it, you basically destroy a sentient. On the other hand, no memory-wipes could also lead to errors which produce a dangerous but non-sentient droid. I'm not sure how to handle this.

3. Which tests would you perform to determine if a droid is sentient?
I firmly disagree on this: "...if it passes then it should be studied so that other droids can be prevented from doing so, then either dismantled if it is potentially dangerous or allowed to exist in a limited perimeter."

If it is sentient, the same rights apply to it, as to organic sentients. You cannot simply kill a Wookie, because he is potentielly dangerous, so dismanteling a sentient droid shouldn't be allowed either. Same for the limited perimeter: The same thing has been done to sentients, but it isn't considered morally right. Even studying the droid should depend on his permission.
"I was one of many. We were servants of the dark side… Sith Lords, we called ourselves. So proud. In the end we were not so proud. We hid… hid from those we had betrayed. We fell… and I knew it would be so."
-Ajunta Pall

akdonkey's Avatar


akdonkey
01.13.2013 , 09:25 PM | #14
Different Series but, you seen what happened when the Cylons got their own Freedom!

Liquidacid's Avatar


Liquidacid
01.14.2013 , 02:10 AM | #15
Quote: Originally Posted by MilesTeg_cy View Post
Just a reminder:
http://en.wikipedia.org/wiki/Three_Laws_of_Robotics

For me, a robot is just a machine. If it acts on its own randomness it is just bec. you programmed it to be. Even the most complicated AI is a hand made tool with a purpose of serving mankind. If it doesn't serve to mankind it means that is only malfunctioning.
eh I could make the same argument about say clones or slaves (if they were bred to be slaves as apposed to being free beings who were captured) because they were "made" with the purpose to serve and everything they know is what they were taught to fulfill that service

all it comes down to is a cultures opinion of how "sapient" something/one has to be to count or if they can count at all

I don't know why you brought up the 3 laws of robotics as those are just a set of laws programmed into all robots in a specific IP which apparently don't have a common place equivalent in SW...

also just food for thought but droids can become sapient enough in SW to rebel just as a slave would.... there have been numerous Droid uprisings and revolts as well as numerous "Droid rights" protests and groups that were comprised of not only droids but organics also

you know Naboo considered higher level AI droids as equal to organic sapients right?
"bibo ergo sum" ( I drink, therefore I am)

Teamwork is essential; it gives the enemy other people to shoot at.

Thylbanus's Avatar


Thylbanus
01.14.2013 , 07:03 AM | #16
Droids are tools. Nothing more. C-3PO and R2-D2, while they had personality, are still just droids. If my droid is destroyed, I would be as heart broken as I would if I lost my computer. Each time I wipe my hard drive or install new components, I miss certain things about my old rig or its setup. For a while. Then as my new one takes on its own quirks and mannerisms, I stop missing my old one.

As for an analogy to slavery, not quite. Anthropomorphising them doesn't make them human. We just think they are since some share a similarity to humanoids. Even those droids who do not share a similarity to humanoids, it's still making them out to be something they are not.

If you want to follow that line of thought, wouldn't that make pet ownership slavery? We have bomb sniffing dogs, mine detecting dolphins and seals (yea, the animal, not Navy SEALs), war horses, war elephants, anti-tank dogs, war pigeons, incendiary bats, and just about any military pack animal. In some cases they are dangerous, in others they outright kill the animal to achieve the mission. No one seems to ask, "Hey Fido, do you mind if we stap this bomb to your back? All you have to do is run under that tank over there. Good boy."
It's amazing how loud a dollar can be.
"Computer games don't affect kids; I mean if Pac-Man affected us as kids, we'd all be running around in darkened rooms, munching magic pills and listening to repetitive electronic music." - Kristin Wilson

MilesTeg_cy's Avatar


MilesTeg_cy
01.14.2013 , 08:56 AM | #17
Quote: Originally Posted by Scutum View Post
For the most part I would agree, but there have been cases where the droid's extensive programming has allowed it to expand its own objectives and programming. The major case would be IG-88A, who did just that and killed the ones that built him when they tried to shut him down. He wanted to start a droid revolution, and upload himself into the Death Star to accomplish it.
If we are speeking of the fantasy world you are totally correct but my ideas about this is based on the real world. There newer was a droid-robot or any kind of AI in human history to turn back and kill its creators. Such a decision requires a "mind" and that kind of mind has not been created yet. Until an AI such as B166ER(the very first one started the Matrix world of Artificial Intelligence) been created, my toughts about this will not change. But I have to tell you that no one in the world wants that to become real more than I do. A robot purely acts differently than its programing. This will change everthing.

Even a human being can not act different than its programing. Everyone may think that they have free-will and do what ever they want but this is not correct. You have boundaries that were set by nature, birth and other people. AI has such boundaries in itself. Can anyone really answer the question of "why someones life should be saved?", what for any AI should to take someone's life? Even we are being programmed by our parents, society and other factors. But that does not keep us from creating something with great resemblance to us. Almost all religious books has that "god created us on his own looking" thing. Don't know if we were the ones wrote those books or god himself(if such thing exists) but it is so apparent that we are willing to create "something" in our own looking. AI is a product of such tought. While playing god we are bounded by our own image. Bec. that no human being is really free, nothing we created will be.
All governments suffer a recurring problem: Power attracts pathological personalities. It is not that power corrupts but that it is magnetic to the corruptible. Such people have a tendency to become drunk on violence, a condition to which they are quickly addicted. - Frank Herbert

TheSelkie's Avatar


TheSelkie
01.14.2013 , 10:09 AM | #18
Quote: Originally Posted by MilesTeg_cy View Post
Even a human being can not act different than its programing. Everyone may think that they have free-will and do what ever they want but this is not correct. You have boundaries that were set by nature, birth and other people.
While I see where you're coming from with this, I'd have to disagree. Humans aren't "programmed". Our environment and the people around me can influence me, but I can still do (or at least attempt to do) whatever I choose. I had a wonderful childhood and I'm a pretty happy person. But I could kill someone. Obviously I don't want to as my prior experiences have led me to believe that it would be wrong to do so. But I could do it anyway. That's free will.

We're conditioned more than programmed. And no matter how much you condition a human to believe something or act a certain way, they can always refuse - a computer can't. That's free will.

Liquidacid's Avatar


Liquidacid
01.14.2013 , 04:56 PM | #19
Quote: Originally Posted by TheSelkie View Post
While I see where you're coming from with this, I'd have to disagree. Humans aren't "programmed". Our environment and the people around me can influence me, but I can still do (or at least attempt to do) whatever I choose. I had a wonderful childhood and I'm a pretty happy person. But I could kill someone. Obviously I don't want to as my prior experiences have led me to believe that it would be wrong to do so. But I could do it anyway. That's free will.

We're conditioned more than programmed. And no matter how much you condition a human to believe something or act a certain way, they can always refuse - a computer can't. That's free will.
it does the same for droids or any true AI... the flaw in your argument is that any true AI can learn and adapt just as well as any organic... any decent AI could and will learn, adapt, evolve and have as many random sparks as a person will...

no offence to you but free will is an illusion... as you said we are conditioned... what you think is free will isn't... it's just a combination of random neurons firing and your experience... if you think there is some magical process in your brain that can not be duplicated in a computer you are mistaken

your entire personality and everything you are is nothing more than a combination of a straight logic process of your experiences plus some random variables thrown in... nothing really special and definitely nothing that can't be programmed or that can't be "learned" by a decent AI
"bibo ergo sum" ( I drink, therefore I am)

Teamwork is essential; it gives the enemy other people to shoot at.

Maaruin's Avatar


Maaruin
01.14.2013 , 05:12 PM | #20
Quote: Originally Posted by Liquidacid View Post
no offence to you but free will is an illusion... as you said we are conditioned... what you think is free will isn't... it's just a combination of random neurons firing and your experience... if you think there is some magical process in your brain that can not be duplicated in a computer you are mistaken
That's not entirely certain. Neuroscience and cognitive philosophy is still working on that question. Your position is (as far as I know) the majority position among scientists. But they are still far from having even developed a complete theory about how the brain could produce consciousness.
"I was one of many. We were servants of the dark side… Sith Lords, we called ourselves. So proud. In the end we were not so proud. We hid… hid from those we had betrayed. We fell… and I knew it would be so."
-Ajunta Pall