Page 1 of 2 12 LastLast
Results 1 to 15 of 19

Is it morally right for a robot to make the decision to kill?

This is a discussion on Is it morally right for a robot to make the decision to kill? within the Off Topic forums, part of the Entertainment category; This is for a paper I am doing in my degree and I've loved every bit of it so far. ...
Page: 1


  1. #1
    Da Mod Father Senior Member Felix's Avatar
    Join Date
    Mar 2006
    Location
    England
    Posts
    6,003

    Default Is it morally right for a robot to make the decision to kill?

    This is for a paper I am doing in my degree and I've loved every bit of it so far. Really intresting stuff about the laws of Robotics and the inplications of such a machine being built.

    In my answer to this I put the view that socially due to Sci-Fi films and books we are not ready for such a creation to be formed and fears of a terminator sceaniro happening are too strong.

    I know it's reall nerdy stuff but it was just amazing reading up on the different theories and ideas about this. Can you really programe morralls into a machince to make the judgement to kill someone in a split second or to let them live.

    I hope some of you find this an intresting subject like I have to talk about

  2. #2
    Tits or destiny? Senior Member Dark3lf's Avatar
    Join Date
    Jul 2006
    Location
    Fınland oke?
    Posts
    2,976

    Default

    Personally I don't think that is possible because the machine would have to have a sense of judgement. That means it would have to be able to think rationally and calculate different scenarios.
    I don't think making machines that think like humans are even close to reality at the moment. Maybe in the future we'll have machines with some kind of ability to think. Emotions and judgemental skills are just something way too hard to be coded into a system atm.

  3. #3
    Legendary Mage Senior Member tHeUnBeAtAbLe's Avatar
    Join Date
    Dec 2007
    Posts
    4,981

    Default

    Technically if there is someone controlling it, I would say sure, why not, if it's like government property or whatever. But if you're talking about the robot's artificial intelligence, as in it's own mind, well that hasn't been quite accomplished yet, as no one has been able to win the Loebner's prize yet for creating a model with a strong equivalence of it's artificial intelligence.

  4. #4
    Sεмρяε Fι Senior Member CroNe's Avatar
    Join Date
    Mar 2006
    Location
    Belgium
    Posts
    1,258

    Cool

    You should watch the movie: iRobot - Will Smith
    Its pretty nice

  5. #5
    Xtr3m3
    Guest

    Default

    If such creation was possible in today's world I think the great majority of people will consider this decision "morally unacceptable". But it will also depend on what type of situation it will be used. For example, if a mad scientist will just program a robot to kill any living thing it sees then yes the majority of people will opposed this. But if a robot were to kill another person because that person is going to kill millions then the majority of people will considered this right because it is for a greater good.

    From my point of view the is such things as good and bad...but those it really exist? No. From the third point of view there is no such thing as right or wrong...only the values of the society we live in on our current time frame. At first society will view this as bad(robot killing humans)...but later it will get accustom to it and will have indifference. But does it establishes the real moral of right and wrong? Not really...

    I might have gone a little off topic there

    Hope this help Another good movie is A.I

    And yeah the whole concept of robots is truly amazing!

    Here are some interesting stuff of robots I found in another forums
    The next clip will blow your mind! Although its old But I must admit that it looks like a robot but it's really NOT (first 2 clips)...hopefully on the near future they could come out with such technology

    [ame]http://www.youtube.com/watch?v=h99I5LxoxBc[/ame]
    [ame]http://www.youtube.com/watch?v=xsTX8gHVeB8[/ame]
    [ame]http://www.youtube.com/watch?v=-KxjVlaLBmk[/ame]

    'Nightmarish' blob-like robot unveiled in US
    Video: Robot hand shows off amazing dexterity, speed | Dominic Fallows
    Last edited by Xtr3m3; 10-21-2009 at 06:56 PM.

  6. #6
    ▁ ▂ ▃ ▄ ▅ ▆ █ 100%VOLUME Senior Member SamEs's Avatar
    Join Date
    Apr 2008
    Location
    Bielefeld|Germany
    Posts
    1,248

    Default

    Matrix :X
    there u find your answers lul

  7. #7
    El Gran Tanke Senior Member gimmecookiesnao's Avatar
    Join Date
    Oct 2007
    Posts
    1,175

    Default

    i think for a robot to make any type of moral choices they need to have the ability of not just to reason and think logicaly but also to feel emotions as well as humans do.

    so my answer if its just a random robot like skynet then no its not moraly right

    but if its a robot like robin williams in Bicentennial man then yea it would be right for him to make moral decisions

    [ame]http://www.youtube.com/watch?v=z5YMEwX2-88[/ame]

    this is good movie in my opinion

  8. #8
    Legendary Mage Senior Member tHeUnBeAtAbLe's Avatar
    Join Date
    Dec 2007
    Posts
    4,981

    Default

    Quote Originally Posted by gimmecookiesnao View Post
    i think for a robot to make any type of moral choices they need to have the ability of not just to reason and think logicaly but also to feel emotions as well as humans do.

    so my answer if its just a random robot like skynet then no its not moraly right

    but if its a robot like robin williams in Bicentennial man then yea it would be right for him to make moral decisions

    YouTube - Bicentennial Man

    this is good movie in my opinion
    It's true, that's one of the biggest things humans are looking forward into succeeding.... making a robot that has emotion. That's basically what the Loebner's prize is about,.. someone who can create a robot that is an exact replica of a human, meaning that when tested, you would have an observer/questioning person, and he/she should not be able to tell between the two people, of which one is the robot, and of which the other is a human. But it hasn't been done yet, that $100,000 is still waiting to be picked up by someone.

  9. #9
    Da Mod Father Senior Member Felix's Avatar
    Join Date
    Mar 2006
    Location
    England
    Posts
    6,003

    Default

    It's something I am having to learn about in great depth in my PC systems lessons. Basically the Robot would be an flying drone which they use in the combat feild already. It would be totally unmaned and would seek out a target and then it would exicute it's target and then continue. There are already machinces that will pick up a heat source and fire on it in seconds, but what if these types of wepons were left running and they would effectivly be programed to use a set of predetermined paramatores , morralls if you want to either decide to kill or not kill.

    So far I've only written about 700 words on it as I like you guys am answering the title question. Also the Harvard refferancing system is a bitch to keep on top of but it's making me do so much more back ground reading I am enjoying discovering all the different theories.

    I-robot is an excellent movie for a question like this as it tackles the 3 laws of robotics head on and the effects of them being broken. The laws in the movie are actually laws that were made up by a Sci-Fi writter in the 40's and is acepted as the laws of which a robot is determined by.

    Personally the day we let machinces decide if someone should live or die will be the start of the human race no longer being in control pf it owns existance. It's frigtening how much the Terminator movies really put this whole subject matter into a clear cur prepestice. Also thanks to the movie I think it will also make people think twice before allowing it to happen.

    I'm really chuffed with the replies guys , it's nice to hear other people views on things like this as for me I have been out of any type of education for 11 years now and I am really enjoying the whole learning process and stimulating my mind. I feel a compleatly different person. Just need to learn how to spell and get my back sorted

    I look forward to seeing more replies. If people enjpy these type of discussions I will put some more of my computing degree stuff up for us to talk about on here.

    Cheers all Felix

  10. #10
    El Gran Tanke Senior Member gimmecookiesnao's Avatar
    Join Date
    Oct 2007
    Posts
    1,175

    Default

    i guess in the end those predetermined paramatores your talking about are things that are alreayd used by humans in the military.

    like lets say human pilot is given authority to to bomb a building where an important enemy general is only if it meets certain things like

    a: confirmation general is there
    b: engaged attack has a certaint death percentage ( meaning that 75% chance of killing the general is guaranteed)
    c: collateral damage is kept below a certain percentage
    d: estimated civillian loss of life was kept below a certain percentage as well.

    when humans make that choice its becomes moral cuz im sure some humans would hesitate at the thought of killing so many civillians to just get it a little bit more below than that predetermined percentage.

    if a robot would be given the same parameters he would just be following directions.

    and it would kill general at the moment the civillian percentage was allowed. no more no less.

    so yea i think for any robot to make a moral decision emotions need to be involved.

    even though emotions have nothing to do with morality =/


    a robot would just follow orders no questions asked
    Last edited by gimmecookiesnao; 10-22-2009 at 02:18 AM.

  11. #11
    Legendary Mage Senior Member tHeUnBeAtAbLe's Avatar
    Join Date
    Dec 2007
    Posts
    4,981

    Default

    Try not to include the movies in the essay though, in my opinion, it would be a bad reference to the essay. But it's just a suggestion.

  12. #12
    Ignore Me Senior Member
    Join Date
    Jan 2007
    Location
    AP Whore House
    Posts
    1,873

    Default

    If a weapon is controlled by computers.

    It can be hacked. Hacking is automatable and the hacker can let a computer decide what do after.

    The terminator series is essentially based on that. If you automate a weapon imagining a person will operate the computer. You are just a computer virus/hack from a computer deciding to pull the trigger itself.

    Its therefore already happened. Just waiting for the security vunerabilty to be exposed.

    Anyway its a law. Muder is also illegal. Law doesnt stop something.

  13. #13
    Da Mod Father Senior Member Felix's Avatar
    Join Date
    Mar 2006
    Location
    England
    Posts
    6,003

    Default

    Quote Originally Posted by tHeUnBeAtAbLe View Post
    Try not to include the movies in the essay though, in my opinion, it would be a bad reference to the essay. But it's just a suggestion.
    I was explaining because of modern culture and what we think robots are, when a robot is basically given the power people instantly start to related to the subjects raised in movies like Terminator.

    Unquiely the guy who made these 3 laws of robotics was a Sci-Fi writter so I think the connection between what todays Sci-FI has an effect on these laws and how the Sci-FI of the 1940's actually moulded the laws.

    Basically the army is making a crude vertion of the terminatore by creating these robots. It's a machine that will not stop untill it has forfilled it's job , killing the target. But the biggest concern is how this thing will know when it should kill and when it should not!

  14. #14
    LegendarY Senior Member Lev0's Avatar
    Join Date
    Jul 2006
    Location
    The Netherlands//Turkey
    Posts
    2,582

    Default

    there are always 2 sides of a coin
    and this also has 2 sides
    it wont fear to kill someone, cannot be blackmailed ore bribed, doesnt think so no personal interest gain

    on the other hand like someone else posted it a robot needs to be connected to a network ore has a computer in him so it could be hacked and taken over
    ore another problem is that it will become corrupt starts taking out people and stuff

  15. #15
    Senior Member
    Join Date
    Mar 2006
    Posts
    1,457

    Default

    I don't really understand the premise. If a machine is a "terminator" with a mission to execute, then it will know when to stop because its target will be dead. If anything such a machine would be more reliable than a human. It does need to make any judgement calls because the judgement is taken by its designer or master.

    On the other hand if you are talking about some kind of android that tries to replicate human responsibilities then such a robot by virtue of being able to so closely mimic human behaviour would have the capacity to make judgements. All the decisions that go into making a decision to kill are based on some form of logic, that can be simulated with a powerful enough computer.

Page 1 of 2 12 LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •