UK report says robots will have rights

Discussion in 'Chit Chat' started by isaac000, Dec 19, 2006.

  1. UK report says robots will have rights
    By Salamander Davoudi in London

    Published: December 19 2006 22:01 | Last updated: December 19 2006 22:01

    The next time you beat your keyboard in frustration, think of a day where it may be able to sue you for assault. Within 50 years we might even find ourselves standing next to the next generation of vacuum cleaners in the voting booth.

    Far from being extracts from the extreme end of science fiction, the idea that we may one day give sentient machines the kind of rights traditionally reserved for humans is raised in a British government-commissioned report which claims to be an extensive look into the future.

    Visions of the status of robots around 2056 have emerged from one of 270 forward-looking papers sponsored by Sir David King, the UK government’s chief scientist. The paper covering robots’ rights was written by a UK partnership of Outsights, the management consultancy, and Ipsos Mori, the opinion research organisation.

    “If we make conscious robots they would want to have rights and they probably should,” said Henrik Christensen, director of the Centre of Robotics and Intelligent Machines at the Georgia Institute of Technology.

    The idea will not surprise science fiction aficionados. It was widely explored by Dr Isaac Asimov, one of the foremost science fiction writers of the 20th century. He wrote of a society where robots were fully integrated and essential in day-to-day life.

    In his system, the ‘three laws of robotics’ governed machine life. They decreed that robots could not injure humans, must obey orders and protect their own existence – in that order.

    Robots and machines are now classed as inanimate objects without rights or duties but if artificial intelligence becomes ubiquitous, the report argues, there may be calls for humans’ rights to be extended to them.

    It is also logical that such rights are meted out with citizens’ duties, including voting, paying tax and compulsory military service.

    Mr Christensen said: “Would it be acceptable to kick a robotic dog even though we shouldn’t kick a normal one?

    “There will be people who can’t distinguish that so we need to have ethical rules to make sure we as humans interact with robots in an ethical manner so we do not move our boundaries of what is acceptable.”

    The Horizon Scan report argues that if ‘correctly managed’, this new world of robots’ rights could lead to increased labour output and greater prosperity.

    “If granted full rights, states will be obligated to provide full social benefits to them including income support, housing and possibly robo-healthcare to fix the machines over time,” it says.

    But it points out that the process has casualties and the first one may be the environment, especially in the areas of energy and waste.
     
  2. "Salamander Davoudi"?????????

    Come on, nearly every name in that article is a rip off of some lame sci fi stuff.

    However, i maintain the ultimate, predetermined objective of man is to produce a functional robot woman (kind of ironic, really) all the cards stack up.

    True.
    If you think otherwise, id be happy to hear your reasoning.

    Robot women.........you KNOW you want it!!!!!!!!!!
     
  3. sdavoudi

    sdavoudi

    So, if my name is lame sci fi stuff, what's yours?
     
  4. Lol, that was unexpected.
    :D
    My name? Kirk Andromedan-Borg.

    If at this point, most people dont have any particular rights, is there really a logical reason robots should?

    I guess that will depend on how advanced AI becomes, but the only guarantee is that even if such sophisticated robots are around , people will screw it up somehow.
    Lets face it, as soon as their available, cheaply enough, conventional armies are out the door, you only need to look at the super soldier program to see the logical extensions of the idea.

    The idea such robots will somehow, benifit humankind is highly dubious.