The only trouble is...

Date: 17 Mar 2006 21:11 (UTC)
The swapping of the second and third laws lends itself to all manner of problems. If a robot decided that to protect its existence it had to take over the world, then it would happily do so and there would be mothing anyone could do about it. Sure he'd make certain no human was harmed, but it wouldn't be all that nice for humanity. Which leads me to the problem with the first law, that being the definition of 'harm'. It effectively means that your expensive robot would be helping everyone it met no matter what you had ordered it to do. It could be argued that unless the robot did something about it, all those people consuming large amounts of alcohol and having a jolly good time were actually harming themselves; so the robot just better take care of all that for humanities own good. A sort of Puritanical enforcer. :8)
This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

Profile

den: (Default)
den

April 2023

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526 272829
30      

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated 3 July 2025 08:37
Powered by Dreamwidth Studios