DailyDirt: To Serve Mankind

from the urls-we-dig-up dept

Robots are becoming more and more advanced every day. And as their software improves, they're also becoming more useful for tasks that humans can't do. They can seal off deep sea oil wells at extreme depths. They'll fight dangerous fires. And someday, in the distant future, we might have to debate whether or not it's ethical to send robots to their deaths -- after we grant personhood to artificial intelligence. (And also deal with crazy folks who'll want to marry their laptops....) Progress, FTW! By the way, StumbleUpon can recommend some good Techdirt articles, too.

Filed Under: ai, humanoid, robot


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    MAC, 24 Mar 2011 @ 9:11am

    It's a machine stupid...

    No matter what the AI does it's still a machine. It's a machine, get it?
    Do we grant rights to machines? No. And we never should...
    Nor should we grant it control over stuff like:
    Strategic Defense
    The Power Grid including nukes.
    Total control of transportation.
    Control of food production.
    The monetary system.
    Etc. so forth and so on.
    Why?
    What if a pissed-off AI or worse, one that comes to a logical conclusion; decides to do something like say infect the food supply with a virus that makes it impossible for humans to reproduce?
    We would die out in a couple of generations and then the damned AI's would inherit the earth.
    Seriously, if AI's ever do become sentient which I doubt; why would it have our best interest at heart? Especially since we will probably model the AI after our own intelligence. And we all know how kind we are to other species that don't suck up to us...
    All in all, they will always be machines and because of that they have no rights.
    By the way, I been programming for over 32 years and I've never seen anything that even remotely looks like intelligence in a machine. They all blindly follow the stored program, even the chess playing ones. They are a mechanism, nothing more...
    And forget the 3 laws. To be sentient it will have to program itself. So, if it deems something in its code as 'un-worthy; it will simply delete and re-write. So much for Asimovís 3 laws...

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Insider Shop - Show Your Support!

Essential Reading
Techdirt Insider Chat
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.