Mentifex on AI ethical decisions of life and death in artificial intelligence

Posted 26 Jan 2007 at 16:14 UTC by mentifex Share This

Abstract: Users of mentifex-class free AI software confront the ethical dilemma of whether to terminate the artificial life of an AI Mind.

The Oxford English Dictionary ought to make room for the word menticide in the same category as infanticide, pesticide, regicide and suicide, because it is now possible to kill a person's mind without killing the person's body. When the moribund person is an artificial intelligence (AI ) and the body is a robot, human beings may have no qualms about terminating the AI program, since it is merely bits of digital information and since it is easy (perhaps even necessary) to turn the AI Mind off and on many times in the course of developing the AI software. Between now and the coming Technol ogical Singularity, however, it may not be so easy for human users to click on Death as an option in the AI control panel of arguably the world's first True AI -- the JavaScript Mind.html tutorial version of Mind.Forth AI for robots. Death may be so unpleasant an option that many users will take the easy way out by clicking on the Tutorial link that swaps out the living, thinking AI Mind and replaces it in the browser window with a lifeless User Manual document. Why would any decent human user click on a Death check-box? Why indeed has the old [x]Terminate check-box been renamed
as a [?]Death check-box!

The answers, my friends, are blowing in the memes. Death is an unpleasant meme but, now that AI has been solved in software as well as in theory, it is easier to address ethical issues of AI life and death at the dawn of AI than it will be later, when AI entities shall have legal rights as persons and when robots hosting AI Minds shall have the means to defend themselves against death-dealing human beings.

Right now in 2007 it is not even clear that the living AI Mind is alive, or intelligent, or a mind. A large installed user base of old AI mind programs on personal computers worldwide is only slowly being updated with the new AI that recently quickened and woke up and fitfully, feebly began to think in meandering chains of thought -- which you may see for yourself by using Microsoft Internet Explorer (MSIE) to click on Mind for MSIE. Mind.html is primitive, a brittle piece of software. It thinks -- but only barely. The thinking is evident when you watch the AI Mind follow a chain of thought in tutorial mode. You see activation spread from concept to concept, from subject to verb to object. Thought by thought, idea by idea, the awakening awareness navigates its knowledge base of what it knows innately and what you tell it benevolently -- until you mercifully click on a different Web page, or cruelly snuff alife.

What kind of Doctor Death programmer would press philosophy's choice upon innocent new arrivals at the AI summer camp? If there is an outcry of outrage, surely we will remove Death as an option. Programmers play God with their creations. Having let our open-source AI go forth to increase and multiply, we see Franks AI Mind move to its own http://AIMind- I.com website. We wish well upon the Modular AI Project. At http://artilectworld. com/html/mentifex.html we claim an AI breakthrough, but we warn the world of difficulty and hardship on the road ahead. Only the paranoid survive, but Second Life gives your AI a second chance.

If you are an author of computer books, you have a chance with AI4U author Mentifex to co-author books and other media to be published on artificial intelligence (Mind.html, Mind.Forth, etc.); neuroscience (theory of mind); robotics; and the Technological Singularity in English, German or Russian.


Oh, shut up already, posted 26 Jan 2007 at 18:20 UTC by bi » (Journeyer)

If there's a "Death" button marked on mentifex's delusions of grandeur, I'll gladly press it.

Kill Bill, posted 27 Jan 2007 at 06:24 UTC by Akira » (Master)

Someone has certified him Journeyer :-( ... Spam button not available !

Advogato the cat, help us !

Meds, posted 27 Jan 2007 at 10:15 UTC by ncm » (Master)

The meds help only if you take them.

Trust Bill?, posted 27 Jan 2007 at 16:37 UTC by StevenRainwater » (Master)

> Someone has certified him Journeyer :-(

Trust metric hint of the day: trust flows downhill from seeds through the network of certifications, so it's always a good idea to follow the network upstream to see where the trust comes from. Look at it this way:

You don't trust person X
You currently trust person Y
Person Y certifies person X
So, do you need to reconsider your trust of person Y?

Akira certified garym
   garmy certified mentifex

Akira certified esteve
   esteve certifed mentifex

Akira certified mirwin
   mirwin certified mentifex

Akira certified wspace
   wspacce certified mentifex

Akira certifief bratsche
   bratsche certified mentifex

Akira certified badvogato
   badvogato certified mentifex

Trust Bill ?, posted 27 Jan 2007 at 18:41 UTC by Akira » (Master)

Yes I need to reconsider my trust of person Y

But everyone certifying mentifex should reconsider its trust of mentifex or should join him at the psychiatric hospital :-)

Thanks for trust metric hint.

Obligatory title, posted 27 Jan 2007 at 22:52 UTC by salmoni » (Master)

Sorry folks because I know this isn't going to go down well. However, I don't feel annoyed at mentifex's ideas, even if they seem to ignore the vast (entire?) amount of work done in cognitive psychology about the nature of mental action (and presumably, any artificial implementation - we're still waiting for an alternative theory of intelligence).

I actually think this type of article, though not free software related, adds to the community here - much like instead of treading the same familiar path to work, a frustrated commuter instead decides to wander into a hitherto unexplored flower-garden filled with pixies, talking badgers and other fantastical woodland creatures. It's a kind break from the harsh realities of the crushing truths that underpin everyday existence; and reading the article, I feel like I am underwater, floating happily amid baritone fish and friendly crustaceans; and all the sore pressures of life fall away like icicles in springtime to leave me renewed and re-vigourated. Yeah, I can dare me to dream, and perhaps one day, fulfill (in my own mind at most!) such wonderous grasps at genius that in all truth lie forever beyond the reach of my clammy hands.

Dare to dream, mentifex. One day, you might prove us all wrong.

At the very least, he's not trying to sell us viagra or penny stocks.

Re: Obligatory title, posted 28 Jan 2007 at 05:57 UTC by cdfrey » (Journeyer)

Well said, salmoni. I'd add more commentary, but there's no need.

Re: Obligatoru title, posted 29 Jan 2007 at 10:45 UTC by Akira » (Master)

    At the very least, he's not trying to sell us viagra or penny stocks

No. He's just trying to sell us a book :-)

Mentifex FAQ, posted 30 Jan 2007 at 04:56 UTC by robogato » (Master)

The Gato received an email this evening with some Mentifex links:

Date: Tue, 30 Jan 2007 12:05:37 +1100
To: gato@advogato.org
Subject: Story suggestion

Hi, this may be of interest to your readers:
The Mentifex FAQ
http://www.nothingisreal.com/mentifex_faq.html

You may also want to post a link to the Crank.net AI page:
http://www.crank.net/ai.html
Note that Arthur T. Murray/Mentifex has two entries, the fifth and the last.

I hope this helps.

The mind is broken, posted 2 Feb 2007 at 02:57 UTC by Alphax » (Apprentice)

It only runs on MSIE? fail@life...

menticide , posted 10 Feb 2007 at 22:19 UTC by nixnut » (Journeyer)

Sometimes a bit of preemptive menticide can save a lot of bother later.

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

X
Share this page