Nulono wrote:It'd really just be an AI. Should PETA arrest me if I don't feed my virtual dogs?
If your virtual dogs is living in a virtual world where they can feel pain just like a "normal" dog, then yes. Not feeding your dogs is bad (uhm, let's not get into a long discussion about animals), not feeding your virtual dogs, which works just like your "normal" dogs, is also bad.
zug wrote:I think it would be a neat experiment to do. Assuming we were able to synthesize this kind of compuperson, why don't we just ask them about it? Ask if something hurts or if they want us to stop.
So if you program a robot, for example a Tickle Me Elmo, to say "Stop it, that hurts!" in response to physical stimulus, that means we should believe it?
Uhm, the exact same thing can be applied for "normal" humans. So yes, we should believe it, or at least partly (we don't believe in everything a "normal" human says either).
nbonaparte1 wrote:I think simulating the entire person (much less a universe) is unnecessary. If you want to do an experiment on the body, do a simulation on the body. If the brain never existed, there arises no ethical problem whatsoever. Mental experiments are different, though, but nothing's stopping you from only simulating what you need. If you have to disconnect the cerebral cortex, then do so.
Well, much of our bodies are controlled by the brain, you'd lose a lot by removing it. Besides, why would you even need to do an experiment on an actual simulated body? There's no need to make your program more abstract.
slacks wrote:I'm not sure how we would resolve punishment issues for a being that can live so much longer than a human being either. I mean you can't exactly sentence a computer intelligence to life in prison and figure that is equivalent to the same sentence for a human being (an arguement could be made for it being worse or better!).
I'm also unsure how you would handle people owning/maintaining the physical hardware that stores a computer intelligence, particularly if said software is loaded without permission.
Well, the idea that great punishment is the same as putting someone in jail was created for "normal" humans, for "normal" humans. We'd just find some other way of punishing someone. If that is even needed.
The idea of "life in prison" is based upon the fact that people rarely reaches 100 years of age. What's the point in sentencing a virtual being to ten billion years of prison? Wouldn't it be way easier to just reprogram the virtual being and then continue on? I don't buy into the notion that punishment should be given "just because". If human A kills human B and human A is reprogrammed to not kill more humans, shouldn't that be enough? I mean, why should the reprogrammed human A have to do jail time for what the previous human A did?