Jul 022015
 

lainRecent comments about the previous post regarding valuing how brain-states are achieved are deserving of reflection and reply.

 

 

Rowan:

how is the process of playing Doom without cheat codes distinguished from the process of repeatedly pushing a button connected to certain electrodes in your head that produce the emotions associated with playing Doom without cheat codes? (Or just lying there while the computer chooses which electrodes to stimulate on your behalf?)

If it’s just the emotions without the experiences that would cause those emotions, I think that’s a huge difference. That is once again just jumping right to the end-state, rather than experiencing the process that brings it about. It’s first-order control, and that efficiency and directness strips out all the complexity and nuance of a second-order experience.

See Incoming Fireball -> Startled, Fear
Strafe Right -> Anticipation, Dread
Fireball Dodged -> Relief
Return Fire -> Vengeance!!

Is strictly more complicated than just

Startled, Fear
Anticipation, Dread
Relief
Vengeance!!

I think the key difference being that in the first case, the player is entangled in the process. While these things are designed to produce a specific and very similar experiences for everyone (which is why they’re popular to a wide player base), it takes a pre-existing person and combines them with a series of elements that is supposed to lead to an emotional response. The exact situation is unique(ish) for each person, because the person is a vital input. The output (of person feeling X emotions) is unique and personalized, as the input is different in every case.

When simply conjuring the emotions directly via wire, the individual is removed as an input. The emotions are implanted directly and do not depend on the person. The output (of person feeling X emotions) is identical and of far less complexity and value. Even if the emotions are hooked up to a random number generator or in some other way made to result in non-identical outputs, the situation is not improved. Because the problem isn’t so much “identical output” as it is that the Person was not an input, was not entangled in the process, and therefore doesn’t matter.

 

Billy:

I may be misunderstanding how you use the term “wireheading”, but a sufficiently advanced machine could stimulate the right parts of your brain at the right time to give you the experience of watching a movie, and there would be no way to distinguish between the “real” experience and the “wired” experience. (Or substitute any of your other examples.)

So before we start, I want to state that I don’t think there’s anything bad about simulated experiences per se. “Wireheading” is commonly defined as directly activating the end-state that is desired. In the classic example, by running a wire to the joy-parts of the brain and stimulating them. What you seem to be describing is more of a Matrix-style full sensory replacement.

I actually don’t have much of a problem with simulated-realities. Already a large percentage of the emotions felt by middle-class people in the first world are due to simulated realities. We induce feelings via music, television/movies, video games, novels, and other art. I think this has had some positive effects on society – it’s nice when people can get their Thrill needs met without actually risking their lives and/or committing crimes. In fact, the sorts of people who still try to get all their emotional needs met in the real world tend to be destructive and dramatic and I’m sure everyone knows at least one person like that, and tries to avoid them.

Of course I think a complete retreat to isolation would be sad, because other human minds are the most complex things that exist, and to cut that out of one’s life entirely would be an impoverishment. But a community of people interacting in a cyberworld, with access to physical reality? Shit, that sounds amazing!

Perhaps you meant something different? A “Total Recall” style system has the potential to become nightmarish. Right now when someone watches a movie, they bring their whole life with them. The movie is interpreted in light of one’s life experience. Every viewer has a different experience (some people have radically different experiences, as me and my SO recently discovered when we watched Birdman together. In fact, this comparing of the difference of experiences is the most fun part of my bi-weekly book club meetings. It’s kinda the whole point.). The person is an input in the process, and they’re mashed up into the product. If your proposed system would simply impose a memory or an experience onto someone else wholesale* without them being involved in the process, then it would be just as bad as Rowan’s “series of emotions” process.

I have a vision of billions of people spending all of eternity simply reliving the most intense emotional experiences ever recorded, in perfect carbon copy, over and over again, and I shudder in horror. That’s not even being a person anymore. That’s overwriting your own existence with the recorded existence of someone(s) else. :(

  2 Responses to “Removing The Person From The Output”

  1. I don’t know if you picked up the idea of ‘human as input’ from anywhere, but that totally made the argument click for me. I was sort leaning on a sufficiently advanced wirehead being able to replicate what you were talking about, but this post let me see exactly what you mean. Excellent post, and I will likely use the ‘person as input’ to help frame discussions I have about simulated realities and such in the future!

 Leave a Reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

(required)

(required)