Jan 022017
 

(text of pic: You and some other guy are glued to the tracks. Any of you can pull the lever, releasing the trolley and killing the other guy. You told the other guy that you’ll pulll the level if he does, hoping he won’t kill you. After a long time, he finally pulls the lever. Do you keep your promise and lead to 2 deaths instead of 1?)

This is prob common knowledge, but hey, here’s my answers
1 – Yes. If someone’s killing me, sure as hell I’m gonna do everything in my power to kill him back. Screw that guy.
2 – Yes. If you don’t follow through on precommitments, it leaves you open to exploitation by defectors. Slightly less important for you personally if there’s only one of you but…
..2a – could still be very important if someone has access to your source code, OR if you’re actually a simulation being simulated by a predictor to see what the real-you would do in this situation so they know if it’s safe to defect or not, and
..2b – even if this is the real world rather than a simulation, your actions will reflect on those who are similar to you, which likely includes many of your friends and loved ones. If you don’t pull the lever, this is weak-to-moderate evidence that your loved ones also wouldn’t pull the lever if put in the same situation, and that leaves them open to exploitation.
3 – This would be much harder to apply in the case of actual nuclear weapons. But fortunately these are trolleys, so I don’t have to think that hard :)

The more interesting question is… if you’re glued to the tracks and have nothing else to do for your entire life (and you can’t talk to the other guy)… should you pull the lever just for the excitement of seeing what he’ll do?

  9 Responses to “MAD trollies”

  1. I am bad at this. If you both push the button at the same time the trolleys will collide though, right? also if you stagger the release enough, one could push the other maybe close enough to be useful in escaping.

    • opps, clicked too fast though. I don’t think I would pull the lever. That guy is going through some shit, he got picked up and glued to a trolley track. it sucks, but I can’t hate him for being scared and I don’t think I would be able to pull the lever out of spite.

  2. I wouldn’t be in that situation. If I were able to communicate with him I’d use it for something other than that.

  3. I find these moral decisions kind of tiring because whenever something like this comes up in reality people seem to make their choices based on whatever they feel like anyway without considering what choice they’d make in a hypothetical situation. I’m not meaning people being glued to rails, I mean, for example, autonomously driving cars. Even at their current state they appear to be a lot safer than human drivers but I doubt they’re gonna be allowed to be used for the next ten years at least.
    Or voluntary euthanasia.

    • I think part of the hope is that by presenting these sorts of things and inviting people to think about them, maybe it’ll start to sway their thinking a bit when they do run into those sorts of situations in real life. So maybe they’ll end up acting more like they’d prefer to upon reflection, rather than just going with their first instinctive response.

      • That is a good point. Unfortunately the people making the decisions don’t hang around in rationality forums (or they’re very good at hiding that). But you’re right, the more those ideas become mainstream, the more likely they start influencing decisions.

Leave a Reply to Senjiu Cancel reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

(required)

(required)