I’d pull the lever to kill one person immediately. Assuming the decision maker at each stage is a different person with different opinions on moral, ethical, religious, and logical questions, then it’s a near certainty that someone is going to pull the lever to kill the people at their stage. If you’re lucky, it’s the very next guy. If you’re not, it’s the guy killing a million people a couple of iterations later. If I’m the first guy, I’ll take the moral hit to save the larger number of people.
Now if we assume the victims tied up are frictionless orbs, and the train is also a frictionless orb, and the two of them are travelling in a frictionless void than I reckon we could kill a few more.
But then would they die if they don’t slow the train down? The train would necessarily have to impart some energy in order to effect a change in their bodies.
I mean if you’re going fast enough with a pointy train, you could chop up people pretty easy. You just need to make sure that each person is a tire width apart to make sure the wheels don’t lose traction. Assuming a person is roughly half a metre across and a tire is 75cm in diameter, we get 1.25m per person, so a track of 1250km for a million people. Not very long at all.
I agree with your logic, so far as it goes. However, there are, currently, just over eight billion humans in existence. If my quick, over-tired math is correct, that means only 34 people have to say no, until we run out of people to tie to the tracks. Assuming, at that point, the system collapses and nobody dies, I’d guess 34 people would refuse - might be the better choice.
Would you trust the entirety of human existence to be decided by 34 people? In my experience from watching reality TV, the last one always screws the rest over for their own benefit.
Imagine being the last one. You could singlehandedly wipe out half the global population. This would normally be a bad thing, and it is, but it would also make every surviver twice as rich, solve food scarcity and halve the pollution, perhaps even saving humanity from itself.
If that’s not enough, think about everyone now having double the amount of kittens and half the traffic on the roads.
Society and the economy are not a zero sum game. Killing half the population wouldn’t make the survivors twice as rich. It would send society into chaos which would make the remaining people’s lives far worse.
I’m not sure reality TV is a good basis, it’s very manipulated and set up for drama. I have a lot more faith in humanity in general than I do in reality TV stars. But you still have a good point, it’s definitely not a sure thing.
Oh yeah. I was assuming an infinite series (somehow). Also, odds are good that out of 34 people, one of them would misunderstand the rules or be crazy enough to do it anyway for various reasons. I’d probably still do it.
Exactly. If you have the means at hand, you have the responsibility to act. At the risk of taking a shitpost way too seriously, if you were in that situation and actively chose to leave the decision to someone else to kill double the people, then you acted unethically.
Technically the 2nd guy could just let it go through and nobody dies. However if it was to double over and over forever until it stopped, then technically the best option is to just double it forever. Nobody would ever die? If someone decided to end “the game” as it were and kill some people, then that’s on them.
There are 1.3 million km of railroads in the world. At 200km/h, a trolley could travel them in little over 270 days.
Rearrange them in a circle and, providing everyone is cooperating for the sake of philosophy, there’s plenty of time to tie 8 billion people to the rails and run them over.
Yes, say there are 2^33 people for illustrations sake, by 33 decisions you (the first puller) are guaranteed to be dead too. At 32 it’s 50/50, the odds increase as the decisions get made. From a self preservation standpoint the best thing you can do to minimize your personal risk is pull the lever. It also happens to kill the fewest other people.
True, since we’re analyzing a hypothetical ethical question I shouldn’t leave any open assumptions. I made the assumption that at some point, at least one person will have to die, as in I see this trolley problem as a situation where at the end there is no choice and the maximum number of people die.
I think that’s bad logic. The choice everyone has is kill or not kill. I can’t be held responsible for someone deciding to pick kill when they have the ability to pick not kill.
Ok, and what does that actually mean for/to me? It’s not the same as intentionally putting somrone in a situation where both choices knowingly result in death. And even if was in this situation, wouldn’t it ultimately be the fault/responsibility of whoever set up the scenario to being with?
I think this is a good metaphor for how humanity has “dealt” with problems like climate change.
If you make a tough decision, it causes hardship now, but prevents hardship in the future. If you don’t make a tough decision now, someone in the future has to either kill a lot of people, or just pass the buck to the next guy. People justify not making the tough decisions by saying that maybe eventually down the line someone will have an easy decision and there will be nobody on the side path, even though all observable evidence says that the number of people on that path just keeps growing exponentially.
On the one hand, the possibility exists that the buck gets passed forever, especially as the kill count numbers grow substantially making the impermissibility of allowing the deaths grow with it. It’s not likely the any given person would kill one stranger, let alone millions.
On the other hand, in an infinite series, even something with miniscule odds will still eventually inevitably happen, and some psycho will instantly become the most infamous murderer in history, followed immediately by the person that didn’t just kill one person and end the growth before it started.
But what if you’re the tenth person with 1024 on the line? Or the 20th person with 1,048,576? Etc. Is there ever a point (before it’s everyone, in which case risk doesn’t increase) where you stop pulling it?
I’d pull the lever to kill one person immediately. Assuming the decision maker at each stage is a different person with different opinions on moral, ethical, religious, and logical questions, then it’s a near certainty that someone is going to pull the lever to kill the people at their stage. If you’re lucky, it’s the very next guy. If you’re not, it’s the guy killing a million people a couple of iterations later. If I’m the first guy, I’ll take the moral hit to save the larger number of people.
I feel like running over all those bodies would make the train come to a stop way before it ran over a million people.
Now I sit back and wait for some morbid soul who is better at math and physics than me to figure out the answer.
Now if we assume the victims tied up are frictionless orbs, and the train is also a frictionless orb, and the two of them are travelling in a frictionless void than I reckon we could kill a few more.
But then would they die if they don’t slow the train down? The train would necessarily have to impart some energy in order to effect a change in their bodies.
Maybe the train is an unstoppable force.
Like the GTA train
I guess sticking people in the void is a good way to kill them in any case.
I mean if you’re going fast enough with a pointy train, you could chop up people pretty easy. You just need to make sure that each person is a tire width apart to make sure the wheels don’t lose traction. Assuming a person is roughly half a metre across and a tire is 75cm in diameter, we get 1.25m per person, so a track of 1250km for a million people. Not very long at all.
I agree with your logic, so far as it goes. However, there are, currently, just over eight billion humans in existence. If my quick, over-tired math is correct, that means only 34 people have to say no, until we run out of people to tie to the tracks. Assuming, at that point, the system collapses and nobody dies, I’d guess 34 people would refuse - might be the better choice.
Would you trust the entirety of human existence to be decided by 34 people? In my experience from watching reality TV, the last one always screws the rest over for their own benefit.
Imagine being the last one. You could singlehandedly wipe out half the global population. This would normally be a bad thing, and it is, but it would also make every surviver twice as rich, solve food scarcity and halve the pollution, perhaps even saving humanity from itself.
If that’s not enough, think about everyone now having double the amount of kittens and half the traffic on the roads.
Society and the economy are not a zero sum game. Killing half the population wouldn’t make the survivors twice as rich. It would send society into chaos which would make the remaining people’s lives far worse.
I’m not sure reality TV is a good basis, it’s very manipulated and set up for drama. I have a lot more faith in humanity in general than I do in reality TV stars. But you still have a good point, it’s definitely not a sure thing.
Oh yeah. I was assuming an infinite series (somehow). Also, odds are good that out of 34 people, one of them would misunderstand the rules or be crazy enough to do it anyway for various reasons. I’d probably still do it.
You weren’t wrong, the meme implies an infinite series, and I might be cheating to apply real-world constraints to an absurd hypothetical.
After we run out of people, they start adding cats & dogs.
Yikes! Pull the lever now!
Exactly. If you have the means at hand, you have the responsibility to act. At the risk of taking a shitpost way too seriously, if you were in that situation and actively chose to leave the decision to someone else to kill double the people, then you acted unethically.
Technically the 2nd guy could just let it go through and nobody dies. However if it was to double over and over forever until it stopped, then technically the best option is to just double it forever. Nobody would ever die? If someone decided to end “the game” as it were and kill some people, then that’s on them.
Pretty sure there’s a base case when you run out of people to tie to the tracks. A naive log2 of 8 billion is only 33 decisions.
Except, given finite resources, the tracks would run out before having enough space for 8 billion tied-up people.
There are 1.3 million km of railroads in the world. At 200km/h, a trolley could travel them in little over 270 days.
Rearrange them in a circle and, providing everyone is cooperating for the sake of philosophy, there’s plenty of time to tie 8 billion people to the rails and run them over.
Yes, say there are 2^33 people for illustrations sake, by 33 decisions you (the first puller) are guaranteed to be dead too. At 32 it’s 50/50, the odds increase as the decisions get made. From a self preservation standpoint the best thing you can do to minimize your personal risk is pull the lever. It also happens to kill the fewest other people.
The only out is nobody pulls the lever.
True, since we’re analyzing a hypothetical ethical question I shouldn’t leave any open assumptions. I made the assumption that at some point, at least one person will have to die, as in I see this trolley problem as a situation where at the end there is no choice and the maximum number of people die.
It’s on them, but it affects thousands or millions of others.
As such if you can prevent that, and don’t, it’s also on you too.
I think that’s bad logic. The choice everyone has is kill or not kill. I can’t be held responsible for someone deciding to pick kill when they have the ability to pick not kill.
You’re not responsible for their choice.
You’re responsible for giving them the choice.
Ok, and what does that actually mean for/to me? It’s not the same as intentionally putting somrone in a situation where both choices knowingly result in death. And even if was in this situation, wouldn’t it ultimately be the fault/responsibility of whoever set up the scenario to being with?
I think this is a good metaphor for how humanity has “dealt” with problems like climate change.
If you make a tough decision, it causes hardship now, but prevents hardship in the future. If you don’t make a tough decision now, someone in the future has to either kill a lot of people, or just pass the buck to the next guy. People justify not making the tough decisions by saying that maybe eventually down the line someone will have an easy decision and there will be nobody on the side path, even though all observable evidence says that the number of people on that path just keeps growing exponentially.
On the one hand, the possibility exists that the buck gets passed forever, especially as the kill count numbers grow substantially making the impermissibility of allowing the deaths grow with it. It’s not likely the any given person would kill one stranger, let alone millions.
On the other hand, in an infinite series, even something with miniscule odds will still eventually inevitably happen, and some psycho will instantly become the most infamous murderer in history, followed immediately by the person that didn’t just kill one person and end the growth before it started.
If you are really unlucky the number doubles so many time you end up tied on the tracks.
But what if you’re the tenth person with 1024 on the line? Or the 20th person with 1,048,576? Etc. Is there ever a point (before it’s everyone, in which case risk doesn’t increase) where you stop pulling it?
I don’t think so.