Moral Dilemmas

By devin, 22 February, 2014

I'm watching Ender's Game right now. Spoiler alert!

The book/movie plays with an interesting moral situation. The Formics come and kill a whole bunch of humans; morally reprehensible. But then from their perspective, killing little mindless ants is like cutting off fingers from the one actual mind at the centre of the Formic fleet. From ours, it's awful.

Then Mazer Rackham killed the one live Formic in the fleet - murder! Reprehensible! At least, reprehensible to the Formics. Of course, in reality the humans killed one mind, and the Formics killed thousands.

But in terms of lives, perhaps the humans were doing barely anything when killing thousands of Formics, but essentially committing genocide when killing the one Formic that controlled all the others.


That got me thinking, what if there were a million humans all linked up to one person via some chemical connection or something, such that if the one human was killed, the million would also die. Would it be the equivalent of genocide to kill the one human?

But then, what if a million humans promised to commit suicide if the one human died? Would it be like genocide to kill the one human then? Choice has come into the equation there.

Morals are crazy. I never have any answers.


Plain text

  • No HTML tags allowed.
  • Web page addresses and email addresses turn into links automatically.
  • Lines and paragraphs break automatically.