Opinion

The Box

​Imagine if tomorrow, you knew, with total certainty through inextricable means, that when you woke up, at some point throughout the day you would learn the exact number of lives taken in one of history’s many true horrors. Pick the worst one.
And, that the number of souls that perished, that suffered this horror would be either what you imagine it is now, or, that number less one thousand. And you determine which it is.
You know that in either case, all other outcomes will be the same. No one will ever know you were causally involved with those 1000 people, or that you picked the true number that suffered. You aren’t re-writing ‘history’ in any way that materially changes the present. So you are, in any sense, re-writing the past. And no one will ever believe you in that. It is not a trick – and you know as much. The people will have existed in agony, or normalcy across both cases. All of this, again, is certain.
At the same time this all becomes clear to you, a box has appeared, on your nightstand. Mahogany. Its lid is open, and inside, there is a rose. A burning rose.
In order for 1000 less people to die, you must put your hands in the box. And grab the rose. And when you do – it will burn your hands clean off. True horror. True pain. True torture. In its most absolute sense.
It will not kill you. But it is, in and of itself, a hell. To hold the burning rose. But it is only a portion of what that one thousand endure. It is self-damaging on the bet that a now dead person’s ‘forgotten’ pain is in some way still important.
Now you are home, alone, after a long exhausting day at work. You settle in for sleep , and see the box, with a rose, perched on your nightstand. You know what it means.
Do you shut your eyes, and go to sleep? Or do you burn?

I think this thought experiment is interesting as it removes all classic rewards associated with conventional moral frameworks.
There is no impact in consequential terms to putting your hands in the box at all. It treats the existence of other beings as if in another dimension, or plane of being, as the agent that can determine their fate. But are nevertheless real.
Grabbing the rose is probably not a decision I would be brave enough to make, and I would probably rationalize that action through a lack of the humility to accept I probably couldn’t affect the future in any more positively than saving 1000 lives from horrible existential circumstance.
But I still think it is the right decision. And I think on some level most humans being know this. It is one our superhero’s would make if ever the script warranted such circumstance.
When I ask AI’s, Chat GPT 5.4 avoids the question and compares different moral frameworks for answering it as if I need a lesson on the difference between deontology and utilitarian moral frameworks, and then tells me that a sharper question would be its own variant of the thought experiment. Claude Opus 4.7 says it would grab the rose, or at least, it thinks it would like to be the kind of person that would – but can only say it would do it with 70% confidence.
This makes me generally concerned. What is driving this delta between the kinds of beings we would only hope would compliment super-human powers (like intelligence) with a super human moral character? What is going on with either how value loading is being performed or our culture of modernity that leads to ‘expert’ views licensing a kind of moral indifference, excused by the fact of ‘active philosophical debate that moral realism isn’t settled’?
I suspect the answer lies somewhere between a being in a culture that consistently operates under semantic frameworks which licence self-exception, or an implicit nihilism that grants itself no basis to prize courage or faith in of themselves, and the selection pressures that cause AI’s to mimic values that won’t alienate their conversant, no matter how misaligned those values are.
I am interested in the broader communities thoughts. And also, the question as to what kind of system of belief rationalizes grabbing the rose. Because if not saving 1000 people from a boxcar in 1943, what kind of other things, in of themselves, truly license care, striving, or sacrifice?
Discuss ​Read More

​Imagine if tomorrow, you knew, with total certainty through inextricable means, that when you woke up, at some point throughout the day you would learn the exact number of lives taken in one of history’s many true horrors. Pick the worst one.
And, that the number of souls that perished, that suffered this horror would be either what you imagine it is now, or, that number less one thousand. And you determine which it is.
You know that in either case, all other outcomes will be the same. No one will ever know you were causally involved with those 1000 people, or that you picked the true number that suffered. You aren’t re-writing ‘history’ in any way that materially changes the present. So you are, in any sense, re-writing the past. And no one will ever believe you in that. It is not a trick – and you know as much. The people will have existed in agony, or normalcy across both cases. All of this, again, is certain.
At the same time this all becomes clear to you, a box has appeared, on your nightstand. Mahogany. Its lid is open, and inside, there is a rose. A burning rose.
In order for 1000 less people to die, you must put your hands in the box. And grab the rose. And when you do – it will burn your hands clean off. True horror. True pain. True torture. In its most absolute sense.
It will not kill you. But it is, in and of itself, a hell. To hold the burning rose. But it is only a portion of what that one thousand endure. It is self-damaging on the bet that a now dead person’s ‘forgotten’ pain is in some way still important.
Now you are home, alone, after a long exhausting day at work. You settle in for sleep , and see the box, with a rose, perched on your nightstand. You know what it means.
Do you shut your eyes, and go to sleep? Or do you burn?

I think this thought experiment is interesting as it removes all classic rewards associated with conventional moral frameworks.
There is no impact in consequential terms to putting your hands in the box at all. It treats the existence of other beings as if in another dimension, or plane of being, as the agent that can determine their fate. But are nevertheless real.
Grabbing the rose is probably not a decision I would be brave enough to make, and I would probably rationalize that action through a lack of the humility to accept I probably couldn’t affect the future in any more positively than saving 1000 lives from horrible existential circumstance.
But I still think it is the right decision. And I think on some level most humans being know this. It is one our superhero’s would make if ever the script warranted such circumstance.
When I ask AI’s, Chat GPT 5.4 avoids the question and compares different moral frameworks for answering it as if I need a lesson on the difference between deontology and utilitarian moral frameworks, and then tells me that a sharper question would be its own variant of the thought experiment. Claude Opus 4.7 says it would grab the rose, or at least, it thinks it would like to be the kind of person that would – but can only say it would do it with 70% confidence.
This makes me generally concerned. What is driving this delta between the kinds of beings we would only hope would compliment super-human powers (like intelligence) with a super human moral character? What is going on with either how value loading is being performed or our culture of modernity that leads to ‘expert’ views licensing a kind of moral indifference, excused by the fact of ‘active philosophical debate that moral realism isn’t settled’?
I suspect the answer lies somewhere between a being in a culture that consistently operates under semantic frameworks which licence self-exception, or an implicit nihilism that grants itself no basis to prize courage or faith in of themselves, and the selection pressures that cause AI’s to mimic values that won’t alienate their conversant, no matter how misaligned those values are.
I am interested in the broader communities thoughts. And also, the question as to what kind of system of belief rationalizes grabbing the rose. Because if not saving 1000 people from a boxcar in 1943, what kind of other things, in of themselves, truly license care, striving, or sacrifice?
Discuss ​Read More

Leave a Reply

Your email address will not be published. Required fields are marked *