Epistemic status: Left as an exercise for the reader.
I was thinking of EA outreach and its optics this week. And was inspired to glance at a specific part: What does EA look like from the lens of a Slytherin?
People keep being surprised about Effective Altruist Slytherins
Surprise means your model has error. I’ll point out the confusions. First confusion:
But Slytherins don’t care for altruistic goals!
I think we can divide the Hogwarts Houses into methods houses
[1]
:
Slytherin values social shrewdness. Understanding the status game. Knowing who holds the stakes.
Ravenclaw values well-crunched analysis. Intellectualism. Academicness.
and value houses:
Gryffindor values courage and doing what’s right
Hufflepuff values kindness and loyalty
And this naturally leads into an explanation for why Slytherins would want to do EA:
Slytherins have a pretty normal amount of variance in their values, accounting for their backgrounds. Methods are (to a significant degree) orthogonal to values.
If it’s a methods house, why do even the non-reprehensible Slytherins often get called evil? I think it’s largely because the Slytherin-virtuous methods are considered reprehensible by the wider public; manipulation of social realities and truth, paying attention to the gap between what people say and what they actually think, is considered something that you’re not supposed to look at. And people judge others by their social associations.
Second confusion:
But why are there so few Slytherin EAs?
1. Selection effects
Senior Slytherins don’t buy into EA. (Importantly: this includes people such as: politically savvy people)
(Theory: People who have good models on when someone is Out to get You only buy into EA shallowly)
The typical form of the pitch: “If you donate a fraction of your income, you can have an outsized impact with the money – saving a life costs $50000+ in the US, and only $5000 at the most effective location.”
To a Slytherin this reads as: “Yap yap yap, I’m selling you indulgences with Additional Math, and calling it moral realism.”
Which are the same mouth-noises every religion salesman, who is, of course, also trying to profit, makes.
Let’s observe how the outreach pulls in the other houses:
Consistency arguments, which interest the Ravenclaw, are a side concern to most pure Slytherins. (Slytherin-Ravenclaw and Ravenclaw-Slytherin are common archetypes in Serious Ethical Debates, though.)
Belonging, a community that truly cares, can sell the Hufflepuff who happens to come across EA. Slytherins don’t see why they should buy belonging when someone is selling it – belonging isn’t the kind of good that you can legibly sell.
Call-to-arms, for the Gryffindor. Sigh. For a Slytherin, call-to-arms are preying on people’s naivete, extracting free resources and giving none back.
Why is there no Slytherin pitch? The people with shrewd detectors will apply cost-benefit math to the problem of community-joining and associating. And the problem with this is obvious: If you sell a community by its benefits (the way you sell stuff to people whose Out-to-Get-You detectors fire), you attract grifters.
The frame also often implies that EA is looking for true believers. Sigh. The true-believer frame is charity’s non-adversarial self-image made explicit — and to a Slytherin, charity’s self-image as non-adversarial is self-contradictory. For don’t charities compete for the same attention, same labor as do profitable businesses?
2. You just don’t see them
Most Slytherins don’t actually advertise their house (And they shouldn’t.). You can look at the previous section for some of the reasons.
[2]
(On the other hand, they kind of do, if you look at in the right way – it’s just illegible and layered enough to not be obvious. For obvious reasons.)
Case study
Let’s interview a real life Slytherin who was available for anonymous extrospection:
What does it feel like to be a Slytherin?
I guess most of the time it doesn’t feel like anything. It’s a component of how I model reality and social spaces and connections.
In EA spaces sometimes I’m bothered by peoples assumptions about altruism or self-sacrifice.
But then someone genuinely wise puts self-preservation into a list of virtues vices for wicked problems and I feel the sanity is back.
Genuine worldview difference: People talk about other people being good or evil. I don’t actually spend time computing that. It’s not action guiding and I want good for everyone. I spend time modelling what peoples observed and stated values are, where they are in conflict, and when people are incoherent.
How do you tell yourself apart from a Slytherin grifter? How would anyone else?
I don’t answer adversarial questions.
Do you have takes on EA you wouldn’t say aloud nonymously?
I think EA and especially the giving pledge is pushing too hard on ‘very altruistic people’ and not enough on ‘people who are self-interested but understand the value of cooperation’. The rationalism movement of course significantly captures people from the latter, but the dogmatic purism can be quite offputting.
To elaborate; there’s this constant push to ‘be more altruistic’, but drawing on the law of equal and opposite advice, I doubt that most EA’s (values) benefit from being pushed towards a more altruistic direction.
What’s the question I should have asked and didn’t?
What is this, a job interview? Uhh. “What’s the most significant favor owed to you?”
What’s the most significant favor owed to you?
I don’t answer adversarial questions.
Can you explain your policy for which questions you will decline to answer?
I don’t answer adversarial questions.
Seriously, can you tell me a question you would answer?
I don’t answer… just kidding. “Do you have a list of favors people owe you, or of levers you have identified?”
Do you have a list of favors people owe you, or of levers you have identified?
No. I don’t enjoy using paper notebooks, and I wouldn’t trust a digital notetaking setup with this kind of information.
Kidding. Seriously, I don’t create those kind of lists because I think they would be corrosive.
Additionally, it would breach GDPR.
Anything else?
It’s an iterated game.
This is an illustrative lens about what kind of people the houses are selecting for, not a clean fit. ↩︎
TLDR: It’s just bothersome to get called mean, evil, unprincipled, scary, or dangerous repeatedly. And instrumentally detrimental. ↩︎
Discuss Read More



