Politics is an EA blindspot
Effective Altruism is failing to grapple with US politics
Tl;dr
I argue that maintaining nonpartisan norms on the EA Forum, in public communications by influential community members, and in funding decisions may be more costly than people realize. Lack of discussion in public means that people don’t take political issues as seriously as they should, research which depends on understanding the political situation doesn’t get done, and the community moves forward with a poor model of probably the most consequential actor in the world for any given cause area - the US government. Importantly, I don’t mean to say most community members shouldn’t maintain studious nonpartisanship! I merely want to argue that we should be aware of the downsides and do what we can to mitigate them.
Why nonpartisan norms in EA are a big deal
Individual politicians (not naming names) are likely the most important single actors affecting the governance of AI. The same goes for most of the cause areas EAs care about. While many prominent EAs think political issues may be a top priority, and politics is discussed somewhat behind closed doors, there is almost no public discussion of politics. I argue the community’s lack of a public conversation about the likely impacts of these political actors and what to do in response to them creates large costs for how the community thinks about and addresses important issues (i.e. self-censorship matters actually). Some of these costs include:
Perceived unimportance: I suspect a common, often subconscious, thought is, 'no prominent EAs are talking about politics publicly so it's probably not as big of a deal as it seems'. Lack of public conversation means social permission is never granted to discuss the issue as a top priority, it means the topic comes up less & so is thought about less, and it means institutions don't feel pressured to do anything.
There is a culture of deferring to experts in EA (i.e. people base their opinions on those of senior figures in the community) and, while I think EA is better than most communities about deference & that deference is often a reasonable strategy, here it backfires in a big way.
Newcomers looking to download the wisdom of EA will look around (e.g. at the EA Forum or the 80k website) and see that no one seems to care about politics even a little. I expect many will think either, ‘woah, EA is kinda stupid. Seems like they’re missing something big’ or ‘I guess politics is another one of those things that seems important but actually doesn’t matter’.
Nonpartisanship forces self-censorship & erodes epistemic honesty: Many EAs and rationalists believe it best to present a relatively unfiltered and authentic version of their beliefs to the public. Aiming for nonpartisanship in your public communications means you cannot convey your honest opinions to the world or maintain a more authentic presence (which does matter, I think people are good at noticing inauthenticity and they do care about it).
Failure to develop accurate models of (political) reality: I fear that because many people in the community aren’t really keeping up on what’s happening and the above factors, many people in the community don’t really know what’s going on (e.g. DOGE cuts were a total surprise to most but are probably the single biggest event in global health this decade). When I tell people facts about the current situation they’re often surprised and update their models at least somewhat. Given the lack of understanding, some people act as if we’re in a business-as-usual world where the US can be modeled as acting roughly as it did under Biden but with some Republican flavoring. I think this is incorrect and could hurt decision-making in crucial ways.
Research gap: There is only a tiny amount of funding in the EA space available to do work which is necessarily political in nature. This is both because of a desire by researchers & funders to be nonpartisan, but also because most donors either strongly prefer or are legally obligated to only donate to nonpartisan nonprofits (501(c)3s).
Lack of knowledge transfer: It would appear that most organizations do essentially no analysis of the political situation, those that have will likely never share those publicly and, I suspect, don’t even share them privately. When no one publishes, there’s no build-up of knowledge, the frontier of understanding advances dramatically slower than it would under an uncensored information-sharing regime.
Being nonpartisan closes down many options for impact: While maintaining nonpartisanship is a way to keep options open, it is more costly than I think many realize. Maintaining a nonpartisan stance means you miss many potentially impactful opportunities (e.g. getting involved in political campaigns is a common way to get roles in future administrations) and I worry people are overweighting the value of preserving option value.
Failure to prioritize and act on high-impact opportunities : All of the above culminate in a failure to consider political interventions and engage in real thinking about one of the most important leverage points in determining the trajectory of the future. If the single most important thing to do right now were to intervene in the political situation in the US right now, it feels unlikely the EA community would recognize that and act on it.
Why EAs tend not to engage much with politics
There are plenty of excellent reasons why being openly political can be a bad idea. By engaging publicly with political issues, organizations and individuals risk:
Antagonizing political opponents
Losing influence with influential people
Losing or being unable to get funding (e.g. 501(c)3 aren’t allowed to engage in political advocacy1)
Polarizing the EA community and top-priority cause areas (especially if there’s an imbalance in support for certain candidates or parties)
Being canceled by the other side
There’s also some more cultural and temperamental reasons why EAs tend not to engage with politics as much as might be ideal. EAs tend to focus mostly on issues where there isn’t a ton of public attention, tend to dislike the adversarial atmosphere of partisan politics, and prefer to work on the margins (this recent piece goes into more detail). I also suspect the meme that ‘politics is the mindkiller’2 has been influential in the community and pushes many toward believing it’s not worth spending time trying to be rational about such an emotionally charged space.
Many should remain nonpartisan but improvements can be made
While there are large costs, I believe most of the community should remain largely nonpartisan. What I am saying is that, given the importance of politics, we should find more ways to ensure political analysis and action are happening. How might we go about doing this?
Some community members could talk more openly about their analysis of the current political situation. I.e. Someone should be willing to name names & spell out how politics affects the cause areas they care about (e.g. Kelsey Piper does this well). Importantly, it’s best to avoid making claims on behalf of a given cause area or EA as a whole as I think the broader community should remain largely nonpartisan.
More advocacy movements should aim for bipartisanship rather than nonpartisanship by having partisan advocates, operatives, and funders on both sides of the aisle, as is often the case with successful advocacy and lobbying (e.g. corporations, AIPAC, crypto, etc).
In an ideal world, there would be many Republicans and many Democrats who care about nonpartisan issues like mitigating existential risks and ensuring a flourishing future. These priorities and causes are likely to get more buy-in when politicians and others involved in party politics see that people with these priorities are also allied on other causes they already care about.
For this to happen we need to have more members of the community who are openly partisan on both sides. Admittedly the community is likely to lean more toward one side than the other and I’m not sure what to do about that.
Some community members should get involved directly with politics (with both parties).
Organizations could have internal discussions and commission research to keep track of what’s happening and consider potential interventions.
There could be a few more funders open to funding research which is necessarily political.
There could be broader awareness of just how consequential the political situation is to ~everything the EA community cares about.
It could be useful for events, organizations, and places where discussion happens to make it more clear that the reason people don’t talk about politics in these spaces is not because they think it’s unimportant but because partisan speech is restricted or disincentivized.
Though this is easier to circumvent than I think many realize and there are clear, albeit moderately costly, ways to get around these restrictions. After all, Charlie Kirk runs a 501(c)3 (along with a 501(c)4) and has no qualms about endorsing candidates. It seems quite common for advocacy groups to have both & do very explicit partisan activities with purely nominal separation.
There’s an old Yudkowsky post called ‘Politics is the Mindkiller’. The title has become a meme in the EA & Rationalist community and I think it’s contributed to the community’s general allergy to politics. I’ve certainly heard the phrase ‘politics in the mindkiller’ said, often jokingly, in EA circles. While the post doesn’t actually advocate against talking about politics (it’s pretty narrowly saying don’t use politically charged examples when making an unrelated point), I think the meme it’s created has pushed many away from engaging with the topic.

Thanks for this post, I agree a lot with what you said! I just made a post with similar themes, although you go into much more depth on the issue of non-partisan politics :)