Weirdly enough, both can be true. I was tangentially involved in EA in the early days, and have some friends who were more involved. Lots of interesting, really cool stuff going on, but there was always latent insecurity paired with overconfidence and elitism as is typical in young nerd circles.
When big money got involved, the tone shifted a lot. One phrase that really stuck with me is "exceptional talent". Everyone in EA was suddenly talking about finding, involving, hiring exceptional talent at a time where there was more than enough money going around to give some to us mediocre people as well.
In the case of EA in particular circlejerks lead to idiotic ideas even when paired with rationalist rhetoric, so they bought mansions for team building (how else are you getting exceptional talent), praised crypto (because they are funding the best and brightest) and started caring a lot about shrimp welfare (no one else does).
I don't think this validates the criticism that "they don't really ever show a sense of[...] maybe I'm wrong".
I think that sentence would be a fair description of certain individuals in the EA community, especially SBF, but that is not the same thing as saying that rationalists don't ever express epistemic uncertainty, when on average they spend more words on that than just about any other group I can think of.
> caring a lot about shrimp welfare (no one else does).
Ah. I guess they are working out ecology through first principles, I guess?
I feel like a lot of the criticism of EA and rationalism does boil down to some kind of general criticism of naivete and entitlement, which... is probably true when applied to lots of people, regardless of whether they espouse these ideas or not.
It's also easier to criticize obviously doomed/misguided efforts at making the world a better place than to think deeply about how many of the pressing modern day problems (environmental issues, extinction, human suffering, etc.) also seem to be completely intractable, when analyzed in terms of the average individual's ability to take action. I think some criticism of EA or rationalism is also a reaction to a creeping unspoken consensus that "things are only going to get worse" in the future.
>I think some criticism of EA or rationalism is also a reaction to a creeping unspoken consensus that "things are only going to get worse" in the future.
I think it's that combined with the EA approach to it which is: let's focus on space flight and shrimp welfare. Not sure which side is more in denial about the impending future?
I have no belief any particular individual can do anything about shrimp welfare more than they can about the intractable problems we do face.
> I think it's that combined with the EA approach to it which is: let's focus on space flight and shrimp welfare. Not sure which side is more in denial about the impending future?
I think its a result of its complete denial of and ignorance of politics. Because rationalist and effective altruist movements make a whole lot more sense, if you realize they are talking about deeply social and political issues with all politics removed from it. Its technocrat-ism the poster child of the kind of "there is no alternative" neoliberalism that everyone in the western world was indoctrinated into since the 80s.
Its a fundamental contradiction, we don't need to talk about politics because we already know liberal democracies and free-market capitalism is the best we ever going to achieve, faced with the numerous intractable problems we face that can not possibly be related to liberal democracies and free-market capitalism.
The problem is: How do we talk about any issue the world is facing today without ever challenging or even talking about any of the many assumptions the western liberal democracies are based upon? In other words: the problems we face are structural/systemic, but we are not allowed to talk about the structures/systems. That's how you end up with space flight and shrimp welfare and AGI/ASI catastrophizing taking up 99% of everything these people talk about. It's infantile, impotent liberal escapism more than anything else.
They bought one mansion to host fundraisers with the super-rich, which I believe is an important correction. You might disagree with that reasoning as well, but it's definitely not as described.
As far as I know it's never hosted an impress-the-oligarch fundraiser, which as you say would at least have a logic behind it[1] even if it might seem distasteful.
For a philosophy which started out from the point of view that much of mainstream aid was spent with little thought, it was a bit of an end of Animal Farm moment.
(to their credit, a lot of people who identified as EAs were unhappy. If you drew a Venn diagram of the people that objected, people who sneered at the objections[2] and people who identified as rationalists you might only need two circles though...)
[1]a pretty shaky one considering how easy it is to impress American billionaires with Oxford architecture without going to the expense of operating a nearby mansion as a venue, particularly if you happen to be a charitable movement with strong links to the university...
[2]obviously people are only objecting to it for PR purposes because they're not smart enough to realise that capital appreciates and that venues cost money, and definitely not because they've got a pretty good idea how expensive upkeep on little used medieval venues are and how many alternatives exist if you really care about the cost effectiveness of your retreat, especially to charitable movements affiliated with a university...
> If you drew a Venn diagram of the people that objected, people who sneered at the objections[2] and people who identified as rationalists you might only need two circles though...)
I’m a bit confused by this one.
Are you saying that no-one who identifies as rationalist sneered at the objections? Because I don’t think that’s true.
Nope, I'm implying the people sneering at the objections were the self proclaimed rationalists. Other, less contrarian thinkers were more inclined to spot that a $15m heritage building might not be the epitome of cost-effective venues...
Yes! It can be true both that rationalists tend, more than almost any other group, to admit and try to take account of their uncertainty about things they say and that it's fun to dunk on them for being arrogant and always assuming they're 100% right!
Because they we doing so many workshops that buying a building was cheaper than renting all the time.
You may argue that organizing workshops is wrong (and you might be right about that), but once you choose to do them, it makes sense to choose the cheaper option rather than the more expensive one. That's not rationalist rhetoric, that's just basic economy.
When big money got involved, the tone shifted a lot. One phrase that really stuck with me is "exceptional talent". Everyone in EA was suddenly talking about finding, involving, hiring exceptional talent at a time where there was more than enough money going around to give some to us mediocre people as well.
In the case of EA in particular circlejerks lead to idiotic ideas even when paired with rationalist rhetoric, so they bought mansions for team building (how else are you getting exceptional talent), praised crypto (because they are funding the best and brightest) and started caring a lot about shrimp welfare (no one else does).