(no subject)
Dec. 21st, 2018 12:27 amI have a lot of effective altruists on my tumblr dashboard. I also have a lot of people who have serious criticisms of effective altruism on my tumblr dashboard. This means I see them bicker a lot about how to do charity and what the right way to help people is. And there's this pattern I keep seeing over and over again where the EAs and the people who don't like EA keep getting into huge arguments-- because they're not actually talking to each other.
From what I can tell, most EAs are systematizing nerdy types- people who genuinely do want to make the world better, do not have the skills you need to do 'traditional' charity work without going batshit insane, and are really really good at numbers. Meanwhile, most of the people I've seen who are really really upset about EAs and think they're culty are people who *do* traditional charity work- sometimes for a living.
So... effective altruism, as a philosophy, is supposed to be a way of optimizing doing good. Whether or not it actually is ... well, I'm not informed enough to say one way or the other. Whether or not it's optimal *overall*, though, for a certain kind of systematizing-nerdy-person who is already planning to have a career in a high-powered numbers-heavy industry, it is probably one of the better ways to do some good in the world. If you have not much time but a lot of money, and you can put all that money into helping people in a specific way that you know will have a positive effect... it's probably better to do that than not.
The thing is, there honestly *is* a stigma in our society about wanting to make lots of money. ...Don't get me wrong, there's absolutely no stigma about having lots of money, quite the opposite, and there's very little stigma against most of the jobs that get you lots of money. But if you say "I would like to do this thing specifically because it will get me lots of money and for no other reason; I'm not particularly passionate about it and I don't think it will actively Make The World Better"... people are going to call you greedy. On Tumblr especially there are a lot of left-leaning people who think that trying to become wealthy is guaranteed to make the world worse, but even in more 'YAY, CAPITALISM' places you'll still get some side-eye.
The trouble is, on the other side of the debate... a lot of the people I've seen who argue with EAs are in the helping professions or do a lot of charity work in their own communities. If you're helping vulnerable people- especially in the USA- the odds are good that you have almost no money to work with and you're getting less and less every year. Your clients get a lot of shit for not just pulling themselves up by their own bootstraps, *you* get a lot of shit for not kicking your clients in the pants until they bootstrap themselves, and meanwhile your *community* has less and less stuff available to help people pull themselves out of poverty period, even simple things like "good jobs" and "affordable housing". There're a lot of shitty people out there who like to claim that anyone who says they need help is trying to fake their way out of doing real work, and a lot of the time these people *are* people who are making massive amounts of money and have massive amounts of luck to boot.
...The problem comes from the same place. People like Horatio Alger stories, but they don't like the truth of what it's really like to drag yourself out of poverty or make yourself rich. They would like to believe that everyone who gets rich sort of lucked into it by being Good People Who Followed Their Bliss, but also that anyone who's a Good Person Who Follows Their Bliss can wind up that way. People don't like to realize that it takes a lot of work and a lot of planning and a lot of other people helping you to drag yourself up, and that one of the things you have to do is go "yes, I am making this choice because it will make me money; I do not like it, but them's the breaks".
And on top of everything else I just mentioned, there's this... thing systematizing nerdy people do, where they find The Optimal Way To Do Something and then try to get everything else to do it that way, too, and get really baffled and upset- sometimes to the point of borderline hostility, but it's not coming from a place of anger as much as bafflement- because why would you want to do things this other way that is clearly non-optimal. Like...you know the feeling you have when your elderly relative spends five minutes slooooowly typing 'google.com' into the bing search engine bar, clicking on the link to google, slooooowly typing in 'gmail.com', and then sloooooowly clicking the link to Gmail?
I get the impression that systematizing-nerdy-type people get that feeling a lot more easily than other people because they spend a lot of time trying to make the world around them work more efficiently. (Because they're engineers or accountants or technical writers in their Other Hat and you need a brain that does that to be a good engineer/accountant/technical writer.) It is a frustrating feeling to have AND a frustrating feeling to be on the other end of. So people who have a strong Optimizing Instinct often upset other people because they're trying to non-consensually Optimize them, and that means that the person with the strong Optimizing Instinct feels like they have to either constantly be a jackass to everyone or keep their mouth shut about everything and eternally cringe in their souls.
...What I'm saying is... this entire situation is a perfect storm of Things People Are Super Defensive about.
The EAs are coming into this situation with a lot of baggage. First, their decision to try to make a lot of money to make the world a better place is not really a popular one, especially in Tumblrspace because there are a lot of people who think trying to become wealthy is making the world worse by default. Second, their Optimizing Instincts have gotten them in trouble in the past, so saying "this thing is the optimal way to do things!" already feels like a dangerous proposition. Third, people tend to think that EA is a cult because ... well.... getting enough people in one place EXCITEDLY TALKING about how they're passionate about optimizing things in a specific way tends to look like that, and when one of the things they're passionate about is "not letting insane rogue AI take over the world"... yeah. I don't think EA qualifies as a cult by any sane definition, but honestly at this point it doesn't matter; if people think you're brainwashed and part of a cult, the more you try to go 'no, really, it's not a cult' the crazier you look.
But then on the flip side, the people who are not fans of EA have a similar amount of baggage. They're dealing with an ever-expanding client pool, an ever-shrinking amount of money, and people trying to splain to them that their entire field of work is bad and evil. They tend to be people who really really deeply care about the work they're doing and about the communities around them, and they tend to be more empathetic and thus more affected by the way their clients are hurting. They also tend - at least on my dash- to be people that have been abused by people they cared about, often in a way that's taken the form of Nonconsensual Optimizing.
So what you get, basically, is everyone trying to justify themselves. The argument winds up devolving from "EA is/isn't a good idea" into one person trying to justify that the people they're trying to help genuinely need help and they're not evil or crazy for wanting to help them and the other person trying to justify that it's okay for them to want to help people in the way they can help and they're not evil or crazy for wanting to parlay the work they're good at into something that can actually make the world better. And it usually gets worse if someone involved has an unusually high Optimizing Instinct and/or an unusually high sense of empathy, because what you get is the "agh.... please.... stop binging google to get to your gmail....." feeling applied to the way someone has chosen to find deep meaning and purpose -- and ultimately to the lives of people in need who do need help.
Justifying yourself does not usually lead to a productive discussion, unless you're justifying yourself to someone with power over you as a way to avoid punishment (and EVEN THEN). If you've started to justify yourself, it's no longer a rational debate. You're not talking to the other person, you're protecting yourself from them. No one is winning. No one is going to come out of this feeling good.
I don't know if there's a solution to this. It's probably inevitable, to some extent, just because people who identify strongly with their value system are going to take any attack on that as an attack on *them*, and at that point they're pretty much going to feel like they have to justify themselves.
But it's something to keep in mind.
From what I can tell, most EAs are systematizing nerdy types- people who genuinely do want to make the world better, do not have the skills you need to do 'traditional' charity work without going batshit insane, and are really really good at numbers. Meanwhile, most of the people I've seen who are really really upset about EAs and think they're culty are people who *do* traditional charity work- sometimes for a living.
So... effective altruism, as a philosophy, is supposed to be a way of optimizing doing good. Whether or not it actually is ... well, I'm not informed enough to say one way or the other. Whether or not it's optimal *overall*, though, for a certain kind of systematizing-nerdy-person who is already planning to have a career in a high-powered numbers-heavy industry, it is probably one of the better ways to do some good in the world. If you have not much time but a lot of money, and you can put all that money into helping people in a specific way that you know will have a positive effect... it's probably better to do that than not.
The thing is, there honestly *is* a stigma in our society about wanting to make lots of money. ...Don't get me wrong, there's absolutely no stigma about having lots of money, quite the opposite, and there's very little stigma against most of the jobs that get you lots of money. But if you say "I would like to do this thing specifically because it will get me lots of money and for no other reason; I'm not particularly passionate about it and I don't think it will actively Make The World Better"... people are going to call you greedy. On Tumblr especially there are a lot of left-leaning people who think that trying to become wealthy is guaranteed to make the world worse, but even in more 'YAY, CAPITALISM' places you'll still get some side-eye.
The trouble is, on the other side of the debate... a lot of the people I've seen who argue with EAs are in the helping professions or do a lot of charity work in their own communities. If you're helping vulnerable people- especially in the USA- the odds are good that you have almost no money to work with and you're getting less and less every year. Your clients get a lot of shit for not just pulling themselves up by their own bootstraps, *you* get a lot of shit for not kicking your clients in the pants until they bootstrap themselves, and meanwhile your *community* has less and less stuff available to help people pull themselves out of poverty period, even simple things like "good jobs" and "affordable housing". There're a lot of shitty people out there who like to claim that anyone who says they need help is trying to fake their way out of doing real work, and a lot of the time these people *are* people who are making massive amounts of money and have massive amounts of luck to boot.
...The problem comes from the same place. People like Horatio Alger stories, but they don't like the truth of what it's really like to drag yourself out of poverty or make yourself rich. They would like to believe that everyone who gets rich sort of lucked into it by being Good People Who Followed Their Bliss, but also that anyone who's a Good Person Who Follows Their Bliss can wind up that way. People don't like to realize that it takes a lot of work and a lot of planning and a lot of other people helping you to drag yourself up, and that one of the things you have to do is go "yes, I am making this choice because it will make me money; I do not like it, but them's the breaks".
And on top of everything else I just mentioned, there's this... thing systematizing nerdy people do, where they find The Optimal Way To Do Something and then try to get everything else to do it that way, too, and get really baffled and upset- sometimes to the point of borderline hostility, but it's not coming from a place of anger as much as bafflement- because why would you want to do things this other way that is clearly non-optimal. Like...you know the feeling you have when your elderly relative spends five minutes slooooowly typing 'google.com' into the bing search engine bar, clicking on the link to google, slooooowly typing in 'gmail.com', and then sloooooowly clicking the link to Gmail?
I get the impression that systematizing-nerdy-type people get that feeling a lot more easily than other people because they spend a lot of time trying to make the world around them work more efficiently. (Because they're engineers or accountants or technical writers in their Other Hat and you need a brain that does that to be a good engineer/accountant/technical writer.) It is a frustrating feeling to have AND a frustrating feeling to be on the other end of. So people who have a strong Optimizing Instinct often upset other people because they're trying to non-consensually Optimize them, and that means that the person with the strong Optimizing Instinct feels like they have to either constantly be a jackass to everyone or keep their mouth shut about everything and eternally cringe in their souls.
...What I'm saying is... this entire situation is a perfect storm of Things People Are Super Defensive about.
The EAs are coming into this situation with a lot of baggage. First, their decision to try to make a lot of money to make the world a better place is not really a popular one, especially in Tumblrspace because there are a lot of people who think trying to become wealthy is making the world worse by default. Second, their Optimizing Instincts have gotten them in trouble in the past, so saying "this thing is the optimal way to do things!" already feels like a dangerous proposition. Third, people tend to think that EA is a cult because ... well.... getting enough people in one place EXCITEDLY TALKING about how they're passionate about optimizing things in a specific way tends to look like that, and when one of the things they're passionate about is "not letting insane rogue AI take over the world"... yeah. I don't think EA qualifies as a cult by any sane definition, but honestly at this point it doesn't matter; if people think you're brainwashed and part of a cult, the more you try to go 'no, really, it's not a cult' the crazier you look.
But then on the flip side, the people who are not fans of EA have a similar amount of baggage. They're dealing with an ever-expanding client pool, an ever-shrinking amount of money, and people trying to splain to them that their entire field of work is bad and evil. They tend to be people who really really deeply care about the work they're doing and about the communities around them, and they tend to be more empathetic and thus more affected by the way their clients are hurting. They also tend - at least on my dash- to be people that have been abused by people they cared about, often in a way that's taken the form of Nonconsensual Optimizing.
So what you get, basically, is everyone trying to justify themselves. The argument winds up devolving from "EA is/isn't a good idea" into one person trying to justify that the people they're trying to help genuinely need help and they're not evil or crazy for wanting to help them and the other person trying to justify that it's okay for them to want to help people in the way they can help and they're not evil or crazy for wanting to parlay the work they're good at into something that can actually make the world better. And it usually gets worse if someone involved has an unusually high Optimizing Instinct and/or an unusually high sense of empathy, because what you get is the "agh.... please.... stop binging google to get to your gmail....." feeling applied to the way someone has chosen to find deep meaning and purpose -- and ultimately to the lives of people in need who do need help.
Justifying yourself does not usually lead to a productive discussion, unless you're justifying yourself to someone with power over you as a way to avoid punishment (and EVEN THEN). If you've started to justify yourself, it's no longer a rational debate. You're not talking to the other person, you're protecting yourself from them. No one is winning. No one is going to come out of this feeling good.
I don't know if there's a solution to this. It's probably inevitable, to some extent, just because people who identify strongly with their value system are going to take any attack on that as an attack on *them*, and at that point they're pretty much going to feel like they have to justify themselves.
But it's something to keep in mind.
no subject
Date: 2018-12-21 11:26 am (UTC)no subject
Date: 2018-12-21 05:51 pm (UTC)I don't think rationalists are the only systematizing nerdy types. But in this specific example-- people on my tumblr dash who are part of the rat-adjacent bubble-- I've seen this dynamic play out a couple times, and generally the rationalist in question is a highly systematizing enthusiastic nerdy type and the not-a-rationalist is someone who's not nearly as systematizing (though still pretty darn nerdy).
no subject
Date: 2018-12-21 09:24 pm (UTC)no subject
Date: 2018-12-22 06:46 am (UTC)no subject
Date: 2018-12-22 05:40 pm (UTC)I do think the interpersonal dynamic is definitely A Thing though.
no subject
Date: 2018-12-22 03:14 am (UTC)I do feel like, in addition to rationalists being on the analytical side, there's a tendency to downplay non-rationalist analyticalness and upplay their own when it functions as a kind of identity marker. But in favor of that I don't think I have anything better than intuitive first-person impressions of my own - and of course the interactions we're exposed to could be quite different.
And nothing wrong with flippant references! Or at least I should hope not.
no subject
Date: 2018-12-21 06:59 pm (UTC)So I'm in a very traditional nonprofit field, and have come up through other nonprofit fields including health work and higher ed work. And I guess at the end of the day I think that people engage with altruism of any kind when it provides some kind of emotional payout that keeps them involved:
1) the shivery joy of optimizing correctly (EA, but also other kinds of money-and-management altruism)
2) personal, affective-empathetic, social urgency (direct services, direct actino)
3) frankly, social rewards for being impressive and competent in an approved-of and pro-social direction (leadership tracks in mid-tier and national NGOs, for example).
I work in a somewhat chilly position within a fairly huggy field, and I'm where I am because I absolutely can't keep my brain engaged with work if the end goal isn't compelling for me. A lot of people I know in more direct-services, crisis-driven fields work in them because they are at their highest level of functioning in situations of that kind of urgency.
Meanwhile, there's absolutely a secondary thread where putting money before social good is strongly gendered. I understand that people of different genders get criticized for being "greedy" but from within a traditional nonprofit field - it's much more likely that ambitious and very competent women end up permanently in nonprofit management because of social pressure instead of preference. So that's a situation where people MIGHT be more effective altruists if they did effective altruism - their choice of jobs are stymied for reasons other than "they function best in mission-driven jobs".
no subject
Date: 2018-12-22 02:46 pm (UTC)I'm also glad you wrote this, because it made me think about my views and figure out how to phrase some things I hadn't pinned down before. I appreciate that.
no subject
Date: 2018-12-22 09:54 pm (UTC)And I can't help the poor if I'm one of them
So I got rich and gave back, to me that's the win/win”
—Jay-Z, “Moment of Clarity”
So it’s definitely possible to do this without turning it into an argument.
+1
Date: 2018-12-23 03:18 am (UTC)