Monday 14 November 2022

Ours is a High and Lonely Destiny: overcoming disingenuous cant from high priest of Effective Altruism

As I have said before, I have a great deal of respect for the ideals of Effective Altruism (EA). But this Twitter thread about EA from Will MacAskill is weak and pathetic.

MacAskill is "known for being a leader, perhaps the intellectual leader, of the effective altruism movement", according to Tyler Cowen. He recently published a book called What We Owe the Future, which you may have seen reviewed or perhaps even read. He's kind of a big deal.

His thread was prompted by the collapse of FTX, the something-to-do-with-crypto-currency set of companies run by Sam Bankman-Fried (SBF). SBF (reportedly) was a billionaire and (definitely) was a big supporter of EA. FTX's collapse was sudden and there are murky aspects to it: since I couldn't tell you what FTX was meant to do if it was running well, I can't tell you whether SBF was running it badly; no doubt all will become clear in due course.

Anyway, MacAskill appears to consider the possibility that SBF's many generous donations to EA causes were funded by fraud. MacAskill mounts the highest horse available to the EA community and expresses outrage at the very idea of such a thing. He says that he wants to make it "utterly clear" (normal clarity being insufficient) that "if those involved deceived others and engaged in fraud (whether illegal or not) that may cost many thousands of people their savings, they entirely abandoned the principles of the effective altruism community" because "clear-thinking EA should strongly oppose “ends justify the means” reasoning". In a development that surprised me (and I suspect many others), MacAskill tells us that "the EA community has emphasised the importance of ... the respect of common-sense moral constraints", indeed that "we do not see ourselves as above common-sense ethical norms". 

Now it's my turn to be utterly clear: this is disingenuous nonsense. It's cant. Why do I say that? Because the whole idea of EA is to step above common-sense ethical norms.

The starting point for EA is the insight, to adapt Peter Singer's example, that it should not matter whether the child I help, at minimal cost to myself, is the child physically in front of me, drowning in a small pond, who needs me to wade in and ruin a pair of shoes, or the victim of famine on the other side of the world who needs me to donate money equivalent to the cost of a pair of shoes. A dying child is a dying child.

But of course this sort of distinction does matter to common sense morality: one has some kind of responsibility to the people one might see drowning while out on a stroll in the countryside that one does not have to the nameless multitudes overseas. The man who passed by on the other side and left a child to drown would be hated and ostracised by his common-sensical peers, while those of us who spend our money on shoes rather than charitable donations are still considered decent citizens in good standing. That's just how common sense morality works.

The world of EA is full of attempts to cast off the shackles of common sense morality. Take this article (funded by some emanation of FTX, I now see), which links to a piece explaining that it's a good idea for doctors to kill healthy patients to use their organs on others, or this piece, arguing that we should "herbivorize" predators to avoid wild animal suffering. Maybe MacAskill himself is arguing, properly understood, that the comfortably-off are morally obliged to have children? All a little at odds, shall we say, with common sense morality.

Perhaps the most ambitious EA or EA-adjacent thought programmes at the moment are the 'longtermist' ones, i.e., the ones that look ahead to humanity's potential future as a trillion+ numbered species inhabiting alien realms across the observable universe. Common sense morality probably tells us that we shouldn't spend much time worrying about these unborn trillions when there are plenty of problems in the here and now. 

So much the worse for common sense morality, surely? That's what the EA-er should say. But MacAskill doesn't. Instead he weasels his way around the issue with some quotations from his book: he tells us that "naive calculations" that the good will outweigh the harm are "almost never correct", without talking about more sophisticated calculations that are correct; that violating rights is "almost never" the best way to bring about good results (almost?); and that the ends "do not always" justify the means (not always - but sometimes? 5% of the time? 49%?). That kind of hedging might be right, but it doesn't support the absolute prohibitions declaimed in high dudgeon in MacAskill's twitter thread. 

Stop being pathetic, say I. Have the courage of your convictions, MacAskill! Sometimes your moral calculations lead you to endorse actions that common sense regards as anywhere on a spectrum from stupid to evil. Own it! Proclaim it out loud!

There are plenty of decent arguments MacAskill could use. Why should he be the person who says, "well, I understand the arguments for the abolition of slavery but it will almost never be right to disregard property rights?" Disregard away! Can't he say: "if we stuck to common sense morality then a woman's place would still be in the home?" Or: "don't let petty bourgeois common sense morality stop you from considering the big picture." If we don't come to believe that we are wrong about some things that now seem to be common sense then we are not making moral progress.

Let's return to our well-intentioned entrepeneur setting out to save the world. Would it be wrong for him to steal $10 from the petty cash if he knew that that money would save a billion people from devastation and suffering? Of course not! What EA could object? But now we're just haggling over the amounts. If you're consistent about these matters then you have to be open to the possibility that someone can abscond with some crypto currency (whatever that is) and use it to do good - and that he would be justified in doing so. (Since I started to write this, Tyler Cowen and his dark alter ego Tyrone have made similar points.)

I said above that the EA-er "should" say "so much the worse for common sense morality". What kind of a "should" is that? From my point of view - a point of view not terribly far from common sense morality - it's a moral "should": people should be honest. But if you are a utilitarian then honesty is not an absolute value and you can, for example, happily embrace the Noble Lie. The man who says "honesty is the best policy" is not an honest man - a better policy may present itself, and the use of weasel words (e.g., "almost never") designed to ensure popular acceptance may well be such a policy.

Let me instead give MacAskill a prudential reason to embrace honesty here: he might convert people to his cause.

In The Magician's Nephew, my favourite of the Narnia books, two characters tell the eponymous nephew, Digory, that they are not bound by the constraints of common sense morality because "ours is a high and lonely destiny". First, there is Digory's Uncle Andrew:


Then there is Queen Jadis, who at this point has just told Digory how she killed the entire population of her world, save only herself:


Let's return to EA. What could be more fine or noble than to survey the whole world of creation, present and future - the dumb beasts subject to the depradations of predators; the sick, weak and defenceless poor of the wretched corners of the Earth; the uncountable trillions still to come, even unto the farthest stars and most distant galaxies - and then to consider, with that immense burden on one's shoulders and nothing but an Excel workbook before one's eyes, how the resources of mankind, real and crypto alike, should best be allocated among those fathomless chasms of need? Surely there is no higher or more lonely destiny? For the common people - little boys, servants and women - words like "fraud" and "lies" sound very big and scary, but in the immensity of the moral universe they may surely be no more than little bumps in a path that slopes to the highest and most sunlit of uplands.

Or perhaps you prefer William Roper's approach?

William Roper: “So, now you give the Devil the benefit of law!”

Sir Thomas More: “Yes! What would you do? Cut a great road through the law to get after the Devil?”

William Roper: “Yes, I'd cut down every law in England to do that!”


That's the spirit - don't let the devil hide behind mere man-made "laws" and "rights"!

Uncle Andrew is a weak and silly man, and yet he almost persuades Digory. Jadis is neither weak nor silly: she is magnificent, splendid and terrible, strong in mind and body. If Digory had not been innoculated to the message by having first received it from Uncle Andrew - or perhaps if he were more than merely a little boy - then surely he would have succumbed to the grandeur, breadth and daring of Queen Jadis' ideals. 

I don't think MacAskill is seven feet tall nor, for all that his features have a certain boyish charm, would I call him dazzlingly beautiful. But these are minor matters. Once the right messenger is found, I feel sure that the world will come to see the superiority of the High and Lonely Destiny that MacAskill and the EA community urge upon us. 

No comments:

Post a Comment