This will be a short post, I hope. I don't have time at the dog-end of the day for a long one.
I've seen Eliezer Yudkowsky's "Torture vs. Dust Specks" for the first time today. It's an old post and I never really followed Less Wrong, though I was very briefly entranced by its premise in my 20s and knew people who were involved in that community. Intimated in the Mastodon post which directed me to it was the notion that Luigi Mangione had been influenced by this idea. That's neither here nor there for me, honestly.
In my 2024-11-08 post I think I made it clear that I think equity is an important element of any ethical system, which is part of why I personally consider utilitarianism to really only apply in situations where some sort of equity already holds. I.e., yes, one innocent dying is better than one hundred innocents dying.
Yudkowsky does a good job of illustrating unregulated utilitarianism's limits when he reveals that he believes there's a threshold of people experiencing a barely perceptible, momentary irritation that justifies a person being horribly tortured for 50 years in order to avoid it. I didn't believe the Less Wrong folks were barking up the right tree all those years ago, and I don't believe it now.
From my perspective: reason is a dogged servant of axioms. Reason does and will only ever serve the irrational driving forces at all of our cores that gets us up in the morning and makes most of us at least uncomfortable in the face of injustice. You can absolutely construct a beautiful, internally consistent system of beliefs that are also wholly disconnected from reality; from the little glimpses I've had, I have the impression that the Less Wrong community has been constructing a beautiful edifice of reason that completely misunderstands the role of ethics. From my perspective, ethics are a formal mapping of moral sense in the same way that, e.g., physics are formal mappings of physical phenomena. If your physics contradicts observation, you have most like fucked up somewhere. If your ethical system violates the moral sense of most people, you have most likely fucked up somewhere.
This is difficult because "moral sense" gets mixed up with "cultural conditioning" and "religious indoctrination" and other forms of "self-delusion for the sake of inclusion" for a lot of people, and it can seem impossible to disentangle them. Reason may often be the only way to cut through that kind of bullshit. The hope of these groups (because Less Wrong and its Rationalist followers joins a very long tradition of extrapolating confidently from reasonable-sounding premises to arrive at unreasonable conclusions without going, "wait a minute, have I reductio ad absurdumed myself?") is to cut through those sorts of "false selves" to arrive at some sort of universal ethical system. This is a noble project, though my suspicion is that they may be in error to pre-suppose such a thing can be constructed any more than, e.g., a universal measure of musical merit. At the very least, they oughta listen to their guts a little more often and revisit their premises, because their conclusions can only ever be as good as those. And what good is an ethical system anyway if it leads you to violate your own moral sense? Fr my perspective, the whole point of constructing an ethical system is to codify one's moral sense, not to supplant it or substitute for the lack of one.