We judge intents, actions, or consequences, not necessarily people — the actors — as a whole. For example, in the Lojban language it’s not grammatically possible to say someone is crazy, only that a state, action or quality is crazy.
But as for the judging part, that’s the easy part.
It’s just turtles all the way down. Recursive ethics!
If I think a particular intent/action/consequence is similar to another intent/action/consequence, and I think the second i/a/c is bad, then I probably also the first i/a/c is also bad (to the extent that they are similar).
Now, the question becomes to judge whether the second i/a/c is bad. Well, to the extent that it is similar to a third i/a/c that I think is bad, it is.
Soon enough I’ll reach an i/a/c that either I feel pretty good about (an act of kindness), or, something that feels like an absolute gut punch to even think about.
Problem solved by feels over reals.♥︎♥︎♥︎♥︎
When we don’t use emotions, we are only using half our brains.
(Which is also true for only using our emotions.)
When you subscribe to existentialist ethics (and I do need to read The Ethics of Ambiguity), every choice is a choice. We can’t rely on a sacred text or a trolley problem math formula. We are responsible for every thing we do.
Now, of course it’s a good idea to have some rules of thumb in that. If I every morning need to decide whether it’s a good idea to open my eyes that morning, I’ll get nowhere. Instead, I am allowed to set up something like “OK, unless circumstances really change, I’m allowed to open my eyes every morning for the next six months, and then it’s time to reconsider.”
The “recursive ethics” is such a rule of thumb. It’s not the be-all end-all but it handles a lot of situations pretty well.
My personal belief is that human morality and ethics is just as complex as language. It evolved to allow us to deal with complicated social interactions. We have been arguing over formalism and the history of Moral Philosophy is littered with incomplete, dissatisfying theories that have strengths in some artificial cases and weaknesses in day-to-day use.
I agree with this analogy.
A simple formula can’t substitute for intuition, pattern recognition, gut feel, emotion. I’m not saying computers can never do those things—witness AlphaGo or Waifu2X—but they’re not using simple heuristics to do it.