Cultural Relativism and Moral Authority

Linas Vepstas
4 min readJul 22, 2018

A short essay inspired by discussions of Jonathan Haidt’s book “The Righteous Mind: Why Good People Are Divided by Politics and Religion (2012)”

Those who studied science in high-school may recall learning how to fit a straight line through a set of points that were not-so-straight. This was called the “least squares fit”.

If you have multiple dimensions, the name changes to “linear regression”, but the idea doesn’t. Of course, trying to explain complicated things with straight lines or planes sometimes isn’t good enough, so the machine learning guys have developed all sorts of ways of trying to capture the shape of curvy, convoluted things that twist around and fold back.

Whatever it is that these algos do, they all tend to smooth things out, gloss over details, misclassify the exceptions, mistreat anything unusual or out of the ordinary in their algorithmic rush to create a simple model that fits the data. And most of the time, that’s OK.

But sometimes, the data really really does curve and jump around in an important way, and the machine-learning algo got it wrong. You need a human to look at that, to spot it, to fix it. You need a human rider to ride herd on these big lumbering compute clouds, gray and four-legged.

I propose that the human brain is like that: what it can do quickly, is done so often, that its a habit. Once its a habit, we stop training, we stop attempting to do better. I learned how to walk when I was two. I practiced running for the next 10 years. Then I stopped. Every now and then, the autonomous, subconscious, automatic foot-placement algorithm gets it wrong, and the elephant-rider is just scrambling for dear life, trying to avoid a face-plant.

I propose that everything, just about everything we do, is initially under conscious control. We train, until we can do it automatically. When we do it automatically, ts fast, efficient, easy, thoughtless. We stop training. Problem is, the fast, automatic circuits smooth over certain details, gloss over certain distinctions that are important. If I can’t play the piano, its only because my fast, automatic finger-movement neural circuits are glossing over minor details of timing, pressure, speed and position.

I propose that morality is like that. In child-hood, you learn certain moral codes, and you stick to them. Heck, maybe even a genetic component — gazelles walk from the minute they’re born, with a minimum of training.

But just because your moral judgments are automatic, unconscious, doesn’t mean that you stop there. You can train these too, and learn to be more subtle, more accurate, distinguish more carefully, and move more precisely. This is what WEIRD (“Western Educated Industrial Rich Democratic”) shows you how to do. Colleges famously teach you “how to think”. College students can think in articulated ways about dead chickens. The less-educated, less practiced, less-thoughtful can only react in an automatic fashion.

And so we come to cultural relativism. If a whole culture or society is running on automatic, taking the ingrained, knee-jerk reaction to any given moral dilemma, well, I don’t think that necessarily implies that the society/culture is right, doing the right thing. Maybe they just never bothered to think about it deeply. Maybe the decisions they are collectively making really are wrong, really are incorrect. (footnote 1)

Great. Now what? I mostly don’t go around criticizing how people walk, because, umm, well, “boundaries”. You can’t just say whatever you want to anyone and have it go over well. You have pick your words wisely, or possibly stay silent. Using blunt words to change someone’s mind is like using a hammer to fix your plumbing. Shanta Stevens has a story about a man who can talk racists into being not racists. Funny thing, he doesn’t have any stories about men who can talk non-racists into being racists. Those stories are morally repugnant.

So, to what degree should we be culturally relative, and to what degree should we make informed, conscious choices and decisions about the correct manner of behaving? Is it OK for one culture or subculture to tell another one that they are wrong?

So, Haidt has done something interesting: he’s identified certain subconscious, automatic moral circuits commonly shared by all humans. He’s come up with the metaphor of the elephant and the rider. What he doesn’t say is that maybe, just maybe, some elephants are rampaging out of control, and some riders are doing a really terrible job of steering. Yet, socially, culturally, this is what we are dealing with. Not all moral judgments are created equal, and not all moral judgments should be accepted or tolerated, just because some large segment of society is down with it. One can do better.

— — — —

footnote 1: And maybe some subcultures have actually thought about it more deeply and throughly. Which is why subcultures can be interesting. Especially when their story is told with empathy and grace, such as by vice.com.

--

--