Commentary

Racism In Our Algorithms -- And In Ourselves

As soon as I gunned it through the intersection -- which was yellow-turning-red if I’m being generous -- I knew I was in trouble. Sure enough: the lights went on, the siren sounded, and the cop gunned it around the corner to pull up behind me.

I pulled over, put the car in park, and lowered the window. My wallet was in my backpack, tucked behind the passenger seat. I reached back to get it. My hand was in the bag when the officer arrived at my door.

“Left that one a little late, didn’t you?” he asked.

“I guess I did,” I replied.

“Got your driver’s license?”

“Sure do. I’m just getting it from my bag.” I pulled out my wallet, got the license out, and gave it to him.

He entered it into his system and asked me a few more questions. Where was I coming from, where was I going, what’s my address, had I had anything to drink?

“Well,” he said, finally, “you seem like a reasonable person. So I’m just going to give you a warning. Be a bit more careful next time.” I thanked him and drove away.

Part of me was grateful. But part of me was thinking about how different my experience could have been: if I didn’t live in New Zealand, if I weren’t so visibly middle-class, if I weren’t so white.

I had been thinking about this from the moment he pulled me over and I reached behind the seat for my wallet. If I were a person of color in the U.S., there is no way I would have had my hands anywhere other than where the cop could see them.

I had been thinking about this from a week ago, when I did some of the Harvard Implicit Association Tests. These short, free tests measure some of our automatic associations based on traits like race, gender, age, and physical ability.

So far, I’ve learned that I have a moderate preference for white people over black people, a moderate preference for cis people over trans people, and no preference either way for straight people over gay people.

I’m not surprised by the results, but I’m saddened by them. I have no desire to have a moderate preference for white people over black people. But it’s there. And if I can’t admit it, I will never be able to address it -- the same way if we as a society can’t admit that we have ingrained in us an array of biases that are both profound and unexamined, we will never be able to address them.

These biases are the reason our technology ends up biased. Perhaps some coders intentionally make their algorithms discriminatory, but many more simply fail to address the profound and unexamined biases that are all around us. Developers who don’t think about the fact that their image database comprises mostly white people, or data scientists who don’t think about the fact that medical trials have historically only included men. A couple years ago, Robyn Speer showed how training a sentiment classifier on standard, off-the-shelf datasets turns it racist.

You seem like a reasonable person.

I don’t think my cop was racist. I’m sure I seemed reasonable because I spoke to him nicely, accepted I was wrong, didn’t argue, wasn’t drunk or belligerent.

But I also feel confident that it was easier for me to seem reasonable because I fit nicely into his automatic preference system.

We shouldn’t be building algorithms to reinforce our historic prejudices. Our algorithms should counter our biases, not amplify them. But the only way that happens is if we are all willing to acknowledge we have them. I know it’s scary and hard to admit. But I’m willing. Are you?

Next story loading loading..