Grappling With Algorithms And Justice (Oh, The Humanity)
dogtrax, Kevin's Meandering Mind, Feb 21, 2020
Last night was the second online session of an inquiry group project called The Grapple Series – hosted by the National Writing Project, Western Pennsylvania Writing Project and the CMU CREATE Lab — that is looking at the impact of AI and technology on our lives. The theme last night was algorithms and justice, a […]

Last night was the second online session of an inquiry group project called The Grapple Series – hosted by the National Writing Project, Western Pennsylvania Writing Project and the CMU CREATE Lab — that is looking at the impact of AI and technology on our lives. The theme last night was algorithms and justice, a pairing that made for interesting conversations about how blind trust in both often lead to disastrous consequences.

We explored some interesting reading and video pieces before gathering in our online session. The articles explored the issue from multiple angles, but the overall connecting concepts are clear: algorithms are created people, and people have bias, and so algorithms have bias, too, and when algorithms are embedded with bias, it impacts our notions of justice in the world.

Sometimes, this is literal — as in the case of computer software being used to designate length of parole. Sometimes, it is more nuanced — the way search engines bring racial stereotypes to the surface. Sometimes it is not yet known — the way facial recognition is changing our sense of privacy in the public sphere.

The Grapple gathering began with a large discussion and writing about justice and algorithms, and then broke into smaller groups, where we engaged in deeper debate about the role of algorithms on society.

We also teamed up to create our own paper “algorithm” for fighting off the common cold, and while our group went a sort of silly route (Should a teacher call in sick or not?), the short flowchart activity reminded us how often we can fall into Yes/No binary decisions that can leave the humanity aspect out. Another small group did integrate ideas of humanity into their algorithm, and I found that quite interesting.

I appreciate being able to work through and “grapple” with these complex questions rippling through society. There is no real solution — the algorithmic genie is long gone from its bottle. But we can be aware, and make some decisions about how what information we share and how we are being manipulated by technology.

Here are resources shared before our session, if you are interested:

Peace (ain’t no code for that),
Kevin