Technical Bias

The most popular post on this site is “Emergent Biases” where I talk about Tay, Microsoft’s AI which went quite sour as she learned neo-Nazism and misogyny from Twitter users. 

Due to the post’s popularity, I will be doing a brief series on biases in computer systems, based on Batya Friedman and Hellen Nissenbaum’s “Bias in Computer Systems.”  They define three types of biases – pre-existing biases, technical biases, and emergent biases.  This post details one of these biases.

I find all biases in computer systems interesting.  Friedman and Nissenbaum only posit three…so it’s hard to choose which one I find most interesting.  But for today, let’s say it’s technical biases.  Technical biases are difficult for me to wrap my head around: it’s difficult for me to think of solutions to these biases.  This is not because it should be difficult but that I have lived so long with certain technologies just being “that way” that it’s hard to imagine a radically different technology doing the same thing.

index typewriter
Index Typewriter. From https://www.collectorsweekly.com/stories/33818-victor-index-typewriter-circa-1891

(Think of it like this.  I’m so used to keyboards for typing.  But typing didn’t have to include keyboards.  Typewriters didn’t start off with the QWERTY or AZER keyboard being standard.  In fact, there are these things called indexing typewriters that use a plate and needle to type.  So, if someone asked me to develop a typing technology for Klingon, I would immediately think of a keyboard…but that might not actually be the best technology.)

Friedman and Nissenbaum define technical biases as resulting from “issues in the technical design…including limitations of computer tools such as hardware, software, and peripherals” (335).

In their article, Friedman and Nissenbaum note that screen sizes limit how much information can be displayed.  So, software developers had to think of a way to offer possible information but in a readable way, with the limitation of screen size.  They developed pages.

Think of your Google results.  You don’t see all 3,908,393 results on your screen.  First, you scroll down to see more results and, eventually, you have to click to see the next page of results.

The pages themselves aren’t biased; they’re really just more a reality.  What is a technical bias is the Google algorithm (which is a black box that we don’t know what is in).  It ranks results and then displays them.  We can see some biases immediately – that advertisements filter to the top two slots of the results.  And then there are the non-ads that are ordered in some unknown manner.  So few of us even scroll down to the end of the first page, let alone go onto the next page of the results.  So those first few results are crucial – because that’s the information most people will access.  And we know Google’s system is deciding what this information is.  So, we know it’s biased but we don’t know how.

This is a technical bias, which arose from a limitation in typical hardware…and from the code.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s