The issue with #AmazonFail isn’t that a French Employee pressed the wrong button or could affect the system by changing “false” to “true” in filtering certain “adult” classified items, it’s that Amazon’s system has assumptions such as: sexual orientation is part of “adult”. And “gay” is part of “adult.” In other words, #AmazonFail is about the subconscious assumptions of people built into algorithms and classification that contain discriminatory ideas. When other employees use the system, whether they themselves agree with the underlying assumptions of the algorithms and classification system, or even realize the system has these point’s of view built in, they can put those assumptions into force, as the Amazon France Employee apparently did according to Amazon.
As Hodder observes,
The ethical issue with algorithms and information systems generally is that they make choices about what information to use, or display or hide, and this makes them very powerful. These choices are never made in a vacuum and reflect both the conscious and subconscious assumptions and ideas of their creators.
The ethics bar in creating algorithms and classification systems should be very high. In fact I would suggest that companies with power in the marketplace, both commercial and ideas, should consider outside review of these systems’ assumptions and points of view so the results are fair.
Algorithms are often invisible, and difficult to detect by design, because technologies that use them are designed not to share the methods for providing information. This is mainly because users are focused on the tasks at hand in information systems, and given good information, they don’t need to know everything under the system’s hood, and because technology makers like to keep the ’secret sauce” hidden from competitors, not to mention people who would game systems for their own devices such as spammers or other bad actors.
A post on the Equal Justice Society blog extends Hodder's discussion: How the Amazon "Glitch" Relates to Structural Discrimination and Racism. The author of the post, Keith, writes
In both the Amazon glitch and structural social groups, the impact of system-driven automatic choices is often irrefutable: a category of books and a category of people suffer from discrimination that has a clear negative impact on their opportunity to succeed.
In both cases, the causes of the problem are constructs - one technological, one sociological - a creation by human beings that have no inherent malice, but result in discrimination because bias seeds the way the systems make choices.
Some of the reactions to Hodder’s analysis also sing the same tunes to those we hear when we present the notion that unconscious bias, even in the absence of conscious discrimination, impedes opportunity.
(Links thanks to Michelle Murrain.)
And by the way, just so you know: the Aqueduct book struck by amazonfail, Centuries Ago and Very Fast, is now available at Powells.com.
Timmi - Thanks so much for linking to my post. I didn't think it worthy of being of being mentioned on someone else's blog, but appreciate it!
ReplyDeleteI think we've only scratched the surface of #amazonfail ...
If Amazon hadn't over-extended their paradigm, I wouldn't have suspected that Centuries Ago and Very Fast wasn't included with my other books deliberately.
ReplyDeleteRats! I was planning to make a pretty big order from Amazon, and now I can't.
ReplyDelete