Allowing algorithms to curate our lives might have unintended consequences, an Australian researcher says.
Algorithms are used throughout our interaction with technology. They range from those used by Facebook to tell us which friends we find most interesting, to Google’s Now product that reminds us of meetings we may care so little about that we’ve forgotten about them. The pictures we see in social media photo streams are dictated by algorithms, as are search results and the music we listen to on online radio stations.
+ Also on Network World: 10 amazing algorithms +
To what extent is this shaping of our lives a potentially dangerous transfer of power, is a question Michele Williams of Curtin University in Western Australia poses in a new paper published in Information, Communication & Society. It's of particular concern when the delegation is done without much of our conscious involvement.
Williams says although algorithms are instigated and developed by humans, people do not have the ability to “to critique and guide many of the algorithmic processes.” That’s done by big business with “which we increasingly interact.”
She questions whether that’s a good idea.
Bias, is obviously one potential issue, but there’s another one that is less tangible and is related to “particular ways of seeing the world,” she says.
Algorithm-driven “stereotypes” could be one example of that, Williams says. Taste could be another. If an algorithm calculates that images in a photo library should look a certain way in order to appear at the top of a stream, then that’s the algorithm dictating taste.
But taste changes. And just how much should we depend on Flickr, say, a Yahoo-owned picture gallery, to dictate taste. Can it keep up, for example? One could argue that Yahoo’s garish purple logo is an indication it doesn’t have any taste in the first place.
How much can we trust algorithms?
“De gustibus non est disputandum” is the famous latin phrase that means: Where taste is concerned, there can be no arguments, though. In other words, taste is entirely subjective. So, is big businesses’ taste, via its algorithms, something we really want to buy into. Wouldn’t an art gallery in a city’s downtown regeneration project, say, be a better arbiter? It’s their business, after all.
And in that case, are we not handing our hard-earned taste, derived through knowledge and experience, over to big corporations and data scientists—however well-meaning.
Algorithms shape “our everyday practices and understandings,” Williams explains. So, just how much should we rely on them when we don’t really have any involvement in the design of them—and maybe even less involvement in the cultures and make-up of the companies that do
Even though a dynamic algorithm can learn what its human master wants to see through analysis of data that’s then added to the calculation, how much can it be trusted when we’re not involved in the formulae?
In one case, Williams uses an example of algorithms that can tell you which selfies are no good and should thus be deleted, leaving the ones that make you look beautiful.
The algorithm that spots beauty carries an implication that “you are less than capable of doing the same thing,” she say. Should one “trust the technology more than yourself to make these aesthetic choices” is a question she interestingly poses.
This article is published as part of the IDG Contributor Network. Want to Join?