Who's to Blame? Human or Machine?

In an article featured in The Guardian a few years ago, author Tom Chatfield states "Meaningful collaboration between people and machines must not subvert human creativity, feeling, and questioning over speed, profit, and efficiency. This sentiment has been echoed over the past several years and is getting even louder.Google has made blunders before, for instance when they released a new photo app in 2015 which resulted in automatic tagging of black people as gorillas. Google was quick to blame the problem on computer algorithms and quickly removed the "gorilla" category.The most recent controversy has been about an app that was released in 2016 and matches selfies to works or art. Once again Google's app is having a hard time with how it interacts with images of people of color. Much of the conversation has focused on choice of museums and organizations that Google has chosen to partner with and the collections that they have, or rather what's lacking in their collection.Others, like Joy Buolamwini, a Researcher at MIT Media Lab and founder of the Algorithmic Justice League, feel that the heart of the problem lies in the teams of mostly white engineers who create facial recognition algorithms based on their own experiences.Neither of these arguments speaks to what some feel is the heart of the problem, the intertwining of human and machine relationships, and how we as humans view our roles with these machines. In an article for The Guardian, author Tom Chapman writes, "We think of ourselves as individual, rational minds, and describe our relationships with technology on this basis." But we do not have as much individual freedom and autonomy as we think we have, we are interdependent, relying on our devices much more than we would like to admit.The same logic that Google's algorithms use are at work in all aspects of our lives, cars that drive themselves—medical procedures that don't require a physician. The problem is that technology is Darwinian and data and performance drive where our culture is headed.Argodesign's Mark Rolston recently wrote an article for Co.Design that points designers to doctors for a model on adopting an ethical code. While I agree that ethics are called for, I don't know if a model has been created yet for any profession that deals with the issues ahead of us.ProPublica is one organization that is actively working to change things as Katharine Schwab reports for Co.Design. Led by Pulitzer Prize-winning reporter Julia Angwin. Dedicated to investigating algorithms that impact people's lives, they've ended up building their own algorithms in order to hold big tech companies accountable.Google's Art and Culture App may not seem very serious today, especially to those who don't bother with such silly things on social media sites. However, it may very well be foreshadowing a future where algorithms and data control our world more than we do.Sources:http://digg.com/2018/google-arts-culture-racist-facehttps://www.fastcodesign.com/90159804/what-designers-could-learn-from-lawyers-doctors-and-priestshttps://www.theguardian.com/technology/2016/jan/20/humans-machines-technology-digital-agehttp://bgr.com/2018/01/17/google-art-selfie-viral-app-privacy-racism/https://www.bustle.com/p/googles-arts-culture-app-is-being-called-racist-but-the-problem-goes-beyond-the-actual-app-7929384https://www.fastcodesign.com/90160486/how-propublica-became-big-techs-scariest-watchdog?utm_source=postup&utm_medium=email&utm_campaign=Co.Design%20Daily&position=1&partner=newsletter&campaign_date=02162018

Previous
Previous

Helping women tell their #MeToo stories

Next
Next

Are we ready for #unphotoshopped?