Now, increasingly, it is not humans ordering us what to do and what not to do. Machines do that. Algorithms are the enforcers. It scares the bejesus out of me.
It is not that I don't care for the comforts that these machines provide. It is awesome that. for instance, the heater kicks in automatically at five in the morning and warms up the home before I get out of the bed. But, I worry that most of us are mindlessly yielding to computers.
As we transfer agency to computers and software, we also begin to cede control over our desires and decisions.
That loss of agency worries me. It has always worried me. Agency is what I have always urged in my students too. I have even semi-seriously joked that my view of college education is to make sure we don't create automatons out of students, and that I want to make sure they can think.
But, automatons we are rapidly becoming.
Already, many people have learned to defer to algorithms in choosing which film to watch, which meal to cook, which news to follow, even which person to date. (Why think when you can click?) By ceding such choices to outsiders, we inevitably open ourselves to manipulation. Given that the design and workings of algorithms are almost always hidden from us, it can be difficult if not impossible to know whether the choices being made on our behalf reflect our own interests or those of corporations, governments, and other outside parties.
I feel like it is a lost cause. The battle, the war, has been lost. A few of us jumping up and down shouting about these won't matter--most are not even listening to us. Heck, they don't even know we exist!
I often argue, as I did even yesterday, it is all about making conscious decisions about "the trade-offs inherent in offloading tasks and decisions to computers." Agency. " If we don’t accept that responsibility, we risk becoming means to others’ ends."
We humans are becoming more and more like machines. We are becoming robotic, even as robots are getting better and better. And that means:
it will be increasingly impossible to distinguish between humans and robots because of our machine-like behavior as much as robots’ human-like features. And could this eventually become the norm, with humans spending their entire lives acting like machines?
I am sure that even now many amongst us will fail the “Voight-Kampff test," which in Blade Runner was used to "assesses capacity for empathy, a human facility that even the most intelligent androids lack." How else can one explain the election of a man completely devoid of empathy as the President of the US!