Wednesday, March 01, 2017

From programming computers to programming people

In some of the dystopian portrayals of the future, the inequality widens so much that there have-nots are left way behind.  Sometimes, the rich and the affluent are on an entirely different planet--literally.  Other times, the underclass are underground--literally.

The rate at which technology is progressing, I won't be surprised if that is exactly how the future will unfold.  (Though, I am confident that I won't be around for that dystopia; thankfully!)

Which is why reading something like this in the Scientific American becomes more fodder for this worry-wart.  I mean, consider the following sentences, for instance:
One thing is clear: the way in which we organize the economy and society will change fundamentally. We are experiencing the largest transformation since the end of the Second World War; after the automation of production and the creation of self-driving cars the automation of society is next. With this, society is at a crossroads, which promises great opportunities, but also considerable risks. If we take the wrong decisions it could threaten our greatest historical achievements.
Oh, hey, have a nice day!

The automation of society is next.  Ouch!
Everything started quite harmlessly. Search engines and recommendation platforms began to offer us personalised suggestions for products and services. This information is based on personal and meta-data that has been gathered from previous searches, purchases and mobility behaviour, as well as social interactions. While officially, the identity of the user is protected, it can, in practice, be inferred quite easily. Today, algorithms know pretty well what we do, what we think and how we feel—possibly even better than our friends and family or even ourselves. Often the recommendations we are offered fit so well that the resulting decisions feel as if they were our own, even though they are actually not our decisions. In fact, we are being remotely controlled ever more successfully in this manner. The more is known about us, the less likely our choices are to be free and not predetermined by others.
And you thought trump was the biggest problem!
we urgently need to impose high standards, especially scientific quality criteria and a code of conduct similar to the Hippocratic Oath.Has our thinking, our freedom, our democracy been hacked?
Remember the idea of a Hippocratic Oath for the digital technology/AI fields?  No?  You forgot this post from October, BT?  Ahem, BT means "before trump" ;)

Back to the Scientific American:
In summary, it can be said that we are now at a crossroads (see Fig. 2). Big data, artificial intelligence, cybernetics and behavioral economics are shaping our society—for better or worse. If such widespread technologies are not compatible with our society's core values, sooner or later they will cause extensive damage. They could lead to an automated society with totalitarian features. In the worst case, a centralized artificial intelligence would control what we know, what we think and how we act. We are at the historic moment, where we have to decide on the right path—a path that allows us all to benefit from the digital revolution.
Like I said earlier, have a nice day!

2 comments:

Ramesh said...

Sure, I'll have a nice day. All I have to do is once in a few days do the following

- Write a post that Trump is the greatest guy on earth
- Agree with you on a random post
- Post a comment on a food blog that the recipe worked out great
- Trash the Icelandic President

That's enough to throw off track, those stupid computers that are monitoring me.

Here's clear proof - YouTube continues to offer me Jayalalithaa death conspiracy videos all the time, just because I once watched a Sun News video. If that's the quality of "intelligence" they get from monitoring me, I have no problems for my lifetime.

Sriram Khé said...

That's what you think, my friend ... you may continue to live in your own bubble ... but, "they" know more about you than you know about yourself.