Showing posts with label computers. Show all posts
Showing posts with label computers. Show all posts

Thursday, April 19, 2018

The machines have won :(

Ever since I started thinking on my own, following something that somebody lays down as a rule has been difficult.  I don't want to merely follow orders.

Now, increasingly, it is not humans ordering us what to do and what not to do.  Machines do that. Algorithms are the enforcers.  It scares the bejesus out of me.

It is not that I don't care for the comforts that these machines provide.  It is awesome that. for instance, the heater kicks in automatically at five in the morning and warms up the home before I get out of the bed.  But, I worry that most of us are mindlessly yielding to computers.
As we transfer agency to computers and software, we also begin to cede control over our desires and decisions.
That loss of agency worries me. It has always worried me.  Agency is what I have always urged in my students too.  I have even semi-seriously joked that my view of college education is to make sure we don't create automatons out of students, and that I want to make sure they can think.

But, automatons we are rapidly becoming.
Already, many people have learned to defer to algorithms in choosing which film to watch, which meal to cook, which news to follow, even which person to date. (Why think when you can click?) By ceding such choices to outsiders, we inevitably open ourselves to manipulation. Given that the design and workings of algorithms are almost always hidden from us, it can be difficult if not impossible to know whether the choices being made on our behalf reflect our own interests or those of corporations, governments, and other outside parties.
I feel like it is a lost cause.  The battle, the war, has been lost.  A few of us jumping up and down shouting about these won't matter--most are not even listening to us.  Heck, they don't even know we exist!

I often argue, as I did even yesterday, it is all about making conscious decisions about "the trade-offs inherent in offloading tasks and decisions to computers."  Agency.  " If we don’t accept that responsibility, we risk becoming means to others’ ends."

We humans are becoming more and more like machines. We are becoming robotic, even as robots are getting better and better.  And that means:
it will be increasingly impossible to distinguish between humans and robots because of our machine-like behavior as much as robots’ human-like features. And could this eventually become the norm, with humans spending their entire lives acting like machines?
I am sure that even now many amongst us will fail the “Voight-Kampff test," which in Blade Runner was used to "assesses capacity for empathy, a human facility that even the most intelligent androids lack."  How else can one explain the election of a man completely devoid of empathy as the President of the US!

Wednesday, April 11, 2018

Stayin' Alive in The Wall

I have forever blogged about creativity in the time of advanced computing.  Routine tasks can be translated into algorithms--even facial recognition, yes.  But, creativity?

There is no formula for creativity.  After all, if there is one, then that can be written up as an algorithm, right?

Creativity is something that has always intrigued me; I have always felt that formal education the way we offer it simply kills any creativity. Only the fortunate ones survive with their creative skills in tact.

All these add to my frustration with the mantras of STEM and coding. If I could, I would tell educators to "fuck off."  But, alas, in the academic and professional worlds, we cannot ;)

Which is why I fully resonate with the following:
Machines are already superintelligent on many axes, including memory and processing speed. Unfortunately, those are the attributes our education system currently rewards, with an emphasis on learning by rote.
It doesn’t make sense to me. Part of my job as an investor is to attempt to predict the future – I need to make bets on the way we’ll be behaving in the next two, five, ten and 20 years. Computers already store facts faster and better than we do, but struggle to perfect things we learn as toddlers, such as dexterity and walking.
We need to rethink the way we teach our children and the things we teach them. Creativity will be increasingly be the defining human talent. Our education system should emphasise the use of human imagination to spark original ideas and create new meaning. It’s the one thing machines won’t be able to do.
We should aim to teach our kids about the power of creativity in every area.
The system in K-12 and in higher ed increasingly make no sense to me.

As an example, think about how music comes about.  And then think about such remixing:




Of course, there is a lot more to creativity than to music alone.
We need to rethink the way we teach our children and the things we teach them. Creativity will be increasingly be the defining human talent. Our education system should emphasise the use of human imagination to spark original ideas and create new meaning. It’s the one thing machines won’t be able to do.
We should aim to teach our kids about the power of creativity in every area. Science and maths, which are often considered uncreative, have shaped human history with huge creative leaps. It was creativity that allowed Newton to discover gravity while observing a falling apple as he was thinking about the forces of nature.
Tell me something that I have not been yelling about!

Oh well ... nobody cares :(

Here is Sir Ken Robinson, whom I have quoted a lot when it comes to creativity:


Sunday, January 04, 2015

For want of a nail ... a plane-load of people were stranded!

Remember that old elementary school rhyme on the want of a horseshoe nail?
For want of a nail the shoe was lost.
For want of a shoe the horse was lost.
For want of a horse the rider was lost.
For want of a rider the message was lost.
For want of a message the battle was lost.
For want of a battle the kingdom was lost.
And all for the want of a horseshoe nail.
You forgot, eh!  What a shame!!! ;)

It is not to be taken literally that a kingdom was lost only because of one small little horseshoe nail. It is a metaphor at various levels, primarily on the cause and effect--which this blogger loves exploring--and is a rhyme variation of that other old idea that the strength of a chain depends on its weakest link.

I had plenty of time to think about that after I was stranded, again, in Denver.

Here's how the story developed: the aircraft arrived earlier than scheduled--we could see it through the glass.  I was looking forward to reaching home by midnight.  After a long, long flight and after a lengthy absence from my sweet home by the river--the Riverhouse, as the friend calls it.

It got to boarding time.  The person at the counter was working the phone and not announcing the boarding.

And then came the announcement at 9:40: flight canceled.

It was not a technical malfunction.  The aircraft was fine.

It was not a pilot issue--they were right there.

The flight was canceled because the cabin attendant did not show up for work.

Yep, no cabin attendant, no go.

Almost right away I then got the automated email from the airlines:


I think the capacity of that plane is sixty or seventy.  It was a full flight, post-holiday, and all of us were now stranded in Denver for the night, and perhaps more.

All was lost for the want of a horseshoe nail :(

We can expect a lot more such events because we now deal with immensely complex systems that can fail any minute depending on the weakest link.  In the digital world, it is possible to build in redundancy and decrease the failure rate to near zero.  But, that is with the data bits of ones and zeros. Building human redundancy, however, is expensive--to hire and have available "spare" replacement personnel.  When running an "efficient" organization, the tendency is not to have too many people as "spare" personnel.  Which means ... incidents similar to what happened in Denver.  

When life goes on really well, we even resort to describing it as "everything worked out like a well-oiled machine."  We have developed such an understanding so much so that we never refer to life going well as "everything worked out like a human."  The temptation is to then replace more and more humans from the commercial and other enterprises and replace them with ones and zeros.

But, here is what we need to remember: an error-proof technological world can be sterile. It can be, well, like a well-oiled machine. A life that is devoid of humanity itself.

I don't want to be in that world.   It is the humanness that makes life all the more exciting and worthwhile.  And painful, like when the attendant did not show up.

I worry that our rush to embrace technology removes, little by little, all those qualities that make the human existence, well, human.

Of course, I cursed the night away that I was stranded in Denver--when pricked, do I not bleed?  But, am glad that we are humans and not automatons. At least, not yet!

Thursday, November 06, 2014

If the Internet of Things is the future, then I want no part of it!

The other day, I went to Best Buy.  I walked about in the store to look at some of the latest gizmos.  From flashy refrigerators and laundry machines, to snazzy coffeemakers, to cameras big and small, to phones and tablets, all of them made me feel like everything I own dates back to the Flintstones.  I worried that if I talked to a sales person there, whatever I say might comes across as Yabba Dabba Doo! ;)

And then there were all those gizmos with which you can watch your home from anywhere on the planet, set the thermostat, open the garage door, switch lights on and off.  The gizmos that are connected to the internet, and all you need is your smartphone.  They looked cool.  And, at the same time, it is freaky!!!

I am not the only one who is uncomfortable about the future that is already here:
It is already possible to buy Internet-enabled light bulbs that turn on when your car signals your home that you are a certain distance away and coffeemakers that sync to the alarm on your phone, as well as WiFi washer-dryers that know you are away and periodically fluff your clothes until you return, and Internet-connected slow cookers, vacuums, and refrigerators. “Check the morning weather, browse the web for recipes, explore your social networks or leave notes for your family—all from the refrigerator door,” reads the ad for one.
Welcome to the beginning of what is being touted as the Internet’s next wave by technologists, investment bankers, research organizations, and the companies that stand to rake in some of an estimated $14.4 trillion by 2022—what they call the Internet of Things (IoT).
I used to have a coffeemaker that had a simple clock-driven program feature--load up the coffee and water in the night before going to sleep, set it switch on at a certain time and, presto, the coffee is ready waiting for me in the morning.  I hated that, and stopped programming it.  It felt, yes, freaky.  I was missing my connection to the experience of brewing coffee through all the steps, and being there as the first drops percolated through and as the wonderful aroma of coffee started wafting through the home.   The Internet of Things take that programmable features to a dimension that no man has ever gone before.

These gizmos are slowly creeping up on us:
One reason that it has been easy to miss the emergence of the Internet of Things, and therefore miss its significance, is that much of what is presented to the public as its avatars seems superfluous and beside the point. An alarm clock that emits the scent of bacon, a glow ball that signals if it is too windy to go out sailing, and an “egg minder” that tells you how many eggs are in your refrigerator no matter where you are in the (Internet-connected) world, revolutionary as they may be, hardly seem the stuff of revolutions; because they are novelties, they obscure what is novel about them.
Many of these novelties that are tethered to the internet are simultaneously devices that in turn keep track where we are at any moment.  (Even now, the location info in my smartphone is one that I always keep in the off mode unless I really, really, need that feature.)
as human behavior is tracked and merchandized on a massive scale, the Internet of Things creates the perfect conditions to bolster and expand the surveillance state. In the world of the Internet of Things, your car, your heating system, your refrigerator, your fitness apps, your credit card, your television set, your window shades, your scale, your medications, your camera, your heart rate monitor, your electric toothbrush, and your washing machine—to say nothing of your phone—generate a continuous stream of data that resides largely out of reach of the individual but not of those willing to pay for it or in other ways commandeer it.
Freaky!
in September Apple offered a glimpse of how the Internet of Things actually might play out, when it introduced the company’s new smart watch, mobile payment system, health apps, and other, seemingly random, additions to its product line. As Mat Honan virtually shouted in Wired:
Apple is building a world in which there is a computer in your every interaction, waking and sleeping. A computer in your pocket. A computer on your body. A computer paying for all your purchases. A computer opening your hotel room door. A computer monitoring your movements as you walk though the mall. A computer watching you sleep. A computer controlling the devices in your home. A computer that tells you where you parked. A computer taking your pulse, telling you how many steps you took, how high you climbed and how many calories you burned—and sharing it all with your friends…. THIS IS THE NEW APPLE ECOSYSTEM. APPLE HAS TURNED OUR WORLD INTO ONE BIG UBIQUITOUS COMPUTER.
So, where are we headed?
So here comes the Internet’s Third Wave. In its wake jobs will disappear, work will morph, and a lot of money will be made by the companies, consultants, and investment banks that saw it coming. Privacy will disappear, too, and our intimate spaces will become advertising platforms—last December Google sent a letter to the SEC explaining how it might run ads on home appliances—and we may be too busy trying to get our toaster to communicate with our bathroom scale to notice. 
First those evil corporations come for our toasters ...

The Flintstones era is looking better in the rear-view mirror, but is fading rapidly.

But, hey, only twenty-five more years! ;)


Sunday, February 22, 2009

Praise the transistor!

One of the valuable benefits of being a member (hey, I am a full-member with voting rights!) of Sigma Xi is the magazine--American Scientist. Even though my life now is in the social sciences, this magazine is a wonderful way that I can educate myself--perhaps in a half-baked mode, eh--about a few of the topics in science.

One article there is about the transistor--a review of the past, and what the future might hold. Lots of wonderful observations there. One, even though it might sound trivial, actually speaks volumes about the fantastic transformation of our lives with innovations in the semiconductor field. So, what is that trivial yet profound observation?
The price of integrated circuitry has long been a constant one billion dollars per acre, in spite of the increasing number of transistors on that acre. The current price per transistor on an integrated chip is about 0.002 cents. A staple used for fastening together sheets of paper costs 10 times as much as a transistor.
Awesome, right? Thanks to all those people who toil day in and day out in research labs and chip factories.

BTW, some of the other articles there are equally fascinating:
The Tacoma Narrows Bridge
If you gamble, knowing when to stop to your advantage


Enjoy.

Monday, December 08, 2008

Happy birthday, mouse!

Those of us old enough to remember the days of strange keyboard commands in WordPerfect, also might remember well how we thought the mouse and GUI were the greatest inventions ever. That mouse is now 40 years old--I didn't know that it pre-dated the Mac, which is where I used a mouse for the first time. The BBC:
On 9 December 1968 hi-tech visionary Douglas Engelbart first used one to demonstrate novel ways of working with computers.
The first mouse that Dr Engelbart used in the demo at the Fall Joint Computer Conference (FJCC) was made of wood and had one button

Thank you, Dr. Engelbart.