Sunday, May 24, 2015

Racists like us. Our robotic future!

Were you like me who guffawed a lot a few years ago when a Google search for "miserable failure" led to this?

Soon after that, Google modified its algorithm and the laughter subsided.  But, the fundamental problem itself continues.

What's the problem?  Remember all my posts (like here) worrying that the searches that do, the photographs that we tag with people's names, our interactions with voice-activated features, are all how the machines learn in order to become better and better?  I.e., we provide the data, which eventually becomes the basis for the "brain" for the machines, and we then turn around and ask questions to which machines provide the answers.

Aha, now you see the problem, don't you?

Latest exhibit: our racism means that the machines become racist?  We--not you the reader nor me the blogger--but the collective we are apparently racist folks in how we do searches.

Do you see the huge problem in the graphic below, which is a Google map of the White House?

Source

What happened?
It was discovered that when searching for “n***a house” and “n***a king,” Maps returned a surprising location: the White House.
It is one thing to laugh at the result of a search for "miserable failure" but another when it takes a sharp racist turn.

BTW, it was not only racism:
A search for “slut’s house” led to an Indiana women’s dorm.
What's going on?  "the internet itself is racist and degrading":
“Certain offensive search terms were triggering unexpected maps results, typically because people had used the offensive term in online discussions of the place,” wrote Jen Fitzpatrick, VP of Engineering & Product Management. “This surfaced inappropriate results that users likely weren’t looking for.”
The more we users had referred to the White House as the place where a N*a lives, the more the machines learn that association and then they repeat that to us.
The type of invective that led to this more recent Google Maps grotesqueness, though, isn’t something you can simply flip a switch to turn off, because it’s woven into the fabric of the internet itself. Essentially, we’re making internet algorithms racist.
This is truly atrocious, and not at all funny like the old "miserable failure" search because:
And it’s important to understand that while the technical function of producing the recent racist results are similar to how a Googlebomb works, there’s one very big fundamental difference: A Googlebomb is calculated. A group of people decided they wanted to game the “santorum” results and made it happen. In the case of the White House and other offensive Maps searches, the algorithm wasn’t subject to a coordinated effort, it just gathered up all the data the internet could provide, and the internet provided trash.
Yep, nobody was orchestrating a campaign.  The machines simply picked up our usage--they learn really well!

In addition to such racist behavior being despicable, we also need to keep this in mind:
Google doesn’t show us the world; just a curated version that it thinks we want to see.
What it shows is really ugly.

If only the internet users were really just lovable dogs!

Source

2 comments:

  1. Its a sad commentary on the human race that most of us are "racists" in some form or the other. It may be colour, religion, caste, nationality, whatever. We seem to like to belong to one group and consider another group as inferior. Maybe its part of the survival instinct honed by millions of years of evolution. As you observed, the internet simply throws up these bisases as a mirror.

    ReplyDelete
  2. Oh, no reason for this awful behavior ... we humans are messed up--as simple as that.
    The internet serves as a mirror, and we now know that we have met the enemy and it is us :(

    ReplyDelete

Posts popular the last 30 days