Friday, March 13, 2015

Google as the Oracle at Delphi

In the old, old days, when we wanted to know something, we might have walked up to the elders and asked them the questions for which we sought the answers.  Almost always, the elders didn't know and they bullshitted.  But, we didn't know any better and took their answers as the truth.

Then, we started referring to books and encyclopedias, and even our teachers.  We were a lot more confident about what we read in print than we were about the words that came out of our elders' mouths.

Now, things have changed.  Dramatically. So much so that "I will google it" has become a part of our conversation.  Google has replaced the elders, the books, the encyclopedias, the teachers, the ...  Google knows it all.  When I want to find out what Google might not know, it even helps me with filling out the search query!


Given its importance in our lives, of course it became big news that Google was thinking of altering its algorithm:
Currently the biggest factor is how many other pages link to the page in question, but this isn’t always a good metric for determining quality content. Often viral hoaxes are linked to tons of times simply because they're being talked about, not because they’re correct.
The Google research team wants to revise the current system to look for inaccuracies instead of links. The strategy isn’t being implemented yet, but the paper presented a method for adapting algorithms such that they would generate a “Knowledge-Based Trust” score for every page. To do this, the algorithm would pick out statements and compare them with Google’s Knowledge Vault, a database of facts. It would also attempt to assess the trustworthiness of the source—for example, a reputable news site versus a newly created Wordpress blog. Another component of the strategy involves looking at “topic relevance.” The algorithm scans the name of the site and its “about” section for information on its goals.

In other words, Google is improving the "information literacy" of the algorithm.  If Google can successfully do this, then we don't need to teach students about information literacy?

Not so fast; "The search engine should not be the arbiter of truth":
Of course, figuring out what’s worthy of belief isn’t easy. In fact, it’s one of civilization’s basic and continuing tasks. We’re not there yet, despite the efforts of people who prefer to live in an evidence-based society. And we’ll never have perfect knowledge. Still, we should applaud any progress Google and other search engines can make in undoing some of the pernicious effects of the Internet.
But, no, Google absolutely should not become the arbiter of truth. That’s dangerous in every kind of way
Not that Google wants to become the arbiter of truth.
No one—not even Google—wants Google to step in and settle hash that scientists themselves can’t fully.
But, we lesser mortals who once thought that the elders' words were the truth, and then thought that our teachers' words were the truth, and then ... could now lazily think that what Google gives us is the truth.
We’ll all have some responsibility, too—not just to complain (properly) when Google gets it wrong, but to recognize and appreciate the nuance and complexity that are part of almost everything. In other words, we’re all in this together.
And that is the truth. Because I read it on the internet! ;)

3 comments:

Ramesh said...

For once, just for once, I completely agree with you :)

Anne in Salem said...

Google as the arbiter of truth? Perish the thought! Two humans can't even agree on truth. How will a machine, which is programmed by one such human? Impossible.

Sriram Khé said...

Nothing is impossible, Anne. As long as people are lazy to think for themselves--and I run into such people everywhere--the machine wins in the long run.