Search
  • Kenneth Jacobs

The Illusion of Inquiry: Google It

The status of truth in America is inextricably linked to our means of inquiry. Most common among those means is "Googling it." As the predominant "engine" for "search," Google is the authority. Google dissolves doubt and fixes our beliefs. Unfortunately, Google tends to confirm what you already know.


The illusion of inquiry is automated "search."

  1. Enter your query

  2. View suggested queries

  3. Accept or reject suggested queries

  4. Link to your preferred and most befitting source

Your preferred link is whichever one gave you what you wanted in the past––quickly and effortlessly. Search engines capitalize on this law of effect with "zero click searches." Google, for example, works hard to trim the time it takes to get from "search" to "answer." The effect of this brevity is habit forming; it is what keeps you saying, "Google it":

  • to "prove" your own point ("I knew it!")

  • to save face ("See! I knew I wasn't imagining things.")

  • to find "facts" ("Told you so!")

  • etc.

The problem with Google's Quick Answers Box is that it brings inquiry to an end. We hail Google in moral and political affairs, as if it could solve a problem in the same way that a calculator "solves for" a "problem" like 5!. Unlike the solution for a factorial of 5, moral and political inquiry requires us to hear what others have to say. I need your individual experience to bear on my thinking because that is the sort of associated action we need to form a community. The associated activity of two or more people––in talking, planning, or reflecting––is the conjoint behavior we miss out on when your transaction is with an algorithm.


The proof is in the algorithm


"Googling it" is automated inquiry. Things go in and things come out, but we don't quite know what happens in between. We move from "search" to "result" without ever having access to the methods of our inquiry. In an attempt to solve for this, we end up misplacing method altogether. Rather than question the go-between search and result, we blame the user for her choice of "search terms."


We blame the user while knowing full well that the "results" of our "search" are curated according to our "search history." The intermediary between search and result is confounded; it is biased. In other words, the proof is not in the pudding because the intermediary between our mouths and the pudding is contaminated. Google's algorithm, like a dirty spoon, contaminates what we know as true, false, good, bad, right, or wrong.


When in doubt, consider method


When in doubt, we "Google it" at the expense of other means of inquiry. If not Google, then YouTube (owned by Google), Facebook, Instagram, etc.

The problem is not so much with the source as it is with its method. Like the results of your Google Search, social media gives you what you want by confirming what you already know. Hence, the term filter bubble. Facebook's carefully curated "news-feed" keeps you scrolling in the same way that zero click searches keep you on Google's platform. Ad-revenue is prioritized above all else. That is, regardless of what might be true "in the long run and on the whole" (James, p. 106).


When in doubt, we need to consider the method that fixes our beliefs. Truth in America is a result of the methods of authority, tenacity, agreeableness, and science. Additions to those methods are Googling it and virality. In other words, "truth happens to an idea" when we Google Search it or when it goes viral on social media. With the exception of science, our current President––whose lies measure in the thousands––uses these methods to his advantage. What's true for @realDonalTrump, though, is not true for everyone else. His truths are personal truths that for-profit Search and Social Media are happy to promulgate.


Science seems the only way out of this filter bubble, but only so long as it too is not biased by measures of virality like h-index. Like the methods of inquiry in science, the methods of inquiry in Search and Social Media need to be made more transparent. Algorithms alone cannot decipher what is true, false, right, wrong, good, or bad. And to be clear, the illusion of inquiry is NOT a problem with technology per se. In Dewey's (1927) words:


There are those who lay the blame for all the evils of our lives on steam, electricity, and machinery. It is always convenient to have a devil as well as a savior to bear the responsibilities of humanity. In reality, the trouble springs rather from the ideas and absence of ideas in connection with which technological factors operate (p. linked).

The problem is with the thinly veiled idea that Search and Social Media are for connectivity and communication across people and places. The problem is the idea that we are "searching," "sharing," and "liking." Search and Social Media are neither free nor equitable, as our exchanges are mediated and reciprocated by bots not people. We have the means of communication to form what Dewey (1927) called a Great Community, but it is contaminated by the algorithmic works we cannot see.


We have the physical tools of communication as never before. The thoughts and aspirations congruous with them are not communicated, and hence are not common. Without such communication the public will remain shadowy and formless, seeking spasmodically for itself, but seizing and holding its shadow rather than its substance. Till the Great Society is converted into a Great Community, the Public will remain in eclipse. Communication can alone create a great community. Our Babel is not one of tongues but of the signs and symbols without which shared experience is impossible (Dewey, 1927, p. linked).

As of now, and if Election Day 2020 is any indication, our Search and Social Media are missing the genuine signs and symbols that connect us.

18 views

©2019 by Pragmatic Means. Proudly created with Wix.com