All of the major tech companies in the United States are promoting Hillary Clinton just as Wall Street money favors Clinton over Trump 99-1.
REPORT: In this exclusive report, distinguished research psychologist Robert Epstein explains the new study and reviews evidence that Google's search suggestions are biased in favor of Hillary Clinton. He estimates that biased search suggestions might be able to shift as many as 3 million votes in the upcoming presidential election in the US.
Biased search rankings can swing votes and alter opinions, and a new study shows that Google's autocomplete can too. A scientific study I published last year showed that search rankings favoring one candidate can quickly convince undecided voters to vote for that candidate — as many as 80 percent of voters in some demographic groups. My latest research shows that a search engine could also shift votes and change opinions with another powerful tool: autocomplete. Because of recent claims that Google has been deliberately tinkering with search suggestions to make Hillary Clinton look good, this is probably a good time both to examine those claims and to look at my new research.
As you will see, there is some cause for concern here. In June of this year, Sourcefed released a video claiming that Google's search suggestions — often called "autocomplete" suggestions — were biased in favor of Mrs. Clinton. The video quickly went viral: the full 7-minute version has now been viewed more than a million times on YouTube, and an abridged 3-minute version has been viewed more than 25 million times on Facebook. The video's narrator, Matt Lieberman, showed screen print after screen print that appeared to demonstrate that searching for just about anything related to Mrs. Clinton generated positive suggestions only. This occurred even though Bing and Yahoo searches produced both positive and negative suggestions and even though Google Trends data showed that searches on Google that characterize Mrs. Clinton negatively are quite common — far more common in some cases than the search terms Google was suggesting. Lieberman also showed that autocomplete did offer negative suggestions for Bernie Sanders and Donald Trump. "The intention is clear," said Lieberman. "Google is burying potential searches for terms that could have hurt Hillary Clinton in the primary elections over the past several months by manipulating recommendations on their site."
Google responded to the Sourcefed video in an email to the Washington Times, denying everything. According to the company's spokesperson, "Google Autocomplete does not favor any candidate or cause." The company explained away the apparently damning findings by saying that "Our Autocomplete algorithm will not show a predicted query that is offensive or disparaging when displayed in conjunction with a person's name."
Since then, my associates and I at the American Institute for Behavioral Research and Technology (AIBRT) — a nonprofit, nonpartisan organization based in the San Diego area — have been systematically investigating Lieberman's claims. What we have learned has generally supported those claims, but we have also learned something new — something quite disturbing — about the power of Google's search suggestions to alter what people search for. Lieberman insisted that Google's search suggestions were biased, but he never explained why Google would introduce such bias. Our new research suggests why — and also why Google's lists of search suggestions are typically much shorter than the lists Bing and Yahoo show us. Our investigation is ongoing, but here is what we have learned so far...
The three main findings were as follows:
1) Overall, people clicked on the negative items about 40 percent of the time — that's twice as often as one would expect by chance. What's more, compared with the neutral items we showed people in searches that served as controls, negative items were selected about five times as often. 2) Among eligible, undecided voters —the impressionable people who decide close elections — negative items attracted more than 15 times as many clicks as neutral items attracted in matched control questions. 3) People affiliated with one political party selected the negative suggestion for the candidate from their own party less frequently than the negative suggestion for the other candidate.
In other words, negative suggestions attracted the largest number of clicks when they were consistent with people's biases. These findings are consistent with two well-known phenomena in the social sciences: negativity bias and confirmation bias.
Negativity bias refers to the fact that people are far more affected by negative stimuli than by positive ones. As a famous paper on the subject notes, a single cockroach in one's salad ruins the whole salad, but a piece of candy placed on a plate of disgusting crud will not make that crud seem even slightly more palatable. Negative stimuli draw more attention than neutral or positive ones, they activate more behavior, and they create stronger impressions — negative ones, of course.
In recent years, political scientists have even suggested that negativity bias plays an important role in the political choices we make — that people adopt conservative political views because they have a heightened sensitivity to negative stimuli. Confirmation bias refers to the fact that people almost always seek out, pay attention to, and believe information that confirms their beliefs more than they seek out, pay attention to, or believe information that contradicts those beliefs. When you apply these two principles to search suggestions, they predict that people are far more likely to click on negative search suggestions than on neutral or positive ones — especially when those negative suggestions are consistent with their own beliefs. This is exactly what the new study confirms. Google data analysts know this too. They know because they have ready access to billions of pieces of data showing exactly how many times people click on negative search suggestions. They also know exactly how many times people click on every other kind of search suggestion one can categorize.
To put this another way, what I and other researchers must stumble upon and can study only crudely, Google employees can study with exquisite precision every day. Given Google's strong support for Mrs. Clinton, it seems reasonable to conjecture that Google employees manually suppress negative search suggestions relating to Clinton in order to reduce the number of searches people conduct that will expose them to anti-Clinton content. They appear to work a bit less hard to suppress negative search suggestions for Mr. Trump, Senator Sanders, Senator Cruz, and other prominent people.