Google’s influence on society is both unparalleled and poorly understood. The company gets over FIVE BILLION searches per day, and it’s only increasing. Most of us use Google, and we’re far more likely to click the top few results. If Google is determining what we’re more likely to read, and if reading can change belief, then Google’s algorithm can change millions of beliefs daily
I myself have changed beliefs – and in one case my entire worldview – based on what I found through Google. I’m happier for it, but I can’t help thinking a few different top results may have changed the course of my life forever.
Most people either don’t know or don’t care how Google chooses results, as long as they find what they’re looking for. That might be fine when you’re looking for an address or a cake recipe, but are we comfortable with an unknown algorithm subtly influencing which political and philosophical opinions are most relevant and silencing opposing voices? Sure, the algorithm changes several times a year, but it’s best to know the underlying foundation.
Google and the Spiral of Silence
A “spiral of silence” occurs when a system pushes one opinion into obscurity. This is because it differentiates from the norm and many people do not like to stick out like a sore thumb.
Any social media site exhibits this behavior, and it can be especially seen in popular social sharing site reddit, an example of a spiral of silence. When users post a new link or comment, other users vote on whether they like the content or not. Reddit sorts content by popularity, voting a comment up results in more visibility and thus a higher chance of being voted up further. If users vote a comment down quickly most people will never see it. Downvotes push unpopular arguments so far down the page that casual readers rarely ever see opposing viewpoints.
Reddit is at the whims of the masses. Is Google?
Trishan Mehta co-founder of WPBizBlog, argues that it isn’t. Instead, he states,
“Search has changed tremendously over the past 5 years. Earlier, it was true that only pages with the highest backlinks could rank on the 1st page of Google. But now, it is not uncommon to spot pages with in-depth content of more than 3000 words but with just one or two backlinks ranking on page one.So, it cannot be said Google is promoting only popular opinions on its platform. If you can write comprehensive content, replete with interesting and relevant facts and figures, you can hope to rank on the first page with zero or very few backlinks. Even if your content runs counter to the popular opinion in vogue, Google will rank it higher on SERPs provided that it fulfills the requirements of EAT (Expertise, Authority, and Trust) as laid down by Google.”
Andrew Selepak from the University of Florida agrees in a sense and cites that mainstream sites (and thus better well-known) are more credible due to their reputation. He explains,
“People are looking for accurate and reliable information to keep them and others safe. Although this means that users are more likely to find the same results in their searches, the higher ranking results are most likely to be the most reliable as a page’s higher ranking has been earned from age of site and number of inbound links. At the same time, while Google’s algorithm prioritizes more mainstream sites and those deemed more credible by placing them higher in the user’s search results, it does not prevent users from finding other information on Google.”
On the other hand, Ryan Satterfield from Planet Zuda believes that the search engine company doesn’t do it intentionally. It simply provides what people want to see. Perhaps suggesting that we are a product of our own ignorance. He states,
“It is understandable that when you’re a large company that you will have to make some compromises, but when it comes to the status quo, that’s where it is interesting. Google’s rules on search have changed drastically in the last few years which a few of the rules do benefit corporations which have a lot of people talking about them and linking to them, which is easier for those who are already considered a status quo company, however that is also a good way to measure what people want to see.”
How The Algorithm Works
Google’s algorithm works in a similar manner. It bases its algorithm based on how many links from OTHER websites it gets. These links are like votes and like with anything, the more votes a page gets the higher it can rank. Because unpopular opinions have fewer sites and fewer proponents, they get fewer votes in Google’s eyes, while popular opinions end up with more votes to share and thus more exposure. Sites with unpopular opinions already have less visibility because they’re being shared less, but the search engine actually compounds the lack of visibility by ranking these sites lower since people aren’t linking to them.
Admittedly, it’s likely not the intent of Google – after all, it’s just one flawed ranking system that’s not perfect. Unfortunately, it’s just how Google accidentally reinforces the status quo.
Google has always ranked sites by their popularity through a combination of pageviews and “votes” through backlinks. One of Google’s primary advantages over early spam-infested rivals was its use of PageRank – an algorithm developed by Larry Page and Sergey Brinn – to determine which sites and pages were being linked to most often. Sites with more links pointing to them tend to rank higher, and their outgoing links count for more than less-popular sites. The logic for using popularity metrics to rank results is simple: if other legitimate sites like a page enough to link to it, it’s probably not spam. If hundreds of sites are linking to a page, it’s more likely to appeal to you, too. That’s driven by Google’s page rank algorithm, demonstrated here:
Google has become more and more sophisticated in the way it ranks sites, but they have never stopped relying on popularity metrics. A recent study from Backlinko found that popularity metrics – links, shares, etc. – remain the highest correlated factor in earning a top spot in a Google result. Google representatives have said they tried excluding links as a factor, but doing so made search results far worse. Pablo Lopez, CEO and Founder of Topflight Agency agrees with the current ranking factors. He believes,
“It’s true that Google follows the patterns of mass thinking when selecting the search results for each specific search term. This is because Google, as well as the rest of the search engines, wants to show the best accurate, helpful possible result so the users will easily find whatever they’re looking for. And, in general, the mass thinking aligns with the general thinking which is very likely to provide the most relevant information and that’s why tends to choose commons ways of thinking when it comes to search results pages.”
Google’s algorithm remains fundamentally biased towards the majority view: the less popular your viewpoint, the less likely it is to show up. The less visibility you receive, the less likely you are to get links or shares, and the spiral continues downward. However, not all links are created equal. David Zimmerman from Curious Ants claims Google,
“doesn’t just use the number of links to determine who the best plumber’s website is. It evaluates the quality and
objectivity of those links.Does this promote the status quo? Yes- but that’s a good thing. Let’s say there’s a new plumber coming into business. Maybe this plumber has a new idea or a better company than the established leader. Who says? Well, he’ll have to build credibility in his area to establish this. That will take time. As he does, he’ll eventually get more recommendations than the other leading plumber and will beat him out. However, that shouldn’t be a quick process. It takes time to establish credibility.”
How it Affects Real Search Queries
Consider the search on Google.com for, “does god exist?” It certainly favors that God does with a split at 3(that argue god does exist, 1 that argues against it, and 1 that poses the question as mere philosophy. Google’s algorithm has come a LONG way since 2014 and while links remain a large ranking factor, Google also analyzes other metrics (bounce rate, time on page, and search intent). These metrics still certainly reinforce the status quo and promote more popular thinking ideologies, but it’s a little more complex than that.
Not every search will reflect majority views, of course. The words you use can easily skew a search towards what you’re looking for. “Is abortion wrong,” for example, yields a lot more pro-life results, and “is abortion right” yields more pro-choice results. Google will also personalize and localize results it thinks you’ll want: for example, the search above may appear different to you based on your location, search history, and friends so that it will appeal to YOU – something that really only enforces existing ideologies.
Google justifies showing you popular and personal results because the results are more relevant: these are things you are more likely to be interested in. Google says, essentially, “We’re just giving the people what they want.” I understand the business need to increase personalization, but I also worry that feeding the majority their own view on important social issues could lead to a culture with increasingly stagnant opinions.
Google certainly isn’t the only site silencing minority views with an algorithm side-effect. Nearly all search engines prioritize results on popularity. Facebook and Twitter are more likely to show you well-liked posts. Among sites intended for information discovery, all of the most popular sites use popularity in some form or another to rank what you see. This realization should be a little frightening, and search engines so far have escaped the scrutiny of social observers.
The trend today is toward more popularity data and integration resulting in more clicks. Marketers realized long ago that personalizing what you see to match your current interests and beliefs means you’ll be more likely to click, read, and return. This is why Google gathers data about you and Facebook personalizes your news feed. If showing you information you already agree with is as profitable as it seems, we’re headed for a world where our most-used online services are afraid to offend us by disagreeing.
We’re in a unique time. Not just because of COVID-19, but also because of the Black Lives Matter movement. No matter where you stand on this issue it’s interesting to see what results “all lives matters” results in:
Which shows a mixture of news and articles. In this case, we can see the SERPs are 100 percent against All Lives Matter. Again, this post isn’t political but it doesn’t show any articles that may be in favor of All Lives Matter. However, Dr. Julio Viskovich believes that getting links isn’t just a vote of popularity, but potentially a trust factor. He cites a Pew Research Center study where,
“after the 2016 election found 64% of adults believe fake news stories result in mass confusion and also 23% said they’d accidentally shared fabricated stories themselves. Since it relies heavily on links their algorithm needs to analyze the authority of these links much better. However, this puts them in a conundrum of becoming a publisher rather than a platform that may exclude them from section 213 protections. Radical thinking needs to be promoted as much as popular beliefs to shed light on these ideas while perhaps providing a “truthfulness score” determined by an independent company with a diverse group of members from both sides of the political spectrum.”
Responsible web search
Limiting our exposure to conflicting views is dangerous. When journalism was booming in the 1920s, many realized that what newspapers wrote and published had a great influence over society and culture. It was easy to write what their readers wanted to read – playing to their existing biases and beliefs. Many journalists worried that papers were neglecting less-common beliefs and avoiding difficult criticism and so the “opinion” section rose in popularity.
Entrepreneur and Founder of Solitaired Neal Taparia doesn’t think Google should necessarily be held to the same standard and he brings up a good point. Taparia says,
“Interestingly, controversial and misleading topics can drive a lot of links to a site.They are more sensational and share-worthy, all of which are ranking factors to Google.. EAT, however, encourages sites to cite expert sources and to produce in depth and credible research to rank better. The best research often presents a diversity of perspectives as part of making a strong argument. This, in my opinion, serves their users best. They become better informed when Google surfaces information that is authoritative and well rounded, as opposed to those based on conspiracies or lies or conjecture. “
Movements like civic journalism grew out of the idea that journalists and editors had a responsibility to do more than report the facts. Socially responsible journalists actually tried to improve public discussion, and in many cases, they succeeded. Their coverage of the civil rights movement was highly influential and led many to reconsider long-held biases.
Social networks such as Facebook are in a similar boat unfortunately since you see posts that have more likes and comments by people you are friends with – friends that are more likely to share our beliefs than strangers. The responsibility is ultimately ours to honestly and openly challenge our own beliefs, but maybe our search engines have a social responsibility, too.
What do you think? Do Tech Giants such as Google and Facebook have a responsibility to serve ALL content or ONLY what people are looking for? Maybe they should do this on a per-query basis to promote fresh thinking. Let us know in the comments below.