It is seemingly impossible to use Google Search without the underlying software attempting to personalize the results. This is the conclusion from a new report by DuckDuckGo which draws on data the company compiled from Google Search users.
While the idea Google is returning personalized results in Search is hardly newsworthy, what the report focuses on is the discrepancy between searches performed when logged in and those performed when searching using the service's "incognito" mode. The latter of which is supposed to offer more of a private browsing experience, and in theory should perform a neutralization of user data. In other words, when a user is making what they believe to be anonymous searches, the results provided by Search should be standardized – the same for everyone. DuckDuckGo argues the reality is very far from the theory.
Incognito doesn't mean you're alone
The main conclusions from the study focus primarily on two clear points. The first, when a user searches in incognito and again in 'logged in' mode, the understanding is the results returned will be different. As the latter will provide results based on the user's known information, preferences and so on, while the former will return neutral results. DuckDuckGo states the data collected does not support this notion and instead the results offered very little variation regardless of whether the user was logged in or not.
This represents the within-subject element and the second main observation DuckDuckGo makes to highlight the same point relates to a between-subject approach. In this instance, one would assume that all users who are searching through incognito mode should see the same results. However, the study suggests this was also not the case with those who provided the data seeing results which varied significantly. A point DuckDuckGo uses to further suggest is evidence that the results are anything but standard or neutral, but instead are directly related to the users. As they vary wildly, so did the incognito results.
Google's secret Search sauce under fire again
This is not the first time Google's approach to searching the internet and ranking results has come under fire. While Google's solution is easily the most widely used, it also has the potential to exert massive influence and that in itself is a thin line to walk for a search engine. This was most prominently seen during previous election cycles when Google was accused of political favoritism based on the way results were returned and ranked when consumers were searching for select politicians and political topics. The company has denied these claims and has repeatedly pointed out the various ways in which it has improved the service to ensure results are fairer. Though the political bias accusations have not subsided with fresh accusations coming through this year suggesting Search has an inherent left-leaning bias.
In fact, this latest study subtly draws on this point as the methodology in play utilized politically hot topics such as "gun control" and "immigration" to create the study, examine the results and form the conclusions made. Though the study does not comment on this point specifically, the choice of keywords used are interesting and especially considering they are so highly divisive. On the one hand, if Search has a left-leaning bias as suggested by some, then this sort of keyword selection would automatically result in a less neutral return of results. On the other hand, even if the results are bias-free, the topic itself is so divisive that it may impact on the way a natural ranking of results can be displayed – what constitutes the right ranking for such controversial keywords? This in itself may be a problem for Google's algorithm as identifying authoritative results on a controversial issue might not be as easy to do as the study suggests, and may even result in different results for different users. If that is the case, while that is certainly a problem, it's not quite the same problem as suggesting personalized results are penetrating incognito mode.
Google's user bias or biased study?
While the study itself is designed to highlight a user bias that's apparently always in play regardless of whether the user has identified themselves to Google or not, the study itself is open to criticism due to its author. DuckDuckGo is not necessarily an independent operator here as it offers a search engine solution that looks to directly compete with Google Search. What's more, the main selling point with DuckDuckGo's solution is the level of privacy it offers with the company keen to highlight how searches it performs for you, are not based on your personal data. Therefore, the results highlighted in this study directly chime with DuckDuckGo pre-existing assumptions about how Google Search operates and what makes its service a better option for the modern and privacy-concerned landscape.
Of course, this does not necessarily mean the results, and the study itself is biased, or cannot be taken on face value. As the suggestions made, regardless of motive, are based on data collected and do raise valid points. Although on their own they probably do not equate to an automatic confirmation of the use of personal data when in incognito mode. Instead, the results should be viewed as a starting point and a reason to justify further research on the topic. Especially considering the results might not be as clear-cut as suggested. As the study clearly implies causation is in effect where the presence of a user impacts on the results shown and this might not actually be what is happening. The data noted, could in theory be subject to other and less intentional aspects. For example, the high levels of consistency within-subjects might be to do with additional variables that have not been accounted for (although DuckDuckGo does state it accounted for time and location), while the between-subject variation may also be result of more natural fluctuations that are occurring and not directly an effect of personalized results. Regardless of whether Google's Search technology is still personalizing results, the very fact that they are not standardized in these instances is a concern.