Rethinking freedom in the algorithmic age

How was your life this year? Do you want to reflect on it? Do you want to look back at what you have achieved or lost this year? Some of us might want to do that but I believe many have moments they do not want to relive. While they have the right to do so, they might not be able to run away from them thanks to the new age of algorithms. In 2014, Facebook incorporated the app “Year in Review” into its social network. The app was designed to help users relive their favorite memories by featuring a selection of highlights of their year (mostly photos) pulled from their profiles in a video. And for Eric Meyer, despite seeing the videos created by others popping up on his Facebook News Feed, he avoided making one of his own (2). Until one afternoon, the app inadvertently showed him what his year looked like with one feature photo (3).  There, on the top of his Newsfeed, among the celebrating animation, was the face of his daughter who died that year (4).

The algorithms of the app were designed to push a picture in user’s timeline to urge them using the app. The pictures were probably algorithmically selected based on their interaction scores and the codes were clearly not designed with all scenarios in mind. However, my essay is not to denounce algorithms or programmers, but to contemplate on users freedom on world wide web. The story of Meyer is painful, yet it showcases how important it is to ask and answer, “What is the nature of freedom in the algorithmic age?” This essay takes the conceptual and historical work of the philosophers Michel Foucault, Gilles Deleuze and Isaiah Berlin to theorise personal freedom based on the analysis of some digital platforms. Foucault and Deleuze’s readings of disciplinary power and society of control are anticipatory and reflect our realities today; however, it is also important to take an approach from the perspective of freedom. In the essay, I will first establish the foundation of the analysis by defining algorithms and reviewing concepts of freedom and neoliberalism, and then analyse the important subjects of freedom before concluding what freedom is when algorithms are at work.

Algorithms

Algorithms are systems or processes consisting of sets of rules and grammars of action that describe how to perform a task. While algorithms do not have a consciousness themselves, they are essentially not unbiased or neutral (Cheney-Lippold 166). Ultimately, it is inevitable to design without bias. In fact, the “architects” (programmers, product owners, project managers, etc.) either deliberately or unintentionally embed their own ideology, their own perception of the world, or their ways of doing into the system, which are usually opaque to users unless it is an open source system. Moreover, algorithms, especially self-learning machines, need initial data to train themselves. The historical data that was fed to the algorithms certainly embeds historical practices and patterns. Algorithms simply pick up on these patterns and propagate them (Agre 745). Moving further from there, the operation of personalisation algorithms is a constant feedback loop that continuously aggregates user data, i.e. users’ consumption behavior, to classify users into dynamic categories and then modulates their online experiences (Cheney-Lippold 168).

I agree with Foucault that a conception of power would not simply be negative or juridical and algorithms could be analysed as a technology of power which is productive because it operates by producing knowledge and desire (The Meshes of Power 154). An important note here is that in Foucault’s concept of power, knowledge is inseparable from power (Discipline and Punish: The Birth of the Prison 27). Algorithms indirectly produce knowledge of us by capturing, analysing our activities and categorising us. Based on this knowledge, algorithms show us the advertisements that we may find relevant to us and therefore creates the desire for the products shown in these advertisements. It also produces new forms of desire, sometimes at a high level, such as the desire to expose ourselves and to see others exposed their life and their thought or the desire to create an online persona as in the case of social networks. Nevertheless, if we analyse the power of algorithms critically, we would be able to understand its relationship with human freedom and then contemplate the dimensions of freedom. Before conceptualizing freedom in the algorithmic context, it would be useful to review and reflect on the two concepts of liberty identified by Isaiah Berlin, a liberal philosopher, in 1958.

The two concepts of liberty/freedom

Negative freedom, advocated by classical liberals, John Locke, John Stuart Mill, Benjamin Constant, is a freedom from outside interference (Berlin 371). The concept of negative freedom can be summarized as: “I am no one’s slaves. Freedom is my ability to operate within a certain sphere where no one else is interfering with me.” The classical liberal thinkers believed that there should be a minimum range of personal freedom and a clear division between the public and private sphere (371). “Freedom in this sense is not, at any rate logically, connected with democracy or self-government” (373). Therefore, governing in the sense of negative freedom does not attempt to model and impose an overarching sense of common good, but to make sure that one’s autonomy is free from constraints.

Positive freedom, advocated by socialists or even liberals, could be summarized as: “I am my own master. Freedom is my ability to achieve my goals regardless of interference” (373). The concept of positive freedom entails being a rational, active, and responsible being. But human beings could be passionate, irrational, and ignorant. Therefore, democratic optimists such as Fichte, T.H. Green believe that democratic governing should involve creating social conditions which would provide each individual the means to exercise their free will. It became legitimate to them to impose constraint or obligation on individuals so that higher forms of freedom can flourish (381). Berlin rejects this argument which seem to defend authority and has been used by those needed justifications for imposing their doctrines on society (382). For Berlin, while imposing positive freedom, one might actually behave as the enemies of personal freedom.

Then we have neoliberalism which adopts the strict definition of negative freedom and maintains that individuals are free to pursue their self-interest and participate in a competitive market (Read 5). It should be noted that neoliberal refuses the positive freedom tradition and advocates reducing the role of government to a minimum. Neoliberal governmentality operates through a choice of architecture to shape the environments and rules of the game to intensify individual freedom and responsible choice (Thaler and Sunstein 6).

Self-determination and Interference

What the two concepts of freedom have in common is the self-determination. For both concepts, one must, ultimately, be able to exercise their free will. Thus, it is important to analyse the self-determination of individuals in the algorithmic context. To understand how our self-determination takes place in the digital space, we can examine the case of Google’s search algorithm, which is also referred to as PageRank. Google explains their search algorithms as follow: when a search query is entered into Google Search, the algorithms look for clues (i.e. user’s search history) to understand what user means by it rather than just the literal content of the search query. Then they match the user’s query with information on web pages and use a formula to decide how relevant a content is to what the user is looking for. They also examine different aspects of the web pages to decide the orders of search results. Within this system, our knowledge is, to some extent, dictated by various aspects that the algorithms take into account. Furthermore, the algorithms always evolve to better understand what you mean, sometimes better than you know yourself. For example, if you search for “how to go to Schiphol Airport”, you see a map with directions, not just links to other sites. By optimizing the search results to best match users’ search intentions, deciding what is more relevant to the query, the algorithm becomes an affecter of knowledge, and therefore influence users’ self-determination.

Optimizing for success is at the core of algorithms, which is most evident in the recommendation engines that have become excessively popular. Netflix, Amazon, Youtube gently suggest content or products they think we will like. While one can argue that users have the ability to choose what they want to watch on Youtube and Netflix or choose what they want to buy on Amazon, it is clear that the space to explore other options than which are presented to us is very limited. In a letter to shareholders in April 2015, Jeff Bezos, CEO of Amazon, declared that they generate “a steady stream of automated machine-learned nudges (more than 70 million in a typical week)” and “these nudges translates to billions in increased sales to sellers” (Mac 17). It is hard to imagine that these 70 million nudges did not somehow manipulate consumers’ self-determination. This manipulation is a subtle form of interference to personal freedom.

Control and Freedom 

In the final section, I want to discuss the relationship between control and freedom. From a Foucault perspective, freedom and control are not in opposing relationship with each other. Rather, they are mutually constitutive. In other words, they cannot exist without one another. “Power is exercised only over free subjects” (Foucault, The Subject and Power 221). People are controlled through freedom, not in conflict with freedom. This is certainly true in the algorithmic age which is observed to has a strong influence of neoliberalism. For example, Google Ad Auction lets the market set the price or the dating website OKCupid finds your match by analysing your answers on the site’s survey.

A GPS technology improves our ability to find the way in an unknown location and provides a means to exercise our free will in a strange environment. On the other hand, it also takes away our freedom in some opaque ways. For example, our movement becomes the data that we give away in exchange for the direction. Or we could also say that we depend on the direction given by the GPS technology, and our autonomy was taken away in the process. This example perfectly illustrates the nuances of control and freedom. We are left with as much freedom as possible because any attempts to control the subjects (i.e. social media users) would undermine the system’s ability to study their subjects. However, algorithms have optimized the environment not only to ensure that users benefit the most from their services but also to accommodate the goal of the platform owners.

Another example is that Google’s search result is not the same for everyone depending on various factors including their location. When someone in Germany searches for “4th of June 1989”, they are shown the historical event “Tiananmen Square protests of 1989”. On the contrary, using the same search query on the same platform in China gives us different results, i.e. “famous birthdays” was on the top of the results.

Once individual autonomy is increased, algorithmic control could also be leveraged. Such practices are established on the ground of as least coercion as possible so that one can exercise their freedom according to current social norms. While not forcefully imposing the current norms of freedom, civility, and industriousness on individuals, the platform owners can act as the public arbiters of social values and knowledge by the use of algorithms.

Conclusion 

The exercise of power and freedom in the West has taken new forms. Deleuze anticipated and already explained the work of control in his essay “Postscripts of control” which is the foundation of this essay. How about freedom? In the algorithmic context, we are exercising a form of negative freedom, though not in its strict definition, as we are free to act and to choose within the options filtered for us based on our previous actions. Power is embedded in the algorithms with economic and/or political incentives. This is not to say that we are suffocated under the algorithmic mechanism of control. Rather, algorithms empower a kind of freedom which I would call “filtered freedom”. We are free from constraints, have a sense of full self-determination, and yet remain subjects to surveillance and potential manipulation.

Ideally, this essay could have provided a more comprehensive review of the debate on freedom by including the work of Nikolas Rose who challenged Foucault’s view of freedom. Further research on freedom in the algorithmic age could look into that and explore the ethical values of this freedom. Moreover, in any society, resistance to control always exists; therefore it would be important to research how the algorithmic forms of resistance affect our understanding of freedom.

 

References

Agre, Philip E. “Surveillance and Capture: Two Models of Privacy”. The Information Society 10.2 (1994): 101-127.

Berlin, Isaiah. “Two concepts of liberty.” The Idea of Freedom. Oxford: Oxford University Press, 1979. 175-93.

Cheney-Lippold, John. “A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control.” Theory, Culture & Society 28.6 (Nov. 2011): 164-81.

Deleuze, Gilles. “Postscript on the Societies of Control.” October 59 (Winter 1992): 3–7.

Foucault, Michel. “The Meshes of Power.” Space, Knowledge and Power. (2007): 153-162.

Foucault, Michel. “The Subject and Power.” Critical Inquiry 8.4 (1982): 777-95.

Google. “How Google Search Works | Search Algorithms.” Google Search, https://www.google.com/search/howsearchworks/algorithms/. Accessed 24 Oct. 2017.

Mac, Ryan. “Jeff Bezos’ Letter To Shareholders: ‘Don’t Just Swipe Right, Get Married (A Lot).’” Forbes (2015), https://www.forbes.com/sites/ryanmac/2015/04/24/jeff-bezos-letter-to-shareholders-dont-just-swipe-right-get-married-a-lot/. Accessed 25 Oct. 2017.

Meyer, Eric. “Inadvertent Algorithmic Cruelty.” Thoughts From Eric, 24 Dec. 2014, http://meyerweb.com/eric/thoughts/2014/12/24/inadvertent-algorithmic-cruelty/.

Read, Jason. “A Genealogy of Homo-Economicus: Neoliberalism and the Production of Subjectivity.” Foucault Studies, (Feb. 2009): 25-36.

Thaler, Richard H and Cass R Sunstein. Nudge: Improving Decisions About Health, Wealth, and Happiness. New Haven, London: Yale University Press, 2008.

Leave a comment

Your email address will not be published.