Algorithmic Cultures

Organisatoren
Robert Seyfert, Konstanz; Jonathan Roberge, Québec
Ort
Konstanz
Land
Deutschland
Vom - Bis
23.06.2014 - 25.06.2014
Url der Konferenzwebsite
Von
Nicolai Alexander Ruh, Fachbereich Soziologie, Universität Konstanz

The interdisciplinary conference, “Algorithmic Cultures“, organized by Robert Seyfert (Konstanz) and Jonathan Roberge (Québec), addressed issues concerning the ubiquitous emergence of algorithms in diverse realms of social life. The plural “cultures” in the conference title was chosen to emphasize the increasing importance of algorithms as a new epistemic and organizational paradigm in manifold social spheres and the multifaceted analysis of algorithms in various fields of research. The goal was to understand not just different algorithms in their different settings but also to find commonalities among them that could be useful for further scientific reflection.

In his talk, LUCAS D. INTRONA (Lancaster) focused on the disempowering quality of algorithms in the becoming of what he termed “the impressionable subject”. Through the case study of real time bidding in online display advertising, Introna described the “dance of agency” that takes place between internet users and the machine learning algorithms that are fed with the digital information that users leave behind while web browsing. According to Introna, the social-technical entanglement can be characterized as a performative conversion of the subject. By tracking and responding to an individual’s online behavior, machine learning algorithms aim at channeling the user’s attention. Along with insight that advertisers themselves are the product of this dance of agency, Introna discussed questions of responsibility, and challenged the justification strategies used by the advertising industry to outsource (ethical and moral) responsibility to allegedly neutral algorithms.

DOMINIQUE CARDON (Paris) took a closer look at the internal logic of algorithms in order to understand their moral and political effects on the structure of the web. Through an analysis of Google’s PageRank social networks metrics and machine learning algorithms, he revealed the different organizational principles and statistical epistemologies behind the multiple web metrics. Cardon highlighted the fact that the calculations of the web, which are carried out from different angles (beside, above, inside, and below), are meant to access meanings produced on the web. The particular perspectives that are the outcome of such calculation foster different ideas that at times run counter to democratic claims made about the web. Google’s PageRank algorithm, for example, introduces a meritocratic element in its implementation of the idea of a patron. This is to be understood as an attempt at establishing a form of instrumental objectivity by replacing the bottom-up ratings of users with meritocratic top-down ratings, where algorithms compare texts with texts in order to separate good links from bad links.

ESTEVE SANZ (New Haven) offered a “performative critique” of five algorithmic cultures through their underlying ontologies. Sanz compared Google with an “existential therapist” whose task is to provide its users with ontological security. To make his point, he introduced a temporal element into the discussion by discriminating between “authentic time” and “vulgar time”, where algorithms are “the quintessential contemporary instantiations” of the latter. He illustrated the pervasiveness of algorithms in our lives by examining the metaphysical assumptions on which the delusion of technical determinism is founded. Algorithms thereby function as time-based calculated promises that have a seductive quality of alienating people from authentic (sequential) time, but at the same time offer a therapeutic solution to overcome this estrangement. For example, “posteriority algorithms” – the idea that the web does not forget – seduce people into a denial of death, in a sense that these algorithms democratize the notion of not being forgotten, by virtue of the digital artefacts one leaves on the web.

JONATHAN ROBERGE (Québec) continued the performative critique by analyzing how the web is semantically organized. He observed a new kind of performativity involved in the semantic structuring of the web that is characterized by the bending of numbers and letters through the use of machine learning algorithms. Here, algorithms function as a means to cope with complexity by outsourcing disambiguation to numeric tools. According to Roberge, the fragmented environment in which the algorithms operate intensifies the efforts of disambiguation because the same terms could have different meanings and different meanings could have the same term. Subsequent attempts at disambiguation would create performative loops that increase the complexity and thereby the ambiguity of the net. Such loops are disguised by efforts to “clean up” a “messy” net, for instance, by shifting from the categorization of key words to the organization of spoken language. He concluded that cultural sociology should focus more on deciphering this rhetoric of cleansing by unveiling how the algorithmical indexing of the web creates the Internet as an empty signifier.

JOSEPH KLETT (New Haven) described the loss of meaning and quality that arises when sensorial experiences are transformed into data by algorithms. He focused on the phenomenological question of how algorithms shape the way we experience the world. As a case study, he chose the analysis of digital signal processors (DSPs). In this new technology, audio engineers encode layers of meaning into algorithms in order to make a certain auditory experience identically sharable with listeners who are situated in different parts of a room. Sound sources therefore become objectified, so that they may be personalized for the particular listener. According to Klett, cultural assumptions are implicitly mobilized in the programming process of these algorithms (e.g., the idea that acoustic experiences are a bodily and not a mental phenomenon). Echoing Roberge, Klett came to the conclusion that algorithms are too trivial, and not sophisticated enough, to capture the “deeper, aesthetic or moral dimension” that is entailed in the experience of listening.

SHINATARO MIYAZAKI (Aarau) also addressed the loss of quality that takes place when a physical phenomenon is algorithmically transformed into sequential step-by-step instructions. For deeper insights into the analysis of algorithmic cultures, he suggested the introduction of the concept of “algorhythm”, a concept that combines the ideas of algorithm and rhythm. The introduction of considerations of rhythm, understood as an “elementary movement of matter, bodies and signals”, would allow research to gain new insights into both the time-related and material aspects of contemporary digital culture.

ELIZABETH VAN COUVERING’s (Karlstad) research was inspired by the observance of a different sort of resistance towards Google’s and Facebook’s market acceptance – namely, that of state actors in Russia and China. Drawing on LeFebvre’s concept of the social construction of space, she theorized that the spread of the US-American companies from the hegemonic center to the periphery is a specific form of US-American imperialism. Van Couvering developed the idea that web companies export an augmented space that is organized around certain US-American values, a space that is layered atop foreign nation states. In the cases of Russia and China, these dynamic virtual spaces, which favor ideas of Western market capitalism, encounter resistance from systems of strong state control.

Elaborating on empirical work on the effects of recommendation systems in the field of restaurant reviews, JEAN-SAMUEL BEUSCART (Paris) analyzed the performative quality of algorithms and the way these algorithms may transform, constrain, manipulate, narrow or broaden our tastes and cultural experiences. According to Beuscart, online reviews differ qualitatively from classical restaurant reviews, and challenge the conventions of quality that restaurant managers had previously internalized as legitimate. Restaurant managers would observe their market and manage their business by closely paying attention to online recommendations. Like Cardon, Beuscart observes the undermining of the democratic claims made about the web. In order to model online recommendation systems after classic evaluation criteria, restaurant evaluation algorithms would be manipulated in a way that attaches weight to some user ratings while ignoring others.

YUVAL MILLO (Leicester) described the sophisticated historical and socio-technical processes that led to the transition from traditional exchanges to the establishment of electronic markets. In his argument, Millo described how complex interactions between regulatory discourses and technological materialities led to a reconfiguration of competition among U.S. exchanges. He observed an ontological change in political regulation that favored the automatization of exchanges by connecting buyers and sellers through matching algorithms. In this case algorithms functioned as a means to cope with the increasing complexity of a rapidly growing market. Instead of treating the algorithmization of markets as a logical consequence of dealing with complexity, Millo reconstructed a path-dependent history in which the idea of applying data principles to markets gradually evolved.

ANN-CHRISTINA LANGE (Copenhagen) further advanced our understanding of market algorithms beyond the common image of dehumanized neutral black boxes. Her presentation investigated the relation between crowd dynamics and financial markets in light of the recent development of algorithmic and high-frequency trading techniques. Highlighting the complex interdependencies between different financial actors and instruments, Lange emphasized that algorithms are not neutral techniques, but rather, are implicated in a changing psychology of markets. Evoking positive and negative feedback loops, algorithms do not eliminate human bias, but rather, in their imperfection and their necessary tweaking by human actors, cause new forms of sociality.

ROBERT SEYFERT (Konstanz) further deconstructed the bifurcation between neutral algorithms and emotional humans. He demonstrated that fully automated trading in fact intensifies complex affective human-machine relations through different frequential entanglements between human actors and trading algorithms. According to Seyfert, fully automated trading floors that operate virtually in real-time create an environment that requires permanent human surveillance. The algorithms therefore engage the complete sensuous attention of their human supervisors, resulting in a symbiotic relationship between the two so fundamental that a decoupling would be experienced as a trauma for the trader.

VALENTIN RAUER (Frankfurt am Main) related his definition of algorithms to Bruno Latour’s concept of interobjectivity, portraying algorithms as intermediaries “between engines, cameras and other digital devices”. The increase of interactions between objects and the exclusion of human actors is accompanied by the outsourcing of responsibility: there is no subject left that can be addressed. Furthermore, loops of prescriptions and benchmarking allow for a flexibility that creates a surplus of meaning, making digital technologies compatible for multiple purposes (e.g. drones).

OLIVER LEISTERT (Paderborn) analyzed the discourse on social bots that control social networking accounts and that are able to execute and react to social actions. Leistert pointed to the high acceptance rate of social bots among users of social networking sites, and argued that in an algorithmically operating environment, there is no difference between a bot and an avatar. He analyzed the discourse on social bots in an environment whose existence relies on the monetization of consumer data. Leistert’s attempted to clarify the qualities that participants of the discourse ascribe to these software programs.

The conference closed with a keynote address by TARLETON GILLESPIE (Ithaca), on the production of calculated publics by algorithms. Gillespie pointed out that mapping public attention through trending is not a new phenomenon, and that it is necessary to see the actual manifestation of this instinct in its historical context in order to understand how algorithms “are relevant to our collective efforts to know and be known”. In his keynote, he formulated a critique that had been repeatedly raised over the course of the entire conference, namely, that we ascribe meaning to algorithms beyond what was intended by their programmers, and credit algorithms with too much explanatory power, and finally, that we are tempted to reduce social problems into computational terms, in order to solve them by computational rather than social means.

The lesson of the conference is clear: more in-depth analyses of computational cultures and algorithms by disciplines across the Humanities and Social Sciences are necessary for a more complex and nuanced understanding of their implications and consequences. This is all the more urgent as advocates of algorithmic epistemologies, such as Google’s Eric Schmitt, elevate the outcomes produced by machine learning software programs to the sphere of pure knowledge, seemingly unbiased by human interests. This renewal of a positivistic approach was demonstrated by the conference presenters in their discussions of the different fields in which algorithms were applied. This positivism was either implicitly a feature of their functional logic or explicitly formulated by their apologists. The uncovering of the cultural principles that underlie algorithmic ontologies must be considered a fundamental tool in the formulation of an in-depth critique of these positivistic epistemologies that hide their “impurity” and systemic inconsistencies behind their complexity.

Conference Overview:

Lucas D. Introna (Lancaster), Learning algorithms and the sociomaterial production of the impressionable subject: the case of real time bidding in online display advertising

Dominique Cardon (Paris), Objectivity, benchmark and predictivity: Three statistical worlds embedded in web metrics

Esteve Sanz (New Haven), Five algorithmic cultures and their ontologies: a performative critique

Jonathan Roberge (Québec), From numbers to letters and back: On the algorithmic construction of a semantic web

Joseph Klett (New Haven), The reflective algorithm

Louis Melancon (Montreal), Project Glass: a Google incursion into algorithmic culture

Elizabeth van Couvering (Karlstad), Capitalist algorithms abroad: Google and Facebook in China and Russia

Jean-Samuel Beuscart (Paris), Do algorithms shape our tastes? A Sociology of online recommendation systems

Yuval Millo (Leicester), Where do electronic markets come from? Regulation and the transformation of financial exchanges

Ann-Christina Lange (Copenhagen), Algorithmic trading and swarm theory

Robert Seyfert (Konstanz), Intensified socio-artificial interactions: affects in algorithmic trading

Shinataro Miyazaki (Aarau), The concept of ‘algorithmic agencement’: media archeological inquiries to computational cultures

Valentin Rauer (Frankfurt am Main), Interobjective algorithms: The case of security infrastructures

Oliver Leistert (Paderborn), The botherder’s lure and the figure of the pirate: trespassing enclosures and tweaking affordances

Tarleton Gillespie (Ithaca), #Trendingistrending: a look at algorithmic measures of public discourse


Redaktion
Veröffentlicht am
Klassifikation
Weitere Informationen
Land Veranstaltung
Sprache(n) der Konferenz
Englisch
Sprache des Berichts