RGS-IBG Annual International Conference 2015

RGS-IBG Logo
Add to my calendar:    Outlook   Google   Hotmail/Outlook.com   iPhone/iPad   iCal (.ics)

Please note that some mobile devices may require third party apps to add appointments to your calendar


36 Algorithmic Practices: Emergent interoperability in the everyday (2)
Affiliation History and Philosophy of Geography Research Group
Convenor(s) Chris Speed (The University of Edinburgh, UK)
Eric Laurier (The University of Edinburgh, UK)
Monika Buscher (Lancaster University, UK)
Chair(s) Eric Laurier (The University of Edinburgh, UK)
Timetable Wednesday 02 September 2015, Session 2 (11:10 - 12:50)
Room Forum - Seminar Room 8
Session abstract An ever-increasing proportion of the interactions that we have with digital platforms, apps and devices are mediated according to complex algorithms. Whether it be the real time analytics that draw us into playing a game on our phone, or tailored recommendations built from our historical searching and buying habits, we structure our daily lives in response to ‘performative infrastructures’ (Thrift, 2005: 224), most of them hidden deliberately by their makers.

Yet, in responding to the summons, the predictions, the recommendations, the help, the calculations that occur as platforms try to anticipate our next actions, we are learning how they work and don’t work. In our ad hoc assemblies of devices, apps and screens we short cut and re-make algorithms. For instance, in disaster response, ad hoc interoperability and agile response are creating incentives for ‘systems of systems’ that allow locally accomplished convergence of diverse information systems, with implications for data surge capacity as well as protection and privacy (Mendonça et al 2007).

Described as “a structured approach to real-time mixing and matching of diverse ICTs to support individuals and organizations in undertaking response”, emergent interoperability maybe becoming common place in less dramatic daily practices as individuals negotiate the range of algorithms that “react and reorganize themselves around the users” (Beer 2009).

This panel contains papers and presentations that provide insight into conditions and settings in which emergent operability and interoperability occurs within society.
Linked Sessions Algorithmic Practices: Emergent interoperability in the everyday (1)
Contact the conference organisers to request a change to session or paper details: AC2015@rgs.org
Language in the Age of Algorithmic Reproduction
Pip Thornton (Royal Holloway, University of London, UK)
Using Walter Benjamin’s essay The Work of Art in the Age of Mechanical Reproduction (1936) as a theoretical springboard, this paper seeks to examine what happens to writing, language and meaning when processed by algorithm, and in particular, when reproduced through search engines such as Google. Reflecting both the political and economic frame through which Benjamin examined the work of art, as mechanical reproduction abstracted it further and further away from its original ‘aura’, the processing of language through the search engine is similarly based on the distancing and decontextualization of ‘natural’ language from its source. While all algorithms are necessarily tainted with the residue of their creators, the data on which search algorithms can work is also not necessarily geographically or socially representative and can be ‘disciplined’ (Kitchin & Dodge, 2011) by encoding and categorisation. Added to this is the underlying and pervasive power of commerce and advertising. When a search engine is fundamentally linked to the market, the words on and through which it acts become commodities, stripped of material meaning, and moulded by the potentially corrupting and linguistically irreverent laws of ‘semantic capitalism’ (Feuz, Fuller & Stalder, 2011). This paper therefore begins to question what is gained and what is lost when we entrust language to search engines, and will suggest that the algorithmic processing of data based on contingent input, commercial bias and unregulated black-boxed technologies is not only reducing and recoding natural language, but that this ‘reconstruction’ of language has far reaching societal and political consequences, re-introducing underlying binaries of power to both people and places. Just as mechanical reproduction ‘emancipated’ art from its purely ritualistic function, the algorithmic reproduction of language is an overtly political process.
Assembling spatio-algorithmic collectives while Cruising on Grindr
Michael Liegl (Lancaster University, UK)
Contemporary sociological research on intimate relations is specifically interested in the changes in forms and styles of intimate communication brought about by the use of New Media. More often than not, New Media are conceptualized as ‘parasites’ that threaten to undermine a sequential logic of intimate encounters which privileges the ‘pure interaction’ of two co-present bodies as a natural beginning of relationships. . In this paper, I am concerned with algorithmically enabled hook-up practices through the smart phone app Grindr which combines the possibilities of social network dating sites with geo-positioning technology. Using the built in GPS of a smartphone to scan the geographic vicinity for other users and ranking their profiles according to proximity, Grindr creates an emergent ego-centric cruising space of ‘addressables’, which form a local collective of men interested in sexual encounters with other men. The collective is fluid in that it changes its personnel with the user’s movement. I am particularly interested in the (spatio-)algorithmic practices of assembling such fleeting collectives. These collectives, rather than simply being accessed through a smartphone interface, are produced and performed by the seeking individual with the help of the location algorithm. There are two kinds of algorithms involved in this context: proper location algorithms which take the data from sensors and extrapolate location, and the algorithm which arranges the user’s location in relation to those of other users. The assemblage of local collectives relies on the affordances of the city, especially its heterogeneity and density and differentiation together with the mostly advanced digital infrastructure, a nexus Kitchin and Doge (2011) call Code/Space. This paper explores the nature and performance of such collectives as assemblages of ‘addressables’, aesthetic-affective landscapes or ‘faces of spaces’ and the algorithmic practices that produce them.
Describe in single words, only the good things that come in to your mind…
Mike Phillips (Plymouth University, UK)
Davide Marocco (Plymouth University, UK)
Christos Melidis (Plymouth University, UK)
Birgitte Aga (Plymouth University, UK)
Christopher Hunt (Plymouth University, UK)
This paper describes the design, construction, application and evolution of instruments and analytical tools framed as ‘Operating Systems’ (op-sy.com).
These projects employ a range of digital processes coupled with ethnographic practices which can be described as a “techno-ethnography’. They challenge traditional methods for evaluating cultural and social impact which are all too often based on quantitative measurements of parameters not directly related to the ‘quality’ of life itself, ie, demographics which measure economic impact etc. .
Built around specific contexts, such as Architecture (Arch-OS), the environment (Eco-OS) and the body (Bio-OS), the Operating Systems act as ‘provocative prototypes’ which offer opportunities for creative interventions. Recently these activities have focused on the cultural sector with the production of a Social Operating System called ‘Qualia’ and its integration into projects such as Artory.co.uk.

These activities are currently being consolidated with new research to explore the formulation of a single cognitive model for subjective and objective experiences. The processes will endeavour to deliver a syncretic model for the incorporation of cultural evaluation data and a variety of data sets from the existing Operating Systems.
In addition to the more easily measurable metrics and indices, the authors are working on designs that will provide a more holistic approach to capturing the intangible impacts of social and cultural activity, such as mood, feelings, participation and engagement.

These more qualitative data sets have been harvested from Social Networking APIs (such as Facebook and Twitter) through sentiment analysis, location and flow (GPS tracking, CCTV footfall monitoring) and smile analysis, as well as the feedback and mood feeds from mobile phone app (Qualia and Artory). The collection of this data is enhanced through the use of direct ‘push’ incentivisation and ‘gamification’ of user experiences.

The new analytical engine incorporates a range of data scraping and analytical tools, but uniquely incorporates analytical models based on modern integrative, sub-symbolic, computational techniques (Artificial Neural Networks, Self Organising Maps and Deep Learning Networks) that can innovatively integrate subjective and objective data, considering its temporal and predictive aspects, variety and quality and correlations in pure statistical terms.
Technical debt and the role of auto-disciplinary algorithms in San Francisco’s digital media sector
Daniel Cockayne (University of Kentucky, USA)
It is commonly observed that algorithms (such as Google’s search) have both a regulatory function and the potential to monitor and direct the behavior of users. Within the sphere of algorithm and digital media production itself, algorithm-based applications and software are also used in attempt to monitor the work undertaken by engineers and other workers. Based on interviews and participant observation-based fieldwork currently being undertaken with startup firms and workers in San Francisco’s digital media sector, in this paper I consider how algorithms are used in software and digital applications targeted inwards at the sector itself toward an auto-disciplinary function that monitors worker performance through algorithmically mediated metrics. Framed discursively in terms of best-practice, transparency, the maintenance of trust, and relationship building, I argue that algorithms mark and reform relations of power between workers. I examine the term 'technical debt,’ a discourse used in software engineering to refer to the deleterious consequences of deliberate poor software design for the purposes of pushing a product to an audience, client, or investor more quickly for financial gain. Getting into (and out of) technical debt is not something to be avoided altogether (which may be considered too time-consuming, expensive, or even impossible), but to be mitigated, managed efficiently, and kept within limits of acceptable debt. To conclude, I consider how auto-disciplinary algorithmic practices such as technical debt and surrounding discourses may be indicative of a broader normative injunction to submit oneself to digitally mediated disciplinary practices.
"a cybernetic ecology where we are free of our labours"?: Practices of autonomy and heteromation
Sam Kinsley (University of Exeter, UK)
The stories we tell about algorithms in relation to labour are, frequently, binary: we are either the networked proletariat, blindly staggering into algorithmic servitude (i.e. the microwork of the crowd-sourced taxi system 'Uber' or Amazon'd Mechanical Turk); or, we are the heirs of a new 'wealth of networks' (following Yochai Benkler), set to be relieved of monotonous work and freed to exercise creativity. These are, of course, familiar stories, which were told of the automation of mass production. At the heart of both stories is the attribution of a mythical autonomous agency to the unseen machine. As Lucy Suchman (2006) has argued, this is the enchantment of technologies through the obfuscation of the labour of production. However, neither the algorithm or the worker are autonomous. Rather, they are bound into a quasi-autonomous system (what Rob Kitchin calls a 'coded assemblage') that distributes work tasks to a labour force and vice versa. The rules of such systems may be opaque but they have authors (programmers, managers, service designers) and they are maintained, which both involve labour. Not only, then, are the outcomes of algorithmic labour practices distributed, but so are their creation and maintenance. This is perhaps a muddying of 'immaterial labour', but is nevertheless a powerful illustration of the new kinds of proletarianisation identified by the philsopher Bernard Stiegler. To understand this algorithmic 'agencement' this paper argues that what is required is a rethinking of agency, not in terms of automation but what Bonnie Nardi has called 'heteromation': technical systems that push critical tasks to end users as indispensable mediators. The aim of this paper, therefore, is to interrogate this distributed performance and practices of knowledge and work, the cost of which is arguably a loss of individual autonomy and skill, as the rise of the new 'proletariat' (following Stiegler) of technocultural 'composite workers'.