2018-2019 HTDTWT Seminar Ana Teixeira Pinto: FEEDBACK FORMS - from month to month


Seminar 3: January

Samsung’s Smart TVs come with a fine-print warning: if you enable voice recognition, your spoken words will be ‘captured and transmitted to a third party’, so you might not want to discuss personal or sensitive information in front of your TV. Even if voice recognition is disabled, Samsung will still collect your metadata – what and when you watch, and including facial recognition – though you won’t be able to use their interactive features. The SmartSeries Bluetooth toothbrush from Oral-B, a Procter & Gamble company, connects to a brushing app in your smartphone, which keeps a detailed record of your dental hygiene. The company advertises that you can share such data with your dentist, though, in a privatized health market, it’s more likely the purpose of such technology is to share data with your insurance company.

The cultural logic of the information age is predicated on an inversion of the gaze: within this fusion of surveillance and control, the screen, as Jonathan Crary has noted, ‘is both the object of attention and (the object) capable of monitoring, recording and cross-referencing attentive behaviour.’1 Data processing – whose reaches span the NSA, credit rating agencies, health insurance providers, up to the sorting algorithms used by Google or Instagram – is predictive, modeling future actions on previous behaviour. As such, as Orit Halpern argued in her 2015 book Beautiful Data, data processing implies a model of temporality in which the past is a standing reserve of information, waiting to be mined. 


Stalder, Felix. “The Fight over Transparency: From a Hierarchical to a Horizontal Organization.” Open! Platform for Art, Culture and the Public Domain. Nov 18, 2011. https://www.onlineopen.org/the-fight-over-transparency#contributor-bio-0

Additional Reading:

Jameson, Frederick. "Cognitive Mapping." 1990.

Buck-Morss, Susan. "Envisioning Capital: Political Economy on Display." Critical Inquiry 21, no. 2 (1995).

Galloways, Alexander. "Are Some Things Unrepresentable?" Theory, Culture & Society 28, no. 7-8 (2011): 85-102.


Seminar 2: December

When, in 1913, John B. Watson gave his inaugural address at Columbia University, “Psychology as the Behaviourist Views It,”(1) he described psychology as a discipline whose “theoretical goal is the prediction and control of behaviour.” Strongly influenced by Ivan Pavlov’s study of conditioned reflexes, Watson aimed to anchor psychology firmly in the field of the natural sciences. By black-boxing mentation or internal states behaviorism could operate on observable behaviour alone, creating a psychology completely devoid of subjectivity.

Following Watson’s lead, American psychologists began to treat all forms of learning as skills—from “maze running in rats […] to the growth of a personality pattern.”(2) For the behaviourist movement, both animal and human behaviour could be entirely explained in terms of reflexes, stimulus-response associations, and the effects of reinforcing agents upon them. Burrhus Frederic Skinner researched how specific external stimuli affected learning using a method that he termed “operant conditioning”. While classic—or Pavlovian—conditioning simply pairs a stimulus and a response, in operant conditioning, the animal’s behavior is initially spontaneous, but the feedback that it elicits reinforces or inhibits the recurrence of certain actions. 

Like Watson’s, Skinner’s method black-boxes the animal’s internal states to operate on observable behaviour alone. From this perspective an animal is just like a machine because it can be made to behave like a machine. But this equivalence also invites its reversal: producing an identical type of behaviour could also be construed as creating an identical being.

(1) This was the first of a series of lectures that later became known as the “Behaviourist Manifesto”.

(2) John A. Mills, Control—A History of Behavioral Psychology (New York: NYU Press, 1998), 84.


Seminar 1: November 

C: Will X please tell me the length of his or her hair?

– A. M. Turing, Computing Machinery and Intelligence, 1950

In 1951, Alan Turing described a thought experiment, which became widely known as the ‘Turing Test’, though Turing himself termed it the ‘Imitation Game’. The Imitation Game was a game conceived to tackle the issue of artificial intelligence, at that time known as machine intelligence, but the first experiment does not involve machines. Instead, Turing asks the reader to imagine two rooms, connected via computer screen and keyboard to a third room, in which a person who will be the game arbiter sits. In the first room one finds a man, in the second a woman, who are hidden from view but able to communicate via the computer terminal. The judge’s job is to determine which player is the man and which is the woman, whereas the woman’s job is to deceive the judge into misidentifying her as the male player. The second experiment involves a variation of the same game, this time round replacing one player with a machine. Now the judge’s job is to decide which of the contestants is human. If he gets it wrong oftentimes, the computer must be a passable simulation of a human being and hence, intelligent.

The Imitation Game is usually misunderstood as proof that the converse is also true, that if a computer passes the test it must be a passable simulation of intelligence and, hence, human. Whilst the queer dimension of Turing’s thought experiment was overlooked, the gendering of technology became a central theme in pop culture and pulp science. In spite of the feminist appropriation of the cyborg body as a means to depart from gender dichotomies, the cyborg became a reactionary figure (humanoid robots or commercial personifications of AI are gendered according to traditional roles: military technology is male, service technology female) predicated on the relation between the male gaze and the female body rather than on a double articulation of difference – sexual difference and machinic difference.


mandatory reading:

Hayles, N. Katherine. “Boundary Disputes: Homeostasis, Reflexivity, and the Foundations of Cybernetics.” Configurations 2, no. 3 (1994): 441-467.

recommended reading: 

Halpern, Orit. “Inhuman Vision.” Journal of the New Media Caucus (Fall 2014). http://median.newmediacaucus.org/art-infrastructures-information/inhuman-vision/.

Jackson, Zakiyyah Iman. “Outer Worlds: The Persistence of Race in Movement ‘Beyond the Human’.” GLQ: A Journal of Lesbian and Gay Studies 21, no. 2-3 (2015): 215–46.

Halpern, Orit. “Schizophrenic Techniques: Cybernetics, the Human Sciences, and the Double Bind” S&F Online 10, no. 3 (2012). http://sfonline.barnard.edu/feminist-media-theory/schizophrenic-techniques-cybernetics-the-human-sciences-and-the-double-bind/.

Theweleit, Klaus. Male Fantasies. Minneapolis: University of Minnesota Press, 1987.


back to main page