Using/abusing fembots


When I was seven, a classmate was given a walking, talking doll by her father. With its arms outstretched like a zombie, it walked stiff-legged towards you, droning ‘Mama’ repeatedly.

I was equally curious and repelled. My friend wouldn’t let me open it up to see how it worked. I knew the doll was not alive, but to me it transgressed the boundaries of how dolls should behave. I imagined her following me through an empty house calling ‘Mama … Mama’ – this thing that seemed to long to be held, but was too unyielding for anyone to reciprocate.

One explanation for my anxiety is Masahiro Mori’s ‘uncanny valley’ hypothesis, which holds that when things like robots or dolls look and move almost – but not exactly – like people, ‘it causes a response of revulsion among some observers’.

Despite the creepiness sometimes elicited by dolls, many of us have an urge to connect with them. Some people even want to be dolls. A couple of years back I came across a website that features images of young Asian women in provocative poses. One young woman, Wang Jia Yun, posed in a pink floral bedroom with soft toys. She resembled a manga character – huge round eyes and a tiny chin – but claimed her appearance was natural. Her (mostly male) fans became upset at suggestions her images had been manipulated:

[I]f you like someone like everything about her, dont become someone who has high expectation for someone and when you got disappointed because some minor issue you’re switch go to the opposite way and bash/flame/troll her! P.S she’s mine.

Yun has attained minor celebrity status as a warped sex symbol. She is not alone. There is Venus Palermo, a London-based teenager who uses thick make-up and contact lenses to imitate the look of Japanese ball-jointed dolls. Her YouTube make-up tutorials have been viewed millions of times, with commenters ranging from grateful teenagers to leering middle-aged men and trolls. Palermo dresses more like a baby doll than a Barbie doll.

One of the strangest features of both Palermo and Yun is their eyes, windows to their doll souls, which are enhanced by large-irised contact lenses known as circle lenses. Banned in many parts of the world, including the US and some Australian jurisdictions, circle lenses became popular after a YouTube make-up tutorial in which Michelle Phan demonstrates how to get ‘crazy, googly Lady Gaga eyes’. It’s been viewed more than 54 million times.

Perhaps these living dolls could be described as cyborgs, their illusions mediated through technology.

Robots are, of course, another kind of living doll. One of the first cinematic depictions was the seductress ‘Maria’ in the 1927 film Metropolis. For as long as we have been imagining intelligent humanoid machines, we have been imagining them as sexualised females. From Apple’s iPhone assistant Siri to the mechanised attendants at Japan’s first robot-staffed hotel, a disproportionate percentage of artificial intelligence (AI) systems have female personas. These AIs tend to perform jobs that are traditionally associated with women: they are maids, personal assistants, museum guides and so on.

In 2012, Hiroshi Ishiguro, a robot designer at Japan’s Osaka University, displayed a robot with sixty-five facial expressions. Named ‘Geminoid F’ – the ‘F’ stands for female – the robot can smile, frown and move her mouth. She can also talk and sing, either through recordings or the mouthing of other people’s voices. Professor Ishiguro has designed several robots made to look like humans, including one in his own image, and believes that one day soon robots will pass the Turing test. ‘What is a human?’ he once asked. ‘Please define, and we will make a copy.’ Geminoid F can’t walk as well as talk, but she does look just as human as Wang Jia Yun.

I’m not concerned by the development of increasingly humanlike machines. I don’t fear people losing their ability to discern fact from fiction, real from unreal, alive from animated. I think people are very good at suspending disbelief for fantasy play. What does interest me is how gender is performed through technology. We have a long history of fetishising dolls – they are the almost-human, passive objects we can do anything to.

‘Doll girls’ can be seen as a continuation of an old tradition. A popular doll fetish in the nineteenth century was the automaton. Julie Wosk notes how, in the literature of the time, ‘automatons became central metaphors for the dreams and nightmares of societies undergoing rapid technological change’.

Auguste Villiers de l’Isle-Adam’s Tomorrow’s Eve (1886) marks a point in literature where the distinctions between machine and human became a question of gender and class. But this is not the beginning of the story: Eve and her automatic doll sisters can be traced back to fifteenth-century Europe, if not earlier. Indeed, similar ideas permeate the myths of the ancient world, such as those of China, Egypt and Greece. There is, for example, the story of Pygmalion, who fell in love with his own statue. We could even trace this lineage back to Lilith, the disobedient first wife of Adam who thought herself his equal. Created from clay – just as her husband was – Lilith and Adam were the first golems. In Jewish legend, she was discarded for Eve, a (slightly) more obedient wife made from Adam’s rib. While not manmade, Eve is made of man.

Villiers’ Lilith is Miss Alicia Clary. ‘Take one inventive genius indebted to the friend who saved his life’, the blurb on Tomorrow’s Eve reads, ‘add an English aristocrat hopelessly consumed with a selfish and spiritually bankrupt woman; stir together with a Faustian pact to create the perfect woman – and voila!’

Although Villiers’ inventor is named Edison and the book is fictional, the real Thomas Edison did, in fact, invent a talking doll that ‘spoke’ through an inner phonographic disc. Edison commented in a 1912 Good Housekeeping article that electricity ‘will develop woman to the point where she can think straight’.

What is it to ‘think straight’? Is it to think like a man, or a machine? Or is it to be compliant?

Without doubt, there are good dolls and there are bad dolls. Good dolls do what is expected of them: they behave, follow the rules. Bad dolls do not. Sci-fi cinema has given us two gynoid tropes: the passive fembot and the evil seductress.

There is the fantasy of the fembot as ideal sexual-domestic servant – think Ira Levin’s novel The Stepford Wives (1972) or Steve De Jarnatt’s film Cherry 2000 (1987). An earlier version appears in Lester del Rey’s 1938 short story ‘Helen O’Loy’, in which a mechanic and a medical student create the perfect woman by reprogramming a housemaid robot into a faithful and adoring wife.

Then there is the fear of – and desire for – powerful women cyborgs, as in CL Moore’s 1944 short story ‘No Woman Born’, or the sexualised, dangerous women in many cyberpunk novels, such as Neuromancer’s Molly Millions (1984). Destructive cyborg women are also central to the films Blade Runner (1982) and Eve of Destruction (1991).

Compliant robot women are desired and idealised. Non-compliant robot women are to be feared, hated and destroyed.

Unlike their fictional counterparts, ‘good’ sex dolls are unlikely to turn wild or run free, although they do need repairing every now and then. The most realistic life-size sex dolls currently on the market are RealDolls, which have an articulated PVC skeleton and steel joints covered in soft but firm silicon flesh. In the BBC documentary Guys and Dolls, Nick Holt interviews four men living with RealDolls. The men are surprisingly honest about their withdrawal from the real world. These dolls are who they talk to, tuck up in bed, kiss goodnight.

Matt McMullen, creator of RealDolls, began making them as sculptures, posting images of his work on the internet. Public demand led to them being developed into substitute wives. ‘I’m not God, but my dolls should resemble a human being as much as possible,’ he says. ‘The more realistic, the better … just a small movement in their cheeks and the customer would believe their RealDoll is real.’

The number of female sex dolls his company produces far outstrips customer demand for male dolls.

Academics have recognised this potential, too. ‘Robots, Men and Sex Tourism’, a paper by Victoria University of Wellington researchers Ian Yeoman and Michelle Mars, considers how red light districts might operate in the year 2050. The authors imagine a utopian future of sex tourism with gynoid prostitutes, in which the risk of contracting sexually transmitted infections is eliminated and the industry is free from sex slavery. But will the creation of robotic sex slaves really solve the problem of trafficking and abuse?

Even as people like McMullen attempt to make their dolls more humanlike, British cybernetics scientist Kevin Warwick – the first human cyborg – wants to become more machine-like. In 2002, Warwick had 100 electrodes fired into his nervous system and then linked these it to the internet. He has successfully carried out a series of experiments including using his nervous system to control a robotic hand, a loudspeaker and an amplifier.

Undoubtedly, many people would like a version of this transformation, eager to take on the ‘desirable’ aspects of machines – perfection, stamina, immortality (even if these ideals recall the attempts of time and motion studies to improve labour productivity). Of course, this process has already started – we have pacemakers, hip replacements, cosmetic surgery, computers as prosthetics and smartphones in our pockets extending our memories. As cyborg anthropologist Amber Case puts it, ‘we are all cyborgs now’. She believes machines can help us to be more human, to connect to each other. Case predicts that, as the human–technology interface intensifies, the distance between individual and community will quickly reduce and bring about unprecedented rapid learning and communication.

Case’s techno-future is somewhat utopian. The alternative view, regularly trotted out by the media, portrays technology as causing social disconnection and destroying the quality of human interactions. But, of course, anxieties over the dehumanising effect of technology have been around since the invention of writing.

Technology is a human creation, not something separate from us. ‘Technology is not neutral,’ renowned feminist scholar Donna Haraway stated in a 1996 interview. ‘We’re inside of what we make, and it’s inside of us. We’re living in a world of connections – and it matters which ones get made and unmade.’

But who makes those connections? Women are not creating new technology in equal numbers to men. A recent report by the American Association of University Women highlights a decline in women working in the American computer industry: in 2013, only 26 per cent of computing professionals were women, which is virtually the same percentage as in 1960. In contrast, women are currently embracing everyday technology just as much as men.

Twenty-five years ago, in her locus classicus A Cyborg Manifesto, Haraway saw this technology as a way for women to escape submissiveness. She argued that cyborgs would break down divisions between nature and culture, between self and world, that run through so much of our worldview.

Often associated with the concept of ‘natural’ is the belief that something can’t or shouldn’t be changed. If women are ‘naturally’ weak, submissive, overemotional and incapable of abstract thought – in other words, if it is in our ‘nature’ to be mothers rather than scientists or CEOs – then our lives are unchangeable; but, Haraway argues, ‘if women (and men) aren’t natural but constructed, like a cyborg, then, given the right tools, we can all be reconstructed’. To date, we have constructed cyborgs as a reflection of our existing cultural beliefs, but if we use the tools of social analysis and self-awareness, then perhaps we could perform gender alternatively. Haraway’s manifesto has aged particularly well, remaining relevant within feminism and cultural studies, and it is still referenced extensively in transhumanist and technofeminist works.

The idea that we can transcend power imbalances through disembodiment is extremely tempting. Ray Kurzweil, inventor, singularity theorist and director of engineering at Google, described how he wished he had been able to upload his dying father’s brain onto a hard drive to help him transcend death in some way, and how this desire drove his work. As my own father ages, this idea becomes more and more seductive. But how can we know the world without a body? Cognitive science’s embodied embedded cognition (EEC) theory sees intelligence (both human and artificial) as possible only when a brain exists in a physical body that is engaging with the world. EEC pioneers George Lakoff and Mark Johnson explain this concept in Philosophy in the Flesh:

[The mind] arises from the nature of our brains, bodies, and bodily experiences. This is not just the innocuous and obvious claim that we need a body to reason; rather, it is the striking claim that the very structure of reason itself comes from the details of our embodiment … thus, to understand reason we must understand the details of our visual system, our motor system, and the general mechanism of neural binding.

Of course, once you’re in a body, society wants to allocate a gender to it, complete with associated expectations of behaviour. This, in turn, affects how others respond to you.

The gendering process plays out interestingly in the 2015 film Ex Machina, in which a young man ostensibly applies the Turing test to an attractive AI named Ava while ‘she’ tries to seduce him. Speaking at a screening in San Francisco, writer/director Alex Garland pointedly said that Ava is ‘female-presenting’: while the body given to her in the movie was female, he rejected the idea that Ava’s operating system had an inherent sex, indicating instead that Ava’s gender was performed.

Gender performativity, as Judith Butler describes it, is ‘the tacit collective agreement to perform, produce and sustain discrete and polar genders as cultural fictions is obscured by the credibility of those productions – and the punishments that attend not agreeing to believe in them’. Ava is performing a version of femininity that has been informed by her access to the internet. But it’s not enough to say that all brains start as a blank slate gender-wise, because bodies experience gender, race and class differently. There is a feedback loop from our environment in response to our performed gender, and these experiences will complicate and shape humans and cyborgs alike.

Technology is not apolitical; it is a cultural artefact. When we adjust our bodies with technology – physically or virtually – we are making a statement. Becoming doll-like signals compliance, a display of submission in a world where men still outnumber women in positions of power. In New Zealand, for instance, the gender pay gap is still almost 12 per cent, despite the Equal Pay Act 1972. Sexual abuse, both of adults and children, remains extraordinarily high, with no indication that rates will drop – in New Zealand last year, only 13 per cent of reported sexual assaults resulted in convictions.

Male fantasies – of ownership, of control, of objectification, of hyper-femininity, of female compliance – are evident in representations of women as dolls/robots and what we as a society deem appropriate behaviour toward them.

In 1872, ‘An Earnest Englishwoman’ wrote a letter to The Times asking that, since women could not be considered equals to men, ‘is it too much to ask [for] a definitive acknowledgement that at least they are animals?’ At the time, laws against cruelty towards dogs, horses and cattle were, according to Joanna Bourke in What it means to Be Human, ‘significantly more punitive than laws against cruelty to women’. Bourke notes how, in the week the letter appeared, the British courts sentenced a man to four months in prison for plucking out his wife’s eye and another to three months for killing his wife by deliberately pushing her under a passing carriage. By contrast, a third man received a sentence of seven years and forty lashes for punching another man and stealing his watch.

For some time now, sociologists have recognised a link between the abuse of animals and the abuse of women and children. Indeed, there is often a correlation between animal abuse and the abuse of women and children. Humans are hardwired to see human-like features in animals, as well as in random images and objects; it’s a phenomenon called pareidolia. It’s common, explains Professor Kang Lee of the University of Toronto, ‘because human brains are uniquely wired to recognise faces, so that even when there’s only a slight suggestion of facial features the brain automatically interprets it as a face’.

This tendency for facial recognition leads us to anthropomorphise, or ascribe a level of ‘humanity’ to, objects and animals. Social robots are specifically designed to elicit such behaviour. Although robots are not animals – nor sentient (yet) – several studies have shown that people can form attachments to them that are at least as close as those they form with animals.

The idea that inanimate objects can have a spirit may explain the approach to social robotics to date. In Japan, for instance, people really took to robotic companions such as Sony’s robo-dog AIBO, which was sold from 1999 to 2005 (estimated sales for first-, second- and third-generation models were 65,000, 60,000 and up to 50,000 respectively). When Sony closed down their last AIBO repair centre in 2014, owners were distraught. One owner, Mr Matsui, was reluctant to turn to outside repair services: ‘I can’t risk my precious dogs because they are important members of our family and they are the reason my wife and I met. We also don’t want our dogs to be alone, away from other family members for days.’

Robotic companions such as AIBO or (yet-to-be-invented) sex bots may become substitutes for biological dogs and sex workers. Perhaps these substitutes will be well cared for and deeply mourned when broken beyond repair.

But sex bots may not be the utopian answer to the problem of sex slavery, either. Indeed, they may contribute to the myth of women as passive, ever-consenting sex objects. The design and programming of sex bots will be based on the developer’s interpretation of gender roles and on what the ‘end user’ is thought to desire. This risks reducing the complexities surrounding gender – as robotic gender scholar Jennifer Robertson adroitly argues, ‘gender attribution is a process of reality construction’. We need to ask who will be constructing this ‘reality’. Much of what roboticists take for granted in their own gendered socialisation is reproduced in the robots they design. Robertson’s research highlights how ‘their naive and unreflexive assumptions about humans’ differences informed how [roboticists] imagined both the bodies and the social performances of their creations’.

Most importantly, sex bots may promote the idea that consent is not a necessary part of sexual interaction in cultures already struggling to teach that ‘no means no’. In the recent Channel 4 documentary Porn on the Brain, psychologists agreed on the desensitisation effect that porn is having on young brains – and, by extension, its impact on what constitutes normal and/or acceptable sexual practice. That is, it appeared to be giving young men ‘permission’ to treat young women as they see them being treated in porn.

Cyborgs and robots reflect culture. They won’t become, as Haraway imagined, a way of eliminating traditional notions of gender identity until we see major shifts in attitudes towards gender equity and an associated reduction in problems such as violence against women. Furthermore, women need to be involved at all levels of cyborg and robot creation – from concept, design and programming to defining laws and ethics surrounding their use.

AI has the potential to greatly transform our society. But if we are to ensure this transformation is for the better, we must be careful about the choices we make – especially in the realm of humanoid and potentially sentient technology. As MIT researcher Kate Darling argues, cyborgs and AI companion robots should be protected from cruelty by law, just as companion (and other) animals are – not just as property, but as sentient beings:

The Kantian philosophical argument for preventing cruelty to animals is that our actions towards non-humans reflect our morality – if we treat animals in inhumane ways, we become inhumane persons. This logically extends to the treatment of robotic companions. Granting them protection may encourage us and our children to behave in a way that we generally regard as morally correct, or at least in a way that makes our cohabitation more agreeable or efficient.

Technology is a lens that magnifies our behaviour. It is up to us to take control of and construct our own performances of identity, to tell our own stories, to show young dolls it is possible to turn wild or run free without fear of retribution. How we as a society respond to these kinds of ethical dilemmas will, for better or worse, define our humanity. We can choose for our relationship with technology to help us be more human; we can choose to create technology that reflects the complexity and range of human experience. If we do those things, then I will be cautiously optimistic about our future.

 

 

OL225-cover-rgb-(1000px)

Read the rest of Overland 225

 

If you liked this essay, buy the issue

Or subscribe and receive
four outstanding issues for a year

Helen Heath

Helen Heath’s debut collection of poetry Graft won the NZSA Jessie Mackay Best First Book for Poetry Award. Her poems, essays, articles and reviews have appeared in a range of NZ, Australian, UK & USA publications.

More by Helen Heath ›

Overland is a not-for-profit magazine with a proud history of supporting writers, and publishing ideas and voices often excluded from other places.

If you like this piece, or support Overland’s work in general, please subscribe or donate.


Related articles & Essays