What’s in a frame? Misogyny/hate

‘Women’, wrote Germaine Greer in 1970, ‘have very little idea how much men hate them’. Fifty years later, it seems we have woken up. The problem of woman-hatred is now widely acknowledged and discussed; in Britain there’s growing pressure for misogyny to be legally recognised as a form of hate. Campaigners have presented this as a question of parity, saying that the law should ‘treat misogyny like racism or homophobia‘ (which are already covered, along with religious hatred, transphobia and hostility to people with disabilities). It’s an argument that has resonated with many feminists, and it’s now under serious consideration. Though the Scottish Parliament recently rejected a proposal to include women in new hate crime legislation, a working party has been set up to examine the issue further. Meanwhile in England and Wales, the Law Commission issued a consultation paper last year which did recommend that the law should be extended. Since the outcry that followed the murder of Sarah Everard this proposal has attracted more mainstream political support.

So, it looks as if change is coming; but will that be a step forward for women? On reflection I have my doubts, and in this post I’m going to try to explain them.

In England and Wales currently there isn’t a specific hate crime law, but rather a patchwork of provisions threaded through other laws. One key provision is in the Criminal Justice Act 2003, which says that if someone who committed a criminal offence ‘demonstrated, or was motivated by, hostility on the grounds of race, religion, sexual orientation, disability or transgender identity’, the court should treat that as an aggravating factor and consider whether to impose a harsher penalty. This also indirectly brings what is popularly known as ‘hate speech’ into the picture (though the term itself has no status in English law), in that the language someone used may be treated as evidence of hostile motivation. Other legal provisions target verbal behaviour more directly. The Public Order Act 1986 includes an offence of ‘stirring up hatred’, which will often be done by way of language (one recent case involved a series of anti-Muslim posts on Gab), and also one of using ‘threatening words and behaviour with intent to cause harassment and distress’.

The Law Commission has recommended that these provisions should be extended to cover hostility on the grounds of sex, or hostility to women (which of these options to prefer is one of the questions posed in the consultation). To reach that conclusion, it explains that it applied three tests:

  1. Demonstrable need: whether there is evidence that crimes against women are (a) prevalent and (b) linked to hostility and prejudice;
  2. Additional harm: whether women victims are more severely impacted by crimes which are motivated by hostility/prejudice, and whether these also cause harm to other members of the target group (‘secondary victims’);
  3. Suitability: whether an extension of the hate crime framework to crimes against women would be workable in practice and compatible with the rights of other groups.

The Commission concluded that the first two tests were met. Crimes which disproportionately target women (e.g. rape and sexual assault, domestic violence, forced marriage, FGM, street harassment, online abuse) are prevalent, rooted in prejudice, and have an impact on women in general. But some questions remain unresolved. One is the practical feasibility of extending the law, given the high number of crimes against women and the fact that the justice system is already overstretched. Another concerns the status of domestic violence/abuse, which some argue should be excluded because it isn’t motivated by hostility to women as a group; rather it arises within specific intimate relationships, which could be same-sex partnerships, or heterosexual ones where the abusive partner is the woman. The consultation paper does suggest that sex (more specifically, femaleness) should become ‘a protected characteristic for the purposes of hate crime law’, but it asks if there should be a ‘carve out’ for domestic violence.

This is one reason why some feminists are concerned about the Commission’s proposals. They fear the effect will be to create a new hierarchy of crimes against women, taking us back to the days when attacks carried out by strangers were seen as ‘worse’ than violence perpetrated by someone the victim knew. Feminists have also drawn attention to an even more basic problem, namely the failure of the criminal justice system to enforce the laws we already have. What good, they ask, is creating new offences, or giving the courts power to impose harsher penalties, when most of the crimes women currently report do not lead to a prosecution, let alone a conviction? And that’s not only because the system is under-resourced. Women are also denied justice because of longstanding biases, both in the system and in the surrounding culture. How can we trust institutions which are themselves riddled with misogyny to enforce new anti-misogyny laws effectively and fairly?

Campaigners for new legislation often argue that it will help to drive institutional and cultural change, by sending the message that ‘this is serious and will no longer be tolerated’. But in the case of crimes against women, this message often turns out to be no match for the prejudice it was meant to shift. For instance, this month the media reported on a school in Liverpool where girls had been told to wear shorts under their uniform skirts after several of them were ‘upskirted’ (i.e., boys took pictures of their underwear) on a transparent staircase in the sixth-form building. This story caught my eye because upskirting was recently the subject of a successful campaign to make it a criminal offence (it became one in 2019). The Liverpool boys, who were over 16, could in theory have been reported to the police. I’m not saying that would necessarily have been the right thing to do. I’m sympathetic to the argument that where possible we should try to educate young people rather than criminalising them. But it’s telling that this school did neither. Instead it chose to punish the girls, by imposing a dress-rule that would make them feel uncomfortable, undignified and as if they were the ones at fault.

Even if I had more faith in legislation as a remedy for social ills, I would still want to ask whether extending hate crime laws sends the right message about misogyny. My doubts on that score reflect my interest in language–in words and meanings and what might be called ‘discursive framing’. Treating misogyny ‘like racism and homophobia’ means slotting women into a pre-existing frame which was not originally designed for them. And that raises the question of how well the frame fits.

Categories have their prototypical members, the examples that spring to mind first when we encounter their generic label. Our prototype for the category ‘bird’, for instance, the kind of bird we’ll draw if we’re instructed simply to ‘draw a bird’, is something that looks like a robin or a sparrow, not an ostrich or a penguin. In the case of hate crime/hate speech the prototype is hatred of a racial or ethnic Other. This is where it began in the UK, with the outlawing of ‘incitement to racial hatred’ in the 1960s. Later religious hatred was added, and this was not a big stretch because it’s close to the prototype: often it’s as much about race/ethnicity as religious belief per se. The other types of hatred now covered by the law—homophobia, transphobia, hostility to disabled people—share some features with the prototype, in that they target minorities who are perceived as ‘different’, as outsiders. And there’s another thing these target groups have in common. Hatred of them is linked, historically and in our minds, to right-wing extremism. The prototypical (western) right-wing extremists, the Nazis, regarded Jews, homosexuals and disabled people as inferior and impure, and they did their best to exterminate them.

But this prototypical form of hate, the kind that motivates genocides and pogroms, that calls for the ‘repatriation’ of Black British people to ancestral homelands they have never set foot in or advocates the involuntary sterilisation of the ‘unfit’, is not what (most) misogyny is about. Though misogynists do see women as Other and lesser beings, who exist only in relation to men and for men’s benefit, few of them wish for a world in which women are not available to meet their emotional, domestic, sexual and reproductive needs. What they want is not to eliminate women, or to live entirely apart from them, but to exploit, dominate and control them. Misogyny, in short (as the philosopher Kate Manne has argued), is not a generalised hatred of women, but rather the punishment of women who refuse to stay in their subordinate place or to meet what men regard as their obligations. The extreme right has no monopoly on that kind of punishment, nor on the belief system which justifies it. Some forms of misogyny are so common and unremarkable, it hardly makes sense to label them ‘extreme’.

Because misogyny is so different from the prototype which hate crime laws were designed for, it’s difficult to just ‘add women and stir’. The Law Commission’s question about whether there should be a ‘carve out’ for domestic violence is one illustration of this difficulty: violence against an intimate partner is commonly understood as the consequence not of hate, but of its opposite, love, ‘gone wrong’. Murderers and family annihilators are said to have killed their ex-partners and sometimes their children because they couldn’t bear the pain of separation, rejection or ‘betrayal’.

I would have no hesitation in calling this behaviour misogyny, but I think what’s behind it is less a hatred of women than a sense of entitlement in relation to women. I would apply the same reasoning to, for instance, child abuse and elder abuse: what motivates these forms of violence is surely not a generalised hatred of children or old people, but rather a feeling of entitlement to use and abuse them, to exploit their relative powerlessness for your own gratification, or to punish them for making what you see as unreasonable demands. We should be able to recognise the seriousness of these forms of abuse, and to punish them as they deserve, without having to put them into a frame that doesn’t fit.

The notion of misogynist hate speech raises similar questions. According to the philosopher Alexander Brown, a typical legal definition of hate speech looks something like this:

(1) Speech [or other expressive conduct] (2) concerning one or more members of a protected group or class (3) that involves [expresses, incites, justifies] feelings of hatred toward group members.

Brown argues that this is too narrow, and that a better definition would reflect the way the term ‘hate speech’ is used/interpreted in ordinary language—which, as he points out, does not always treat ‘feelings of hatred’ as central. He goes on to offer a list of the types of speech (or writing) which in his view would ‘intuitively fall under the ordinary concept [of] hate speech’:

  1. Slurs, epithets or insults vilifying members of historically victimized groups (e.g. the N-word, ‘dirty Jew/faggot’)
  2. Forms of speech that assert or imply a group’s inferior or sub-human status (e.g. ‘these people [asylum seekers] are cockroaches’)
  3. Group defamation or negative stereotyping: the false/overgeneralized attribution of qualities/behaviour to a group (e.g. the blood libel; ‘homosexuals abuse children’)
  4. Incitement: advocating, justifying or glorifying hatred, violence or discrimination against a group (e.g. ‘kill all Xs’; symbols used to intimidate, e.g. burning crosses/nooses/swastikas)

Although this list makes no explicit reference to women–all the examples relate to race/ethnicity, religion and sexual orientation–it’s not hard to see how the framework might be applied to them. Clearly, there are slurs vilifying women (‘bitch’, ‘cunt’, ‘whore’); assertions of female inferiority and subhumanity are staples of online discussion among incels, MGTOWs et al.; negative stereotyping of women is commonplace; and under the heading of incitement/intimidation we could include the threats with which women are bombarded online, often expressed in the linguistic register to which Emma Jane has given the label ‘rapeglish’. Maybe we could even consider flashing, or sending unsolicited dick pics, as the misogynist analogue of the noose and the swastika. The problem with Brown’s taxonomy, then, isn’t that women can’t be slotted in at all. The problem is how much that leaves out.

One thing it leaves out is a feature of many kinds of misogynist discourse: the use of, specifically, sexualised speech to enact power and domination over women. A great deal of what women experience as intrusive, degrading or intimidating male behaviour is couched not in the language of hate, but ostensibly in the language of desire or sexual interest. Everyday street remarks like ‘nice tits’, or ‘give us a smile’, certainly don’t ‘intuitively fall under the ordinary concept of hate speech’: on the surface they seem appreciative rather than hostile, and men are quick to exploit that if women object (‘what’s the matter, can’t you take a compliment?’) But these comments are not innocent or harmless. As well as underlining women’s status as sexual objects, they are pointed reminders that women in public space are under constant male surveillance and must conduct themselves accordingly.

Other kinds of misogynist speech, like ‘rapeglish’, are closer to the ‘ordinary concept of hate speech’ because they’re explicitly violent and threatening. But even rapeglish tends not to be put in the same conceptual box as, say, racist or anti-semitic rhetoric, because its graphic sexual content prompts people to read it as a display of individual pathology rather than the expression of a hateful ideology. The same is true of indecent exposure, which is viewed more as a compulsion afflicting some (inadequate or disturbed) men than as an intentional form of expressive behaviour which is meant to humiliate and intimidate. Once again, the sexualised nature of the behaviour obscures the political purpose it serves. The philosopher Rae Langton has made a similar point about pornography, arguing that its sexual content tends to disguise its ‘status…as propaganda’. ‘For racial hate speech’, she writes, ‘hierarchy and subordination look like what they are… For pornography [they] look like what they are not–namely, the natural sex difference’.

Our belief in ‘the natural sex difference’ also makes it possible for certain non-pornographic messages that might otherwise be judged as hate speech to escape that categorisation. Consider the greeting card below, which was photographed in a bookshop: the fact that it was openly on display suggests that most people wouldn’t consider it hateful, even if some might find it tasteless.

Why not, though? Because it’s saying you can’t ‘shoot [women] and bury them in the garden’ rather than advocating that course of action? Because it’s clearly meant to be a joke? Maybe; but if the word on the card were not ‘women’ but, say, ‘Jews’ or ‘gays’, neither of those considerations would make it acceptable. Animosity between men and women (aka the eternal ‘battle of the sexes’) is understood to lie beyond the realm of politics and even culture: it’s seen as natural, universal and—crucially—reciprocal (just like the desire which draws the warring parties together). That’s why the one word you could replace ‘women’ with and still have an acceptable product is ‘men’—though you’d be glossing over the fact that in reality women very rarely kill men, whereas (in Britain) men kill women at a rate of 2-3 a week.

I’m not using these examples to argue that more kinds of speech should be legally defined as hate speech. I’m suggesting that ‘hate’ may not be the right frame for understanding or addressing the issue of misogyny. Feminists who favour that frame argue that equality requires inclusion: the exclusion of woman-hatred from existing provisions sends the message that women are less important than other groups, and that misogyny is less serious than other hatreds. But while I agree that misogyny is a real and serious problem, I don’t think that means it is, or should be treated, exactly like racism or homophobia. To me, taking it seriously means considering it on its own terms. Women need to be able to frame a response that begins from our experiences, our needs, and our ideas about what would truly make a difference.    

The fembots of Ashley Madison

Content note: this post includes some explicit sexual material which readers may find offensive and/or distressing.

‘Life is short. Have an affair’.

That was the sales pitch for Ashley Madison, the website for people seeking ‘discreet’ extra-marital sex that recently came to grief after hackers dumped a load of its users’ personal data on the web. It turned out that the website was basically running a scam. Straight men, the majority of site-users, were paying to hook up with women who did not, for the most part, exist. Real women did use the site, but they were massively outnumbered by fake ones.  Profiles were cobbled together by employees, and then animated by an army of bots which bombarded male subscribers with messages.

The bots’ opening gambits were merely banal: ‘hi’, ‘hi there’, ‘hey’, ‘hey there’, ‘u busy?’, ‘you there?’, ‘hows it going?’, ‘chat?’, ‘how r u?’, ‘anybody home? lol’, ‘hello’, ‘so what brings you here?’, ‘oh hello’, ‘free to chat??’. But if a man responded (using his credit card as instructed), they started to sound distinctly bottish. ‘Hmmmm’, they would confide, ‘when I was younger I used to sleep with my friend’s boyfriends. I guess old habits die hard although I could never sleep with their husbands’. Or: ‘I’m sexy, discreet, and always up for kinky chat. Would also meet up in person if we get to know each other and think there might be a good connection. Does this sound intriguing?’ No, actually—it sounds like you’re a bot.

Some men did suspect fraud. In 2012, one site-user complained to the California state authorities, though nothing came of it at the time. What tipped him off wasn’t, however, the bots’ clunkily-scripted lines. It was being contacted in a short space of time by multiple women who supposedly lived in his area, who hadn’t looked at his profile, and who sent him identical messages. All things that might have passed unnoticed if the bots hadn’t been operating on such an industrial scale.

The Ashley Madison bots were pretty basic. But the sex industry is a serious player in the world of AI bots—more sophisticated programs that can learn from their interactions with humans, and produce novel, unscripted messages. David Levy, who has twice won the Loebner Prize (a competition based on the Turing test for machine intelligence, in which a computer has to convince human judges it is also human) is the author of a book called Love and Sex with Robots, and president of Erotic Chatbots Ltd, a company whose name is self-explanatory. Recently it has gone into business with an enterprise that makes high-end sex dolls. At the moment, sex dolls are designed to satisfy their owners’ physical and aesthetic requirements: few of them talk, and none of them could be said to converse. Chatbots, on the other hand, talk, but they’re not usually physically embodied. Bringing the two things together in one package—a doll that looks and feels realistic and can also make human-like conversation—seemed like an obvious (though technologically ambitious) business proposition.

When I first read about this I was sceptical, for reasons that are succinctly summarized in this comment left by a man:

Don’t you realize, the whole reason to get a doll is so we DON’T have to listen to them talk after sex?

But while this may be the prevailing attitude among the minority of men who regularly fuck inanimate objects, there are reasons to think it is not how most men feel. In surveys of men who buy sexual services, a high proportion typically claim to want some kind of human relationship. Silent, sullen prostitutes who make no effort to get to know the client, talk to him or pretend the encounter is enjoyable for them are apt to prompt complaints from punters, even if those punters also describe them as physically attractive and compliant.

You might think this issue would also deter men from having sex with robots: you can’t have a human relationship with a non-human entity. However, many experts believe otherwise. Studies of people who work with robots in other contexts have found a strong tendency to anthropomorphize them, projecting personality traits and feelings onto them which, outside fiction, they do not have. The military sometimes uses robots to do dangerous tasks like disarming bombs, and sometimes the robots get blown up. Human soldiers reporting these incidents say things like ‘poor little guy’. One group whose robot got blown up held a funeral for it.

So, there could be a market for talking sex robots. But what kind of conversation will they make?

The less ambitious developers are just hoping to improve on the current generation of ‘unintelligent’ sex chatbots, programmed to spew out the sort of random messages Ashley Madison’s subscribers got. You could do this by giving them a sexed-up version of the capabilities displayed by Virtual Assistants like Siri and Cortana. They wouldn’t pass a Turing test, but they’d be able to, as one developer puts it, ‘follow simple instructions’.

Erotic Chatbots Ltd. has more ambitious plans. At the moment it’s developing a bot that can ‘talk dirty’. Levy explained in an interview how you train a bot to do that:

You give them lots and lots of examples and they generalize from those examples and they can make the whole of their conversation sound like somebody who talks dirty in a loving way. We teach [the bot] and it generalizes, but it will talk about any subject. You can talk to it about Italian food and it will interject about lasagna. “I could have a great time with lasagna!”

His business partner Paul Andrew chipped in:

We’ll be using erotic writers to help us program the language, so we’re actually going to work with people who do this for a living, as it were. That way we can give the chatbot a good understanding of the vocabulary and the… talk. I’m trying to think of a good word to use there. Basically, we will give them a really good grounding, and then the chatbot learns. Once they have a vocabulary, once they have a basic brain, they grow themselves. They’re quite competent. We also work with some people who do [sex] chat lines; we’re going to pick their brains, too.

In other words: ‘we’re going to teach our bot to emulate the linguistic characteristics of porn’. In the circumstances that’s not a big surprise. But there is, perhaps, a certain irony in it. Levy and Andrew want to use cutting-edge science and technology to make machines capable of producing one of the most predictable and stereotypical linguistic registers in existence—so clichéd that its human users often sound like bots themselves.

Paul Andrew mentions ‘picking the brains’ of people who work on sex-chat phone lines. Back in the mid-1990s, the linguistic anthropologist Kira Hall did some research on the language used by phone sex workers (their own term was ‘fantasy makers’) around San Francisco. She found their performances traded heavily on stereotypes about women’s language. Like speaking in a lilting, breathy voice, ‘using lots of adjectives’ when describing yourself, and dropping in plenty of elaborate rather than basic colour-terms (your imaginary underwear wouldn’t be ‘pink’ or ‘black’, it would be ‘peach’ or ‘charcoal’). The workers knew these were stereotypes: the language they produced on the phone was nothing like the way they talked when they weren’t taking calls. But stereotypes, in their experience, were exactly what their customers wanted.

Since different customers were into different stereotypes, a skilled fantasy maker needed to be able to produce a range of female personae on the phone—schoolgirl, southern belle, dominatrix, bimbo, Asian woman, Black woman, etc. They prided themselves on being able to ‘do’ personae which were remote from their own real-life identities. One of the individuals Hall spoke to wasn’t even a woman, he was a man who could pass for a woman on the phone. On the question of race, the view was widely held that white women made the best Black women, and vice-versa. As one worker explained to Hall, the Black woman of the (mainly white) callers’ dreams was a two dimensional racist stereotype which white women were actually better at producing (not to mention less uncomfortable with).

Women (and men) who work the fantasy lines are like human fembots, performing a version of femininity that callers will pay to spend time with. Not only does this performance not have to be authentic to be convincing, in the context of commercial sex an authentic (i.e., non-stereotypical) performance of femininity would risk destroying the illusion which is the real object of desire.

But you might wonder, what is sex-talk like when the parties are not in a commercial relationship? Is the language less clichéd? Are the personae constructed less stereotypical? The short answer is, not necessarily. The researcher Chrystie Myketiak has analysed cybersex encounters between peers in a virtual environment which those who study it refer to as ‘Walford’. (It’s an online community which did a deal with a university: the university would host and maintain it in exchange for being able to observe and analyse what went on in it. Its members all consented, and their consent is sought again every time they log on). Here’s a typical extract from Myketiak’s data, in which the two parties have taken the roles of a male and a female (most likely this reflects their offline identities, but we don’t know for sure). In the transcript ‘F’ and ‘M’ identify the female and the male participant.

(M) [his] hot seed fills every crevice of your womanhood…
(M) Keeps fucking you hard, jolting your entire body with each thrust.
(F) Grinds you by twisting and turning, faster and faster… she really wants it rough.
(M) Gives it to you so hard your ancestors feel it.
(F) Is pleasured senseless, she has tears coming to her eyes.
(M) Reaches around and rubs your hardened clit, violently.
(F) Whispers “Know any other wild positions? Hehe…”
(M) Whatever comes to mind is good for me.
(F) Same here… surprise me…

Linguistically, what stands out about this extract is the way the participants mix third-person narrative, second-person address and occasional use of the first person. You don’t get that in other kinds of porn. But the narrative itself is full of porn clichés, and the whole thing is organized around the heteropatriarchal proposition that in sexual encounters, men lead and women follow. Men are dominant, women submissive: whatever men desire is also pleasurable for woman. If he’s violent, that’s OK, because ‘she really wants it rough’.

In a paper she gave at a 2012 conference on robots (there’s a written draft version available here), the lawyer Sinziana Gutiu argued that if AI sexbots are successfully developed they will further entrench these ideologies of gender and sexuality. She thinks this will be a serious problem, because the combination of verbal and physical interaction which intelligent sex robots permit will have an even more powerful effect than porn does now on men’s real-world interactions with human women.

Gutiu points out that the advanced capabilities designers hope to give future robots will make them seem human, and that perception will be reinforced by the anthropomorphizing tendency mentioned earlier. However, some key human qualities will be deliberately left out of their design—like the ability to verbalize pain or emotional distress. Above all, there will never be any question about whether a robot consents to sex. It is there for its user to have sex with as and when he wishes. She goes on:

By circumventing any need for consent, sex robots eliminate the need for communication, mutual respect and compromise in the sexual relationship…allowing men to physically act out rape fantasies and confirm rape myths.

And she believes men’s experience with these nearly-but-not-quite human entities will lead at least some of them to assume that real women can legitimately be treated in the same ways.

Her paper also hints, however, that intelligent sex robots could in principle be designed to do the opposite of what she fears. If they were trained to engage their human user in talk which emphasizes negotiating consent, communicating your desires and feelings, respecting others’ boundaries and being willing to compromise, they could be used to teach a different way of interacting from the one which is modelled in porn. I’m no AI expert, but that sounds to me a lot more difficult than making a bot that ‘talks dirty’. And also, of course, a much less attractive proposition for investors whose aim is to make a profit.

It’s not only the money angle which makes me think that Gutiu’s educational sexbot is less likely to materialize than the pornified fembot of her nightmares. Therapists tell us that the most important reason why intimate relationships fail is a lack of open and honest communication, particularly about sex. But intimacy with another person doesn’t seem to be what a lot of men are looking for. If what they wanted was an intimate encounter with a female human being—a unique, complex individual with her own thoughts, feelings and desires—how could so many men have fallen for the fembots of Ashley Madison?