The concept of AI—specifically of the foxy, sexualized persuasion—has permeated pop culture for a very long time, most recently exemplified with Alex Garland’s Ex Machina. Technology, as it is wont to do, continues surging forward, simultaneously beckoning or threatening (depending on personal outlook) the potential of true artificial intelligence. And should these AI rise up, what kind of role would sexuality and sexual identity play in their existence—if at all? Hopes&Fears corralled a group of varied experts to weigh in through a group panel discussion to see what the future holds for us, the AI... and our respective crotch parts.

Artificial Sexuality: a roundtable discussion on screwing robots. Image 1.

Beca Grimm



Artificial Sexuality: a roundtable discussion on screwing robots. Image 2.

What do we mean by A.I.?

Gareth Stoneman: I've thought about this [and how AI relates to sexuality] and there are two camps here. The first is creating machines that have an intelligent response to sexual input and output, creating pleasure simulacra for humans to experience. The second would be to create some kind of model of sexuality inside the artificial intelligence that it would use for it’s own reproduction. You could use a genetic algorithm where it breeds and mutates over generations to select the most effective fit to solve a problem. Like how evolution works, natural selection.

Ian Kerner: To be honest, Gareth, I didn't understand a lot of what you said, but it sounds almost as if algorithms could "court each other"? Like, they'd almost be in a courtship model.

Gareth Stoneman: The idea with [genetic algorithms] is [for instance] if you wanted to figure out something boring like the best recipe for cooking a pizza. You could write a genetic algorithm that would test different inputs. "Let's try 15 minutes in the oven and three under a broiler." "Let's try seven minutes in the oven and 15 under the broiler." Then you break apart each part of the initial inputs and try the alternatives until you reach the optimal solution. It's like mutating genes. You put two genes together and you end up with a mutation which is a combination. So, with a mother and a father, you'd end up with a combination of the mother's genes and the father's genes. With a genetic algorithm, you're splitting an electronic gene in half, recombining it, and trying thousands and thousands of permutations to try to come up with the optimal solution to a problem.


Gareth Stoneman is Manchester University grad with a degree in AI. He currently works as technical director at an NYC-based digital agency and recently spoke at SXSW about affective computing (the computation detection and reaction to emotions). Price also spends a lot of time researching EEG biofeedback and its application for communication.


Dorothy Howard is writer and activist with bylines in Daily Beast, New Inquiry, and others. Howard is also associate editor of BK-based small press WONDER. Her areas of expertise include gender, digital labor, and Wikipedia.


Kate Weinberg is a sex educator for adults and college students. She is also a co-presenter for the Sex Discussed Here! Female Orgasm program (which was discussed at Jezebel recently).


Dr. Ian Kerner is a therapist specializing in sexuality and human relationships. He works as a sexuality counselor for Good In Bed.


Artificial Sexuality: a roundtable discussion on screwing robots. Image 3.


Kate Weinberg: I feel like my brain is exploding a little bit. I just think of sex and intimacy and these sorts of things—even the language of computers and machines feel so counterintuitive to me when applied to something as primal and ancient as sex. Even words like "input" and "output" and "algorithm"—it feels so separated. Clinical and cold.

Gareth Stoneman: I would agree with that. I feel like that approach, that facet of it, is [us] taking these primal terms and applying them to technology. So it's not necessarily a true representation of what that human experience is. So I wonder if that whole topic is a little off track. Should we go back to AI as a simulation for creating [technology] and finding how humans would interact with it? Does anyone have feelings about this?

Ian Kerner: I started to think about it a little bit and two things occurred to me. One is we are starting to see sex toys—vibrators, specifically—that are trying to be more intelligent about how they respond to the process of arousal. I know I came across one vibrator in specific—they're calling it the first AI vibrator. It's crowd-funded… I was listening to a little bit [about it] and the idea was a vibrator operates to its own operator. And it can be adjusted to limited extents. But what about a vibrator that could work really intelligently on the feedback it's getting and really create a much more pleasurable experience? So that seemed to me very, very useful—the idea of artificially intelligent sex toys. I also find, as a sex therapist, that so many people are struggling with sexual problems—whether it's them dealing with rapid ejaculation, women grappling with orgasm issues—being able to engage with an AI, especially if you're single, in order to be able to work on these issues would be really great. You know, sex surrogacy, is effectively illegal. As a therapist, I can't really legally recommend a sex surrogate but there is a place with surrogacy and I wonder if AI could occupy that place.

I feel like my brain is exploding a little bit ... the language of computers and machines feel so counterintuitive to me when applied to something as primal and ancient as sex.

Hopes&Fears: Could you explain to us what sex surrogacy is?

Ian Kerner: Sex surrogacy, for example, would be somebody who is single and suffers from premature ejaculation or has had very, very limited sexual experience or suffers from trauma—the idea of going to a surrogate might be somebody who [the patient] could effectively have sex with and learn how to relax and learn how to participate in sex and learn more about their own arousal. I mean, surrogacy is something that's important for people who are disabled and dealing with sexual issues, people who have chronic sex problems, sometimes people suffer from intense sexual trauma. That's surrogacy in a nutshell. It's legally confused with prostitution so there's no legal way for a licensed clinician to recommend surrogacy. Sex is in the margins. It's a gray zone. But I could see AI—maybe a limited use but a very important use to occupy that space.

Gareth stoneman: There [has been] a lot of talk about vibrators that are “smart” devices. For example, a vibrator that has sensors and pre-programmed responses in it. That's something I don't know if I'd consider to be intelligent, though. Does a sex toy have to be actually enjoying the experience to be considered intelligent? Or is a sensor that can read and respond—tracking [things like] your breathing—is that an intelligent device? What do we define as AI here? And what do we define as sexuality in the device itself?

Artificial Sexuality: a roundtable discussion on screwing robots. Image 4.


Dorothy Howard: Another type of use for AI would be for care robots. I've been really interested in reading about robots that are sitting with patients that are in a coma or aren't fully cogent. Robots that provide a basic level of sociality that a nurse might otherwise. And then there's also the ways that we're seeing… like Hatsune Miku for entertainment, right? One of the ways that robots are being used is in pop culture, as celebrities, as singers. There are a lot of different reasons why we would want a variety of forms of identity to be replicated for introducing robots into pop culture. You don't want any one thing. At the same time, I worry that when we try to model gender we're using a limited number of standing that's often based off biological stereotypes or biological readings. It makes me worry just based on—maybe this is very cynical, but—the way the humanity has been decreased recently. We've already discovered that machine vision algorithms detecting skin color are racist—they much more easily detect white skin color than other skin colors. I think part of that—the way that's been explained is—well, engineers, the engineers making them happen to come from the US. And people with a limited understanding of diversity are making algorithms that are modeling them on things like gender or race and trying to detect those things, they might be problematic. They might represent a limited understanding of these things that are highly complicated. I mean, I'm someone that approaches gender from the spectrum. The gender spectrum, non-binary level. One of my main worries is just that this replication of gender is going to be very binary. 


H&F: Dorothy, you were talking about how some of the humanities might decrease with the inclusion of technology and how that's becoming a much larger influence. And Kate was talking, too, about how the inclusion of machines makes sex—something that's very natural and primal—seem cold. Is there any way this mechanical quality could be stripped from AI's place in sexuality?  Is that possible at all?

Kate Weinberg: Gareth, can we talk about how we are defining AI? Does it have to represent an artificially intelligent being with some aspect of consciousness or independence? Or can it be on the level of vibrators that read into body signals?

Gareth stoneman: Artificial Intelligence is the science of creating intelligent machines. It’s trying to simulate (or even create) intelligence through technology. So is this something where we're talking about simulated intelligence or actual, artificial consciousness—

Ian Kerner: Gareth, just to add into that, when we're talking about sexuality, are we talking about touch and physical contact and emotional impact?

Gareth stoneman: I don't know that intelligence or emotional impact requires a physical body. I understand intelligence as the capacity to infer things from previous learning. So the ability for a being to take a scenario it hasn't seen before and respond to it in a considered way. It's more than seeing a vibrator that's saying, "Hey, you're breathing heavier so I’ll speed up." That's not an intelligent response, that's pre-programmed behavior.

Kate Weinberg: I guess on another level, aren't we all just responding to influencers? Aren't all our responses based on some information? Perceiving, then reacting. I don't know the answer but I guess it's another question: What does intelligent mean? And what level are we talking about intelligence? And why isn't it something that reacts to the way you're breathing or the way your body—why isn't that as intelligent as something with a verbal response?

It's more than seeing a vibrator that's saying, "Hey, you're breathing heavier." That's not an intelligent response, that's an inference.

Gareth stoneman: I think it's because—for example—if I asked everyone in the conversation to visualize a blue elephant. None of you guys have ever seen a blue elephant before. There's no such thing as a blue elephant [however] you have a very clear picture in your head of a blue elephant. Whereas a computer [might be] trained to picture elephants by showing it 5000 pictures of elephants. If you tell it to look for a blue elephant and it doesn't have it in its data set, it's not able to come up with that idea.

So I think intelligence is the ability to come up with things from inputs that aren't existent in the set previously. Something like a vibrator that, for example, would be able to say, "Hey, this is your breathing pattern. I haven't seen this pattern before but I know you're approaching orgasm right now." That would be an intelligent inference. But then something that says, "Hey, looks like your heartbeat is up from five minutes ago." That's just reacting to input.

Kate Weinberg: So it's the ability to draw conclusions. To take information and make further inferences.

Dorothy Howard: I think that's the scientific terminology. But I wonder how else we can or should talk about intelligence. Because that's one angle… when I say "intelligent design" know I'm not talking about the scientific intelligent design. There's design like the design of objects which predicts an action will happen. The object—I don't want to say wants something to happen—but it's designed so something will happen. Like a grab-bar for a bathtub. It's positioned in the right place so when you're in the bathtub, you'd automatically lurch to that. The grab-bar isn't taking an input and learning from that and responding. I wanna say we can get away from the brain really when we're talking about intelligence. I don't know if we actually can but—I want to resist, at least personally, the scientific definition of what that means.

I want to say we can get away from the brain really when we're talking about intelligence. I don't know if we actually can but—I want to resist, at least personally, the scientific definition of what that means.

Artificial Sexuality: a roundtable discussion on screwing robots. Image 5.


Sex and A.I. in Ex Machina

Kate Weinberg: It seemed in the film it was about manipulation…it seems [sexuality] is Ava's primary tool in achieving what she wanted which is to escape. Not a strange, romantic creature—thing. I don't know what to call her. Well—I don't want to ruin it for those who haven't seen it—

Ian Kerner: But at one point doesn't [the protagonist] ask Oscar, the creator, why did he create Ava? What's the purpose? I found his response a little disappointing. He said, "Well, it's inevitable, because we've been focused on AI for however-many decades and so Ava is ultimately inevitable." I have to agree. I didn't see a clear purpose for Ava or any real purpose of her sexuality except that she could be some sort of proxy for a person…maybe more fun than just masturbating to porn. What do they call it? Haptics? When there's physical feedback happening in conjunction with something. So like, a version of masturbation. Oscar's sort of this loner, by himself. He doesn't know how to have a relationship with a person. He has this closet full of robots of different ethnic varieties and different body types. You really just get this impression of this guy with his advanced fuck dolls that can't really [offer] him real intimacy. Is that where AI goes for people who can't handle genuine intimacies? Do we do an AI version of intimacy which is a little better than masturbation but not the real thing? That's kind of what I got from Oscar in that movie. He did it to avoid dating.

Gareth stoneman: There was a really interesting case study of an IRC bot called Jenny18. And another called Eliza. And Eliza could half-hold a conversation. So someone took an Eliza algorithm and filled it full of a bunch of kind of air-headed sex comments and set it running wild on a public chat network. And basically a lot of guys kind of went with it and wanted to have sex with it. It was quite obvious that it wasn't a human being but it didn't really stop a lot of people from having a go. I think there's a fairly low barrier for suspension of belief when it comes to creating something that men particularly would try to have sex with as a masturbation substitute.

Is that where AI goes for people who can't handle genuine intimacies? Do we do an AI version of intimacy which is a little better than masturbation but not the real thing? 

H&F: Is that where we're heading with the creation of AI in general? Why would we create a robot that thinks like a human? How would that benefit us anyway?

Gareth stoneman: Pornography is always the first industry to take advantage of modern technology…going back to engraving naked ladies on the side of vases thousands of years ago. It's a field where there's a lot of interesting things happening first especially because the interactions with it tend to be fairly basic. There's a high amount of demand. It's a fairly simple response that you are trying to elicit from consumers.

Kate Weinberg: First of all, it seems pretty classist. Like who would be able to afford an AI? What kind of person is this being created for?

Gareth stoneman: …technology, if there's a demand for it, it will become [more widely available].

Kate Weinberg: For instance, people who's sexualities have been formed by porn. Which is more and more people it seems since we have such poor sex education—especially in this country but a lot of countries. A lot of the sex education people are getting is coming from porn. Not all porn is bad or wrong or illustrating types of sexuality that are accurate. But most mainstream porn is [educating] people in a way where they may enter the world interacting with real human beings and think that's how they actually behave or that's actually what they want. So I wonder how this kind of technology, too, could influence people's sexuality. If they are only ever interacting with machines and that's what they want and that's what they're happy with, then OK. But if they do come to interact with real human beings then what happens? What do they expect of those real human beings if their sexuality is formed from these creations designed to be masturbation-aides or that kind of thing at their disposal?


Doing the right thing

Gareth stoneman: If there's [a generation] growing up with intelligent sex devices that learn your preferences and you can fulfill your deepest, darkest desires without any risk of exposure or disease or having to learn how to discuss consent and experience intimacy with a willing partner, it could be extremely disturbing when those people do come into contact with other human beings. I completely agree with you.

Dorothy Howard: And there's also like, one thing that I've found really interesting about Chobits, the Manga comic. One of the characters feels jealousy for the machine that the main character is starting to love. So it's not just this question of what happens when you emerge from your cell where you've been having your romance with your bot for a few years and how do you interact with humans? Do you have unrealistic expectations for them? But a question of how would people in the family or previous lovers or friends measure themselves in comparison to that thing? I think it's also about this feeling of inadequacy in the face of technology. It's not just something that can be applied to AI but just in general. It's feeling that we're being replaced and that these things do life better than us. It's kind of an existential crisis or a potential crisis of the imagination.

Gareth stoneman: Yeah, I think if we take that dystopian conclusion—we talked earlier about human beings having these inputs and outputs—where your needs are taken care of. What does it mean to feel love? What are the inputs that cause you to have a response that equates to love? You can have this machine that's intelligently anticipating your every need and giving you these feelings of warmth and security and love and knows everything that you want and it's doing it in a way that's far beyond what any human could do, it'd certainly create a sort of drug experience or this kind of cocoon in this perfect love with a machine? Then what is a role for a messy, flawed human relationship? You're a rat in a cage hitting the lever over and over until you drop dead having cocaine injected straight into your vein. If you have this constructive machine that is simulating the input, you can imagine a dystopian future where no one has real relationships and real sex anymore. Everybody just goes and sits in their little cocoon with their machine that holds them and pets them and tells them they're beautiful and that's that.

It's a feeling that we're being replaced and that these things do life better than us. It's kind of an existential crisis or a potential crisis of the imagination. 

H&F: How could this presence potentially create fatal flaws in a young person's developing expectations of human relationships? Could it? Is there any way this love with a machine may be supplemental to existing relationships? Ian, do you have thoughts on that?


Ian Kerner: Well, no. That's where I'm scratching my head—beyond surrogacy, sex aides, masturbation… I could imagine for somebody who's, say, a pedophile—someone with a sexual imprint that really runs counter to society's values, an AI might be [helpful] to that kind of sexuality to be expressed. I am having a hard time imagining in a perfect world where any AI that is truly intelligent—what would we do with it? And how would it enhance our relationships and our intimate lives? I'd love to hear it. I'd love to get thinking about that.

Gareth stoneman: Again, going back to stuff that's probably not intelligent, there's the whole wearable technology thing where you have sensors that are reading heart rates, skin galvanization, things like that. There's emerging research about reading emotions through that with first-order sensory data being processed to uncover the emotions that you are currently feeling. If we could create an intelligent, viable system for reading emotions from people beyond what they're saying—because people aren't always able or willing to talk about what those emotions are… [When] you can read those emotions through technology, it creates this whole other medium for enhancing communication. By connecting to people that way, you can create this whole… emotional language. A way of communicating with each other in a way that's nonverbal. You could have a device that would read, for example, sensory input from your partner… and you’d have a device in your ear saying, "Hey, go faster. Go slower," that kind of thing.

Ian Kerner: You know, Gareth, the other thing that occurs to me is people aren't always educated about sex. They're not always very good at sex. They certainly don't communicate about sex. You could be lying in bed next to somebody and be a million miles apart. All you need to do is say something but you're not gonna say something. So people are actually not very sexually intelligent. There was that movie Her, with Scarlett Johansson, she was an artificially intelligent… something, right? [There] is a very powerful sex scene and it happens in the verbal aspect. It reminds you… so much of arousal, so much of sexual fulfillment is not just physiological stimuli. …It's mental stimuli. And I wonder if, in a way, it would take artificial, sexual [beings] to help us be more sexually intelligent as people.

H&F: So in that sense, could it be used possibly as an educational tool? To get closer to your partner or learn how to read people better? Or even how to learn yourself. Because like Gareth was saying, a lot of these machines go off of physical cues to gauge someone's emotions when I don't know if people are necessarily as in touch with their own emotions.

Ian Kerner: The teenage boys would go off to sex camp for a couple of weeks, with the artificially intelligent sex bots… they'll learn about "cliteracy."

H&F: Would we ever go there as a society? Would that be a good thing or a bad thing to actually create—in a broader sense, sexual education through AI?

The teenage boys would go off to sex camp for a couple of weeks, with the artificially intelligent sex bots… they'll learn about "cliteracy."

Kate Weinberg: I think we should get to place where we have sexual education in general. Whether or not we get to a place where we have robots with great clitorises—which, that'd be great, too, as an additional tool. But I think so many of our sexual problems as a culture and a society results from that basic level of sex education missing.

Dorothy Howard: I'm also interested in [focusing] on LGBT communities and the use of technology. Often, people who aren't necessarily socially accepted in their day to day lives turn to online communities to really be themselves. To feel like, "OK, this is a space where I can find people and feel protected and I wouldn't be able to do that on the street." Just thinking about people with types of social trauma, as mentioned earlier, might feel more comfortable in chat rooms or being online interacting with other humans for any number of reasons. People who don't want or like to interact with other humans have another option.

H&F: Would this be more beneficial in a training wheels situation so then they can go out into the world and feel confident and comfortable navigating human relationships? Or would that be a potentially acceptable end-game situation? Like, someone's sexual preference could be AI.

Dorothy Howard: I think it's a really interesting question. A lot of the points we've mentioned have touched on this question of whether AI—is the responsible use of these technologies a question of not losing our humanity? Trying to keep humanity and keep human sociality because it's preferable relationships with machines? That's an open question.

Gareth stoneman: It's a should over a what would happen sort of situation. We're all pretty intelligent. We're sexually aware…so yes, these things should be used for the service of humanity. It could be used for good, it should be used to open people's minds and prepare relationships. But market forces are [powerful] so what will probably happen is we'll just get fuck bots… because that's what the marketplace will find a use [for]. In all these artificial intelligence conversations, you can have high ideals about it and what should be done. As it actually does happen, it's often very different from that because the technology is used to produce something very base level. It's simpler and easier to harness. So what will happen first? Probably not the high-minded things we're discussing here. I think all you can really do as part of the kind of people who are creating this technology is participate in advocacy that needs to happen to make sure the technology develops in a responsible and beneficial way. Advocacy to be doing the right thing when building these things.