By JWDOLL | 13 July 2019 | 0 Comments

Could you fall in love with a sex doll?

Can we really have relationships with sex doll?
If so, what about robots? Computerised chat-bots, such as the “personal assistants” Siri and Cortana , already exist which can remember many of their owner’s likes and dislikes, and respond helpfully to their questions, recognising their individual voice and accent.
sex robot
The silky computer-seal, Paro, can’t use language (yet), but it can make affecting eye-contact with the person cuddling it, and appears to enjoy being stroked.
In the near future, so we’re told by the manufacturers of “computer carers,” the residents of old people’s homes will find solace and companionship, and endless opportunities for satisfying conversation, with robots (or screen-based AI systems) using natural language. These will be able to discuss their fondest memories, as well as their most trivial everyday irritations.
As for the sex-doll equivalents, I leave it to you to imagine the increasingly lifelike (and huskily speaking) robots that are being researched/marketed around the world. (Siri and Cortana are already being engaged in sexually explicit interchanges—sometimes, almost 300 times a day—by lonely male users.) The sexual gizmos of the future are described by their supporters as offering not only “sex” but also “love.”
Sex-with- Sex Doll is certainly possible, and perhaps no more distasteful than other types of impersonal sex. But love-with-robots? Personal love (which is not at all the same thing as lust, or sexual titillation either) is a complex relationship between two people who each have their own motives, goals, and preferences but who each respect the other’s interests and also adopt them to some extent—even, sometimes, putting them first. That involves a significant degree of cognitive-emotional (computational) complexity on both sides. The sex-dolls anticipated so eagerly by the porn-market have not even the beginnings of such complexity.
Nor do the personal assistants or chatbots destined for use in old people’s homes. Unlike dogs (with whom we can have genuine, although not fully personal, relationships), they have no interests whatever. If a natural-language-using gizmo were enabled to say, from time to time, “I want this”, or “I’d be upset by that”, the human user wouldn’t take it seriously.
Or anyway, they shouldn’t take it seriously. But perhaps, if they were already in the early stages of dementia, they would. And perhaps they would alter their own behaviour accordingly. They might even gain some satisfaction from doing so, feeling that they had “done the right thing” by their gizmo-friend. But if so, they would be deeply misled—not to say betrayed by those who put them in that position. And if they looked to it for genuine attention and concern with regard to their own interests and problems, they would be horribly disappointed.
bbw sex doll
In other words, the “conversations” that human beings could hold with such computer artefacts would not be genuine conversations. There wouldn’t be any meeting of minds here—not even in the sense of hostile disagreements. There would be engagement, yes. But engagement at a very shallow level: taking up time, effort, and concentration, and maybe attended by hope or dismay. But the hope and dismay would be all on the side of the person. The chat-bot can know nothing of this.


Leave a Reply

Your email address will not be published.Required fields are marked. *
Verification code