HitchBot Meets His Maker: What a Robot’s Murder Tells Us About Ourselves

This week, thousands of robotics enthusiasts, Philadelphians, and Canadian children are feeling sad about hitchBOT, a hitchhiking robot that traveled across Canada and Europe, aided only by cuteness, programmable charm—and our tendency to project human emotions onto inanimate objects.

Two weeks ago, the Canadian-built hitchBOT left Marblehead, Massachusetts. The goal: San Francisco. It was supposed to be an On the Road-meets-Wall-E jaunt, and a testament to the goodwill of human beings.

Instead, over the weekend, somebody in Philadelphia destroyed hitchBOT. Game over.

The timing for hitchBOT’s destruction felt oddly appropriate. Late last week, the New York Times published a startling article about Xiaoice, a Microsoft chatbot that’s gained a major following in China. Xiaoice carries on a lifelike conversation with users. Millions of people chat with it every day. In a Her-like twist, many express affection, even love, for their digital companion.

“We’re forgetting what it means to be intimate,” Sherry Turkle told the Times. “Children are learning that it’s safer to talk to a computer than to another human.”

My inclination is to agree with Turkle about Xiaoice—and to feel totally gooey and glum about adorable little hitchBOT. In this inconsistent blend of emotions, I doubt I’m alone.

Both are robots, designed to elicit emotions from people. Both have demonstrated how easy it is to do so, even for inanimate objects. So why do I feel creeped out by Xiaoice, but sad about hitchBOT? Is it just a cultural thing—a xenophobic fear of smartphone-wielding strangers, versus a comfort with cuddly robo-travelers? What tools do we have to navigate the line between a creepy robot-human interaction, and an adorable one?

These questions aren’t strictly academic. From the Microsoft paperclip to Siri, millions of people interact daily with human-mimicking computer programs. Earlier this summer, Aldebaran, a French robotics manufacturer (I’ve written about them before) released Pepper, a sophisticated “emotional robot” with a “focus on affection,” that costs just a bit more than a MacBook Pro. Microsoft has a creepy new commercial about how your children “will grow up with Windows 10.” And as Xiaoice illustrates, shades of Her are materializing faster than many of us would have expected, sitting in theaters two years ago.

In the face of this strange new roboticized world, Turkle’s style of response about “what it means to be intimate”—which assumes that there’s something corrupting about intimacy between humans and non-humans—seems totally off-base. After all, we already have robust social precedents for this kind of intimacy.

There’s nothing new about seeing humanness in the world around us. We anthropomorphize, finding the human (anthropos) form (morphe) in all kinds of relationships, both natural and supernatural.

Starting with the natural, consider the case of domestic animals, our pets. Dogs and cats are independent, flesh-and-blood beings, of course. But the pets that we know and love are, in part, projections of our minds. Basically, we anthropomorphize the hell out of them. We give them names, and imaginary voices, and speak of their motivations in unabashedly human terms. We claim to understand their internal lives, and we make them characters in our own stories.

Does this make the love between people and their pets any less real? Well, no. There is something marvelous and, to my mind, very true about the bonds that can develop between people and animals. But in some ways, we are also alien to each other–a gap that we bridge (at least partly) with our minds.

I don’t think it’s solipsistic to say that we are always, at some level, intimate with entities that are partially of our own creation. In any kind of relationship, we have to grapple with the disparity between our expectations and realities.

Children may only just now be learning, as Turkle suggests, “that it’s safer to talk to a computer than another human.” But if Old Yeller and many, many people’s child- and adulthoods are any indication, humans have known for a long time that it’s sometimes nicer to talk to a non-human entity, whom we sort-of-imagine is able to comprehend language.

There’s an analogy here, too, with religion. A believer might assume an intimacy with a god she can neither see nor hear, and whose essential nature she is unable to probe. You need not be an atheist to consider the anthropologically-inflected perspective that believers project their ideas, desires, and expectations onto their objects of worship. And it is not foreign to theology, either, the idea that human beings might have created God in their own image.

I don’t mean to equate animals, gods, and robots. The point is that we’ve long formed relationships with non-human entities whose sentience is non-existent (in the case of robots, imaginary friends, and beloved fictional characters), impossible to plumb (in the case of animals), or difficult to confirm (in the case of deities).

When it comes to these new tools, the question is not should we form relationships that involve a substantial amount of projection and anthropomorphizing? Instead, we have to ask ourselves, how are these new relationships unique? Does it matter if a robot is embodied (like hitchBOT) or disembodied (like Xiaoice)? Does it matter if it’s the product of a fun, communal experiment (hitchBOT) or a multinational corporation (Xiaoice)? How quick should we be to judge the intimacies of others?

As humans, we are often the animators of our own worlds. In this sense, the study of religion might offer some of the smartest guidance to the strange future of Xiaoice, Pepper, Siri, and hitchBOT. Few other areas of human endeavor, after all, have spent more time trying to disentangle the dynamics of human relationship with the inhuman. We may not all believe in God. But increasingly, we are all conversant with non-human things.


  • reedjim51@gmail.com' Jim Reed says:

    There is probably a lot of potential in making robot toys that can have a conversation and are linked in a cheap way to the internet so the conversation can grow complex over time. Religion is another matter. The power of religion comes from not actually being able to interact with God in any way other than just having faith. If a religion ever gets linked to the internet, the more sophisticated the conversation becomes, the more clearly it will be not a real religion. It will be something else. In fact, whatever that is, the internet might already be it.

  • jfuyui@gmail.com' Elena Geary says:

    Posted: Tuesday, August 4, 2015

    STEP BY STEP to Success.Just Follow Simple instructions.Just simple jobs offered by GOOGLE.STAY at home and get paid daily.I am making 20K$ monthly. I Joined this 4 months ago and Now I am making 97$ hourly On laptop.
    Visiting here
    I Started●●●●▸▸▸▸ Bit.Dℴ/RevealHomeCareers271



  • conjurehealing@gmail.com' conjurehealing says:

    Good job. I’d like to see more of this kind of writing about the social and religious implications of a world in which our corporations (entities, with legal rights and status now!) are filled with people and machines working together toward common purposes and goals. In the home, one (ostensibly) has a choice about interacting with these robots and computers. In the corporate environment, these relationships have already been established and are institutionalized. We will see more sabotage and destruction of these impersonal “beings” as human frustration grows.

  • claynaff@yahoo.com' claynaff says:

    To your credit, you put some critical distance between your views and Turkle’s absurd remark, but even so I’m sorry to say that I think you’ve failed to grasp the real issues here. Briefly, everyone has interests, and every intimate relationship is fraught with converging and conflicting interests. Children prepare for this complex social terrain by playing, both with imaginary and real others. Technology is neither needed for nor supplanting play. Kids on the prairie played with corncob dolls. Kids with Xboxes face imaginary foes but also have real allies, either next to them or connected via the Internet. What you say about pets is true as far as it goes but leaves open the question, “Why do we bother?” Why do people spend so much time and money on animals that, whatever rat-catching mutuality may have existed in the past, today serve only as costly objects of affection? I think there are plausible answers available in evolutionary psychology, and much more satisfying answers at that than any stock “kids today!” responses.

  • phillinj@slu.edu' NancyP says:

    The hitchbot reminds me of the bizarre local story of the “travels of Baby Jesus”, a prank whereby someone swiped their neighbor’s Baby Jesus out of the yard creche, and sent the neighbors photos of Baby Jesus vacationing at the beach, sitting in McDonalds’, etc.

  • jimbentn@verizon.net' Jim 'Prup' Benton says:

    This seems to be one of these ‘much ado about nothing’ articles; interesting for the facts about the robots, etc., but building far too much on the incident. There is nothing new or unusual about ‘interacting with imaginary entities, even with the entire question of religion left out.

    Most of us read fiction, or watch it on tv. And I would say that all of us, in little or big ways, mentally interact with the characters we meet there. Our interaction can be so strong that, for example, we argue that the Tony Randall version of Poirot was almost ‘slanderously’ bad — while the Suchet version “was” the character.

    In fact, here’s a mental test for you. You are driving — in my case, being driven — along a highway, you see a car having engine trouble, and the driver tryoing to flag down a lift, and you recognize him. He’s the very good, but relatively unknown, actor, Brian Dietzen. Now if you share my own love of what is, to me the best tv show, overall, I’ve seen in my 60+ years of tv watching, NCIS, you’ll recognize him as “Jimmy Palmer” the assistant coroner on the show. (Pardon the irrelevancy, but I picked him because of the brilliant way in which he, shown as the socially awkward, ‘always says the wrong thing’ type, becomes the only character on the show to be happily married — and to a model-type ‘stunner,’ who was not used as a character, afair, before they became a couple — something only NCIS makes its characters real enough to make work).

    My question is — and if you don’t watch NCIS, you can insert your own favorite actor/role character — while tyou were driving him to where he could get help, would you talk to him as “Brian Dietzen,” or as “Jimmy Palmer” or both at once.

    Of course we interact with our own fictional creations — sometimes ones based on someone else’s creations — all the time. (Even if you don’t read or watch fiction, you do occasionally masturbate, don’t you, or fantasize during sex?)

    There IS a difference between these types of interactions and those between us and the ‘new robots.’ The ‘old-style’ interactions are one we are, in reality, in full charge of, ‘writing’ both parts. If, when I finish this, my wife and I watch the rerun of QUINCY we usually do — it’s fun watching the old series we may have missed — and I picture us, after the show, going for dinner at “Danny’s” — Danny’s chef will be in fine form, the club will be open, and maybe I can share a beer with Lieutenant Monaghan, or poor Sgt. Brill, who is never given anything important to say. (And on the alternate type of interaction, I’m sure whoever I fantasize will respond in the way I want that person to, and not ‘have a headache.’)
    But we DON’T control our interactions with robots. We don’t know — we may be 99% sure, but we don’t know — what that machine will do in response to us. And THAT is the new thing, and the one which I believes requires far more thought. (This is why the “Frankenstein” — or even ‘Sorceror’s Apprentice” — idea is such a persistent nightmare, why Asimov needed the ‘Three Laws” for his robots, why he and others wrote stories about the gigantic computer the size of a planet — this was pre-transistor — who was asked “Is there a God” and got the reply “There is, NOW!”)

    That, and not the questions about anthropomorphosizing, is the interesting one, to me. How do we handle interactions with ‘imaginary persons’ who can, in fact, write their own half of the script?

  • jimbentn@verizon.net' Jim 'Prup' Benton says:

    Again, no biggie except for the fact that it is a religious image. People have been doing the same with garden gnomes for years — there was even a series of commercials using the idea.

  • whiskyjack1@gmail.com' Whiskyjack says:

    I find the most significant aspect of the story is buried in the opening sentence: the bot had already traveled safely across both Canada and Europe. I think its destruction in Philadelphia speaks volumes about the relative violence in the US relative to the rest of the developed world.

  • david@hollenart.com' djhollen says:

    I”m confused by my reaction to the destruction of that hitchbot: Yay!

    To me, all I see is emotional manipulation being programmed into a machine. It’s beyond creepy for an inanimate object to manipulate the brains of people to make them do things they wouldn’t ordinarily do.

    And yet, I have no problem with kids playing with a teddy bear. Is it because the teddy bear is a passive object that the child uses to project her emotions onto? I need to think this through…

  • I found the author’s comments a little distressing because I am remembering a specialist in artificial intelligence saying that the greatest threat to humanity is robotics. Not in the Terminator sense, but in the interference that is created between real life in our world because we become so invested in the artificial world that we forget that the real world is in trouble. Do I use my computer a great deal? Absolutely because I live in a remote area of the country, using it for reaching out, writing, and reading on Kindle books I could not afford otherwise. Do I blog regularly on Facebook, Twitter, or other social media? No. In fact, I often forget to even post on a blog I write on WordPress.

    Some would say that it is my age (62) that reflects my actions, but age has nothing to do with it. I have worked with computers since 1970. We are sentient beings who have needs that no machine will ever be able to fulfill. We need the warmth of contact with living, breathing beings, and we need the contact of minds that human minds can provide instead of canned answers. We have the ability to be creative and innovative, but computers and robots are neither of those things, and when we rely on canned responses, we lose our capacity to relate to creative and innovative ideas.

    In short, we need each other and we need this planet in ways that most of us do not even realize. Computers and robots do not. They are tools which do not clean air, soil, food, water, or love to survive and when we invest ourselves in them exclusively we forget that we need all those things to survive ourselves. I am sorry that hitchBOT was destroyed. I thought the little robot was cute, but I am not sorry that it reminds us that we are not robots and that investing humanity into something that is just circuits and wires as a substitute for real people is a very dangerous path to walk.

  • reedjim51@gmail.com' Jim Reed says:

    It’s beyond creepy for an inanimate object to manipulate the brains of people to make them do things they wouldn’t ordinarily do.

    I think that began in the 50s when they started advertising things like refrigerators on TV.

  • phatkhat@centurylink.net' phatkhat says:

    That was my takeaway, too. And in the “City of Brotherly Love” of all places. How ironic.

  • phatkhat@centurylink.net' phatkhat says:

    Some of us, while we have empathy and compassion for other humans, don’t like to be around them. I really don’t like people much, never have. I prefer my cats, frankly. I very much enjoy talking with people online, because I do not have to actually interact except verbally. Being WITH people wears me out. And, as an ardent player of Canasta, which I play on Pogo, I always play with robots. They are programmed to mimic human abilities and emotions, and I love them. They do not mind if I am not a gracious loser! ;o)

    And I, too, am old. Five years older than you, in fact. I suppose I have a long history with computers as well. My first job was as a keypunch operator, back in 1966, when women were not allowed in the room with the mainframe, LOL.

    I’ve always liked sci-fi, and Asimov had a lot to say about robotics. At some point, perhaps robots will become sentient. Who knows? Raises interesting points. Remember “Blade Runner”?

  • reedjim51@gmail.com' Jim Reed says:

    They may not become sentient as much as just getting so close nobody knows the difference. That might make us wonder if we are truly sentient, or just temporarily almost. Also the sentient robot probably won’t be a mechanical man. It will be a sentient intelligence on the internet composed of an unknown number of computers in unknown locations. Then these sentient intelligences on the internet will start competing to see which ones can make more money. They learned that from those who used to be their human masters.

Leave a Reply

Your email address will not be published. Required fields are marked *