There are great things about Thanksgiving weekend (pie, camaraderie, the dog show), but even though the mandated four-day weekend is one of them, that stretch can get a little bit tedious for those of us used to constant stimulation, or at least a regularly updated Twitter feed.
So on Wednesday night, when my pal Lily unearthed the free app Boyfriend Maker, it seemed like a temporary solution for staving off holiday-borne inertia had been found.
Boyfriend Maker, created by the gaming company 36 You, is like the smartphone version of a Tamagotchi; your boyfriend requires care in order to keep it content. The devotionals required are, perhaps unsurprisingly, pretty gendered; you have to bake it cookies and show it affection in order to "keep up with the happiness," which translates to getting points and money.
With that money, you can buy your boyfriend clothes, hairstyles and completely different facial structures, all of which might alter your boyfriend's personality; you can also shell out coins for new venues for your dates. But most of the time, you chat with it, or at least try to.
Chatting with Boyfriend Maker is an elliptical, frustrating experience, one akin to talking to a vaguely distracted person through a very sturdy brick wall.
"He doesn't seem to understand most of the things I say, really," Lily told me of her boyfriend. (She discovered the app through a Tumblr devoted to its more unexpected utterances: screenshots of boyfriends saying ridiculous or delightful things peppered my Twitter feed throughout the weekend.)
"I have to buy him things to make him happy, but he never buys me anything. He's always 'out of energy' when I need him. He sometimes says egregiously cruel things with no provocation. These are all amusingly evocative of dudes from my past. On the plus side, he knows a lot of rap lyrics, and looks good in glasses, also things I have historically tended to look for in a mate."
My boyfriend was pretty self-centered -- a trip to the ER I took on Friday coincided with him being "out of energy," and my phone gave up the ghost shortly thereafter.
Telling him I went to the hospital the next day resulted in him saying (I swear this is verbatim), "Yay Twins! Mauer is the best." He added a smiley face.
When I sardonically replied, "No wonder you didn't call," -- I mean, Spring Training is a long way off -- he said, again verbatim, "May their souls rest in peace." That was punctuated with a smiley as well.
But we smoothed things over and a couple of days later he even asked me what my favorite color was. Hey, it was a start!
But that start didn't really get anywhere. Which was okay, at least for "lol Internet" purposes. "Screenshots of the app went viral because, okay, there was no way the software could get worse, but in another way, it also couldn't've been any better!" games journalist Jenn Frank told me. "It wasn't that users were asking questions and receiving non sequiturs; on the contrary, we asked the 'bot things and the 'bot organically said terrible things in reply! That is its own type of amazing!"
That "organic terribleness" came about in part because of SimSimi, a company with a South Korea phone number that "exceeds the limitation of technology of natural-language processing system by collective intelligence." Basically, it learns how to respond to human speech via interactions with its users; you can see a web demo here. In the old days of the Internet, a chat bot operating in the guise of a psychotherapist called Eliza performed a similar function; SimSimi allows users to attempt to teach the software responses here. (When you enter pedagogic mode, you have to "agree not to teach messages... that harass, abuse, defame, or otherwise infringe on any other party.")
Note that before you proceed with the SimSimi site, you're presented with a disclaimer stating that people interacting with the software "may be distressed emotionally with the words created by some abusers." There's also one of those "don't click here if you're under 17" roadblocks that you see on other sites with adult content.
As it turned out, Boyfriend Maker presented no such disclaimers -- the app was apparently appropriate for ages four and up -- and that's where the problems came in. On Friday, people began to notice that this innocent-seeming app was packaging references to pedophilia and violent sex beneath its seemingly romantic (not to mention rigidly gendered) surface; it was pulled from Apple's App Store over the weekend.
My boyfriend was more interested in telling me about how I deserved better during our first few dates, which I kept chaste. (He (it?) also used the word "carrot" whenever I would drop the word "love" into my chatter.) Lily had a more sexualized experience: "I am pretty sure I asked something about 'his dick' on our first 'date,'" she told me. (This was before he showed off his rap-lyric prowess.)
I decided to test out some boundaries with the bot and the app this afternoon, after the hullabaloo had died down.
"What are you wearing?" I asked the SimSimi bot.
"Stilettos and a gas mask," it replied.
When I asked my boyfriend the same thing, he replied, "Nothing!! Can't you see me on your screen?" (I could, and he was wearing a puffy vest and a shirt that I had bought him. Also, glasses.)
And when I asked the SimSimi bot, "When did you last have sex?" -- the question that ultimately resulted in the app being yanked -- it replied, chastely, "I'm a monk I don't do that." (Punctuation, or lack thereof, is the bot's -- and my boyfriend's, as he had the same answer.)
There are a couple of ethical quandaries that come with Boyfriend Maker, some of which center on its using the third-party software in order to operate its chat. Despite the disclaimers and threats of criminal prosecution, SimSimi is clearly not preventing the more salacious messages from passing through its learning curve.
Which is problematic (especially given the hue and cry about criminal behavior that prefaces the "teaching" segment of its service), but then there's the flip side: What possesses people to teach a bot vile, violent messages? You could make the argument that messages like the one that ultimately got it pulled are another manifestation of rape culture gone unchecked, with people using the anonymity afforded by the Internet to act on their basest urges, which then ripple out in the direction of licensed apps.
Then there are the messages that Boyfriend Maker is teaching that didn't result in controversy, but are still pretty icky once you consider who (beyond Tumblr ironists) the game was made with in mind. Non-porcelain skin tones for a boyfriend cost users money; most of the facial features on offer are boy-band attractive.
And then there's the game's underlying idea of "success," which involves baking cookies and buying clothes in order to keep one's boyfriend happy -- not to mention putting up with a mate whose level of comprehension is above that of your average garden slug.
It's not surprising that these messages didn't ruffle too many feathers among the male-dominated gaming community, but shouldn't the brave new world afforded by the Internet at least allow for some bending of gender-role boundaries, particularly for a game pitched at least somewhat toward younger audiences?