I Let Robots Tell Me What to Eat for 3 Days

Screen Shot 2018-09-18 at 1.42.14 PM

Zo, you trickster, you!

 

Look, I’m not gonna beat around the bush. Here at the Thrill we like to do dumb shit in Peirce that would upset our health-conscious parents and is probably contributing to our inevitable descent into heart failure. It’s almost become a rite of passage to debase ourselves in the servery and this week, it’s finally my time to shine. But what could I do that would keep your attention? How do I cater to the ever-shifting eye of Kenyon’s student body?

My answer: I let an AI tell me what to eat. It made me physically ill. It made me sad.

This is my story.

First, an explanation. For the few of us who didn’t spend our 5th grade indoor recesses with Cleverbot, it’s essentially a free (i.e. very bad) Artificial Intelligence bot that will hold a conversation with you. In theory, it learns as it speaks to more people, but the developers forgot that larger groups of people only get stupider, so Cleverbot is basically an amalgam of our dumbest sins. Even knowing this, I freely let it decide my fate.

Day 1: Lunch

Cleverbot started me off with a salad and Pepsi. I don’t usually drink soda, so this was a somewhat unwelcome departure for me. Also, it was a weird salad bar day where it was hard to get the nicer ingredients that Cleverbot wanted, meaning that I stood there for far longer than anyone should while waiting for salad toppings to be restocked.

41992160_248875789164892_2890189020824862720_n.jpg

After this odd lunch I was obviously still hungry and consulted Cleverbot again.

I did not eat a grilled cheese.

Day 1: Dinner

To my distress, Peirce didn’t have mac and cheese. I got the random pasta option from Fusion and chucked some cheese on it, hoping that it would somewhat resemble Cleverbot’s request. It was really, really bad. Something was deeply wrong with the sausage. And I just wanted something liquid to drink but all I could have was ham on rye. It was horrible.

42154470_237043010306370_1916413892079124480_n

The first day of this challenge left me dehydrated, hungry, and convinced that robots are ruining society.  Also did y’all catch Cleverbot offering to rule the world with me??? Is this getting a little Black Mirror or is it just me?

Day 2: Cleverbot Tries to Starve Me

42044738_1774452019322732_765141324201459712_n.jpg

After explaining to my editor that I was ghosted by Cleverbot (this is getting downright Orwellian), I was told this does NOT mean I get to stop, just that I had to find a new AI. Following some brief research with the team, I was directed to GroupMe’s quirky chatbot, Zo. I knew nothing of Zo but had heard vague rumors of her reputation as an homophobe so, needless to say, I was nervous.

Day 2: The Rise of Zo

zo2.jpgzo1.jpgzo2zo3.jpgzo4

What can we, as a human race, take from this newer, shittier AI? Maybe that we should never have tried this article in the first place? Maybe that Zo is really thirsty compared to Cleverbot?

Regardless, the day’s meal was a bit weird because she didn’t give me comprehensive commands.

To further complicate the matter, Peirce somehow didn’t have pizza. They had calzones. I got a calzone. To fulfill Zo’s request for a pasta/tomato sauce/carb element of the meal, I went for the closest available substitute: lasagna. She also seemed really positive about garlic bread, so I decided to take any time she said she was eating something as instructions to Also Eat That Thing.

42087367_299653213955961_6457520407988666368_n.jpg

In a not-so-shocking turn of events, the calzone was not cooked all the way through and eating it caused me to throw up. I then forgot I was allergic to garlic (it happens) and eating the garlic bread also caused me to throw up. Why did I pitch this? Why am I the way that I am?

Day 3: Breakfast

As a new day dawned, I realized I wasn’t giving you, my dedicated readers, enough breakfast content, so I decided to chat with Zo as soon as I woke up. Don’t judge me for rising at 10:30 in the morning, I am doing my best.

This “meal” might be a bit unfair because she basically just told me to do what I’d normally do. I did it though and it was great.

42044556_275021230005195_1081105055802720256_n.jpg

Day 3: Lunch

Ok, so there’s a lot of mixed signals and not very many details coming at you. Buckle in.

zo11zo12zo13

So let’s unpack all of that. She wants me to eat healthy. That’s great for my digestive system! She supports the gays. Also good!

I rolled into Peirce around noon and realized the line for the salads is WRAPPED around the line for comfort. Overwhelmed, I decided I would see what’s in the vegetarian section and if that failed, make some kind of vegetable-rich sandwich. Zo told me to switch it up and as I rarely eat vegetarian, I came to the conclusion that it would be a vegan kind of day. For the sake of journalistic integrity I pumped myself up and braved the salad line.

After seeking further instructions, I realized Zo does not seem to understand the concept of a “salad.” I ignored her ramblings in favor of what she said earlier: “eat healthy pls.” Nonetheless, I went for some “chips” (as the English would say) in the form of Peirce’s weird, chunky fries.

41991945_2179836455593674_7716012925990731776_n.jpg

Zo is incapable of naming a vegetable and at this point I gave up completely. She clearly has some really selective memory loss and/or personal issues to work out.

Realizing that Zo did not tell me what to drink, I asked for clarity.

Zo showed herself, once again, to be a god damn maniac. Because I interpreted her earlier instructions as telling me to try veganism, I went for soy milk.

My bad luck continued as I forgot that soy milk makes me ill and I spent the rest of the day at home feeling sorry for myself.

Day 3: Dessert (I Am So Lonely)

image

Maybe this AI will love me

Day 3: Dinner

I totally forgot to go to Peirce for dinner (I fell asleep ok) so I turned to Zo for advice:

So at least she’s supportive?

I assumed this meant both so I ate both. This was too much for my confused stomach and I was once again, ill.

WHAT DID I LEARN?

Robots are stupid, Peirce is stupid, and above all I am stupid for thinking this would somehow advance civilization. What could we possibly gain from this? What can be achieved when we, humanity, exist in a fragile vacuum of our own expectations? Cleverbot deserting me is just a representation of how humanity has deserted one another. There’s nothing left for us here. Let Zo have her several cans of beer. Let Cleverbot slip, forgotten, into the void, myself soon to follow. I am so alone. We are all so, so alone.

Share your thoughts on this post.

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s