This is what happens when you get to like-happy on Facebook

There's a great Andy Warhol quote you've probably seen: "I think everybody should like everybody." You can buy posters and plates with pictures of Warhol and that phrase plastered across his face in Helvetica. But when you view Warhol's quip in its full context, from a 1963 interview in

ARTnews, it is just as much a prescient description of how we interact on social media today as it is a definition of pop art.

Warhol: Everybody looks alike and acts alike, and we're getting more and more that way. I think everybody should be a machine. I think everybody should like everybody.

ARTnews: Is that what pop art is all about?

Warhol: Yes. It's liking things.

ARTnews: And liking things is like being a machine?

Warhol: Yes, because you do the same thing every time. You do it over and over again.

This sounds a lot like Facebook, where the default response is a "like." New job? Like. ASOS has ten per cent off with free shipping today only? Like. Bedbugs? Oh, I'm so sorry. Like. By putting that binary option on everything it shows us, Facebook encourages us to be really efficient, Warhol-esque liking machines. And every like informs Facebook's algorithm, which uses that data to feed you more stuff it thinks you will like. By that logic, the more you like, the more you will like, in an ever-escalating spiral of satisfaction. To follow that to its logical end, in Facebook's perfect world we would like everything we see -- from

status updates to news stories to ads. If its algorithm truly works as intended, we shouldn't be able to stop ourselves from liking all the stuff it shows us.

That, of course, would be just fine with Facebook's advertisers.

Ad budgets are won or lost based on how many people give an ad or brand the thumbs-up. It may seem insignificant to you, but the fortunes of ad agencies, media empires and Facebook itself hang on your every click. Liking is an economic act.

This summer, I decided to be Facebook's perfect user and like everything I saw. For 48 hours, I liked everything it sent my way -- status updates, suggested pages, ads -- even if I hated it. I wanted to see how it would affect what Facebook showed me. I wanted to see what would change if I constantly rewarded the robots offering up News Feed content, if I continually said, "Good job, robot, I like this." The results, it turned out, were rapid and dramatic.

The first thing I liked was LivingSocial, a discount service. My friend Jay liked it, a fact that was announced at the top of my feed. Then I liked two status updates from other friends. So far, so good. But the fourth thing was something I actively disliked: a bad joke - or at least a dumb one. I liked it anyway.

Right away, Facebook responded to my newfound appreciation by giving me more to appreciate. You might have noticed that when you like an article on Facebook, it often responds by suggesting other items it thinks you might also be interested in. Let's say you like a story about cows that you see on the Modern Farmer website.

Facebook will immediately present you with three more options below that cow story: "related links", in Facebook parlance. Probably more stories about cows.

Related links quickly became a problem, because as soon as I liked the four related links below a brand, Facebook gave me four more. And then four more. And then four more. If I kept it up, I'd be stuck in an eternal loop of related links. So I made a rule: I would like the first four, but no more.

Sometimes liking was awkward. My friend Hillary posted a picture of her toddler, Pearl, with bruises on her face. It was titled "Pearl versus

the concrete." I didn't like it at all! It was sad. Normally this was the kind of News Feed item that would compel me to leave a comment, instead of hitting a button. Oh well. Like.

I liked one of my cousin's updates, which he had re-shared from US politician Joe Kennedy, and was besieged with Kennedys to like.

I liked The New York Times. I liked Coupon Clipinista.

After an hour, there were no human beings in my feed. For all the talk about Facebook as a social network, this was a stark reminder that it ultimately exists to get me to click on ads.

Likewise, content mills rose to the top. Upworthy and the Huffington Post owned nearly my entire feed. That first night, as I scrolled through my News Feed, the updates I saw were (in order):

Huffington Post, Upworthy, Huffington Post, Upworthy, a Levi's ad, Space.com, Huffington Post, Upworthy.

When I checked my phone just before bed, I saw a conservative post about Gaza. Ah, crap. This was a fraught issue that I was not eager to weigh in on. I pressed Like and turned in for the night.

By the next morning, my News Feed had moved very, very far to the right. I was offered the chance to like the US Second Amendment (the one about the right to bear arms) and some anti-immigrant page. I liked them both. I liked Tea Party-affiliated Ted Cruz. I liked former Republican US

Presidential candidate Rick Perry. The Conservative Tribune came up again and again. I got to learn its very particular syntax.

Once I saw this pattern, I started noticing it everywhere. And it wasn't just employed by upstart publications you've never heard of. SFGate, the San Francisco Chronicle's website, uses a similar tactic. It is a very specific form of Facebook messaging, designed to get you to engage by first being provocative and then giving you a question at the end that encourages you to interact.

If you take the bait by liking it you'll be shown more and more from that publisher.

I was also weirded out to see that my laptop and mobile News Feeds were becoming increasingly divergent. On the laptop, although I still saw mostly branded content, I continued to see the odd update from my friends. But in less than 24 hours, my mobile feed was nearly devoid of human content. I was only presented with the chance to like ads or stories from websites. On that little screen, Facebook's robots decided that the way to keep my attention was by hiding the people and showing me only what machines had pumped out.

By day two I began to dread visiting Facebook. My News Feed had not only drifted further right, it had oddly also drifted further left -- a digest of bipartisan extremism. What began as scattershot likes of random stories had snowballed into rigid ideology. Leftie posts from MSNBC's Rachel Maddow, The Raw Story and Daily Kos were interspersed with items that were so right-wing that I was afraid liking them would land me on a watch list.

This is a problem much bigger than Facebook. It reminded me of how we talk at each other instead of to each other. We set up our political and social filter bubbles and they reinforce themselves.

Our media diets become hyperniche feeds that cater to our prejudices and never give us any other perspective. We go down rabbit holes of special interests until we're lost in the queen's garden, cursing everyone above ground.

Worse than the fractious political tone my feed took on was how deeply stupid it became. I was given the chance to like a BuzzFeed post of some guy dancing, and another that asked "Which

Titanic Character Are You?" A third BuzzFeed post informed me that "Katy Perry's Backup Dancer Is the Man Candy You Deserve." "A cloud that looks like a penis." "Stop what you're doing and look at this baby who looks exactly like Jay Z." My feed was showing the worst kind of media tripe. I liked it all.

Although I expected that what I saw in my News Feed might change, I never expected my behaviour to have an impact on my friends' experiences. That first night, my friend John sent me a message. "Have you been hacked?" The next morning, another friend sent a note. "My fb feed is literally full of articles you like, it's kind of funny," she said. "No friend stuff, just Honan likes."

I replied with a thumbs-up.

Eventually, I would hear from someone who worked at Facebook, who had noticed my activity and wanted to connect me with the company's PR department. The person I got in touch with explained that my News Feed was performing as it should. I was liking all kinds of updates and pages from brands that I normally wouldn't have, so of course it showed me more of them. "Your News Feed is what you make it," the spokesperson explained in an email. "You connected with over 1,000 new pages in 48 hours, and your News Feed changed to show you mostly page content, triggered by these new connections. If you had made 1,000 new friends in 48 hours, your News Feed would be mostly new-friend content."

Maybe so. And it does speak to how adaptive the News Feed is.

But the thing is, I was also liking every update I saw from my friends. Yet in just a day, those updates from actual human beings largely vanished. Maybe that's because Facebook rewards volume over substance. The more content something churned out, the more likely I was to see it and the more likely I was to interact with it, which meant the more likely I was to see more of the same. That meant that publishers and advertisers won out.

It also meant that, by liking everything, I turned Facebook into a place where there was nothing I liked.

This article was originally published by WIRED UK