Monday, September 1, 2014

The Hidden Price of Personalization

The scandal over Facebook’s “Emotional Contagion” experiment, in which Facebook suppressed posts of positive or negative sentiment in order to gauge the effect on its users, shined the spotlight on Facebook’s algorithm, and that’s a good thing. The more we realize that algorithms built by corporations control the information we see, the more likely we are to remain vigilant consumers of that information. This is no easy task.

When it comes to using technology to maintain an objective perspective of the world, we can be our own worst enemies. Even though technology gives us access to an incredible array of opinions and news sources, we tend to engage with world views similar to our own. Rather than broadening our perspective, our ability to personalize our newsfeeds can limit our exposure to different politics and cultures. It’s easy to construct “echo chambers” for ourselves that reinforce our own biases.

So when it comes to using technology to stay informed, we don’t exactly have a good track record. That said, our awareness of the “echo chamber” phenomenon is pretty high – high enough that we have a term for it. Some people even make a concerted effort to undo the effect by following folks with opposing views, as excruciating as that can be. It's safe to say we are aware of our own bias when we gather news. 

But, what we haven’t been as vigilant about is understanding how the tools that we use to gather that information, namely, personalization algorithms, carry their own bias. This bias is not of our own making, but reflects the agenda of the corporation that creates the personalization algorithm. 

This is the revelation of the Emotional Contagion experiment. Simply put, Facebook was willing to manipulate the mood of nearly 1 million people in order to understand its product better. This is bias in the sense that Facebook will filter the information a user receives in order to advance its own agenda. 

I’m conflicted about this experiment. One part of me hopes that Facebook, under Zuckerberg, is genuinely interested in the social good it can create. Even though the experiment failed to properly request consent from its participants (a clear ethical breach), if the end goal is to send ripples of happiness across Facebook’s network of a billion people, that sounds like a worthwhile sociological pursuit. On the other hand, Facebook is a business operating in a capitalist society, it’s naïve to think it is motivated by anything other than profit. Would Facebook experiment with users’ emotions in order to learn what moods make people more receptive to ads? I’m not sure we know the answer to that question. Either way, the fact that Facebook is even capable of manipulating your mood without you knowing is a massive wake up call. To state it clearly: we do not know how personalization algorithms are filtering information for us, and therefore algorithms can have an agenda we’re unaware of.

It's hard to detect algorithmic bias for a number of reasons. Mostly, because we tend to think of our newsfeed as our own. It’s almost impossible not to. Our feeds are overflowing with content from friends and family and we’ve hand picked the publishers, celebrities and brands we want to follow. But, this is only true on a superficial level. Beneath the surface, engineers of personalization algorithms, like the newsfeed, manage a tricky paradox: the algorithm must feel as if it is in the service of its end users, while also being in the service of the corporation that created it.

This issue is broader than Facebook. Algorithms control the stories you see on Yahoo’s homepage feed, the results of a Google search, the offers you see on online travel sites like Orbitz. Even Twitter, the bastion of algorithm free newsfeeds is reportedly changing course. There are countless other web services that use algorithms to personalize the user experience. All of these services are pre-baked with the agenda of their creators. Mostly, they are tinkering with the algorithm to maximize the volume and effect of advertising, upsell, or improve their product, but that does not mean they can’t be used to advance social or political agendas.

For example, despite being widely viewed as a neutral platform, Facebook has a history of introducing its own agenda into the newsfeed. In 2010, it placed a message at the top of the newsfeed encouraging users to vote in the presidential election. In 2012, Facebook urged users to register as organ donors. In both these instances, Facebook’s tactics were highly visible and advanced what are two universally accepted social goods. 

But, imagine a situation where the “get the vote out” message only appeared to users who Facebook had identified as Democrats. Or that negative news stories about Republicans were displayed in more newsfeeds than those about Democrats. With nearly 200 million Facebook users in the US, such tinkering could sway an election. I’m not suggesting Facebook did that, or that it would, but that it can. It’s not illegal or a violation of the user agreement. The primary reason why this is not plausible is because it might be detected, which would discredit the platform’s perceived neutrality and presumably cause a massive drop in users. Fox News bias is delivered with an in-your-face transparency, MSNBC the same. Algorithmic bias is more subtle.

It’s early innings in the world of personalization, but the stakes are about to get higher. Soon the world will be blinking with an interconnected network of “smart” screens that knows who you are, and with a high degree of certainty. This ubiquity of Internet-connected entities (the so-called Internet of Things) means we will soon interact with more and more “personalized” services. What if your TV, your work computer, your phone, your home computer, your tablet, your car’s dashboard, the screen at the shelf in the Walgreens, in the back of the taxi, and in your workplace elevator all acted in concert? 


At first, all this personalization will benefit the end user. Imagine if the screen in the back of your taxi could communicate with your phone to know your home address, that Starbucks would start to whip up your favorite drink as soon as you walk in the store, or that Walgreens will know that your “smart” fridge is out of milk. This personalization will be highly beneficial for the end users, because attracting loyal end users is essential for the business to survive. Yet, as these services become indispensable and the novelty turns into expectation, we must remember that personalization always comes with a price.

1 comment: