Algorithms: Based on your preferences, you may also enjoy this column

June 8, 2015
Picture illustration shows a 3D-printed Facebook logo in front of logos of news publishers it has tied up with


3D-printed Facebook logo in front of logos of news publishers it has tied up with to launch ‘Instant Articles,” in this picture illustration. REUTERS/Dado Ruvic

One key buzzword these days is “algorithm,” which technically means any computational formula but which has come to mean a formula that predicts our behavior. Amazon and Netflix have algorithms that predict what books a user is likely to want to read or what movies and TV shows he or she is likely to want to watch.

Facebook has an algorithm that predicts the news a user is likely to want. Dating sites like Match.com and OkCupid use algorithms to predict with whom we would fall in love. Google, with the most famous algorithm of all, predicts what we want when we type a search term. As one scientist put it, “Algorithms rule the world.”

But there is a problem with that rule. Because algorithms are based on the past, they only satisfy preconditioned responses. They cannot give us what is new, surprising, challenging or different. Difference is what they are designed to dismiss. In effect, they hollow out life.

A neon Google logo is seen at the new Google office in Toronto

A neon Google logo at the new offices in Toronto, November 13, 2012. REUTERS/Mark Blinch

As far as businesses are concerned, this is not only perfectly fine, it is perfect. Algorithms are the holy grail of marketing. The whole idea behind them is to give each of us exactly what we already know that we want — to get rid of the guesswork for companies and for us.

When Netflix was deciding on House of Cards, for example, it devised an algorithm that showed, among other things, users liked director David Fincher, liked actor Kevin Spacey and liked political intrigue. Voila! Just mix these ingredients and you’ve got a successful program — tailored to the audience’s taste.

Another mathematician found an algorithm that identified the various components of hit songs. Though no one has yet written songs to this prescription, the formula has been able to predict which songs would be hits.

Yet another algorithm, the one computed by Facebook, sends you stories via News Feed that are designed to meet your predetermined interests. Another, from IBM, called CRUSH, is calculated to mimic Steven Spielberg’s futuristic sci-fi film Minority Report, in which Tom Cruise anticipated crimes by enabling police departments to predict where the next crimes are likely to be committed.

So what’s wrong with taking guesswork out of life? Doesn’t it make it easier for us not to have to wade through TV shows, albums and news stories we probably wouldn’t enjoy, or for police to focus their attention on the places where the next crime is going to occur?

An employee demonstrates a "Police Pad" at the Algorithm factory in Tbilisi

An employee demonstrates a ‘Police Pad’ at the Algorithm factory in Tbilisi, January 11, 2012. REUTERS/David Mdzinarishvili

What is wrong is just this: They provide us with a closed loop that keeps feeding us what we have already experienced. It puts each of us, and the larger culture, in the position of a boat that runs in circles.

Take Mad Men. We couldn’t possibly know if we would like the work of the show’s creator and frequent writer, Matt Weiner, or like the actor John Hamm, or like a series about advertising, because we had never seen a Weiner series, or much of Hamm, or any show about advertising. Algorithmically, at least, past cannot be prologue to something unprecedented, as Mad Men was.

Similarly, if you only opt to listen to the music or to certain musical configurations you already like, you would be depriving yourself of departures from your  routine. In the 1950s, that would have meant cutting yourself off from rock ‘n’ roll, in the 1960s, the Beatles, in the 1990s, hip hop.

The predictive police work of the IBM CRUSH could even build prejudice into the system. “If you have a group that is disproportionately stopped by the police,” Ian Brown, associate director of Oxford University’s cybersecurity center, told the Guardian, “such tactics could just magnify the perception that they have been targeted.” We are already sowing the results of that kind of thinking.

Singhal, senior vice president of search at Google, gestures as he speaks at the garage where the company was founded on Google's 15th anniversary in Menlo Park, California

A Google executive explains the company has overhauled its search algorithm to address longer, more complex queries in Menlo Park, September 26,2013. REUTERS/Stephen Lam

A recent Pew Research survey on millennials and the news found that 61 percent of millennials on the Internet use Facebook as their primary news source — a staggering figure. But here’s the rub. Because News Feed is algorithmically generated, users are only getting the news that meets their own predispositions.

Though Facebook has been at pains to deny it, even commissioning a study of the effect of its News Feed, scholars say that Facebook users are not only unlikely to be exposed to any opinion outside their safety zone, but also that individuals who receive their news this way are likely to be more polarized from one another. One of the scholars, Christian Sandvig, told the Washington Post, “Selectivity and polarization are happening on Facebook, and the News Feed curation algorithm acts to moderately accelerate” those things. Algorithms don’t create community that breaks through preconceptions. They create isolation that reinforces them.

As algorithms continue to expand exponentially, this self-satisfied isolation is no small concern. Algorithms may take the guesswork out of marketing, crime prevention and even romance. But they also take the guesswork out of life itself. Life, at least as most of us think of that adventure, is dependent on guesswork, on uncertainty, on the new and the unknown. Winnowing is much of what life is about. Without all those contingencies, both wonderful and awful, we are trapped in a cultural Groundhog Day — sentenced to some variation of the same thing again and again and again.

There is an adjective for a world that is tailored to our specifications without anything that need disturb us. The word is “narcissistic.” Algorithms may be a way to give us what we want, and nothing else. But they are also a way for us to bend the world to our own image. That makes for predictability and for the predictably dull.

Our boat keeps circling, letting us see the scenery we have already seen and nothing else. So while algorithms may ultimately rule the world, the world they rule is shrunken.

2 comments

We welcome comments that advance the story through relevant opinion, anecdotes, links and data. If you see a comment that you believe is irrelevant or inappropriate, you can flag it to our editors by using the report abuse links. Views expressed in the comments do not represent those of Reuters. For more information on our comment policy, see http://blogs.reuters.com/fulldisclosure/2010/09/27/toward-a-more-thoughtful-conversation-on-stories/

I am not sure what statistical survey they used to attain, 61% of Millennials, use FACEBOOK for news but that’s completely inaccurate.

Nobody uses FACEBOOK for news that I know. It’s too easy to just get texts updates from the news services such as this site.

Posted by ReutersNews2 | Report as abusive

http://www.pewresearch.org/fact-tank/201 5/06/01/political-news-habits-by-generat ion/ They used this ^ survey.

Posted by A.J.113 | Report as abusive