Algorithms: Based on your preferences, you may also enjoy this column
One key buzzword these days is “algorithm,” which technically means any computational formula but which has come to mean a formula that predicts our behavior. Amazon and Netflix have algorithms that predict what books a user is likely to want to read or what movies and TV shows he or she is likely to want to watch.
Facebook has an algorithm that predicts the news a user is likely to want. Dating sites like Match.com and OkCupid use algorithms to predict with whom we would fall in love. Google, with the most famous algorithm of all, predicts what we want when we type a search term. As one scientist put it, “Algorithms rule the world.”
But there is a problem with that rule. Because algorithms are based on the past, they only satisfy preconditioned responses. They cannot give us what is new, surprising, challenging or different. Difference is what they are designed to dismiss. In effect, they hollow out life.
As far as businesses are concerned, this is not only perfectly fine, it is perfect. Algorithms are the holy grail of marketing. The whole idea behind them is to give each of us exactly what we already know that we want — to get rid of the guesswork for companies and for us.
When Netflix was deciding on House of Cards, for example, it devised an algorithm that showed, among other things, users liked director David Fincher, liked actor Kevin Spacey and liked political intrigue. Voila! Just mix these ingredients and you’ve got a successful program — tailored to the audience’s taste.
Another mathematician found an algorithm that identified the various components of hit songs. Though no one has yet written songs to this prescription, the formula has been able to predict which songs would be hits.
Yet another algorithm, the one computed by Facebook, sends you stories via News Feed that are designed to meet your predetermined interests. Another, from IBM, called CRUSH, is calculated to mimic Steven Spielberg’s futuristic sci-fi film Minority Report, in which Tom Cruise anticipated crimes by enabling police departments to predict where the next crimes are likely to be committed.
So what’s wrong with taking guesswork out of life? Doesn’t it make it easier for us not to have to wade through TV shows, albums and news stories we probably wouldn’t enjoy, or for police to focus their attention on the places where the next crime is going to occur?
What is wrong is just this: They provide us with a closed loop that keeps feeding us what we have already experienced. It puts each of us, and the larger culture, in the position of a boat that runs in circles.
Take Mad Men. We couldn’t possibly know if we would like the work of the show’s creator and frequent writer, Matt Weiner, or like the actor John Hamm, or like a series about advertising, because we had never seen a Weiner series, or much of Hamm, or any show about advertising. Algorithmically, at least, past cannot be prologue to something unprecedented, as Mad Men was.
Similarly, if you only opt to listen to the music or to certain musical configurations you already like, you would be depriving yourself of departures from your routine. In the 1950s, that would have meant cutting yourself off from rock ‘n’ roll, in the 1960s, the Beatles, in the 1990s, hip hop.
The predictive police work of the IBM CRUSH could even build prejudice into the system. “If you have a group that is disproportionately stopped by the police,” Ian Brown, associate director of Oxford University’s cybersecurity center, told the Guardian, “such tactics could just magnify the perception that they have been targeted.” We are already sowing the results of that kind of thinking.
A recent Pew Research survey on millennials and the news found that 61 percent of millennials on the Internet use Facebook as their primary news source — a staggering figure. But here’s the rub. Because News Feed is algorithmically generated, users are only getting the news that meets their own predispositions.
Though Facebook has been at pains to deny it, even commissioning a study of the effect of its News Feed, scholars say that Facebook users are not only unlikely to be exposed to any opinion outside their safety zone, but also that individuals who receive their news this way are likely to be more polarized from one another. One of the scholars, Christian Sandvig, told the Washington Post, “Selectivity and polarization are happening on Facebook, and the News Feed curation algorithm acts to moderately accelerate” those things. Algorithms don’t create community that breaks through preconceptions. They create isolation that reinforces them.
As algorithms continue to expand exponentially, this self-satisfied isolation is no small concern. Algorithms may take the guesswork out of marketing, crime prevention and even romance. But they also take the guesswork out of life itself. Life, at least as most of us think of that adventure, is dependent on guesswork, on uncertainty, on the new and the unknown. Winnowing is much of what life is about. Without all those contingencies, both wonderful and awful, we are trapped in a cultural Groundhog Day — sentenced to some variation of the same thing again and again and again.
There is an adjective for a world that is tailored to our specifications without anything that need disturb us. The word is “narcissistic.” Algorithms may be a way to give us what we want, and nothing else. But they are also a way for us to bend the world to our own image. That makes for predictability and for the predictably dull.
Our boat keeps circling, letting us see the scenery we have already seen and nothing else. So while algorithms may ultimately rule the world, the world they rule is shrunken.