SUBMIT

Did You Know Algorithms Are Automating Our Thinking

wp content/uploads///ryoji iwata  unsplash

By

Aug. 21 2018, Published 3:18 a.m. ET

Share to XShare to FacebookShare via EmailShare to LinkedIn

Mark Zuckerberg disingenuously poses as a friendly critic of algorithms. That’s how he implicitly contrasts Facebook with his rivals across the way at Google. Over in Larry Page’s shop, the algorithm is king, a cold, pulseless ruler. There’s not a trace of life force in its recommendations and very little apparent understanding of the person keying a query into its engine. Facebook, in his flattering self-portrait, is a respite from this increasingly automated, atomistic world. “Every product you use is better off with your friends,” he says.

What he is referring to is Facebook’s News Feed. Here’s a brief explanation for the sliver of humanity who have apparently resisted Facebook: The News Feed provides a reverse chronological index of all the status updates, articles, and photos that your friends have posted to Facebook. The News Feed is meant to be fun, but also geared to solve one of the essential problems of modernity—our inability to sift through the evergrowing, always-looming mounds of information. Who better, the theory goes, to recommend what we should read and watch than our friends?

Zuckerberg has boasted that the News Feed turned Facebook into a “personalized newspaper.” Unfortunately, our friends can do only so much to winnow things for us. Turns out, they like to share a lot. If we just read their musings and followed links to articles, we might be only a little less overwhelmed than before, or perhaps even deeper underwater. So Facebook makes its own choices about what should be read. The company’s algorithms sort the thousands of things a Facebook user could possibly see down to a smaller batch of choice items. And then within those few dozen items, it decides what we might like to read first.

Article continues below advertisement

Algorithms are, by definition, invisibilia. But we can usually sense their presence—that somewhere in the distance, we’re interacting with a machine. That’s what makes Facebook’s algorithm so powerful. Many users—60 percent, according to the best research—are completely unaware of its existence. But even if they know of its influence, it wouldn’t really matter. Facebook’s algorithm couldn’t be more opaque. When the company concedes its existence to reporters, it manages to further cloud the algorithm in impenetrable descriptions.

We know, for instance, that its algorithm was once called EdgeRank. But Facebook no longer uses that term. It’s appropriate that the algorithm doesn’t have a name. It has grown into an almost unknowable tangle of sprawl. The algorithm interprets more than one hundred thousand “signals” to make its decisions about what users see. Some of these signals apply to all Facebook users; some reflect users’ particular habits and the habits of their friends. Perhaps Facebook no longer fully understands its own tangle of algorithms—the code, all sixty million lines of it, is a palimpsest, where engineers add layer upon layer of new commands. (This is hardly a condition unique to Facebook. The Cornell University computer scientist Jon Kleinberg cowrote an essay that argued, “We have, perhaps for the first time ever, built machines we do not understand. . . . At some deep level, we don’t even really understand how they’re producing the behavior we observe. This is the essence of their incomprehensibility.” What’s striking is that the “we” in that sentence refers to the creators of code.)

Article continues below advertisement

Pondering the abstraction of this algorithm, imagine one of those earliest computers with its nervously blinking lights and long rows of dials. To tweak the algorithm, the engineers turn the knob a click or two. The engineers are constantly making small adjustments, here and there, so that the machine performs to their satisfaction.

With even the gentlest caress of the metaphorical dial, Facebook changes what its users see and read. It can make our friends’ photos more or less ubiquitous; it can punish posts filled with self-congratulatory musings and banish what it deems to be hoaxes; it can promote video rather than text; it can favor articles from the likes of the New York Times or BuzzFeed, if it so desires. Or if we want to be melodramatic about it, we could say Facebook is constantly tinkering with how its users view the world—always tinkering with the quality of news and opinion that it allows breaking through the din, adjusting the quality of political and cultural discourse in order to hold the attention of users for a few more beats.

Article continues below advertisement

But how do the engineers know which dial to twist and how hard? There’s a whole discipline, data science, to guide the writing and revision of algorithms. Facebook has a team, poached from academia, to conduct experiments on users. It’s a statistician’s sexiest dream—some of the largest datasets in human history, the ability to run trials on mathematically meaningful cohorts. When Cameron Marlow, the former head of Facebook’s data science team, described the opportunity, he began twitching with ecstatic joy. “For the first time,” Marlow said, “we have a microscope that not only lets us examine social behavior at a very fine level that we’ve never been able to see before but allows us to run experiments that millions of users are exposed to.”

Facebook likes to boast of the fact of its experimentation more than the details of the actual experiments themselves. But there are examples that have escaped the confines of its laboratories. We know, for example, that Facebook sought to discover whether emotions are contagious. To conduct this trial, Facebook attempted to manipulate the mental state of its users. For one group, Facebook excised the positive words from the posts in the News Feed; for another group, it removed the negative words. Each group, it concluded, wrote posts that echoed the mood of the posts it had reworded. This study was roundly condemned as invasive, but it is not so unusual. As one member of Facebook’s data science team confessed: “Anyone on that team could run a test. They’re always trying to alter people’s behavior.”

Article continues below advertisement

There’s no doubting the emotional and psychological power possessed by Facebook—at least Facebook doesn’t doubt it. It has bragged about how it increased voter turnout (and organ donation) by subtly amping up the social pressures that compel virtuous behavior. Facebook has even touted the results from these experiments in peer-reviewed journals: “It is possible that more of the .60% growth in turnout between 2006 and 2010 might have been caused by a single message on Facebook.” No other company has so precisely boasted about its ability to shape democracy like this—and for good reason. It’s too much power to entrust to a corporation.

The many Facebook experiments add up. The company believes that it has unlocked social psychology and acquired a deeper understanding of its users than they possess of themselves. Facebook can predict users’ race, sexual orientation, relationship status, and drug use on the basis of their “likes” alone. It’s Zuckerberg’s fantasy that this data might be analyzed to uncover the mother of all revelations, “a fundamental mathematical law underlying human social relationships that governs the balance of who and what we all care about.” That is, of course, a goal in the distance. In the meantime, Facebook will probe—constantly testing to see what we crave and what we ignore, a never-ending campaign to improve Facebook’s capacity to give us the things that we want and things that we don’t even know we want. Whether the information is true or concocted, authoritative reporting or conspiratorial opinion doesn’t really seem to matter much to Facebook. The crowd gets what it wants and deserves.

Article continues below advertisement

The Automation of Thinking: We’re in the earliest days of this revolution, of course. But we can see where it’s heading. Algorithms have retired many of the bureaucratic, clerical duties once performed by humans—and they will soon begin to replace more creative tasks. At Netflix, algorithms suggest the genres of movies to commission. Some news wires use algorithms to write stories about crime, baseball games, and earthquakes, the most rote journalistic tasks. Algorithms have produced fine art and composed symphonic music, or at least approximations of them.

It’s a terrifying trajectory, especially for those of us in these lines of work. If algorithms can replicate the process of creativity, then there’s little reason to nurture human creativity. Why bother with the tortuous, inefficient process of writing or painting if a computer can produce something seemingly as good and in a painless flash? Why nurture the overinflated market for high culture, when it could be so abundant and cheap? No human endeavor has resisted automation, so why should creative endeavors be any different?

Article continues below advertisement

The engineering mindset has little patience for the fetishization of words and images, for the mystique of art, for moral complexity and emotional expression. It views humans as data, components of systems, abstractions. That’s why Facebook has so few qualms about performing rampant experiments on its users. The whole effort is to make human beings predictable—to anticipate their behavior, which makes them easier to manipulate. With this sort of cold-blooded thinking, so divorced from the contingency and mystery of human life, it’s easy to see how long-standing values begin to seem like an annoyance—why a concept like privacy would carry so little weight in the engineer’s calculus, why the inefficiencies of publishing and journalism seem so imminently disruptable.

Facebook would never put it this way, but algorithms are meant to erode free will, to relieve humans of the burden of choosing, to nudge them in the right direction. Algorithms fuel a sense of omnipotence, the condescending belief that our behavior can be altered, without our even being aware of the hand guiding us, in a superior direction. That’s always been a danger of the engineering mindset, as it moves beyond its roots in building inanimate stuff and begins to design a more perfect social world. We are the screws and rivets in the grand design.

From WORLD WITHOUT MIND: The Existential Threat of Big Tech by Franklin Foer. Reprinted by arrangement of Penguin Press, part of the Penguin Random House company. Copyright (c) 2017 by Franklin Foer.

Ambition Delivered.

Our weekly email newsletter is packed with stories that inspire, empower, and inform, all written by women for women. Sign up today and start your week off right with the insights and inspiration you need to succeed.

Advertisement

Latest The Main Agenda News and Updates

    Link to InstagramLink to FacebookLink to XLinkedIn IconContact us by Email
    HerAgenda

    Opt-out of personalized ads

    Black OwnedFemale Founder