Read Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy Online

Authors: Cathy O'Neil

Tags: #Business & Economics, #General, #Social Science, #Statistics, #Privacy & Surveillance, #Public Policy, #Political Science

Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (18 page)

BOOK: Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy
12.84Mb size Format: txt, pdf, ePub
ads

The money saved, naturally, comes straight from employees’ pockets. Under the inefficient status quo, workers had not only predictable hours but also a certain amount of downtime. You could argue that they benefited from inefficiency: some were able to read on the job, even study. Now, with software choreographing the work, every minute should be busy. And these minutes will come whenever the program demands it, even if it means clopening from Friday to Saturday.

In 2014, the
New York Times
ran a story about a harried single mother named Jannette Navarro, who was trying to work her way through college as a barista at Starbucks while caring for her four-year-old. The ever-changing schedule, including the occasional clopening, made her life almost impossible and put regular day care beyond reach. She had to put school on hold. The only thing she could schedule was work. And her story was typical. According to US government data, two-thirds of food service workers and more than half of retail workers find out about scheduling changes
with notice of a week or less—often just a day or two, which can leave them scrambling to arrange transportation or child care.

Within weeks of the article’s publication, the major corporations it mentioned announced that they would adjust their scheduling practices. Embarrassed by the story, the employers promised to add a single constraint to their model. They would eliminate clopenings and learn to live with slightly less robust optimization. Starbucks, whose brand hinges more than most on fair treatment of workers, went further, saying that the company would adjust the software to reduce the scheduling nightmares for its 130,000 baristas. All work hours would be posted at least one week in advance.

A year later, however,
Starbucks was failing to meet these targets, or even to eliminate the clopenings, according to a follow-up report in the
Times
. The trouble was that minimal staffing was baked into the culture. In many companies, managers’ pay is contingent upon the efficiency of their staff as measured by revenue per employee hour. Scheduling software helps them boost these numbers and their own compensation. Even when executives tell managers to loosen up, they often resist. It goes against everything they’ve been taught. What’s more, at Starbucks, if a manager exceeds his or her “labor budget,” a district manager is alerted, said one employee. And that could lead to a write-up. It’s usually easier just to change someone’s schedule, even if it means violating the corporate pledge to provide one week’s notice.

In the end, the business models of publicly traded companies like Starbucks are built to feed the bottom line. That’s reflected in their corporate cultures and their incentives, and, increasingly, in their operational software. (And if that software allows for tweaks, as Starbucks does, the ones that are made are likely to be ones that boost profits.)

Much of the scheduling technology has its roots in a powerful
discipline of applied mathematics called “operations research,” or OR. For centuries, mathematicians used the rudiments of OR to help farmers plan crop plantings and help civil engineers map highways to move people and goods efficiently. But the discipline didn’t really take off until World War II, when the US and British military enlisted teams of mathematicians to optimize their use of resources.
The Allies kept track of various forms of an “exchange ratio,” which compared Allied resources spent versus enemy resources destroyed. During Operation Starvation, which took place between March and August 1945, the Twenty-first Bomber Command was tasked with destroying Japanese merchant ships in order to prevent food and other goods from arriving safely on Japanese shores. OR teams worked to minimize the number of mine-laying aircraft for each Japanese merchant ship that was sunk. They managed an “exchange ratio” of over 40 to 1—only 15 aircraft were lost in sinking 606 Japanese ships. This was considered highly efficient, and was due, in part, to the work of the OR team.

Following World War II, major companies (as well as the Pentagon) poured enormous resources into OR. The science of logistics radically transformed the way we produce goods and bring them to market.

In the 1960s, Japanese auto companies made another major leap, devising a manufacturing system called Just in Time. The idea was that instead of storing mountains of steering wheels or transmission blocks and retrieving them from vast warehouses, the assembly plant would order parts as they were needed rather than paying for them to sit idle. Toyota and Honda established complex chains of suppliers, each of them constantly bringing in parts on call. It was as if the industry were a single organism, with its own homeostatic control systems.

Just in Time was highly efficient, and it quickly spread across
the globe. Companies in many geographies can establish just-in-time supply chains in a snap. These models likewise constitute the mathematical underpinnings of companies like Amazon, Federal Express, and UPS.

Scheduling software can be seen as an extension of the just-in-time economy. But instead of lawn mower blades or cell phone screens showing up right on cue, it’s people, usually people who badly need money. And because they need money so desperately, the companies can bend their lives to the dictates of a mathematical model.

I should add that companies take steps not to make people’s lives
too
miserable. They all know to the penny how much it costs to replace a frazzled worker who finally quits. Those numbers are in the data, too. And they have other models, as we discussed in the last chapter, to reduce churn, which drains profits and efficiency.

The trouble, from the employees’ perspective, is an oversupply of low-wage labor. People are hungry for work, which is why so many of them cling to jobs that pay barely eight dollars per hour. This oversupply, along with the scarcity of effective unions, leaves workers with practically no bargaining power. This means the big retailers and restaurants can twist the workers’ lives to ever-more-absurd schedules without suffering from excessive churn. They make more money while their workers’ lives grow hellish. And because these optimization programs are everywhere, the workers know all too well that changing jobs isn’t likely to improve their lot. Taken together, these dynamics provide corporations with something close to a captive workforce.

I’m sure it comes as no surprise that I consider scheduling software one of the more appalling WMDs. It’s massive, as we’ve discussed, and it takes advantage of people who are already struggling to make ends meet. What’s more, it is entirely opaque. Workers
often don’t have a clue about when they’ll be called to work. They are summoned by an arbitrary program.

Scheduling software also creates a poisonous feedback loop. Consider Jannette Navarro. Her haphazard scheduling made it impossible for her to return to school, which dampened her employment prospects and kept her in the oversupplied pool of low-wage workers. The long and irregular hours also make it hard for workers to organize or to protest for better conditions. Instead, they face heightened anxiety and sleep deprivation, which causes dramatic mood swings and is responsible for an estimated 13 percent of highway deaths. Worse yet, since the software is designed to save companies money, it often limits workers’ hours to fewer than thirty per week, so that they are not eligible for company health insurance. And with their chaotic schedules, most find it impossible to make time for a second job. It’s almost as if the software were designed expressly to punish low-wage workers and to keep them down.

The software also condemns a large percentage of our children to grow up without routines. They experience their mother bleary eyed at breakfast, or hurrying out the door without dinner, or arguing with
her
mother about who can take care of them on Sunday morning. This chaotic life affects children deeply. According to a study by
the Economic Policy Institute, an advocacy group, “Young children and adolescents of parents working unpredictable schedules or outside standard daytime working hours are more likely to have inferior cognition and behavioral outcomes.” The parents might blame themselves for having a child who acts out or fails in school, but in many cases the real culprit is the poverty that leads workers to take jobs with haphazard schedules—and the scheduling models that squeeze struggling families even harder.

The root of the trouble, as with so many other WMDs, is the modelers’ choice of objectives. The model is optimized for
efficiency and profitability, not for justice or the good of the “team.” This is, of course, the nature of capitalism. For companies, revenue is like oxygen. It keeps them alive. From their perspective, it would be profoundly stupid, even unnatural, to turn away from potential savings. That’s why society needs countervailing forces, such as vigorous press coverage that highlights the abuses of efficiency and shames companies into doing the right thing. And when they come up short, as Starbucks did, it must expose them again and again. It also needs regulators to keep them in line, strong unions to organize workers and amplify their needs and complaints, and politicians willing to pass laws to restrain corporations’ worst excesses. Following the
New York Times
report in 2014, Democrats in Congress promptly drew up bills to rein in scheduling software. But facing a Republican majority fiercely opposed to government regulations, the chances that their bill would become law were nil.
The legislation died.

In 2008, just as the great recession was approaching,
a San Francisco company called Cataphora marketed a software system that rated tech workers on a number of metrics, including their generation of ideas. This was no easy task. Software programs, after all, are hard-pressed to distinguish between an idea and a simple string of words. If you think about it, the difference is often just a matter of context. Yesterday’s ideas—that the earth is round, or even that people might like to share photos in social networks—are today’s facts. We humans each have a sense for when an idea becomes an established fact and know when it has been debunked or discarded (though we often disagree). However, that distinction flummoxes even the most sophisticated AI. So Cataphora’s system needed to look to humans themselves for guidance.

Cataphora’s software burrowed into corporate e-mail and mes
saging in its hunt for ideas. Its guiding hypothesis was that the best ideas would tend to spread more widely through the network. If people cut and pasted certain groups of words and shared them, those words were likely ideas, and the software could quantify them.

But there were complications. Ideas were not the only groups of words that were widely shared on social networks. Jokes, for example, were wildly viral and equally befuddling to software systems. Gossip also traveled like a rocket. However, jokes and gossip followed certain patterns, so it was possible to teach the program to filter out at least some of them. With time, the system identified the groups of words most likely to represent ideas. It tracked them through the network, counting the number of times they were copied, measuring their distribution, and identifying their source.

Very soon, the roles of the employees appeared to come into focus. Some people were idea generators, the system concluded. On its chart of employees, Cataphora marked idea generators with circles, which were bigger and darker if they produced lots of ideas. Other people were connectors. Like neurons in a distributed network, they transmitted information. The most effective connectors made snippets of words go viral. The system painted those people in dark colors as well.

Now, whether or not this system effectively measured the flow of ideas, the concept itself was not nefarious. It can make sense to use this type of analysis to identify what people know and to match them with their most promising colleagues and collaborators. IBM and Microsoft use in-house programs to do just this. It’s very similar to a dating algorithm (and often, no doubt, has similarly spotty results). Big Data has also been used to study the productivity of call center workers.

A few years ago,
MIT researchers analyzed the behavior of call center employees for Bank of America to find out why some teams
were more productive than others. They hung a so-called sociometric badge around each employee’s neck. The electronics in these badges tracked the employees’ location and also measured, every sixteen milliseconds, their tone of voice and gestures. It recorded when people were looking at each other and how much each person talked, listened, and interrupted. Four teams of call center employees—eighty people in total—wore these badges for six weeks.

These employees’ jobs were highly regimented. Talking was discouraged because workers were supposed to spend as many of their minutes as possible on the phone, solving customers’ problems. Coffee breaks were scheduled one by one.

The researchers found, to their surprise, that the fastest and most efficient call center team was also the most social. These employees pooh-poohed the rules and gabbed much more than the others. And when all of the employees were encouraged to socialize more, call center productivity soared.

BOOK: Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy
12.84Mb size Format: txt, pdf, ePub
ads

Other books

Smuggler's Kiss by Marie-Louise Jensen
B00BUGFFGW EBOK by Boyle, Megan
Put on the Armour of Light by Catherine Macdonald
The Green Trap by Ben Bova
Christmas From Hell by R. L. Mathewson
The Renegade by Terri Farley
Puberty by Jillian Powell
Bogart by Stephen Humphrey Bogart
A Risk Worth Taking by Laura Landon