In Praise of Short-Term Thinking

For hundreds of years, economic observers have feared that machines were making human workers obsolete. In a sense, they’ve been right.

Chilanga Cement / Flickr

Pretty much from the moment machines could handle any sort of repetitive task, humans have worried about their own impending uselessness. For example, it was in 1772 that the British writer Thomas Mortimer decried “those [machines] which are intended almost totally to exclude the labor of the human race.” His particular concern was sawmills, which “if introduced into our dockyards etc. would exclude the labor of thousands of useful workmen.”

These sorts of anecdotes are often trotted out for stories about the current unease over technology that might replace human labor. The standard telling is straightforward: Early economic and political theorists such as Mortimer simply lacked the imagination to envision how the economy would grow and grow and grow, how technology would generate new industries and jobs, and how leisure would diversify, providing humans with essentially endless new ways to spend all the money they would earn. Technological advances often appeared to take away jobs, yes, but in the long run they led to more jobs, and all was well. This is then followed by a giant “but” and then the question is posed, “Is this time different?”

A new paper by three economic historians is a welcome departure from that narrative (despite its title, “The History of Technological Anxiety and the Future of Economic Growth: Is This Time Different?”). Instead of using history’s cute examples to invite considerations of the present, the authors—Joel Mokyr, Chris Vickers, and Nicolas L. Ziebarth—illuminate how technology has, repeatedly, brought about painful upheavals for workers in industrial society.

Because the thing is, although jobs have continued to proliferate in the long run, these predictions weren’t exactly wrong in the short run: Machines do replace humans. In fact, replacing humans is often entirely the point. As the economic historian Robert C. Allen has shown, the spinning jenny was invented in England precisely because wages were high, and thus it was worth it to mill-owners to invest in a machine that would allow them to reduce the number of workers needed to make yarn. Even once the spinning jenny was invented, mill owners in France and India (where wages were relatively low) at first did not invest in them—they could just have the cheap humans do the work. Meanwhile, many of Thomas Mortimer’s sawyers surely lost their livelihoods. (As Keynes accurately and succinctly put it, “In the long run we are all dead.”)

But there is more to the upheavals than mere job loss, and even workers who managed to remain employed have found themselves affected. As Mokyr, Vickers, and Ziebarth describe, the concerns about the ways that technology was reshaping work were often not so much about the quantity of work available (with shortages leading to unemployment) but about the quality of that work—whether it was safe, whether it afforded workers sufficient autonomy, and whether it enabled them to have good lives. For example, they write of the “great anxiety” that people felt “in moving to factory work,” which for the first time separated workplace from home on a mass scale. Contrast that, they write, with today, when “people worry about the exact opposite phenomenon with the lines between spheres of home and work blurring.”

This history is much more valuable than some pat lesson about the foolhardiness of prognosticators. Even for those who were right, the fact that they saw the future accurately mattered very little, as no one could have known they were right at the time. What matters, then, isn’t whether early observers were right or wrong about the long term, but whether they were sufficiently empathetic in the short term. As the authors write, “While the predictions of widespread technological unemployment were, by and large, wrong, we should not trivialize the costs borne by the many who were actually displaced.”

Today, we are not much better at making reliable long-term predictions about the future of the economy than 18th-century onlookers. But there is plenty we can and do know about how technological change is shaping work today—who gets it, how they are compensated, where the work gets done.

Perhaps the biggest change, one that Mokyr, Vickers, and Ziebarth highlight, is the growth in so-called “nonemployer business,” which sometimes goes by the buzzier (and misleading) term “sharing economy,” and refers, in part, to gigs coordinated online via apps such as Uber, AirBnB, or Amazon’s Mechanical Turk. Other shifts include the greater “flexibility” many employees throughout the economy have with regard to where they work and their hours (which can mean unpredictable schedules based on demand and, as a result, unpredictable wages), and a recent doubling of the percentage of workers who primarily work from home (from 2.3 to 4.3 between 1980 and 2010). Many of these people—untethered from employment in the traditional, legal, W-2 sense—struggle, because the systems in place for protecting workers (unemployment insurance, worker’s comp, minimum wage rules, social-security taxes) don’t apply to them.

In other words, there is plenty to figure out with regard to how technology is reshaping the economy without looking to the far distant future. The question isn’t whether the machines are coming; they are already here, and have been for a long time.

Rebecca J. Rosen is a senior editor at The Atlantic, where she oversees coverage of American constitutional law and government in the Battle for the Constitution series.