On Working from Home
Since 2020, the disruption in commuting patterns stemming from the Covid-19 pandemic has rapidly increased the number of people working from home. This may feel like an unprecedented shift in human history. But if we look at things from a broader perspective, it will quickly become apparent that we were the outliers: From the beginnings of human civilization until around the late 19th century, most people in most places worked where they lived.
Americans in the workforce today will probably never return to the commuting rhythms they followed until the first few months of 2020. Indeed, many just entering the labor force may spend their entire careers working from nearly any place they choose. A look at the history of office work and the nature of the recent transition suggests that a combination of efficiency gains, worker preference, and technological innovation will likely make working from home the norm for large swaths of Americans going forward. This may be beneficial not only for employers and employees, but for the country as a whole.
THE HISTORICAL NORM
For most of human history, people had little choice but to live more or less where they worked. Skilled tradesmen in urban areas worked from home-based shops. Most of the non-farming, non-trade workforce of those periods lived in military citadels and religious institutions that also functioned as centers for training, service, and worship. Even in sophisticated societies like second-century Rome, Elizabethan England, and 1850s America, a majority of working people lived and worked on farms.
In fact, before railroads and steamships came into widespread use, living far from a workplace was nearly impossible: A trip to work of more than a mile simply took too long. And even if faster transportation had been available, pre-modern agricultural techniques meant that even relatively advanced civilizations needed a sizable number of agricultural workers — most of whom lived where they worked.
Work and home life first began to diverge during the second half of the 18th century. Since early industrial-age technologies could only exist at specific locations, laborers needed to gather at job sites to carry out their work. Early steam engines, for instance, were used almost entirely in mines (their first commercial use in mining predated Robert Fulton’s steamboat patent by over a century), and they required massive quantities of workers who could hardly be expected to live at their work sites. Likewise, the looms and spindles of early textile mills required enormous labor forces to operate. The result, again, was a need for thousands of workers laboring in a centralized location that was too small for many people to live in.
Such necessities developed quickly: In 1840, the textile mills in downtown Lowell, Massachusetts, employed over 8,000 workers in a small area. To put this in perspective, George Washington’s Mount Vernon — a massive commercial enterprise that made Washington one of the richest men in America in the late 18th century — had a little over 300 enslaved laborers and dozens more free people and family members working there at any given time. Mount Vernon comprised nearly 8,000 acres in Washington’s day, meaning there was ample room for the people who worked there to live.
The need to bring considerable numbers of workers together intensified as industrialization continued. Economically efficient use of the Second Industrial Revolution’s main production technologies — machine tools and interchangeable parts — relied on two other technologies, steamships and railroads, to bring large quantities of raw materials to factories and then distribute the finished goods around the world. For this reason, factories had to be situated near rail sidings, rivers, and ports. Similarly, the machines necessary for early industrial-age manufacturing were costly, physically massive, and often bespoke, meaning most businesses couldn’t afford to purchase them in large quantities and distribute them across various locales.
Still, the transition from home-based work to today’s commuter culture took time. In the United States, it wasn’t until around 1900 that a clear majority of people worked outside the family farm. Even then, it wasn’t until the 1930s that the percentage of people working from home-based farms fell below a quarter of the U.S. population. If one includes in the labor-force count the massive number of women engaged in unpaid, value-creating activities like child care, cooking, cleaning, laundering, food preservation, and sewing, most work in the United States, in the broadest sense of the term, took place at the home until women began staffing factories in the early 1940s.
As the century wore on, the need to travel to a centralized workplace each day became more ubiquitous. In 1948, when data measurement in its current form began, about 40% of the labor force was employed in mining, manufacturing, and agriculture — industries that generally require daily presence at a job site. Yet even as the percentage of the labor force working in sectors requiring a physical presence fell to less than 15%, the commuter culture persisted.
Today’s white-collar employers may cite difficult-to-measure factors like creativity, synergy, and corporate culture as reasons for opposing remote work, but initially, offices were also necessary for purely practical reasons. Before computer databases revolutionized legal research in the 1980s, for instance, lawyers needed to work near courthouses and physical law libraries — meaning an attorney working for a major law firm in 1980 had just as much need to travel to the workplace as a woman laboring in a textile mill in the 1840s. At the same time, long-distance, real-time communication, while possible, was too expensive for most firms to use on a daily basis. In 1985 — the year after AT&T’s long-distance monopoly ended — the average cost of an hour-long domestic long-distance phone call for business was $25.60 (equivalent to $68.40 in 2022).
Even when computers entered the scene, mainframes and minicomputers — the mainstays of large companies of the 1970s and 1980s — could only be used via terminals that were physically connected to one another. Local area networks rarely extended beyond a single building, or perhaps a corporate campus. And until 2000, nearly all American legal documents required a physical signature. By then, only 52% of Americans were using the internet regularly, meaning delivering a written message to someone else — say, an inter-office memo, or a draft of a report — often required putting it on paper and distributing physical copies.
Given the office’s long-standing importance to such basic tasks, it’s no surprise that the commute to work proved enduring. As recently as 2019, only about 6% of the American labor force worked entirely from home. While this represented an increase of about two percentage points over the preceding two decades, it still meant that 94% of workers went to an office or job site almost every work day. The rise of the internet may have facilitated international sales of every moveable item, greater access to information, and nearly costless, instantaneous global communication, but it only modestly reversed the long-term trend away from working where one lived.
Then, very suddenly, in March 2020, the bulk of America’s workforce was sent home.
A WORLD MADE NEW
During the pandemic, stay-at-home orders, masking requirements, and mandated social distancing ruled the day. According to the Bureau of Labor Statistics, more than one in three Americans were working remotely at the height of the lockdowns.
As case numbers and death rates climbed and fell, state leaders imposed, lifted, and reimposed lockdown measures, while firm after firm pushed back mandatory return-to-work guidelines. By the one-year mark, things were finally looking up: Remarkably effective vaccines had been developed and were becoming more widely available, a variety of treatments were showing promise against Covid-19, firms were reopening, and states were easing up on social-distancing measures. Yet even before the Delta variant triggered a wave of new cases in the fall of 2021, the trend away from in-person work endured: As late as September of that year, according to a Gallup survey, 45% of all workers were working at least partly from home.
Even after the last major mask mandates were lifted at the end of April 2022, data kept by the office-access and security firm Kastle Systems showed that office-attendance rates nationally were at only 40% of their pre-pandemic levels. In the San Jose office-space market — i.e., Silicon Valley — attendance remained at about a third.
The shift to remote work appears to be primarily a white-collar one. Data from major urban rail systems, which can count their riders directly, offer more evidence of this pattern. In New York, Friday loads on the city’s Metro North commuter rail lines (which serve mostly wealthy, white-collar areas of Connecticut and New York) remain at about 50% of their pre-pandemic levels. In Washington, D.C., even as ridership on rail systems has recovered, parking use at Metro lots remained at 16% of their pre-pandemic levels in the spring of 2022, with only a handful of facilities in the entire system reaching close to 50% most days (pre-pandemic, almost all of them filled up). This suggests that a larger percentage of those taking trains are “transit dependent” riders — generally less-well-off individuals who don’t have cars and are less likely to work at desk jobs that can be done from home.
As of April 2022 — with vaccines widely available, mask mandates rescinded in most areas, and a variety of treatments showing promise against Covid-19 — nearly 8% of the labor force was still working entirely from home. Additional data suggest that, rather than returning to pre-pandemic levels, telecommuting will become more common in the future. In a 2020 working paper for the National Bureau of Economic Research, economists Jonathan Dingel and Brent Neiman found that 37% of all jobs (representing 46% of wages) can be performed entirely at home. McKinsey’s numbers are lower: It estimates that around 20% of jobs globally, and about a third in heavily white-collar economies like the United States, can be done remotely. However, a survey by the Pew Research Center found that 62% of workers with at least a bachelor’s degree say that their jobs can be done entirely from home. The percentage of workers performing white-collar jobs in America has never fallen, so we can expect these numbers to only rise.
What’s more, jobs that current studies assume are not doable from home may become so over time. McKinsey’s study, for example, assumes that most “equipment operation” jobs cannot be performed remotely. While this is true for the majority of such jobs today, the finding may come as a surprise to people who operate rail and aerial vehicles remotely — not to mention engineers working on far more ambitious plans for automobile and truck autonomy.
Of course, trends through early 2022 will not determine the future by themselves. No major public-health entity has declared the pandemic over and, until April 2022, mask mandates remained in place for many service workers and on public transportation. Things could change further, and work-from-home patterns could shift in response. But for reasons of efficiency, employee preference, and technological capabilities, such a reversal appears unlikely.
Working from home appears to bring advantages that nearly every employer desires. A 2021 meta-analysis published in the International Journal of Environmental Research and Public Health found significant advantages that would be hard to replicate at the office. In particular, the researchers observe:
Three factors represent the main advantages of [working from home]: (i) work-life balance, (ii) improved work efficiency and (iii) greater work control. The main disadvantages were (iv) home office constraints, (v) work uncertainties and (vi) inadequate tools.
If the list is even close to correct, the case for remote work is strong. Work-life balance, efficiency, and work control are things that many, if not most, employers strive to achieve. They also relate to both profits for employers and work satisfaction for employees — two key metrics of a firm’s success.
Of the three primary advantages outlined, worker efficiency is the most directly related to a company’s bottom line. And by all accounts, it appears that at-home workers tend to be more productive than their office-based counterparts. Though results will inevitably differ from person to person, job to job, and firm to firm, studies indicate that employees working from home skip about an hour of commuting and work somewhere between 18 and 48 minutes longer than they would have at the office. Depending on the estimates used, this amounts to between two and six additional 40-hour weeks of work per year.
One of the only randomized, controlled trials of the question that meets scientific standards for sample size and replicability found truly extraordinary results: Working from home resulted in increased productivity and reduced absenteeism rates for the entire labor force. (The researchers examined call-center workers in China.) Likewise, a study for the University of Chicago’s Becker Friedman Institute showed that a majority of employees believed they were more productive at home than they expected, while only a trivial percentage believed they were less productive. Overall workforce productivity — which seesawed throughout the pandemic as dislocations forced workers to take all sorts of steps they wouldn’t have otherwise — is higher than it was before March 2020 and saw near record growth at the end of 2021. Though it did appear to decrease in the first quarter of 2022, to argue for a full-time, in-office schedule, one has to believe that the advantages of being in the office on any given day are so great that they outweigh the benefits of this relative increase in work time.
The three problems that the meta-analysis identified — constraints (which mostly refer to spatial constraints), uncertainties (which refer mainly to assurances from bosses and information about progress), and tools — are certainly worth taking into account. However, most of them can be mitigated, especially in the United States.
Two of the factors — the lack of space and tools — may be less of a problem in America than in Europe, where most of the studies in the meta-analysis were conducted. When it comes to space, American homes average 2.4 rooms per person, as compared to an average of 1.9 in the other six G-7 countries. Likewise, American home internet — likely the most important tool for most people working from home — is faster than the internet in most of Europe.
On the other hand, the “uncertainties” problem may be more difficult to surmount in the United States than in much of the rest of the developed world, since most European countries provide a degree of legal job security that’s absent in America’s default system of at-will employment. That said, greater acceptability of working from home may reduce the worries of some employees over time as they see that they are not risking their jobs. And even still, the fact remains that the problems related to working from home have realistic remedies, whereas the benefits are highly desirable.
Of course, nobody doubts that some work tasks are best carried out in a face-to-face environment. An article in Harvard Business Review argues that work from home hurts “creativity, innovation, teamwork, trust, and empathy.” And there is certainly some evidence for some of these claims. A study of Microsoft, for instance, found that creativity suffered when employees worked entirely from home. However, evidence gathered by Deloitte suggests that, with the right structures in place, innovation is perfectly possible in a remote environment.
On the matter of empathy, there doesn’t seem to be an academic consensus as to what “workplace empathy” consists of. While relationships do require face-to-face contact eventually, they may not require it constantly. In fact, meeting co-workers online may make it easier for some to form relationships initially. Today, a plurality of romantic relationships begin online. If so many people can find romantic partners remotely, it stands to reason they can also get to know their co-workers online, so long as they eventually meet in person.
All this suggests that firms can allow their employees to work from home while ensuring that relationships form and creativity flourishes — they just have to go about it deliberately. And when employees do need to gather, employers have to take steps to facilitate the opportunity. What this looks like in practice will differ a great deal from job to job and firm to firm, meaning employers will likely have to go through a period of trial and error before finding a solution that best suits their firm’s circumstances.
Some evidence indicates that they may already be doing so. Even as urban transit systems struggle to recoup half of their daily pre-pandemic riders, air-travel rates are down only slightly (and less than they would be during a recession), while U.S. travel revenues in March 2022 were just 5% below their 2019 levels. The travel industry’s global trade association projects that 2022 revenues will exceed 2019’s. This recovery in long-distance travel suggests that people working remotely may be doing more to get together intentionally.
Commuting by plane (on occasion) may seem like an expensive alternative, but for some, it might actually be cost effective. In Washington, D.C., for instance, the average total costs of a daily commute by car are around $12,000 per year. This is more than enough to pay for an average domestic airplane ticket ($300) twice a month for an entire year. Eliminating the daily commute, therefore, could free up enough money to fund plane tickets for the intermittent in-person office visit. And if employers are able to rent smaller office spaces to accommodate a smaller in-person workforce, they might be able to pick up the tab for periodic employee air travel.
Of course, even if working from home were not quite so efficient, it still may become more common by virtue of the fact that many workers prefer to work remotely.
The polling data are unequivocal on the matter: A Gallup poll from October 2021 shows that only 9% of people who previously worked daily 9-to-5 office jobs want to return to that schedule. Fifty-four percent of workers prefer a hybrid office, while 37% want to work entirely from home. The percentage of employees who want to work remotely is higher than some estimates of the percentage of jobs that can be done entirely from home, meaning it’s theoretically possible that a majority of workers who would prefer to work in person on a full-time basis are those who favor jobs in fields like massage therapy, emergency medicine, and construction — all of which require regular physical presence at a job site.
Employers may not really have a choice if they wish to retain their labor force: One in six workers now say they will quit their jobs if they don’t have a fully remote option, while more than half say they will quit if they can’t have, at minimum, a hybrid option. Nearly half of workers also report they would be willing to take a pay cut in order to work from home, while 36% say they would prefer additional remote work to a raise. Since replacing a worker typically costs between 50% and 200% of that worker’s annual salary, employers that prohibit extensive work from home stand to face high costs.
In short, workers tend to value flexibility — roughly the same percentage want the ability to work from home as want a 401(k) contribution and more paid time off, for example. And working remotely, unlike many other benefits employees may desire — higher pay, greater benefits, a promotion, etc. — doesn’t cost employers anything directly. In fact, in many cases it may reduce overhead costs.
In the long run, most employers will probably have to listen to their white-collar employees — who are the most likely to be able to work from home — by virtue of their short supply. White-collar, office-based jobs are among the easiest to perform remotely, and while there’s no one-to-one correlation between skill level and the ability to work from home (cardiothoracic surgeons must report to operating rooms, for instance, while call-center employees can work at home), internet-based positions tend to require more skill than most in-person work. Given that the American economy may well be suffering from what Marianne Wanamaker, an economist at the University of Tennessee, describes as a “perpetual” labor shortage — especially as the Baby Boom generation retires and workforce-participation rates for males continue their decades-long decline — worker power is poised to rise. Employers who stand to lose current and prospective employees to competitors who offer greater remote-work flexibility would do well to take these trends into account when setting their work-from-home policies.
Employers who resist such trends may put themselves at risk of well-grounded charges of unfairness. In a typical year, just under 10% of Americans move to a new residence. About half of these moves are carried out over a significant distance, and about a quarter consist of moves of more than 500 miles. Given these numbers, it’s likely that most firms with over 25 employees had an employee move a decent distance away from the office during the pandemic. These workers were almost always allowed to keep their jobs, since most everyone who could continue working under lockdown conditions was already doing so from home.
As offices reopen, employers cannot require daily (or even twice-a-week) attendance from people who may now live thousands of miles from the office. Will employers suddenly fire workers who have done a fine job at home simply because they’ve moved? Some will, of course. But doing so might come off as both foolish and cruel. It will also impose direct costs on employers in the form of severance, unemployment, and the process of finding and training a replacement.
Meanwhile, employers who choose not to fire such workers will have to explain why they require office attendance for people who perform similar work but didn’t move during the pandemic. Few will have a compelling rationale. In a massive, 20,000-person survey conducted by Georgetown’s Christine Porath, workers reported that what mattered most to them was feeling respected by their superiors. It’s hard to imagine how employees who cannot work from home when others can would feel they were being treated with respect. While the appearance of unfairness itself isn’t illegal, it is something most employers will want to avoid.
Even if well-founded charges of unfairness don’t convince employers to allow remote work for most everyone who can, they may reconsider in light of the potential legal risk. An employer who allowed some employees to work from home but not others could be subject to investigations and lawsuits if the policies appear to discriminate against protected groups of employees. No doubt some employers will violate the laws out of bigotry or stereotyping, but even those who don’t may end up with facially evenhanded policies that have a disproportionate impact on legally protected groups solely by chance. It only takes a few employers losing lawsuits (rightly or wrongly) for lawyers and human-resources departments to decide that allowing anything other than a high level of flexibility isn’t worth the risk. And it would be hard to argue against them, particularly since allowing employees to work from home seems to be a net advantage for many employers.
Thus, given the overwhelming nature of the preference, the ease at which people can now work from home, the short supply of white-collar workers, and the existence of moral and legal barriers to prohibiting employees from working remotely, worker preference is very likely to win out in the end.
Of course, all of this leads to an important question: If working from home is so desirable and efficient, why weren’t employers already offering it in much larger numbers before the pandemic? The answer is simple: The technology that made it possible for massive numbers of people to work from home is much newer than most would assume. In fact, the internet of the early 2010s could not have supported remote work in the contemporary sense. Three of the key technologies that enabled it — modern video conferencing; cheap, cloud-based collaboration software; and gigabit home internet — are all less than a decade old.
While antecedents to video conferencing and recognizable video phones date back decades, the technology only became practical and pervasive in the relatively recent past. The front-facing cell-phone camera (key to making video conferencing possible from any location) didn’t become common until the introduction of the Apple iPhone 4 in 2010. The LTE data standard — which provides the bandwidth for video conferencing with reasonable reliability anywhere a phone signal is available — didn’t cover 50% of all mobile-phone traffic until five years later. Zoom — a staple of nearly all at-home white-collar jobs — didn’t post a profit until 2019. Zoom’s software was always a leader in the industry, but before the company was profitable, it was too small to attract large enterprise clients. After all, before 2015, most people didn’t have a way to video conference from their homes or elsewhere. Few large companies would have wanted to risk investing in unproven internet technologies before 2019.
Useful, inexpensive, cloud-based collaboration software is also a relatively new development. Google Docs, the most popular cloud-based platform, added the ability to comment on someone else’s document — the key to collaboration — in 2014. The cloud-based versions of Microsoft’s Word and Excel programs came to market in 2017. Before these apps became commonplace, collaborating required users to have the same software installed on their computers. If workers wanted to share a marked-up document widely and with a reasonable level of security, a special server or website created and owned by their company was necessary. So while employees of enterprises that could afford dedicated servers and IT departments were often able to share and collaborate on documents beginning in the 1990s, others weren’t able to do so until less than a decade ago.
Truly high-speed residential internet also arrived just in time for the pandemic. In 2015, a Federal Communications Commission (FCC) “Open Internet” order resulted in declining real investment in the internet, even as the growth of streaming video resulted in much greater demand. As a result, advertised internet speeds barely advanced and, as websites became more complicated, performance likely suffered for many users. But when Donald Trump’s FCC appointees reversed the order in 2018, investment in the internet soared. Gigabit internet with the bandwidth to support simultaneous video streaming and conferencing with ease became available in over 88% of U.S. homes — up from 5% just two years earlier.
In short, the internet of 2017 would have had a difficult time supporting a massive number of remote jobs. But at almost exactly the right moment, practical video conferencing, cheap or free cloud-based software, and faster residential internet made working from home possible for millions of Americans nationwide.
THE FUTURE OF WORK
Throughout American history, changes in the patterns of daily life have often lagged behind the changes in technology, norms, and habits that make them possible. Nearly a century passed between the invention of technologies that first made widespread commuting feasible and the near-universal adoption of the daily commute. A shift away from commuting may take a similarly long time.
Still, there are reasons to believe that widespread increases in remote-work opportunities will emerge in the not-too-distant future. The result could very well be a labor market that does a better job serving employers and employees alike. It may even offer a partial solution to the problems of regional inequality and declining mobility that continue to divide our populace.
Over the past several decades, a handful of large metro-area hubs of innovation and globalization have generally prospered. Most of the country’s rural areas, meanwhile, have seen little real economic growth. Yet in 2021 — a full year into our nation’s unplanned experiment in mass-scale remote work — small towns that combine beautiful environments with cultural amenities, like Meridian, Idaho, and Missoula, Montana, saw some of the fastest rising housing prices in the country.
At the same time, Washington, D.C. — an urban area with a heavily white-collar economy that had grown quickly for decades — witnessed the largest population percentage drop in the country. San Francisco, with its famously expensive real estate, enormous wealth, and dysfunctional housing market, saw rents rise more slowly than inflation for the first time in decades — in fact, it’s no longer America’s least affordable large city. And while 2021 saw a decline in residential moves relative to 2020, many Americans may have been waiting for the pandemic to recede before relocating. It will be some time before more data become available, but the early evidence indicates that a migration of high-skilled workers out of major metropolitan areas may be accelerating.
The Covid-19 pandemic led Americans to work from home in numbers not seen since the early 20th century. While the long-term consequences of this shift will take years to manifest, the results for employers, workers, and the country at large seem to offer more good than bad.
In any case, the trend toward working from home appears unstoppable. Before long, the 20th-century pattern of commuting daily to the workplace may be looked upon as a brief departure from the historical norm.