The Last Human Job?
Weekly Article
Shutterstock / PHOTOCREO Michal Bednarek
July 27, 2017
What does it mean to be human? It’s the question at the center of decades of artificial intelligence stories—from 2001: A Space Odyssey to Ex Machina to Star Trek.
The answer, time and again, is that to be human is to have empathy. To be human is to care.
Now, data seems to back the science fiction. The latest research predicts that many care professions will be mostly safe from imminent robot replacement.
It reflects an emerging consensus around the characteristics that make humans distinct from machines, and will therefore be necessary for success in an increasingly automated labor market: care and empathy. Though artificial intelligence may be making leaps and bounds that will allow it to take on repetitive, predictable, and production-heavy jobs, it hasn’t done as well with tasks that require constant adaptation, context, nuance, and emotional intelligence, like being an early child educator, psychiatrist, or a nurse. These are the kinds of jobs that require a human touch, the thinking goes, and so they therefore will be the last to be fully automated.
There is, as demonstrated by a growing trove of think pieces, sometimes a degree of forced reassurance of human superiority in this line of thinking. We fear the future in which, by the World Bank’s estimate, 57 percent of current work could be automated in the next 20 years. So we cling to our care capabilities as a strength humans will long—or perhaps always—have over artificial intelligence.
And yet, the idea that humans will own all care or empathy-centric professions is only partly true. Acknowledging this care gray area may be essential to better understanding what care is, how it works, and why we should value it more.
Care work is one of the only forms of labor in which time often doesn’t equal money. Women do disproportionately more care work—paid and unpaid—than men. Economists estimate that women around the world work about 39 days more than men each year because of the unequal distribution of work like cooking, cleaning, gathering water, and caring for children, the ill, and the elderly. One report from ActionAid found that those additional hours of unpaid care work, spread out over the average life expectancy for women globally, amounts to a staggering extra 23 years of work. All of those years amount to what is essentially an enormous subsidy: According to the AARP, the value of unpaid caregiving for adults in the United States was $470 billion in 2013—nearly matching the sales of the world’s largest company, Wal-Mart.
This work isn’t only undervalued; it’s also largely invisible when it comes to calculating economic productivity measures. That’s wild when you consider that the McKinsey Global Institute found that if all of this unpaid work was compensated at a rate approximating minimum wage, it would add about $10 trillion to global economic output, and 13 percent of global GDP. What’s more, devaluing care doesn’t just apply to unpaid labor; it applies to paid care work, too. The care economy workforce—comprised mostly of women of color—often earns close to or less than minimum wage. This low compensation traps them in “the cycle of working poverty,” explains Patrice Willoughby, a former Obama administration official and current lawyer who works on technology and care policy issues. These jobs include home health aides, registered nurses, and personal care aides. Though we may be socialized to assume that the more you get paid, the more important your job must be, don’t be fooled: Without these jobs, our economy would grind to a halt. Care work enables all other work to happen. In a world without care workers, many of us would be forced to leave our jobs to care for an aging or ill family member.
So if care jobs become the last human jobs, could that encourage employers and policymakers to recognize and value it as the economically critical work that it is?
Whether or not AI and automation could change the way we value this work will depend on how much care work—and what kind of care work—humans continue to do. Though we’re far from building machines that can mimic human emotional intelligence, Albert “Skip” Rizzo, the director for Medical Virtual Reality at the University of Southern California’s Institute for Creative Technologies, doesn’t think we can rule it out.
Rizzo sees the processes behind certain kinds of emotional intelligence and empathy-signaling less as magic and more as an advanced data-analysis system. In other words, people with advanced emotional IQs are better at taking in lots of data points from another person at once—eye contact, the tone, inflection, and cadence of their voice, and breathing patterns—and feeling a desire to empathize, help, or mentor that person. Noting that he still sees the future of care “very much in the domain of humans,” and that there’s still something unique about a human touch, Rizzo does see potential to bring in AI for the overworked or burned-out care worker.
“We want to fill gaps where there isn’t a live provider, or where there is but they don’t have the time or patience to be able to do it perfectly all the time—to take a load off them,” he said. “AI is here to help people be better carers.”
That’s a common refrain among developers and proponents of automated care: They frame AI as an enhancer, rather than displacer, of human capabilities. It’s a tool that will help people provide better care, fill in large and growing care gaps, and empower some populations—like the elderly—to be independent for longer.
“It is a way of augmenting the person, just as your mobile phone augments your ability to remember important things,” explains Tandy Trower, the founder of Hoaloha Robotics, in an email. The technology, he says, isn’t intended as a substitute for human interaction, but as a facilitator of connection with people. It’s intended to ameliorate the approaching gray tsunami—a rapidly aging population with a dearth of people to meet its needs.
The choice between automated care and no care at all may be a false one, says University of Virginia Professor and sociologist Allison Pugh. There are, she says, two paths before us: “There’s the possibility that we would revalue care work, recognize it, bring it out of invisibility, and call it what it is to be truly human. On the other hand, I can see the way we could continue devaluing it by relegating it to algorithms and robotics or machines.”
Pugh isn’t saying that automated work lacks value, but rather that quality care requires people—and that sometimes it’s not quite clear how certain kinds of care work require this, and what the process is behind them.
“If you talk to actual care providers, most of them say the core processes at the base of care require face-to-face human interaction—it involves seeing the other person, bearing witness to who they are and what they’re going through, and responding appropriately,” she said.
Pugh sees providing a halfway version of care to the masses not as a solution to the care gap, but as a slippery slope that may exacerbate health care inequality. There is one version of the future in which “the affluent buy their way to a very personalized, customized version of human care. Another version is where we actually value care for what it is, and see it as a human right—augmented by machines, but not replaced.”