Few topics have generated as much conversation in recent years as the Future of Work. That might reveal something about the recent past: job loss through economic disruption feels especially real after the global financial crisis, and technological innovations like social media provide fertile ground for speculation. But all this needn’t lead to fear or to a sense of resignation to technological changes. Adam Seth Litwin can prove it.
The Cornell University School of Industrial and Labor Relations Professor won an Ideas Worth Teaching Award for Technological Change at Work (aka TΔ@W), which illuminates that technology’s social impact is a matter of choice—starting in the classroom itself.
The last few years have been heavy with talk of the “Future of Work” and anxieties about automation and education. Taken against this landscape, what do you think the impact of your course has been?
I actually conceived of Technological Change at Work (TΔ@W) over a decade ago while a PhD student at MIT. Being a social scientist among technologists opened my mind to the juncture of these two worlds, and this intersection soon came to describe my research interests. At the same time, the impact of technology on the “future of work” became a teaching interest, too. I began gathering notes, thoughts, and readings to construct the course. Still, despite the swelling box of materials and encouragement from my colleagues, the demands of writing a dissertation and then being a junior faculty member at Johns Hopkins led me to defer the actual teaching of the course again and again.
Much of the appeal of Cornell for me was the interest my soon-to-be senior colleagues took in the interplay of technological change and work. After I was here for a year or so, one of them, in particular, insisted that I bring the course to fruition. When I tore open the TΔ@W box about three years ago, I realized that the “future of work” I had been preparing for had largely arrived. To paraphrase the novelist William Gibson, the future was already here. It just wasn’t very evenly distributed. So, the first lesson I want my students to take from the course is that they must stop thinking that we are still waiting for something.
The second lesson is that people—managers, policymakers, workers, consumers, and citizens—drive change. Technology merely serves these people as a tool for doing that. With this realization, the more Pollyanna among my students conclude that it was not technology per se that delivered us vaccines and airplanes. The more cynical likewise discover that robots are at best scapegoats for the apparent economic dislocation of blue- and white-collar workers alike.
Over the course of the semester, the students build an understanding that management decision-making and the institutions that we ourselves build fully mediate the impact of technological change on work and workers. In other words, we decide whether technology assumes the most treacherous tasks, freeing us up to better leverage our cognitive capacities, and making us all richer, or whether it gnaws away at a fixed lump of work, pushing wages below subsistence levels where it’s not driving workers out of employment altogether.
What immediately stands out about your syllabus are the icons used to denote different forms of multimedia content. What considerations went into the selection of these multimedia assets in the course design?
My natural inclinations are old school. I still enjoy reading actual books and articles, and my mind remains oriented toward print as the best medium for conveying complex ideas and relationships. Multimedia, nonetheless, offers some advantages that text does not.
First, to ignore podcasts, infographics, blog posts, or tweets is to exclude some of the richest thinking on technological change over the last few years. Even if I could construct a syllabus by relying entirely on conventional, print sources, I would be shortchanging the students—essentially teaching them in analog in a digital age.
Second, having grappled with learning disabilities of my own, I am always looking to find new ways to connect with my students. When I first started teaching, I had very limited options. I could assign an entire film, but there was no obvious way to ask students to listen to an archived episode of a radio show, especially when some students didn’t even have broadband. Podcasts technically existed, but they were not that accessible and the selection was limited. Add to that that I was only a few years older than my students, so we naturally we had fairly similar learning preferences.
Fast forward a decade, and even my in-laws (I have in-laws now?) are listening to podcasts. Friends and colleagues are blogging and producing mash-ups. Consulting firms, in particular, are leveraging new, interactive ways of examining data online. Believe me, when I have a student who wants to dive into McKinsey data generating different visualizations to uncover relationships between skill development and unemployment, I’m not going to tell her to shut her laptop and read an article. I’m going to ask if we can share her screen with the class, project it on the board at the front of the lecture hall, and have her walk us through it!
In general, I think we should embrace new technology where it improves our economic and social lot, which—under the right institutional arrangements and incentives—should be quite frequently. Given that, I would be remiss to rely solely on printed resources, and I would be wrong to promulgate a run-of-the-mill “no devices” rule in class. As in society at large, we need to make these choices together, especially in a classroom.
In fact, one of the first tasks we undertake as a group is the development of a “technology-in-class” policy. I facilitate the in-class discussion around it and oversee (and only occasionally intervene) in the wiki that students use to develop it. Believe it or not, at the start of the discussion, there is usually a small cadre of students who think devices should be verboten. But, there is always a plurality that foresees all sorts of ways that the constructive use of devices could add value to class discussions and exercises.
I try to walk the students through the process of deciding how we can exploit the benefits of laptops, tablets, and smartphones while preventing their abuse or use in ways that promote incivility or corrode the educational experience. Encouraging students to develop their own rules of engagement with and disengagement from technology in the classroom helps build community. This bottom-up policy benefits the class in countless ways, one of which is that the students enforce the tech-in-class policy themselves over the course of the semester.
You just mentioned that your students leave the course with the notion that that they will decide which technologies take hold and whether each will be deployed to the economic and social benefit of just a few or to the benefit of many. What makes you think they or we can alter the path that technology takes?
I’ll give you an example from the healthcare sector, one I think about a lot. Digital communication and telepresence—smartphones, really—are changing the way frontline workers care for patients. Right now, home care work is about as lousy a job as it gets—underpaid, overworked. Policymakers and others wonder why there is a supposed “shortage” of folks willing to take on this work, yet they are unwilling to increase funding to the agencies that pay home health aides—potentially because they are disproportionately women, immigrants, people of color, and others without much economic power or political clout.
Bring in smartphone technology, and a number of states want to institute something called electronic visit verification (EVV). This accelerates the “Taylorization” of home care work, tracking these workers via GPS, giving them step-by-step instructions regarding what they must do for which patient and in what order. This allows—or really, forces—home care agencies to micromanage to the extreme—all predicated on the belief that since these jobs are so poor, the people doing them must be incompetent and undependable.
Alternatively, one can acknowledge that home health aides spend more time with the patient than anyone else, and they see them operating in their daily lives. Rather than using smartphones to micromanage home care workers, we could instead use smartphones to connect these workers to the rest of the care team and to serve as case managers of sorts—a conduit to all of the clinicians and social workers involved in the patient’s care. Would this require that this work be upskilled? Yes, it would, and it would improve patient outcomes, too. Of course, if you’re expecting more from these workers, you will need to pay them more, too, which will probably go a long way toward addressing the supposed “shortage.”
So, why is this model not more prevalent? Because it requires active policymaking and unconventional managers and tech developers—but not purely altruistic ones. With the right policies in place, those taking these steps would make money for their shareholders, too.
So, profitability, better jobs, better patient outcomes, and without spending any more money than we do now because of higher productivity…
There’s a proverb that I have seen attributed to both Confucius and Vint Cerf, so whether you want to go Eastern or Western, ancient or modern, I’ve got you covered. Put succinctly, “The man who moves a mountain begins by carrying away small stones.” More crudely, if a boulder atop a mountain looks like it is coming dislodged, poised to crush your entire village, you alone might not be able to stop it. But, if you move a few stones and you can mobilize those around you and those around them to do the same, you can redirect that boulder.
My students are all going to be managers, policymakers, professionals, parents, teachers, workers, citizens. Take your pick. Every single one of them is in a position to carry some stones. All of us can help direct that boulder along the path we really want it to go. Indeed, we must. I am confident that we will.