The New Digital Divide of Emerging Technologies

This piece first appeared on September 26th in icrunchdata news

At the turn of this new century – as the Internet came into its own – there was considerable angst regarding the gap between the nation’s technology “haves” and “have-nots.” Back then only about one in three people were active online, largely because many of the rest lacked the means to do so. Fast forward and the situation has since improved significantly, with 85 percent of Americans regularly using the Internet via desktop computers, televisions, game consoles and a host of mobile devices. Yet despite the improvement, and in part because of it, there has emerged a new, more problematic digital divide.

Recent data from the U.S. Census Bureau is all too familiar: the economic recovery has been feeble and uneven.  Real median household income for most Americans was flat in 2012, while the top 10 percent of earners took home more money than at any other time since records have been kept. There are any number of reasons for the disparity, including the global financial crisis, outsourcing, changes in tax laws and government’s inability or unwillingness to deal with the issue. But the primary culprit, according to Erik Brynjolfsson, the director of MIT’s Center for Digital Business and co-author of Race Against the Machine, is rapid technological change.

The pace of such change is debatable. Some, like engineer and entrepreneur Peter Diamandes contend that digital systems are expanding exponentially. Others, such as author and computer scientist Bob Seidensticker argue that the rate is no greater than at other times in history. Whatever the actual speed, Brynjolfsson believes that technology is moving faster – and permeating society more deeply – than ability of either individuals or organizations to keep up; thus destroying more jobs than it is creating.

To be sure, smart machines have displaced millions of middle income manufacturing and clerical workers. Ordinary citizens effectively using new technologies have also replaced professionals like journalists and publishers, among others. This has forced many to seek employment at much lower skill and pay levels. On the other hand, technical specialists like software developers, data analysts and cloud architects seem to be doing just fine. The upshot is what economists have termed “job polarization.”

But whereas the previous divide could be narrowed by making technology more available to everyone, such pervasiveness may actually create as many problems as it solves; even for those who believe they are on the safe side of the chasm.

What technology gives it can also take away. Just ask past generations of word processing professionals, desktop publishers or high-tech masters of various ilk whose once unique talents abated in the wake of cheap computing power, expanded storage and user-friendly systems. Indeed, the more people able to acquire technology resources and skills, the less valuable they often become. To that end, a number of companies  are already working to make data science more accessible to the general population. Expertise such as this can also be readily transported to any point on the globe. What is more, working for a leading-edge firm no longer  guarantees lasting prosperity, as 40,000 BlackBerry employees have learned the hard way.

What then may be the fate of those caught up in this new, ever-changing digital divide? The good news, according to Harvard economist Lawrence Katz, is that, historically, employment rates over the long term are fairly stable because nations have always been able to create new jobs; many of which may be completely unforeseen. The bad news is that it can take decades for individuals and organizations to acquire the appropriate knowledge and skills, during which time it is estimated that nearly half of all U.S. jobs could be susceptible to digital technologies.

It goes without saying that just about everyone not already well-heeled will have to continually upgrade their capabilities if they hope to do better than simply tread water or sink; including technologists who must keep up not only with the pace of change, but with all of the hype that comes with it. That is the “grand challenge” says MIT’s Brynjolfsson, and it will require  humans and machines to learn to work together.

Gartner Inc. has come to the same conclusion. In the 2013 edition of its Hype Cycle for Emerging Technologies, the research and advisory company has identified three likely trends: humans using technology to enhance their own qualifications; machines replacing humans; and, in a “can’t beat ‘em join ‘em scenario, the two working alongside each other across a broad range of physical and intellectual tasks. Just how such a partnership plays out may determine the future of the new digital divide.