The New Digital Divide of Emerging Technologies

This piece first appeared on September 26th in icrunchdata news

At the turn of this new century – as the Internet came into its own – there was considerable angst regarding the gap between the nation’s technology “haves” and “have-nots.” Back then only about one in three people were active online, largely because many of the rest lacked the means to do so. Fast forward and the situation has since improved significantly, with 85 percent of Americans regularly using the Internet via desktop computers, televisions, game consoles and a host of mobile devices. Yet despite the improvement, and in part because of it, there has emerged a new, more problematic digital divide.

Recent data from the U.S. Census Bureau is all too familiar: the economic recovery has been feeble and uneven.  Real median household income for most Americans was flat in 2012, while the top 10 percent of earners took home more money than at any other time since records have been kept. There are any number of reasons for the disparity, including the global financial crisis, outsourcing, changes in tax laws and government’s inability or unwillingness to deal with the issue. But the primary culprit, according to Erik Brynjolfsson, the director of MIT’s Center for Digital Business and co-author of Race Against the Machine, is rapid technological change.

The pace of such change is debatable. Some, like engineer and entrepreneur Peter Diamandes contend that digital systems are expanding exponentially. Others, such as author and computer scientist Bob Seidensticker argue that the rate is no greater than at other times in history. Whatever the actual speed, Brynjolfsson believes that technology is moving faster – and permeating society more deeply – than ability of either individuals or organizations to keep up; thus destroying more jobs than it is creating.

To be sure, smart machines have displaced millions of middle income manufacturing and clerical workers. Ordinary citizens effectively using new technologies have also replaced professionals like journalists and publishers, among others. This has forced many to seek employment at much lower skill and pay levels. On the other hand, technical specialists like software developers, data analysts and cloud architects seem to be doing just fine. The upshot is what economists have termed “job polarization.”

But whereas the previous divide could be narrowed by making technology more available to everyone, such pervasiveness may actually create as many problems as it solves; even for those who believe they are on the safe side of the chasm.

What technology gives it can also take away. Just ask past generations of word processing professionals, desktop publishers or high-tech masters of various ilk whose once unique talents abated in the wake of cheap computing power, expanded storage and user-friendly systems. Indeed, the more people able to acquire technology resources and skills, the less valuable they often become. To that end, a number of companies  are already working to make data science more accessible to the general population. Expertise such as this can also be readily transported to any point on the globe. What is more, working for a leading-edge firm no longer  guarantees lasting prosperity, as 40,000 BlackBerry employees have learned the hard way.

What then may be the fate of those caught up in this new, ever-changing digital divide? The good news, according to Harvard economist Lawrence Katz, is that, historically, employment rates over the long term are fairly stable because nations have always been able to create new jobs; many of which may be completely unforeseen. The bad news is that it can take decades for individuals and organizations to acquire the appropriate knowledge and skills, during which time it is estimated that nearly half of all U.S. jobs could be susceptible to digital technologies.

It goes without saying that just about everyone not already well-heeled will have to continually upgrade their capabilities if they hope to do better than simply tread water or sink; including technologists who must keep up not only with the pace of change, but with all of the hype that comes with it. That is the “grand challenge” says MIT’s Brynjolfsson, and it will require  humans and machines to learn to work together.

Gartner Inc. has come to the same conclusion. In the 2013 edition of its Hype Cycle for Emerging Technologies, the research and advisory company has identified three likely trends: humans using technology to enhance their own qualifications; machines replacing humans; and, in a “can’t beat ‘em join ‘em scenario, the two working alongside each other across a broad range of physical and intellectual tasks. Just how such a partnership plays out may determine the future of the new digital divide.

What Hollywood Can Teach Social Media

This piece first appeared on September 9, 2013 in incrunchdata news.

It may come as a surprise to film fans and critics, with the number of recent flops like The Lone Ranger, White House Down and R.I.P.D., that this has been a banner summer for Hollywood, thanks in no small part to audiences outside the United States. Indeed, global moviegoers, particularly those in emerging markets, are increasingly influencing how motion pictures get made; and there is a lesson here for social networks. American movie attendance has steadily declined over the past decade, albeit growing just about everywhere else. Accordingly, international audiences for Hollywood films now account for nearly 70% of global box-office. Thus, major studios are increasingly more inclined to churn out variations of franchises in formats that play well in rising foreign markets, even if they may bomb at home. Sequels, for instance, sell a great many more tickets outside the U.S., which helps explain why Pirates of the Caribbean 5 is now in pre-production.

It may come as a surprise to film fans and critics, with the number of recent flops like The Lone Ranger, White House Down and R.I.P.D., that this has been a banner summer for Hollywood, thanks in no small part to audiences outside the United States. Indeed, global moviegoers, particularly those in emerging markets, are increasingly influencing how motion pictures get made; and there is a lesson here for social networks.

American movie attendance has steadily declined over the past decade, albeit growing just about everywhere else. Accordingly, international audiences for Hollywood films now account for nearly 70% of global box-office. Thus, major studios are increasingly more inclined to churn out variations of franchises in formats that play well in rising foreign markets, even if they may bomb at home. Sequels, for instance, sell a great many more tickets outside the U.S., which helps explain why Pirates of the Caribbean 5 is now in pre-production. Moreover, while the popularity of 3-D is softening here, the biggest hits abroad this summer were all three-dimensional.

2f20132f092fTwitter-Hollywood-Star2So far at least one social network finds itself in a similar situation. Though Facebook’s membership surpassed one billion in 2012, it actually lost 10 million American visitors. What is more, its revenue from new markets is growing at more than twice the rate of North America and Europe. Yet Twitter, Linkedin, Pinterest and Google + are also experiencing considerable growth among active global users because nearly one-in-four people on the planet now use social media. The largest group lives in Asia Pacific, home to more than four times the number of social inhabitants than North America. Greater numbers of users also reside in Latin America, the Middle East and Africa; and their first encounters with social networking often take place on mobile devices.

In fact, this year marks the first time mobile internet users in developing countries will outnumber those in industrial nations. In February, China surpassed the U.S. as the world’s largest iOS and Android market for smartphones and tablets. A significant segment of its online population has leapfrogged personal computers and gone straight to social via mobile systems such as cross-platform instant messaging services WeChat, WeMeet and Weixon.

Consequently, Mark Zuckerberg recognizes that future users will come almost entirely from emerging economies, and like American movie studios, social networks will have to balance diverse needs and interests.

The most immediate challenge is the fact that the majority of the world’s seven billion mobile phones are not yet smart so social networks must find less elaborate ways to engage their owners. To that end, Facebook recently launched Internet.org, its partnership with several handset and infrastructure manufacturers committed to “shaping the networked society” by improving the cost and efficiency of delivering Internet services to even the cheapest phones.

Should they succeed, they will still have to adapt to significant differences in online behavior. Asians, for example, are less protective of their privacy than Westerners. The research firm Forrester, has found that 75% of American social media users are passive spectators or “lurkers,” the opposite is true in markets like China and India, where 76% and 80% of their respective social populations are active content creators.

Then there is the matter of censorship. Like many American films, Facebook and Twitter are currently prohibited in China. YouTube is outlawed in Pakistan. Saudi Arabia has banned WhatsApp. Plus Vietnam has decreed that its citizens can no longer share or discuss news and current affairs online.

As they attempt to balance these differences, will social networks shift their allegiances to adhere to the manners and mandates of growing emerging markets, even at the expense of their domestic users?

It is still too early to tell. Once robust developing economies are slackening and their purchasing power has fallen. Nonetheless, their rate of consumption is expected to grow six times faster than developed markets, and the middle classes in these countries are advancing at record rates, even as their opposite numbers in the U.S, and Europe are in retreat. This is especially significant since the middle class has historically been the most likely to adopt new media and technologies.

Furthermore, Facebook and Twitter still dominate global markets while most new social systems and services are essentially variations of the current leaders. But according to a report by the Royal Society, China is already years ahead of the West in building the next generation Internet, with new means to both enhance and limit online traffic.

Most importantly, American social networks have been successful, in large part, because they have communicated American values. “America’s brand,” as Jonathan Berman, a senior fellow of Columbia University’s Vale Center has labeled it, has promised three distinguishing qualities: opportunity, individuality and liberty. So far, no other country has made a more appealing offer.

Whatever the ultimate outcomes, there is little doubt social media will surely change as it reaches new and ever-widening audiences; and social networks, like their motion picture counterparts, will have to continually adjust. Perhaps Mark Zuckerberg expressed it best during a recent interview when he said: “People often talk about how big a change social media had been for our culture here in the U.S. But imagine how much bigger a change it will be when a developing country comes online for the first time ever.”

Article written by Howard Gross for icrunchdata news New York, NY

The Science of Crisis Communication

This piece was first published on July 15th in Smart Data Collective

Early in the film Minority Report, police arrest a man for murder. It is not a killing he has actually carried out, but rather a “future murder” that has been predicted he will soon commit. The premise of the 2002 movie, and the short story on which it is based, is that society in the mid-twenty-first century has the means to portend and prevent such unwanted events; a capability many businesses and organizations today would surely welcome. But while this may have seemed like science-fiction just a decade ago, it is fast becoming science fact. Albeit, what science?

Past (and Present) as Prelude

shutterstock_91507382On-screen, the source of this extraordinary foresight are three young psychics called “pre-cogs” who can envision wrongdoings before they occur. Off-screen, it takes a combination of data mining techniques, text analysis and predictive modeling. Leveraging these tools to dissect vast amounts of information about the past and present, data scientists are increasingly able to make their own astute projections about the future.

Researchers who work in the emerging field of “culturomics” – a form of computational linguistics that scrutinizes digitized text to study human behavior – claim to have correctly augured the Arab spring after applying geographic and tone analysis to 30 years of global news archives. IBM’s Smarter Cities unit has built a system in Rio de Janeiro that exploits real-time data to anticipate an array of urban problems. And in an example of life imitating art, police in Santa Cruz California have adapted models, originally designed to forecast earthquake aftershocks, to foresee potential crimes.

It is only a matter of time then before other enterprises use the same systems and technologies to divine all kinds of latent troubles. Yet crises rarely, if ever, happen in hermetically sealed environments in which the type and amount of available data can be easily controlled. Given the ever-expanding size of the digital universe and the fact that information comes from more sources and at greater speeds than ever before, successfully managing emergencies requires a working knowledge of at least one other science: complexity.

Complexity is Everywhere

In complex systems, writes Microsoft researcher Duncan Watts in his book  Everything Is Obvious, “the best we can hope for is to correctly predict the probability that something will happen.” This is because such systems are comprised of many separate yet interdependent parts. The more parts there are, the more complex a system becomes. Likewise, since these multiple components interact in nonlinear ways, the consequences are often unexpected.

The 2008 global financial crisis has become the poster child of complexity gone bad. In the years leading up to the disaster, traders introduced an array of new algorithms, formulas and models; some of which they barely understood. Because the world’s banks and related institutions had become so densely interconnected, when things went wrong the chain reaction happened too rapidly for analysts to prevent an international meltdown.

To make matters worse, changes in nonlinear dynamics can be exponential. Even the smallest error at the start of a process (such as a misplaced digit or decimal) can result in outsized outcomes that may have once seemed unimaginable. This is the basis of Chaos Theory and, as Nate Silver notes in his book The Signal and The Noise, “in complex systems, mistakes are not measured in degrees, but in whole orders of magnitude.”

Thus, it is ill-advised to rely too heavily on machines and massive amounts of data to foretell the future. As Silver points out, “in data analysis, humans improve the accuracy of forecasts.” But in doing so, they also create the need for yet another body of knowledge: cognitive science.

The Future is a State of Mind

Cognition is how people process information, and it significantly determines the ways in which we gather, evaluate and understand data. Underlying much of cognition are a host of biases – flaws in perception – that color how we define the world and influence many of our most important strategic decisions.

Cognitive biases can result from relying on a limited intelligence, as was the case in Japan in 2011 when government officials failed to go back far enough in time to find geological evidence of past tsunamis on the scale of that which damaged three nuclear power plants in Fukushima. In turn, preset beliefs can inhibit impartial judgment, which may account for why several Republican pundits misread the data from the last presidential election. Whatever form it takes, misguided reasoning can present the same serious risks as misused data.

One possible solution to both problems is the advent of cognitive computing systems like IBM’s Watson, which in 2011 beat its flesh and blood opponents in the Jeopardy! Challenge by studying and extracting meaning from myriad books, newspapers, magazines, web sites and social media. Future Watsons will discern interactions between people and machines and draw insights from them; regularly reprogramming themselves accordingly. More than simply execute stored software, they will sense, learn and, at least in theory, provide sage advice.

In the meantime, humans will continue to handle that responsibility. And though it is unlikely communication practitioners will ever don lab coats, expanding their knowledge across sciences such as data, complexity and cognition, among others, will enable them to address ever more daunting crises.

Big Data and the Myth of Consumer Control

This piece was first published on June 25 in Smart Data Collective

shutterstock_136575008One of the more probable victims of the ongoing National Security Agency (NSA) scandal is also one of the most prominent memes of the 21st century. For more than a decade the idea that “consumers are in control” has permeated marketing thought, often driving strategies and significantly enhancing the popularity of social media. Recent disclosures, however, are proving that the notion is not only wrong but, more importantly, wrong-headed.

How Much Data?

Content may be king in the realm of marketing, yet as in many monarchies real power resides elsewhere. In this case, it is within the massive troves of consumer data companies have been compiling. Last week, Bloomberg reported that thousands of technology, finance and manufacturing firms – the government’s so-called “trusted partners” – voluntarily provide it with customer communications. No doubt there are myriad other less intimate enterprises that also gather and stash similar information.

In truth, no one knows precisely how much data has been vacuumed up over the years. The NSA has repeatedly told Congress it can’t keep track of its surveillance operations; and one of the biggest problems many companies face is their failure to use of most of the data they capture. But according to Viktor Mayer-Schonberger and Kenneth Cukier, co-authors of the best-seller Big Data: A Revolution That Will Transform How We Live, Work and Think, this much is for sure: “more data is being collected and stored about each one of us than ever before.”

Knowing ≠ Understanding

Many consumers already know this. (If they didn’t before the scandal, they probably do now). Still, knowing is not always the same as understanding. In his research, Joseph Turow, a professor at the University of Pennsylvania’s  Annenberg School of Communication, has found that most Americans are unfamiliar with concepts like data mining and behavioral targeting. They are also limited in their ability to use technologies that protect their privacy. Moreover, few grasp the fact that by simply “liking” something on Facebook, they may inadvertently reveal their ethnicity, economic status, political views, religious beliefs, mental health or sexual preferences.

Nor do most consumers realize how companies can exploit their data not only to learn more about them, but also to restrict what they may see in return. Facebook’s EdgeRank, for example, is an algorithm that determines whether and where posts appear in its users’ feeds. Yelp has  come under fire from several small business owners for allegedly using its algorithm to selectively filter customer reviews. The problem, notes journalist and researcher Doc Searls,  is that “by focusing on you, and by getting personal with you, the content managers (these managers are actually algorithms) narrow your view to what they think you should see.”

Power Struggle

Circumstances such as these belie the kinds of engagement Big Data are supposed to promote. Indeed, the fact that marketers describe their relationships with consumers in terms of control suggests they are often adversarial affairs. Control is defined as “exercising dominating influence or power over” another. To quote a popular 20th century meme: “information is power;” and those who have it are loath to give it up.

Not surprisingly, the Interactive Advertising Bureau (IAB) has consistently opposed machine-driven “Do Not Track” standards for Internet browsers that could prevent marketers from monitoring peoples’ online activities. For their part, browser owners Google and Microsoft are among technology companies that have lobbied against California’s Right to Know Act of 2013, which would require businesses to make customers aware of any personal data they hold or share.

Control vs. Competence

Even so, many companies’ cravings for vast amounts of data do not jibe with their capacity to effectively manage them. More than 70 percent of chief marketing officers surveyed by IBM have admitted to being unprepared to deal with the current data explosion. The company’s research has also concluded that fewer than one-in-four marketers have advanced analytic capabilities. Plus Facebook’s most recent privacy breach once again raises questions about whether any organization that handles so much information can keep it secure.

Accordingly, the average marketer may not be all that more savvy about Big Data than the average consumer. The difference, however, is that she has the resources to hire people who are. Which is why traditional marketers are handing over some of the control to third parties that include data scientists, mathematicians and even physicists. At the other end, consumers must still rely primarily on legislators, the courts and the occasional whistleblower.

Meeting Half Way

Ideally, the two sides might someday meet half way and redefine their relationship in terms other than control. To that end, the World Economic Forum issued a report earlier this year based on what it describes as “a nine month, multistakeholder, global dialogue on how the principles for using personal data may need to be refreshed…” Among its recommendations is the necessity to find new ways to engage individuals beyond current notice and consent policies.

It is an ambitious undertaking, and a good first step would be to get all sides on the same page; though that may take a while. In a much cited survey of 409 consumers and 257 marketing executives by the Economist Intelligence Unit and digital marketing firm Lyris, only 23 percent of marketers believe their customers are worried about privacy. That falls far short of the 49 percent of consumers who are “very concerned” about who scrutinizes their online activities and why.

Social Media, Surveillance and Censorship

This piece was first published on June 12 in Social Media Today

censorship
censorshipRevelations about the National Security Agency’s (NSA) surveillance program have highlighted how much data the government collects on people both here and abroad. At the same time however, the leaks have eclipsed current events such as those in Turkey, which should remind us just how far some governments will also go to restrict their citizens’ own access to information.

Remnants of the Arab Spring

It has been more than two years since Tunisian street vendor Mohammed Bouazizi set himself on fire after being harassed by municipal officials. In a fateful demonstration of Chaos Theory, videos of subsequent protests spread quickly via Facebook and YouTube, ultimately inflaming much of the Middle East. And though the link between social media and the Arab Spring is still being debated, several governments in the region seem to be convinced.

Not long after Turkey’s besieged prime minister Recep Tayyip Erdogan declared that “social media is the worst menace to society,” police detained 25 people on suspicion of stirring insurrection with Twitter. Saudi Arabia’s Communications and Information Technology Commission has closed access to the popular messaging app Viber, and has threatened to do the same with WhatsApp and Skype. While Iran continues to build its own version of the Internet, with extreme restrictions on how users will be able to connect to the outside world.

Behind China’s Great Firewall

Yet these efforts pale in comparison to the technological dominance of China’s infamous Great Firewall. Earlier this month authorities blocked the encrypted version of Wikipedia ahead of the anniversary of the Tiananmen Square massacre, forcing users to rely on the less secure HTTP version where content can be banned. (Ironically, Chinese scientists have taken the lead in developing a quantum communication system that will make it possible to send completely secure messages anywhere in the world.) China is also alleged to maintain a staff of more than 40,000 censors who regularly monitor its microblogging site Weibo. Each watchdog can scan as many as 50 posts a minute, and it is estimated that as many as 30 percent of all posts can be deleted as soon as they go public.

Unintended Consequences?

What can be more effective than immediately eliminating unwanted information? Preventing it from appearing in the first place, which may be one of the consequences of the NSA disclosures. Some journalists have warned that the Justice Department’s controversial investigation of reporters’ sources is having a “chilling effect” on their work. The same might be expected of lay persons when they realize their emails, posts, chats or comments are being scrutinized.

To be fair, the government is hardly alone in possibly curbing open expression by capitalizing on the vast amounts of data people generate on social sites. A survey by employee intelligence firm HireRight, for example, found that 61 percent of employers either use, or plan to use, social networks to help screen candidates. Such lack of privacy is one reason users are apparently moving off of Facebook, while those who remain are becoming more protective of their identities, according to a seven-year study by Carnegie Mellon University.

Surveillance = Censorship

Like most things in life, managing security – whether national or corporate – and the right to speak freely is a balancing act. The majority of Americans who are not troubled by the NSA leak may not equate surveillance with censorship. But as the Frank La Rue, the United Nation’s Special Rapporteur on Freedom of Expression, noted in a report last week, “privacy and freedom of expression are interlinked and mutually dependent; an infringement upon one can be both the cause and consequence of an infringement upon the other.

Perhaps more than anything else, it is discourse that separates social media from most conventional forms of communication. Take that away and its value is significantly diminished. The ability to gather and analyze data is becoming essential to the operations of both government and business; and to the safety and satisfaction of citizens and consumers. But limiting peoples’ capability to converse – whether before or after the fact – is tantamount to killing the digital golden goose. Thus, discretion should prevail.

That said, the challenge may be even greater overseas. In the marketplace of the future, two things are seemingly inevitable. First, companies will do more business in countries other than their own. Second, they will conduct more of that business across social media. At the end of last December’s World Conference on International Telecommunications in Dubai, 89 nations signed a new treaty that would grant them more authority over citizens’ Internet usage. Among the signatories were up-and-coming economies such as China, Brazil, Russia, Indonesia and South Africa. And although the pact failed to gain support from the majority of member countries, it reflects broad apprehension about digital and social media.

Historical Perspective

The circumstances, however, are not entirely unique. Like the Internet, Gutenberg’s printing press gave rise to myriad new voices and was instrumental in the Reformation and both the American and French revolutions. Just 50 years after the invention of moveable type in 1450, printing offices across Europe were turning out books at an extraordinary rate of 10 million volumes a year. But the backlash was just as dramatic. By the 16th century, the Catholic Church decreed that no book could be printed or sold without its permission; and monarchies throughout the continent placed highly restrictive licensing requirements on all publications.

Nonetheless, books today are everywhere; though some are still banned in parts of the world where social media is censored as well.

Social Media is a Baby Boomer

This piece was first published on May 28 in Social Media Today.

boomers_0boomers_0The last of the Baby Boomers turns 50 next year. A generation that once exhorted its contemporaries to not trust anyone over the age of 30 passed that milestone a long time ago. Now they find themselves targets of doubt and skepticism, especially with respect to their ability to handle advanced technologies. But like so many other stereotypes, this one is based on at least two misconceptions. The first is that older workers are largely incapable of effectively adopting and managing new systems such as social media. The second is that social media is, in fact, new.

Older Minds Still Work Fine

During the past several years, numerous studies have challenged the conventional wisdom about the performance of older employees. Work by the Stanford University-based Scientific Research Network on Decision Neuroscience and Aging, for example, has found that older people often make better decisions than younger ones. It seems that when persons age they selectively remember more meaningful information and are more inclined to distinguish between what is important and what is not.

Moreover, according to Barbara Strauch, the New York Time’s deputy science director and author of The Secret Life of the Grown-up Brain, the middle-aged mind, though slower to assimilate new information, can nonetheless recognize patterns more quickly and reach conclusions more efficiently than can its less mature counterpart. Plus many Baby Boomers have been witness to a seemingly unprecedented explosion of technology since their mid-30s, says Dr. Karen Riggs, a media studies professor at Ohio University and author of Granny @Work: Age and Technology on the Job in America. So they are often more technologically adept than they are given credit for; which gets at the heart of the second myth.

Social is Middle-Aged

Despite various claims that social media is still too new to be fully understood, it has already crossed the half-century mark, having been conceived in 1962 by JCR Licklider in a paper entitled On-Line Man Computer Communication. As head of the Information Processing Techniques Office at the Department of Defense Advanced Research Projects Agency (DARPA), Licklider was 47 when he championed the principle of social interaction across a network of computers. Seven years later, a team of scientists and engineers took his idea and produced the ARPANET, the precursor of the global Internet.

Its first participants comprised a smattering of individuals at four universities – Stanford, University of California, Los Angeles (UCLA), University of California, Santa Barbara and the University of Utah. But as new networks were added, hundreds and then thousands of users followed and began tinkering with the notion of interactivity. In 1972, the first computer-to-computer chat took place at UCLA. A year later the first public bulletin board system (BBS) was established upstate in Berkeley. And the first dial-up BBS appeared in Chicago during the Great Blizzard of 1978, about the same time CompuServe launched its consumer information service featuring online forums.

The following decade saw the creation of Usenet, the global discussion system; the Whole Earth ‘Lectronic Link (WELL), a popular online hangout for “Deadheads” who were avid fans of the jam band the Grateful Dead; and America Online, which by the turn-of-the-century surpassed all other forms of “virtual communities” to become the Internet’s first 800 pound gorilla with more than 30 million paying subscribers worldwide. By the time Friendster, MySpace, Facebook et al. came along, their founders simply did what innovators have been doing since the origin of invention – extending and building on the work of others.

Innovation is Integration

Innovation is an ongoing process, and augmentation is one of its fundamental principles. It can be decades “between the birth of an idea and when its implications are broadly understood and acted upon,” says Tom Agan, co-founder of the innovation and brand consulting firm Rivia. Hence, Millennials didn’t invent social media any more than Baby Boomers invented sex, drugs or rock and roll. Yet they have significantly advanced and enhanced it; and they will continue to drive future variations.

It would be a mistake however to deny a role for their elders. Granted, those with extensive social media expertise may be relatively few and far between, but their knowledge and experience can be extremely valuable. Just as important, there are nearly 100 million Americans over the age of 50 and their numbers are growing, as is their use of social media, particularly on Facebook, which younger users are apparently abandoning. In addition, Boomers control 70 percent of the nation’s total net worth and have more discretionary income than any other age group. Who better then to interact with them across social networks than their peers?

Developing and managing a successful social media strategy is not an either/or competition between generations. Not every 50- or even 60-year old is an antiquated technophobe. Nor is every 25-year-old a savvy digital native. Indeed, some studies have shown that many college students are far too trusting of what appears online, fostering the cliché that “if it’s on the Internet it must be true.” Other research suggests that the real differentiator is not age but income and education.

Thus, organizations will be best served if they look beyond age – or gender, race and ethnicity for that matter – for the most capable individuals. After all, diversity is another fundamental principle of innovation, because the best ideas usually don’t come from any single person or position, but where people and possibilities intersect.

 

Content to Commerce

content-commerce

A Matter of Perspective

perspectives

A systems approach to communication and complexity

Contents

 Introduction   2
 A change in the weather   3
 Countless conversations   5
 Model behavior   7
 Organization: Open vs. closed    8
 Audience: Making an impression  10
 Environment: It’s chaotic out there  12
 Channels: Old vs. new   14
 Embracing the elephant  16
 Notes  19

 


Google searches for the term “cloud computing” have increased about 150% since 2010; but have grown more than 200% for “social media, ” and nearly 300% for “big data.”

Introduction

There are times when trying to explain something complicated and hard to understand that we simply label it as “complex” and move on. But it is a mistake to reduce as important a concept as complexity to a throwaway line. A complex system is anything made up of many different parts which interact in often unpredictable and unplanned ways. By that measure just about every social, cultural, political and economic institution is complex as is our entire natural environment. Nowhere, perhaps, is this more evident these days than in the realm of communications.

Communication has become the epitome of complexity, especially those mechanisms that have caught our attention of late. Social media bring together countless individuals whose unending conversations can beget unexpected results. To capitalize on “big data” analysts must examine and extract value from the trillions upon trillions of bits of information generated by people and machines. And much of this now occurs I the “cloud,” the universal term for distributed systems that are made possible by the collaboration of vas numbers of computers.

For communication professionals this presents two formidable challenges. The first is t explain ever more complex ideas and issues to broadly diverse audiences. The second is to do so through processes that are, themselves, becoming more complex. In the past we may have handled such matters b dealing with them one at a time. But just as we can’t recognize the workings of an ant colony by scrutinizing just one ant, we won’t uncover solutions to intricate communication problem by zeroing in on a single message medium or outcome. Instead, we must move in the opposite direction and explore how all elements com together under continually changing circumstances.

This is a systems approach to communication, and it involves understanding situations in terms of their relationships, connections and context. The following report provides an introduction to communication in complex systems and the value of systems thinking. It is the first in a series of articles, blog posts and white papers on the subject.

Yet systems thinking is neither a discipline like public relations or marketing communications, nor a new technique such as social media, big data analytics or cloud computing. Rather, it is a distinct way to consider things with a more open mind. That is essentially a matter of perspective: being able to appreciate information and audiences from various points-of-view; and in the process, practicing all of the above more effectively.


Emergence is the relatively simple interaction of components which create complex systems that are different from their constituent parts. For example, water (H2O) is the result of the merger of two gases, hydrogen and oxygen.

A change in the weather

In an increasingly complex world communication mattes more than ever

chg_weather

Complexity is everywhere. It permeates our existence from the outer reaches of space to the inner workings of our bodies. Our brains, in particular, are highly complex organisms. Not only do they manage the billions of electrical connections that keep us alive, but they store the myriad facts, experiences, impressions and memories that variably combine to form ideas. Our ideas, in turn, coincide with those of others to create equally elaborate social, political and economic systems.

Despite their intricacies, most complex systems exhibit common characteristics. Take the weather. Its simple elements – gases, solids and liquids – perpetually collide to produce powerful atmospheric disturbances. Scientists call this “emergence,” whereby the whole is greater than the sum of its parts and cannot be predicated based on individual components. Likewise, cold fronts and warm fronts are systems within systems, each the result of its own combination of meteorological ingredients. And anyone who has witnessed a tornado or hurricane knows they rarely move along straight lines. Such unpredictability is a hallmark of complexity.

As for complex man-made systems, they often exist in the form of networks; the most familiar probably being the Internet, which is actually a network of networks. Every point on a network is known as a node. A web site such as Facebook is a node. So too is every user on Facebook. But what defines these systems is not so much the nodes themselves as how they interact.

Most interactions across networks are based on the flow of information. So understanding how information is produced, shared and perceived – in other words, communication – is vital to operating in complex systems. As systems grow more complex, communication becomes more essential. But it too gets more complicated as audiences fragment, the means to reach them expand and the amount of available data continues to swell.

Organizations, for example, must now produce content in multiple formats – and share it across a wide array media – to connect with increasingly global stakeholders. These target audiences are dividing and subdividing themselves along geographic, economic, social, cultural, political, gender, age, ethnic and religious distinctions. Such disparities affect what kinds of content they access, how they access it, and how they interpret it.

Consumers are also finding divergent ways to handle information overload. They are abridging their sources of news through tactics likes aggregation and personalization, while conversely multitasking their way across more and more media platforms.

Of course none of this happens in a vacuum. Communication is part of larger systems which constantly entangle it in issues and events that wreck havoc on even the most deliberate strategies.

We are, however, developing means to not only become more aware of complexity, but to cope with it as well. The spread of social media, improvements in data analytics and advances in the cognitive sciences are introducing more accurate tools and techniques. We are still learning to use them and, if history is any guide, hype will exceed reality. Nonetheless, they will ultimately prove their worth.

At the same time, we continue to enhance the capacity of traditional media by finding better ways to use text, video, audio and graphic design. Yet simply developing new skill sets will not suffice. Taking full advantage of both contemporary and conventional methods also demands adopting new mind sets. We have to think differently about every part of the communication process.

 

Countless conversations

These days communication professionals must be able to see both the forest and the trees

countless_conversations

Communication is a complex system even in its most basic form: conversation. Like all systems, conversations are mutual interactions. A says something to B. B responds. And so begins a dialogue. If they know each other well and the parameters are clearly marked their discourse will go as expected. If not, then it is apt to be unpredictable; possibly uncontrollable; and almost always self-adjusting.

Now multiply that infinitely. What with the growing dominance of social media and the share of data it throws off, the notion of a conversation is becoming ever more complicated. To be sure, no single person nor organization can adequately converse with hundreds of friends or millions of followers. But those friends and followers are also nodes on assorted networks. Currently, more than 60 percent of the world’s online population connect through social networks, while 85 percent regularly send and receive emails.1 Thus, the ability to engage in these innumerable exchanges largely defines today’s communication.

Successfully managing such conversations, or any other form of communication, means being able to step back and scan all of the critical components – ideas, issues, audiences and technologies, among others. They will vary depending on the situation. Not everything will be apparent; certainly not right away. And some are bound to change throughout the process. The challenge is to recognize how, when and why the elements intersect, and with that knowledge continually build appropriate strategies and content.

Doing so requires a systems approach. This involves dealing with issues holistically rather than concentrating on their separate parts. By viewing problems in broad context – and observing how the different pieces interact and influence each other – we can look beyond what is immediately obvious to pinpoint all possible causes of a problem, and to anticipate all potential consequences.

Furthermore, thinking systemically entails seeing audiences the way they see themselves so as to identify what kinds of content are important to them. What do they need or want? How much do they already know and understand? What will they do with information once they have it?

Lastly, it means accepting the fact that as circumstances change so do outcomes; and both can be extremely uncertain. Swift and seemingly endless changes make it impractical, if not impossible, to codify communication in hard-bound rules or templates. There are too many variables and too much volatility to effectively keep reapplying even the best practices.

Still, it is quite possible to diagnose problems and determine solutions by understanding them in terms of the systems they create. Only then can we really know how best to communicate.

 

Model behavior

One way to better understand a complex system is to make it appear less complex

model_behavior1

It was Albert Einstein who advised that “everything should be made as simple as possible, but not simpler.” Since then we have learned to build models of complex systems that simulate the conditions, operations and interactions among various independent components, or agents. Such “agent-based” models are primarily used in computational research, but simple versions can be developed manually; and while barely as intricate as the actual systems they embody, they can produce blueprints from which to construct practical strategies.

In the case of disciplines like public relations or marketing communications, a workable prototype [Figure 1] can be built around four key agents: the organization, which is often the original source of information; the audience, who access, interpret and act on resulting content; the environment, where issues and events provide requisite context; and the multiple channels of communication.

model_behavior2

As noted previously, the details may change from situation to situation, and just about every new strategy will rely on putting them together in unique combinations. But the model itself offers a manageable framework.

 


In his seminal 1945 work The Use of Knowledge in Society, Nobel prizewinning economist Friedrich Hayek was the first to argue that information at the center of organizations is neither as accurate nor as timely as information on the fringes

Organization:  Open vs. closed

Communication in organizations is influenced by the way organizations are designed

organization

In business, most communication originates from within organizations, which generally exist in two forms – closed or open. To some degree every organization is closed, in that much of its information remains behind its walls. Though sometimes necessary, it can lead to what psychologist and Nobel prize winner Daniel Kahneman terms the “inside view,” wherein decisions and forecasts are based entirely on specific circumstances and supported only by a company’s own experiences. 2

Case in point: the decision by the Susan G. Komen Foundation to defund Planned Parenthood. Motives aside, the resolution by Komen’s board of directors was apparently derived from a narrow perspective. According to Laura Otten, Director of The Nonprofit Center at La Salle University’s School of Business, the board was filled with family and friends. “When we build a board of people who are alike in terms of background and thinking,” she told the Nonprofit Business Advisor 3 newsletter, “[it] tends to be more about affirming what someone else wants, not about asking the serious questions or having a thorough discussion.” In fact, the board came to its decision despite recommendations to the opposite by the foundation’s professional staff.

Nearly every big – and lots of smaller – enterprises are also still hierarchical, so a good deal of their knowledge is further imprisoned within silos. Albeit, information confined to individual departments or operations is easier to coordinate – and therefore more cost-effective – real innovation seldom occurs there. Rather, it emerges at points where separate ideas intersect.

According to research 4 at the University of California, Berkeley, exposure to unfamiliar perspectives fosters creativity. Moreover, debate and criticism do not inhibit imagination but actually stimulate it. Studies 5 of more than 200 public firms in the United Kingdom found that those most able to radically change their entrenched ways of doing business frequently promote creative tension. That is why many resourceful companies encourage employees to expose themselves to a diversity of information, including ideas from the outside. Which makes them open organizations. In his seminal 1945 work The Use of Knowledge in Society, Nobel prizewinning economist Friedrich Hayek was the first to argue that information at the center of organizations is neither as accurate nor as timely as information on the fringes Travis S.

Granted, this concept of openness is hardly novel. But when businesses do venture beyond their walls for additional insights they may not go far enough. In a study 6 of the relationship among 43,000 global corporations, systems scientists at the Swiss Federal Institute of Technology identified a network of about 1,300 blue chip companies with interlocking ownership. These include a core group of 147 tightly knit firms, all of whose ownership is held by other corporate members. Most are financial institutions, which may account for why several major American banks made the same ill-fated decision in 2011 to charge customers added debit card fees.

Conformity may be comforting, but it can also be a drawback. Three decades ago Arie de Geus, then Corporate Planning Director at Royal Dutch Shell, sought to answer the question: “What distinguishes long-lived companies?” At the time the average lifespan of Standard & Poor’s 500 firms was about 40 years, and de Geus discovered that long-lived companies (those that survived for as much as a century or more) were sensitive to their environments. Despite changes that surged and ebbed, he later wrote in his book The Living Company 7 , “they always seemed to excel at keeping their feelers out, tuned to whatever was going on around them.” In addition, “these companies were particularly tolerant of activities on the margin: outliers, experiments, and eccentricities within the boundaries of the cohesive firm, which kept stretching their understanding of possibilities.”

For de Geus, the fundamentals of corporate longevity are as valid today as they were back when he wrote those words. Meanwhile, the average S&P 500 lifespan has since plummeted to just 15 years.

This speaks to the mounting reliance on social media as a means to access greater amounts of information; and on the capacity of data analytics to collect and process it. So far the jury is still out on whether social networks are a viable marketing medium. Yet there is a growing body of evidence to indicate that communing with consumers can play a role in enhancing innovation.

A series of related studies 8 in the United States, United Kingdom and Japan found that consumers in these countries come up with untold numbers of ideas that can be used to improve products. This has led researchers to suggest that in lieu of viewing customers only as passive recipients of merchandise, companies should also collaborate with them as significant sources of innovation.

Capturing and capitalizing on those ideas, though, can be problematic. According to a report 9 from the New York American Marketing Association and Columbia Business School’s Center on Global Brand Leadership, the number one obstacle to taking advantage of consumer data is the lack of sharing between departments.

Indeed, internal barriers to communication have long been a complication for most organizations. But should they overcome the problem they must still be able to recognize which ideas – whether from inside or outside – are really valuable. That starts with becoming more sensitive to their audiences.

 


A zettabyte is equivalent to the information contained in 100 million Libraries of Congress. A yottabyte is one thousand times larger than a zettabyte.

Audience: Making an impression

Much of what drives successful diffusion lies outside of our control
Sean R. Nicholson

Sean R. Nicholson

For as long as anyone can probably remember the statistical standard for defining societies has been demographics. These generic attributes, ranging from age, gender and race to education, employment and even home ownership, have been used to characterize what social scientists refer to as “representative agents:” factitious persons or groups who typify the behavior of broad swaths of the population. For marketers they may be women 18 to 49; for economists, the one percent; and for politicians, “the American people.”

The problem is, as populations atomize into ever smaller, self-defined segments whose needs and concerns overlap, demography is losing its appeal. In its place some practitioners have turned to psychographics to interpret consumers’ beliefs, personalities and lifestyles. Others though are captivated by the aura of affinity groups that coalesce around shared interests or objectives; and for which they have anointed a new version of the representative agent – the influencer.

In brief, the lore of the influencer goes something like this: certain individuals who are especially authoritative or passionate about a subject garner substantial numbers of friends, followers or connections who value their opinions. Organizations then promote and market themselves through these virtual persuaders to induce desired behaviors on the part of target audiences. Not surprisingly, however, the reality is more complex.

Among the first to challenge this conventional wisdom was Columbia University psychology professor Duncan Watts, who is also a principal research scientist at Yahoo. As early as 2001, he began questioning the notion that one person or a small group can drive collective behaviors online.10 Subsequent findings11 by social platforms Buzzfeed and StumbleUpon assert that when influential people do reach a wide audience their impact is short-lived. Moreover, in every instance the analyses concluded that content is more likely to spread when large numbers of ordinary people share it with small groups of other ordinary people, instead of when it comes from someone “special.”

But Watts has taken it a step further. He maintains that regardless who the sender is, the flood of ideas will only flow if the receivers comprise a critical mass of easily influenced people, who then pass the information on to other easy-to-influence people. Without them, he notes in his book Everything is Obvious, 12 “not even the most influential individual could trigger any more than a small cascade.” By that reckoning the task before communicators is to identify who is truly impressionable.

There is certainly no shortage of material to sort through. To the contrary, website traffic, online searches, banner advertising, social media and smartphone use leave behind vast trails of personal information. It is estimated, for example, that 34,000 tweets are sent every minute. That comes to about one billion tweets per month;13 still far less than the 30 billion pieces of content posted on Facebook. Add to this the endless data churned out by the so-called “Internet of Things” – millions of objects embedded with readable sensors – and it has engendered a new lexicon to calculate it all with terms like gigabyte, petabyte, zettabyte, and most recently yottabyte, which is designated by the number 1 followed by 24 zeroes.

But data is just data without the right tools to analyze it; ergo the current allure of big data analytics. By examining massive amounts of digital information simultaneously across hundreds or thousands of parallel servers, organizations can discover once-shrouded paradigms among consumer behaviors; then try to respond to, and predict, outcomes in real time. As part of this process, companies are also attempting to translate much of this data in ways that will allow them to appreciate consumers in more subjective terms. Beyond simply counting likes, follows or retweets, they hope to uncover and exploit genuine attitudes, emotions and intent.

This sentiment analysis represents a major step forward for social metrics. Nonetheless, it has its limits. For one thing, it has a hard time handling sarcasm or cynicism. For another, persons of various ages, ethnicities, genders and geographies can use the same words differently, which further flusters machines unable to pick up on nuances. Most importantly, people filter their judgments and beliefs through a host of perceptions and cognitive biases that computers alone cannot infiltrate.

Enter new methods like neuroscience which melds the study of the brain with fields as varied as computer science, engineering, math, chemistry, physics, psychology and philosophy. But the impact of neuroscience in terms of inferring consumer attitudes and behaviors remains open to debate. In the meantime, the ability to understand how humans process information is still mainly the domain of other humans.

 


In chaos theory, there is the phenomenon known as the “butterfly effect,” whereby a butterfly flapping its wings in China might create tiny changes in the atmosphere that can ultimately alter the weather in New York.

Environment: It’s chaotic out there

What we don’t know can hurt us

environment

The classic science fiction film Alien was originally promoted with the tag line “in space no one can hear you scream.” Not so in cyberspace where just about everything is audible. Which makes it possible for organizations to constantly monitor their environments for any signs of opportunity or misfortune. Scores of communication managers thus believe that thanks to social media all they have to do is sit back and passively listen to the conversations of target audiences. With any luck everything they need to know will eventually cross their paths.

Complex systems, however, aren’t nearly so accommodating. For every situation that arises, there can be multiple causes; some so minute they are, at first, imperceptible. To further confound matters, equally small and barely visible changes can produce an erratic chain of events that ultimately results in problems that are too big to ignore. This is the basis of chaos theory, a mathematical principle that was highlighted in another science fiction classic, Jurassic Park.

On those occasions when organizations accurately source their problems they still may not be able to correctly gauge the outcome, since cause and effect are not always closely related in time or space. It may take awhile before actual consequences are indentified; sometimes only after reaching several false conclusions. Under these conditions successful communication is a process of trial and error.

Even when an organization’s observations are spot on, its reading of a situation may not square with that of its audience. Sundry studies have shown that two or more people can experience the same event and come away with very different impressions. This is a variation of a phenomenon known as “selective perception,” which can sometimes pit companies against consumers.

Consider Netflix’s hapless attempt during the summer of 2011 to raise its prices. While there is probably never a good time to jack up customers’ subscription fees by as much as 60 percent, doing so in the midst of a national debate over the debt ceiling and government default was especially untimely. The political ruckus subjected constituents to a heightened sense of economic uncertainty; and when people are uncertain they are much more resistant to change. So the increase was deemed unacceptable by a great many subscribers. In chaos theory, there is the phenomenon known as the “butterfly effect,” whereby a butterfly flapping its wings in China might create tiny changes in the atmosphere that can ultimately alter the weather in New York.

Meanwhile, Netflix was focused on a separate economic dilemma. Forecasts revealed that its content costs were slated to increase a billion dollars by the end of the year, making it obvious, at least to the company, that the $8 a month it was charging subscribers could no longer support the ongoing delivery of high-quality streaming videos. Hence, the new fees probably seemed quite reasonable to Netflix. But in a series of bewildering explanations, CEO Reed Hastings failed to make the case and was unable to sync customers’ concerns with the company’s financial jam.

Hastings may have had more success had both he and his subscribers perceived the big picture and grasped each other’s predicaments. But truth is there are a great many things going on in the world locally, nationally and globally; some quite discernible, the rest initially taking place mostly out of earshot. Simply listening to conversations may not be enough. For example, before the Arab Spring erupted, Middle Eastern governments regularly tracked citizens’ communications. Yet no one foresaw that the death of a vegetable vendor in a Tunisian town would start toppling regional dominoes, some of which are still teetering.

In complex systems, context matters. Just as in journalism, the who, what, when, where, why and how must all be taken into account. Individually they send very different messages than the stories they tell as a whole.

 


The world is home to seven billion people, one third of whom use Internet. Forty-five percent of Internet users are under 25-years old. ITU ICT Facts and Figures 2011

Globally, one in ten
Internet users is a Muslim living in a populous Muslim
community.
Prof. Philip Howard
University of Washington

Channels: Old vs. new

It’s not an either/or proposition

channels

Versatility is paramount in complex systems because the rules of engagement keep changing. More communication channels are competing for the public’s time and attention than ever before; but it is clearly not a zero sum game. According to Riepl’s Law (coined by German newspaperman Wolfgang Riepl back in 1913), existing media do not disappear when something newer, and possibly better, comes along. Instead, they survive by adopting different formats. Early television was filled with the kinds of situation comedies, variety entertainment and game shows that had previously been staples on radio. As a result, radio became home to talk and a new sensation dubbed “rock-and-roll.” For its part, the Internet has not extinguished any of its predecessors.

What a new medium can gain at the expense of its older competitors is notoriety. Organizations around the world are presently agog over social media, ascribing to it practically every desirable communication function. Is the adulation deserved? True, more than 60 percent of online users are on social networks. Still, at the end of 2011, some 65 percent of the planet’s population did not yet have regular access to the Internet. On the flip side, nearly 75 percent of global citizens have at least one television set in their homes.14 In the United States, many households now have more TVs than breathing occupants. 15

Credit has also been given to social media that may, in part, belong elsewhere. There is little doubt Facebook and Twitter were crucial in helping to organize efforts that led to the Arab Spring. Yet, as Marc Lynch, the director of the Institute for Middle East Studies at George Washington University, has observed, it was the videos aired on independent television network Al Jazerra that mobilized the masses into the streets. That said, Lynch advises us to “not think about the effects of the new media as an either/or proposition (‘Twitter vs. Al Jazeera’), but instead think about new media (Twitter, Facebook, YouTube, SMS, etc) and satellite television as collectively transforming a complex and potent evolving media space.” 16

That is good advice even for those not engaged in regional insurgencies. But while many organizations are still learning to deliver and measure content across multiple technologies, audiences have come up with a simple solution – they choose the most appropriate medium at any given time. According to market research company Nielsen, (for which this writer was formerly Senior Director of Global Communications) consumers base their decisions on several factors, including convenience, availability and relevance of content, and the quality of the experience. 17

Recognizing these attributes can benefit communicators considerably. For instance, television’s rich and compelling content enable it to capture attention and create awareness even in a cluttered environment. Information online may not be as compelling, but the Internet’s interactive capabilities make it ideal for impelling users to take action. And mobile’s ubiquity and portability, enhanced by an array of applications, make it the most popular media platform on Earth.

Although the passage of time generally favors digital devices, they are unlikely to break Riepl’s Law. Old, new and not yet imagined media will probably continue to co-exist and compete into the foreseeable future. Individuals and organizations must adjust communication strategies accordingly.

 


Dealing with problems in a mechanized way is known as the Einstellung Effect (translated from the German word for attitude), and arises when relying on solutions that worked before instead of handling each new problem on its own terms.

 

Embracing the elephant

People generally accept information which confirms their beliefs and dismiss ideas that don’t

elephant

A timeless testament to the puzzle of complexity is the tale of the blind men and the elephant. In this enduring Indian parable, several sightless wanderers come upon an elephant for the first time; and in trying to determine what it is, each touches a different part of the animal. To one fellow who grabs the trunk, it is like a squirming snake. To another who holds the tail, it is the same as a rope. For a third who falls against its side, it is akin to a wall; and so on with every contact. But not only do their various perceptions lead them to quarrel, they fail to accurately identify the beast.

Over centuries the allegory has transcended multiple theologies, including Hinduism, Buddhism and Sufism, and has even made its way into modern psychology. In his best-selling book Thinking, Fast and Slow,18 Nobel laureate Daniel Kahneman says that individuals and organizations similarly jump to conclusions on the basis of limited evidence. His acronym for this condition is WYSIATI – what you see is all there is – and it represents the fact that we base our judgments on our experiences, and on the stories we make up to explain them. Unlike like the men in the fable, cautions Kahneman, we are not merely blind. “We’re blind to our blindness. We have very little idea of how little we know.”

Nowadays, elephants reside in places like politics and the economy, and the blind include elected officials and the sorts of pundits who populate cable television news. It would be naïve to suggest that everyone is equally right and equally wrong. Life is rarely so finely balanced. But it would be fair to say that we all share common causes for our lack of (in)sight.

Of all the technologies we encounter, few may be more hardwired than the brain, which was initially programmed when our ancestors still dwelled in caves. Consequently, many of the ways we process information are inherently ingrained. When presented with new input, we instinctively decide what to admit and what to dismiss. When someone challenges our most cherished convictions, we double down on our beliefs. This “backfire effect” is the cognitive equivalent of Newton’s third law of motion, to wit “for every action, there is an equal and opposite reaction.”

This where systems thinking comes into play. It is a more comprehensive means of seeing things. With regard to the communication model, it involves being able to discern the interactions of all four agents. [Figure 2] That is the sweet spot where we can correctly identify the elephant and act accordingly. But getting to that point requires changes in both attitude and approach.

elephant2

For starters, it is important to remember that complex systems encompass smaller systems, each with their own elements, interactions and emergent properties. For instance, organizations are composed of departments; audiences are becoming more diverse; external events like the global economic meltdown involve so many different agents as to seem almost incomprehensible; and the Internet is an amalgam of practically every medium that has preceded it.

Rather than set of static, interlocking circles, the communication model is more like four wheels of fortune spinning in separate directions and at different speeds; so that the point of mutual convergence can change from moment to moment. The outcome may not always be an elephant but a different creature all together, as elegant as a cat or as motley as a platypus.

Imperative too, is the realization that complexity is nonlinear. As much as we prefer to present ideas in logical sequence – as is the style of this report – complex systems subsist in the form of feedback loops. Every decision we make, or action we take, produces new information that may either substantiate or undermine our original assumptions. This makes it difficult to definitively measure outcomes since every effect can loop back and possibly alter strategies. [Figure 3]

elephant3

Worse yet, it can lead to what is known as “wicked problems:”situations about which our knowledge is incomplete, contradictory or constantly changing. They are the ultimate elephants that have as many different interpretations as interpreters. What is more, they are so tightly interconnected with other issues that, like a game of whack-a-mole, no sooner do we resolve one than another emerges.

As a result, numerous conventional models of communication no longer measure up to the circumstances they are meant to address; though many practitioners continue to rely on fixed and familiar routines. Why else, for example, would public relations professionals mechanically resort to vintage tools like talking points and Q&As when they have a diminishing impact on an equally dwindling and overworked corps of journalists?

An alternative is a systems approach which requires that we confront each quandary from a unique perspective. It is a combination of small steps and big leaps, and some are liable to fail along the way. Still, it can serve as a starting point for new and possibly better means to deal with the accruing complexity of communication. At the very least it is way of thinking differently. After all, as Einstein once said, “you can’t solve a problem with the same mind that created it.”

 

Notes


1   “Interconnected World: Communication & Social Networking,” Ipsos Press Release (March 27, 2012)
2   D. Kahneman, “Thinking, Fast and Slow,” (New York: Farrar, Straus and Giroux, 2011)
3   K. Sullivan, “Susan G. Komen Foundation’s Troubles a ‘Teaching Moment,’” Non-Profit Business Advisor (March 22, 2012)
4   Nemeth, C.J. and Nemeth-Brown, B.(2003) Better than Individuals? The potential benefits of dissent and diversity for group creativity (2001).  In P. Paulus and B. Nijstad (eds) Group Creativity. Oxford: Oxford University Press.
5   G. Johnson, GS Yip and M. Hensmanns, “Achieving Successful Strategic Transformation,” MIT Sloan Management Review (March 20, 2012)
6   “Revealed – the capitalist network that runs the world,” New Scientist (October, 2011)
7    A. De Geus, “The Living Company,” (Boston: Harvard Business Review Press, 2002), 6
8    E. von Hippel, S. Ogawa and J. de Jong, “The Age of Consumer-Innovator,” MIT Sloan Management Review (September 21, 2011)
9    “Data Maketers Struggle to Link Digital Data to ‘Big Data’ Picture.” eMarketer (March 19, 2012)
10    C. Thompson, “Is the Tipping Point Toast?” Fast Company (February 1, 2008)
11    J. Krawcyzk and J. Steinberg, “How Content Is Really Shared: Close Friends, Not ‘Influencers,’” Ad Age (March 7, 2012)
12    D. Watts, “Everything is Obvious,” (New York: Crown Business, 2011): 3-29
13    S. Mills, “Big Data: The New Natural Resource,” March 20, 2012, http://asmarterplanet.com/blog
14    “The World in 2011: ITC Facts and Figures,” International Telecommunications Union, 2011
15    “State Of The Media: Consumer Usage Report,” Nielsen, 2011
16    M. Lynch, “Tunisia and the New Arab Media Space,” Foreign policy (January 15, 2011)
17     S. Whiting, “Digital is Changing the World of Media,” (Presentation at Peking University, Beijing, April 16, 2009)
18    D. Kahneman, “Thinking, Fast and Slow,” (New York: Farrar, Straus and Giroux, 2011): 85-88