Talking Politics is Right for Business

This piece first appeared on November 1st in Mediummedium

I recently came across a post on LinkedIn urging members to refrain from talking politics on the site. The logic being that political discourse has no place on a social network targeting business professionals. Judging from the number of likes and positive comments, it is a popular view; though one that is seriously myopic. Conventional wisdom holds that business and politics are the oil and water of free markets — they don’t mix. The uncertainties of Brexit in Great Britain and America’s presidential election are evidence of that. But while they may not coalesce, these two worlds regularly collide, and business people who choose to ignore this fact do so at their own peril.

Consider a study out of Harvard University’s School of Business, which concluded that the U.S. political system is the nation’s “single biggest barrier to competitiveness.” Nearly two-thirds of business leaders queried believe the current political environment is obstructing economic growth, and there is general agreement that the federal government “has made little or no progress on our most important economic policy priorities.” Yet concerns such as these are hardly confined to the United States. According to a recent McKinsey and Company survey on globalization, “executives are likelier than ever to believe that geopolitical and domestic political instability will affect global business and their own companies in coming years.” This is the biggest increase in a given trend McKinsey has experienced in more than a decade.

For those who have been paying attention, this should come as no surprise. Most enterprises are, after all, social systems — heterogeneous aggregations of individuals and groups who both influence, and are induced by, each others’ behaviors, as well as their shared environments. As such, they are also apt to be hierarchical. Like Russian nesting dolls, they not only contain smaller subsystems, but are also parts of larger social structures. A company, for example, is an association of people who work together for a common purpose. Its subsystems may include finance, marketing, and human resources departments. At the same time it is a subsystem of an industry that, itself, operates within the context of economic, political, and cultural structures. So companies must be prepared to handle all sorts of disruptions that can emerge on various levels.

If all of this seems complex, that is because it is. Social systems are essentially complex systems, which comprise multiple parts that interact in unplanned and unpredictable ways. This was recently the case when Mars, Incorporated, makers of the candy Skittles, and Italian confectioner Ferrero, which manufactures Tic Tac, took to Twitter after each was, through no fault of its own, abruptly drawn into separate controversies surrounding Donald Trump. Indeed, what may disturb some members of LinkedIn is not the actual clashes between business and politics, but the complexity they engender. Such intricacies can be tough maneuver since they don’t conform to most peoples’ desire for control; nor to their need for clear lines between cause and effect. Consequently, many businesses try to avoid what they cannot readily manage.

The bad news is they may have little choice but to deal with such matters. The impact of politics, particularly the ugly parts, are getting harder to evade. Social media makes it easy for disgruntled citizens to register outrage and rally liked-minded critics around punitive actions. Watchdog groups on both sides of the political spectrum routinely monitor business activity to gauge if firms are toeing their preferred lines. And there is an assortment of mobile apps for keeping tabs on companies, such as BuyPartisan, which enables consumers to analyze manufacturers’ allegiances by scanning the bar codes of products to see to what parties they have donated.

The better news is that recognizing political realities, and at times even embracing them, may not be as onerous as once thought. According to the 2016 Edelman Trust Barometer — an annual audit of institutional credibility — participants believe corporations are more capable than government of keeping up with complex times. Moreover, 80 percent of those surveyed want companies to take the lead in tacking pressing problems. Plus several studiesfrom Drexel University have found that consumers can be quite tolerant of corporate activism — even when they disagree with it — as long as they believe an organization is upfront and honest about its motives.

But effectively addressing political controversies cannot be achieved by dodging them, since they are not going away any time soon. To the contrary, several prominent Republicans are already signaling that if Hillary Clinton makes it to the White House it will be business-as-usual on Capitol Hill. Additional turmoil arises from the fact that political tensions are tightly linked to social and economic anxieties. Thus, many companies will have to concurrently confront issues of racism, sexism, homophobia, and inequality. And it is only a matter of time before these difficulties are augmented by the rise of machines, the graying or populations, and the warming of the planet.

Granted, it is easier to think of these and similar predicaments as if they are distinct and detached. Yet that is not only naive, it is dangerous. “Global risks are interconnected, and that can create unexpected consequences,” notes the World Economic Forum’s (WEF) latest report on the subject. Such risks include damage to corporate reputations and loss of market share. And with a faster pace of change and more complex interconnections, the WEF warns that “the stakes have never been higher.”

Why Communication Professionals Must Understand Complexity

This piece first appeared on April 5th in Medium

35cece146b517487041367324a40978c

It is fashionable these days to think of communication as both art and science; albeit with emphasis increasingly on the latter. No surprise, what with ongoing innovation in the digital space. Expertise in coding, search engine optimization, data analysis and visualization, among others, are elbowing out more conventional knowledge and skills, further tipping the scale. Yet few communication professionals truly understand one of the most essential sciences: complexity.

Complexity is different than other sciences. With physics and chemistry the object is to examine conditions in isolation in hopes of achieving undeniable outcomes. Complexity, on the other hand, explores systems whose different elements interact in multiple ways; with the realization that certainty is nearly impossible to attain. This is because when even the simplest components converge, what ultimately emerges may be quite unlike its original parts (as when gases hydrogen and oxygen combine to form water). Moreover, this process is non-linear, so input and output are not proportional to each other. What first appears as small and inconsequential can quickly become big and burdensome. For every event that occurs there are any number of possible causes, with no clear link between them. And since such systems learn from, and adapt to, their environments, they frequently change. All of which means they are difficult to control, and their consequences are too precarious to predict.

For proof, look no further than the current presidential campaigns. Not only is the nation ostensibly split between two warring political parties, these same bodies are beset by their own discordant and disorderly factions. This is one reason pundits failed to foresee that a reality-TV star billionaire or a self-proclaimed democratic socialist would be taken seriously as a potential commander-in-chief. But politics isn’t the only field to which principles of complexity can be applied. Scientists already employ complexity theory to forecast weather. Epidemiologists have developed means to identify “super spreaders” of disease to prevent or limit contagion. The Financial Stability Board — a panel of central bankers, finance officials, and regulators — has built a framework to audit tightly coupled ties among banks, to gauge risks to the world’s financial system.

Communication is a complex system

As for communication, its association with complexity reaches back more than half a century to the first, and one of the most influential, paradigms of its kind. Originated by Claude Shannon, an engineer and mathematician at Bell Labs, and his colleague Warren Weaver, a scientist and pioneer in complexity theory, the communication model was designed to enable engineers to find  the most efficient way to transmit electrical signals from one place to another. Since then it has been adapted to assorted modes of human interaction. Our brains, for example, are highly complex organisms that manage infinite electrical connections among billions of neurons that keep us alive. Beyond that, they store myriad facts, experiences, impressions, and memories that coalesce to produce ideas. These ideas, in turn, traverse robust digital networks in the form of news, entertainment, social conversations, and data.

Indeed, the proliferation of communication networks has dramatically escalated global interdependence. Today’s most pressing issues, whether immigration, climate change, dysfunctional governments, or a fragile worldwide economy are not only intricate in and of themselves, they are deeply entangled with each other. “Global risks are interconnected, and that can create unexpected consequences,” warns the World Economic Forum (WEF) in its latest report on the topic. “The world has navigated previous eras of profound transitions resulting from converging economic, technological, and geopolitical developments. But with a faster pace of change and more complex interconnections, the stakes have never been higher.” According to the WEF, businesses risk damage to their reputations, loss of market share, and disruption of established models.

If these aren’t enough to compel companies to deal with such problems, growing numbers of consumers demand they do so as well. A recent survey by public relations and marketing agency Cone Communications found that nine-in-10 respondents expect firms to do more than simply make a profit. They also want them to address such concerns. Thus, it falls to PR pros and growing legions of content marketers to help stakeholders make sense of a seemingly chaotic world. But this presents two formidable challenges. First, they must comprehend and coherently explain an array of interlocking issues. Second, they have to do this through processes that are, themselves, becoming more complex.

Traditionally, enterprises have relied on linear, or analytical thinking to diagnose a problem by zeroing in on specific aspects and concocting a strategy to resolve it; assuming the situation is static and merely the sum of its various parts. But successfully managing complex communication entails moving in the opposite direction by stepping back and recognizing what happens when divergent elements come together under constantly changing circumstances. This is complexity, or systems, thinking, which is largely a matter of perception. It is a distinctive way of approaching questions with a more open mind, and requires communication professionals to work at the intersection of diverse knowledge and skills. The following of which should be on every communicator’s shortlist.

Digitization

Nothing in this new century has engendered as much opportunity, and angst, as the ability to join two simple digits in innumerable combinations. Like pairs of molecules in DNA strands that determine the structure and function of almost all living things, strings of ones and zeroes increasingly dictate the workings of modern communications. Digital systems are taking down existing barriers to entry, forcing formerly detached businesses to directly compete, collaborate, or both in newly defined markets. Where the New York Times, CNN, and National Public Radio once operated in substantially separate domains, they now go head-to-head online, relying on comparable sets of text, sounds, and images. Advertising agencies feel the breath on their necks from technology companies, talent firms, and publishers looking to disintermediate them when reaching out to consumers. Something similar is happening on a personal level too, as those in public relations assume roles that were formerly the exclusive province of journalists, while defending their own turf against an onslaught of content marketers.

To stay ahead, individuals and organizations will have to master the latest devices and applications; but that will be a Sisyphean task. “Novel technologies make possible other novel technologies,” says complexity theorist W. Brian Arthur. These beget still newer technologies, and so on. Adds Arthur: “It follows that a novel technology is not just a one-time disruption to equilibrium, it is an ongoing generator and demander of further technologies that themselves generate and demand still further technologies.”

Consequently, competitive advantages gained from any particular tool or technique are likely to be short-lived since digitization allows upstarts to continually link people, processes, and things in entirely new ways. Little wonder that, in a survey of almost 4,000 technology officers in 30 countries, 75 percent of participating CIOs in broadcasting and media named digital disruption as a significant threat; as did 62 percent of those in PR and advertising. Thus, capitalizing on future opportunities will involve more than merely grasping technology. It will warrant deciphering complexity as well.

Data

In the same way astronomers use telescopes to explore the intricacies of the universe, and biologists rely on microscopes to reveal objects once too minute to imagine, data scientists hope to uncover new insights by collecting, dissecting, tabulating, analyzing, and extracting value from trillions upon trillions of bits of information. Complexity, however, isn’t always accommodating. The Internet and the slew of apparatus attached to it have generated massive databases, whether residing on single servers or distributed within “clouds” of computing resources. Every minute of every day, users “like” more than four million Facebook posts; send nearly 350,000 tweets; upload 300 hours of YouTube videos; and download 50,000 apps. What is more, as much as 80 percent of the resulting data is unstructured, meaning it lacks any predefined format. And because data regularly change, scientists have to continuously create new models just to keep up.

Further compounding the situation is the fact that companies capture data from disparate sources. It is estimated that better than 70 percent of firms rely on at least six separate sources; with 23 percent using more than twenty. In a survey of 1,500 global executives by digital consulting agency Bluewolf, roughly half of all participants admitted having a hard time reconciling data with different origins. Worse yet, a study by Pricewaterhouse Cooper found that two thirds of companies obtained little or no tangible benefits from the information they gathered.

Given the speed and perpetual glut of data, communicators must acknowledge that the responsibility for realizing actual benefits is no longer theirs alone, but one they share with machines. There are algorithms that already gather, organize, and analyze data they turn into rudimentary press releases and quarterly reports. Other programs can automatically edit hours of video footage for images with ideal artistic qualities. Plus a of team of Chinese and American engineers is developing the means to anticipate users’ needs and deliver only the most relevant content. Complex in their own right, algorithms are eminently versatile; so the more they learn, the more they do.

Cognition

To genuinely appreciate what complexity is, it helps to know what it is not. Arcane language like the kind that fills legal documents is not complex. It is detailed and often convoluted, but nothing some good wordsmithing can’t clarify. Neither is a Power Point complex, no matter how elaborate the bells and whistles. Every slide is designed, organized, and presented in a manner that is readily controlled and repeatable.

A conversation, however, is complex. Person A says something to person B. B responds, and a dialog begins. If they know each other and the parameters are clearly marked, their discourse may go as expected. If not, it is apt to be unpredictable, possibly uncontrollable, and almost always self-adjusting. Now multiply that several billion fold.

Audiences of all kinds are dividing and sub-dividing into smaller, more distinct groups, each with the capacity to send as well as receive information. “In this digital age almost anyone can be a stakeholder or commentator, both inside and outside their organization,” notes David Broome, an executive director at the VMA Group, whose Business Leaders in Communications Study 2014/15 concluded that the number and complexity of key audiences was the single most important challenge to communicators.

While unbundling audiences can unleash a bounty of new and different ideas, many people choose to shut them out. Engulfed in a deluge of content, they seek to narrow their options, primarily to those world views that confirm their own. Such biases serve as lens through which to buttress their attitudes and assumptions; and once they have settled on their opinions, if directly challenged they will double down on their decisions.

To penetrate these filters it is critical to understand how different groups access, process, interpret, and use information. Accordingly, advances in cognitive and behavioral sciences merge disciplines like psychology, sociology, anthropology, philosophy, linguistics, economics and political science to explore how humans reason and respond in the context of their sundry state of affairs. Incorporating their findings within powerful databases can reveal previously undisclosed patterns, peculiarities, and even sentiments around which to design and deliver more meaningful messages.

While they are at it, communicators should take into account how their own beliefs and experiences cloud their judgments. Another reason journalists were slow to pick up on Donald Trump’s popularity was because few actually know any Trump followers. “We were not socially intermingled with his supporters and did not listen carefully enough,” concedes New York Times columnist David Brooks, who vows to do better in the future.

Diffusion

As the connectivity within a system increases, it is the relationship between components, rather than the components themselves, that define it as a network; and what then becomes important is how information circulates. Complexity — particularly nonlinearity — is inherent to almost all networks. Though the number of elements grows in linear fashion, the number of connections between them grows exponentially. Online ad markets, for example, generate 100 billion impressions daily, with each one measured against as many as 100 separate variables, culminating in countless possible outcomes. But such aggressive increases can reach a point at which the entire system is at risk of breaking down or completely collapsing under its own weight. According to research by Accenture Digital, 50 percent of marketers already deal with more content than they can effectively handle. Even so, 83 percent expect the volume to expand unchecked over the next two years.

This condition is especially dangerous in hyperconnected networks where trivial events can suddenly turn into full-blown crises. Here, problems can spread as easily as solutions — sometimes more so — through processes that are difficult, if not impossible, to stop. To make matters worse, the Internet of Things may soon bring an additional 25 billion devices online at speeds up to 100 times faster than current platforms.

Design

In light of these circumstances, communication professionals are wedged between a rock and hard place. Complexity doesn’t conform to most peoples’ desire for control, nor to their need for direct lines between cause and effect. Likewise, managers want assurance that once a plan is in place it can be used repeatedly; hence the popularity of templates and best practices. Nonetheless, communicators have little choice but to, at least, understand complexity if not embrace it to successfully tackle the vicissitudes of 21st century society.

In part, this is because value is shifting from technology — be it hardware, software, or the platforms on which they reside — to how technology interacts with its environment. To that end, complex systems design is about connecting people to people, people to machines, and machines to machines, focusing on both the interactions and their implications.

Additionally, since complex systems comprise multiple interacting elements, they require comprehensive strategies. Bringing together people with varied backgrounds and experiences to consider a problem from diverse perspectives enables them to perceive it in ways no single individual can.

Most important, complex systems are dynamic and forever fluid. “You do not compete against competitors ,” says Cisco’s executive chairman John Chambers. “You compete against market transitions.” Therefore, complex strategies must be extremely adaptable. They are patterns of action that evolve from the juncture where an organization’s best laid plans collide with the ever changing realities of the marketplace. Essentially ongoing works in progress, they develop in the absence of specific policies, or despite them, and instead require a willingness to put aside assumptions and learn what can actually be achieved in practice. They will often vary depending on the circumstances. Not everything will be apparent; certainly not right away. And they are bound to change throughout the process. But the goal will always be to recognize how a system’s different pieces interact and influence each other; and with that, continually build ever better strategies.

A complex path to the digital future

This piece first appeared on March 15th in ParisTechReview

bouroul

Scores of businesses have spent much of the past year obsessing about digital transformation. For good reason. According to one survey of 941 leaders across 12 industries, by the Global Center for Digital Transformation, respondents believe roughly four of the top 10 companies in each industry will be displaced by digital disruption within five years. That is a sobering prospect; and yet their apprehension is somewhat misdirected. Granted the world is being digitized as billions of individuals, organizations, and devices link online. However, these are the same reasons life is becoming ever more complex; which presents an equally formidable challenge.

There are times when trying to explain something confusing or hard to understand we casually label it “complex” and move on. But it is a mistake to reduce such an important concept to a throwaway line. Complex systems comprise multiple interacting parts. When even the simplest components converge they produce new phenomena that are qualitatively different than their original ingredients. (As when gases hydrogen and oxygen combine to form water). Moreover, the process is nonlinear, so input and output are not proportional to each other. What may first appear small and inconsequential can quickly become big and burdensome. For every event that occurs, there are any number of possible causes, with no clear connections between them. And said systems can learn and change in response to their environments. All of which means they are extremely difficult to control, and their outcomes are often too uncertain to predict.

By these measures, our social, political, and economic institutions have always been complex; as has society as a whole. So what is different now? To start, there are a great many more of us with whom to interact. The planet’s population has grown three times larger over the past century than during the previous nineteen hundred years combined. An increasing majority also live in cities, which by their very nature engender complexity. Further, today’s most pressing issues, whether immigration, climate change, dysfunctional governments, or a fragile worldwide economy are not only intricate in and of themselves, but are deeply entangled with each other. “Global risks are interconnected, and that can create unexpected consequences,” warns the World Economic Forum’s latest report on the subject. Plus the situation will surely intensify, thanks to our ability to translate just about anything into strings of ones and zeroes.

In a world in which information, capital, and labor are no longer confined by time nor distance, nearly everything has some sort of impact on practically everything else. As traditional barriers to entry crumble, organizations that once operated in separate universes now bump up against each other, competing, collaborating, or both in newly defined markets. For evidence, look no farther than the recent machinations between Ford and Google as to the future of the automobile. The digitization of products and services also enables disrupters to deliver the same or better value while dodging many of the obstacles incumbents had to confront. Thus, it is the tag team of digital and complex systems that is slamming business models and upending corporate cultures. Nowhere more so than in the realm of communications.

Communication is at the core of complexity
“Every time there is an improvement in the technology with which people and ideas come together, major change ensues.” British science historian James Burke uttered those words some thirty years ago, and time has only amplified the process. As the connectivity within systems expand, it is the interactions rather than the parts themselves that ultimately define those systems. Since all complex systems involve the exchange of varied types of information, these interconnections are actually modes of communication, and digital networks are the high speed conduits through which disparate elements travel in the form of news, entertainment, social conversations, and data. Lots of data. The Internet and the slew of apparatus attached to it have begot massive databases, whether residing on single servers or distributed within “clouds” of computing resources. Currently, cross-border data flows are 45 times higher than in 2005, and are expected to balloon another nine fold by the end of the decade.

No doubt the rate of “datafication” — converting objects into data form — is accelerating. Much of the transition is being driven by people across social media, soon to be substantially augmented by myriad gadgets arriving via the Internet of Things. Like the telescope, which opened the universe to intellectual exploration, and the microscope that revealed organisms once too minute to imagine, this deluge makes it possible for data scientists to discover new insights by collecting, dissecting, tabulating, and analyzing trillions upon trillions of bits of information.

But more and more detailed data bring with them added complexity, especially as companies seek to extract value from numerous sources. In a study of 1,500 global executives by digital consulting agency Bluewolf, close to half admitted having a hard time reconciling data with different origins. Not surprising considering analysis by the Aberdeen Group and Ventana Research, which found that over 70 percent of firms rely on at least six sources, with some exceeding twenty. What is more, though the number of data sources may grow in linear fashion, how they interact increases exponentially. Online ad markets, for example, are estimated to generate 100 billion impressions daily, with each one measured against as many as 100 separate variables, culminating in countless possible outcomes. And because variables frequently change, so do outcomes, forcing marketers to regularly rework their models.

Indeed, the fact that content traverses networks in various formats — and audiences can receive it when, where, and however they want — is reshaping the entire media industry. Where the New York Times, CNN, and National Public Radio previously operated in ostensibly separate domains, these days they directly compete for attention online, relying on comparable combinations of text, sounds, and images. The same goes for communication professionals, as those in public relations assume roles that were formerly the exclusive province of journalists, while defending their own turf against an onslaught of content marketers.

For their part, consumers are both adapting and contributing to the intricacies of a digital world. Once generally passive and apart from each other, they now readily interact, influencing the manner in which they access, understand, and use information. Cheap yet capable systems have also made it possible for virtually anyone to produce content and distribute it, either on their own or in collaboration with others. The resulting complexity, according to research by corporate and marketing communications recruiter, VMA Group, “is the single most important challenge for the communication function.”

Still, the problem is hardly unique to individuals or organizations. For developed nations, whose economic and political hegemonies are already in question, this liberation of information and communication technologies is further eroding the status quo. At the first BRICs Media Summit held in Beijing at the end of 2015, leaders of media establishments from the five emerging economies declared they were no longer willing to allow the world’s mass media to be monopolized by the West. They vowed instead to pool their resources and strengthen ties among their countries so they “may come together and tell stories that truthfully reflect BRICs’ cooperation.”

Not all transformations are equal
Digital transformation and the complexity it provokes are subverting conventional business practices, while opening opportunities for aggressive start ups, upstarts, and veterans. But the conversions aren’t occurring in equilibrium. A study out of the Fletcher School of international affairs at America’s Tufts University found significant differences in how countries digitally adapt. In lieu of a single grand pattern, researchers identified several key drivers — the interplay between supply and demand, institutional environments, and the capacity for innovation — that determine both the direction and momentum of advancement. The extent to which the drivers do or do not correlate dictates whether their trajectories are consistent or nonlinear.

Taking a slightly different tack, an investigation by the McKinsey Global Institute focused on just one location — the United States — gauging the degree to which both corporate America and the general population are moving into the digital age. Dividing the nation into digital “haves” and “have-mores,” it defined the latter as the minority of companies using contemporary means and tools to successfully update their core processes, and as elite workers whose highly marketable skills garner wages well above the national average. After examining more than two dozen key indicators across 30 industries, McKinsey concluded that in the U.S., as elsewhere, progress is uneven, causing a critical imbalance between the most digitized members and the rest of the economy.

Inequality like this “brings a number of potential consequences including the rise of populist politicians, the blocking of innovation and the onset of protectionism and nativism,” cautions Richard Edelman, president and CEO of the communications marketing firm bearing his name. The company’s annual trust and credibility survey uncovered an unprecedented gap between the global elite and the rest of the world’s inhabitants with respect to their confidence in government, non-governmental organizations, business, and the media. Adds Edelman: “the trust of the mass population can no longer be taken for granted.”

Under these circumstances, every institution finds itself firmly wedged between the proverbial rock and hard place. Complexity, after all, doesn’t conform to most peoples’ desire for control, nor to their need for direct lines between cause and effect. Likewise, managers want assurance that once a strategy is put in place it can be used repeatedly, hence the appeal of  playbooks and best practices. When they don’t get that, says Yves Morieux, senior partner and managing director at the Boston Consulting Group, they respond with a “proliferation of cumbersome structures, interfaces, coordination bodies and committees, procedures, rules, metrics, key performance indicators, and scorecards.” This hodgepodge of initiatives, however, can unwittingly aggravate the problem. In response to survey of 331 executives from companies with revenues over $500 million, by the Economist Intelligence Unit, more than half confirmed that organizational complexity actually lowered profits.

Thinking differently
Some enterprises, on the other hand, are pursuing a considerably different line of attack. While not necessarily embracing complexity, they are learning to adapt to their environments and cope with the uncertainty of constantly changing conditions. The scientific community, for instance, has applied complexity thinking to a range of climate, food, and security matters. Epidemiologists have developed tools to identify super-spreaders of diseases and prevent or limit contagion. Most recently, a group of experts from fields as diverse as physics, sociology, ecology, computer science, public health, economics, and banking co-authored an article urging financial regulators and central bankers to incorporate complexity theory into new models that could enable them to foresee and possibly mitigate future crises.

What these endeavors exemplify is the capacity to tackle the vicissitudes of 21st century life through what are known as emergent strategies: patterns of action that evolve from the juncture where an organization’s best laid plans collide with the realities of the marketplace. Essentially ongoing works in progress, they often develop in the absence of specific policies, or despite them, and instead require a willingness to put aside assumptions and learn what works in practice. Just as important, emergent strategies arise through the convergence of ideas from discrete sources.

This is especially important to communication professionals, for whom the rules of engagement are changing. With the upsurge of social media platforms, communication is becoming more like conversation. Yet conversations are sometimes uncontrollable, often unpredictable, and almost always self-adjusting. In short, they are complex. And because people perceive their lives through multiple lenses, it is critical to recognize how different groups access, process, interpret, and use information. To that end, advances in cognitive and behavioral sciences bring together distinct disciplines like psychology, sociology, anthropology, philosophy, linguistics, economics and political science to explore how humans reason and respond in the context of their sundry state of affairs. Incorporating these variables within powerful databases reveals previously undisclosed patterns, peculiarities, and even sentiments around which to design and deliver more meaningful messages. Yet communicators must understand that the task is no longer theirs alone, but one they will increasingly share with machines.

All technologies are complex
There are algorithms, for example, that already gather, organize, and analyze data that they turn into rudimentary documents such as press releases and quarterly reports. More advanced designs are underway that can automatically edit hours of video footage for images with the ideal artistic qualities. And a team of Chinese and American engineers is developing the means to anticipate users’ needs and deliver only the most relevant content. Little wonder then, in a survey of more than 3,000 technology officers in 30 countries, 97 percent of participating CIOs in broadcasting and media acknowledge digital disruption as a significant threat; as do 90 percent of their counterparts in public relations and advertising. These and innovations like them will add still newer layers of complexity.

In fact, complexity is inherent in all technologies because they are basically combinations of elements. “Novel technologies call forth further novel technologies,” notes W. Brian Arthur, an American economist and a pioneer in complexity theory. They, in turn, make possible still newer applications, and so on. Adds Arthur: “It follows that a novel technology is not just a one-time disruption to equilibrium, it is a permanent ongoing generator and demander of further technologies that themselves generate and demand still further technologies.” Accordingly, what it takes to be effectively digitized today won’t be the same, one, two, or five years down the road, rendering digital transformation a moving target.

In time, most businesses will likely learn to cope with the current crop of systems and technologies. Nonetheless, their successes will be short-lived. A lot of the world has yet to come online. The Internet is expected to connect four billion users, and as many as 25 billion devices, before the close of the decade. By then, ultrafast 5G mobile broadband may deliver data and content at speeds possibly up to 100 times faster than current 4G platforms.  Add enhancements to machine learning, virtual reality and, on the horizon, biotechnology to the mix, and these vast swarms of elements will interact in trillions of different ways, increasing uncertainty but unleashing a bounty of unpredictable opportunities as well.

Capitalizing on these will require that individuals and organizations do more than merely manage technology. It will demand that they also master complexity. That means being able to step back and consider all of the critical components of system. They will vary depending on the circumstances. Not everything will be apparent; certainly not right away. And some are bound to change throughout the process. But the goal will be to recognize how all of the different pieces interact and influence each other; and with that knowledge, continually build appropriate strategies.

A modern myth debunked: no, consumers are not in control

This piece first appeared on May 7th in ParisTechReview

cosnomadorA myth, the late British philosopher Alan Watts once wrote, “is an image in terms of which we try to make sense of the world.” To that end, marketers have imagined any number of such illusions to explain the current state of their disrupted profession. The most fashionable of these traverse the Internet as memes. None more so than the fiction that “consumers are in control.” The logic behind this lore goes something like this: digital systems have shifted the balance of power from sellers of goods and services to buyers, who now dictate what, when, and how business gets done. Though this may make sense to marketers, it is news to many consumers. Certainly marketers no longer exercise the leverage they once did. But it is a delusion to assume it has been transferred to consumers; especially in an increasingly complex world where the very notion of control is, itself, becoming a myth.

First we should pay attention to what they say. According to last summer’s Corporate Perception Indicator by U.S. cable television network CNBC and the public relations firm Burson Marsteller, better than 40 percent of the more than 25,000 international consumers queried think corporations have too much influence over their economic futures. In addition, 67 percent of those same respondents—and 66 percent of business executives—believe corporations sway public opinion far more than the public motivates corporations.

The concern is even more acute with respect to privacy, most notably in the United States, where a study by the Pew Research Center found that 91 percent of participants agree consumers have lost control over how companies collect and use their data.

Purchasing vs. subscribing
So why this disconnect between marketers and their intended audiences? One reason is that the former choose to characterize the relationship in terms of control; which implies they see it as adversarial. The dictionary defines control as “having power over someone”; and in an age when information is power, those who have it are loath to give it up.

For instance, when consumers buy digital books from Amazon or music from iTunes, they don’t actually own what they purchase. Rather, they acquire a license agreement that is basically a long-term lease – set forth under the Terms and Conditions – allowing the seller to revoke the privilege and remove the content at its discretion. Apple did essentially that, between 2007 and 2009, when it deleted songs on iPods that had been downloaded from services other than its own. After scanning the devices for items not bought through the iTunes Music Store, the firm required a factory reset, after which all music obtained from rivals inexplicably disappeared.

Companies such as Adobe and Microsoft have taken somewhat different tack by borrowing a page from content services like Netflix and replacing boxed-software with annual online subscriptions, so customers must re-up every year if they want to keep using their programs. There was a time when makers of books, music, and other works had little choice but to package their creations in physical form and sell them to buyers outright. Digital networks have changed that. Producers can now convert owners to perpetual users and extend the value of their intellectual property ad infinitum. Interestingly though, many of these same enterprises balk at any suggestion they pay consumers for the personal data they collect in the process.

More traditional companies too, have tried to enhance their jurisdictions. Last year, U.S. cereal giant General Mills issued a policy informing customers they gave up the right to sue the brand if they engaged in activities like joining its online community or signing up for its email newsletters. The corporation subsequently withdrew the mandate after considerable backlash, but that hasn’t stopped others from attempting to intimidate consumers.

Businesses such as hotels, dentists, and wedding photographers have begun burying non-disparagement clauses within contracts to prohibit customers from writing and posting negative reviews, even if they are entirely true. Southwest Airlines ejected a passenger and his children from a flight when he tweeted about a rude gate agent. It later allowed them back on board, but only after forcing him to delete the post.

To the extent consumers are aware of them, they find these practices offensive, to say the least. Scores of iPod users have filed an antitrust lawsuit that could cost Apple as much as a billion dollars. Moreover, the state of California has made it illegal for companies to retaliate against customers who voice their opinions. Which is why many organizations are taking a more benign approach, adopting the mantra of customer service in the hope that, in the words of the ParisTech Review, “a satisfied customer will naturally become, at virtually no cost, a commercial agent of unequaled efficiency.”

Albeit this may be true of the most passionate brand advocates, consumers as a whole are unlikely to buy into this type of an arrangement. Indeed, a survey by WPP agency Geometry Global has found that 40% of respondents from around the world say they see no point in even friending a brand online; while a majority of those questioned by the PR firm Edelman believe their relationships with companies are “one-sided,” with the latter interacting with them only out of “a desire to increase profits.”

Choice vs. control
Regardless, quality customer service does have merit, particularly when patrons have the advantage of taking their business elsewhere. This is often the case in markets where digital technologies have dismantled barriers of entry and new competitors can come from just about any place. Yet having alternatives is not the same as having control; and when choices are fewer and far between, there may be little need to curry favor with consumers.

Consider some of the businesses that reside at the bottom of the American Customer Satisfaction Index (ACSI). Among air carriers rated by the World Airline Awards, only one that is native to the U.S.—Delta—has landed in the top 50 (at number 49). The nation’s airlines, however, lead all others in terms of operating income and market value, despite subjecting fliers to more fees and less legroom.

For their part, Comcast and Time Warner Cable are the largest American cable television and Internet service providers, and the worst-rated. But they may still have a way to go before hitting bottom as they attempt to merge, since ACSI data show that such consolidation usually results in even lower customer satisfaction. That said, these companies, like airlines, operate in industries where size matters and the barriers remain fairly high; so there is limited downside to not pleasing users.

In light of these circumstances, by what other measure can marketers ascribe control to consumers? Not their technological prowess. Contrary to conventional wisdom, research over the years has shown that only a minority of people online go public with their opinions, whether negative or positive. And a meager one percent of them generate genuine engagement.

Recent headlines notwithstanding, consumers also lack the skills, means, or inclination to bend a company to their will as hackers did to Sony. The fact is, most still don’t truly comprehend how those little people get inside their television sets, let alone how companies track and exploit their personal information. In surveys dating back to 1999, Dr. Joseph Turow, a professor at the University of Pennsylvania’s Annenberg School of Communication has consistently found that the majority of Americans have little understanding of techniques like data mining or behavioral targeting, and are without the knowledge to adequately protect their privacy. Which may account for why the tool most often credited with empowering its users – social media – is the ACSI’s fourth lowest-scoring category; and Facebook specifically is the company most feared with respect to privacy.

Complexity vs. control
Granted marketers no longer exercise the leverage they once did, it is a delusion to assume it has been transferred to consumers; especially in an increasingly complex world where the very notion of control is, itself, becoming a myth. Complex systems are networks of many different elements that continually interact in unplanned and unpredictable ways. What is more, small or almost invisible occurrences can beget ever larger chains of unexpected events. Thus, as situations change so do outcomes; and both can be extremely difficult to command. Nowhere is this more evident than in today’s global digital marketplace.

For one thing, marketers are fond of referring to consumers in the form of what social scientists call “representative agents” – factitious persons or groups who typify the behavior of broad swaths of the populace. But as populations atomize into smaller self-defined segments whose interests simultaneously diverge and overlap, such generalities can be misleading.

In truth, there are significant differences among consumers worldwide. Those in North America and Europe, for example, mainly believe big business has too much control over their lives; whereas their counterparts across much of Asia think corporations exercise just the right amount influence. In developed nations, citizens also tend to be more guarded about their privacy than residents of emerging markets.

But even within the same markets, consumers send mixed signals. According to global IT services firm EMC Corporation, people exhibit what it calls a “We Want It All” paradox. In its study of 15,000 participants in 15 countries, 91% of respondents say they value the benefit of “easier access to information and knowledge” that digital technology affords. Nevertheless, only 27% say they are willing to trade their privacy for the convenience of being online.

Juggling these disparities can be overwhelming, but complicating the issue further is an array of intricate technologies for which many marketers are ill-prepared.  As part of its report on the 2014 State of Digital Transformation, analyst firm Altimeter Group determined that 88 percent of companies surveyed had established a formal set of conversion procedures; though 42 percent were investing in their initiatives without really understanding what it is they plan to achieve.

For sundry businesses, the process starts with mastering “big data.” But to capitalize on this phenomenon they must extract and examine trillions upon trillions of bits of information created by people across multiple devices. Add to this the seemingly infinite output generated by the “Internet of Things”—billions of objects embedded with readable sensors—and the task is truly staggering as companies have to constantly update operating models just to keep pace. Furthermore, their cravings for vast amounts of data do not always jibe with their capacity to manage it.

In a survey last year by IBM, 70 percent of chief marketing officers admitted they were unprepared to handle the relentless surge of data. A more recent study by consulting firm Capgemini found that although 74 percent of covered organizations have managed to launch big data projects, just over a third of them describe these endeavors as “successful.” Only eight percent deem their efforts as “very successful.” Plus the more than 42 million security breaches over the past year raise serious questions about companies’ abilities to safeguard what information they gather.

Still, as overpowering as big data may appear to be, it pales in comparison to how much control marketers relinquish to the dominance of algorithms, and to those who can effectively manipulate them.

The power of algorithms
Algorithms are highly sophisticated sets of instructions for solving complex problems; and in a mere fraction of the time it takes humans to do so. In addition, they are capable of learning from their environments. So they can readily upend even the most meticulous marketing strategies. Google has proven this time and again by altering its search algorithms without warning, and sending numerous web site rankings into freefall.

Facebook too, has fiddled with its formulas. After years of urging brands to aggressively collect “likes,” it has substantially limited their ability to reach fans solely by means of organic marketing. Reducing the types and number of their posts that appear in users’ newsfeeds, the social network is coercing brands into buying ads instead.

Yet even when marketers manage to navigate these obstacles, they still come up against an onslaught of mathematically conceived fraudsters known as bots, which are designed to masquerade as online users. It is estimated that as much as half of all publisher traffic online is in the form of these automated applications; accounting for 11 percent of display ad views, and 23 percent video ads. This year alone they are expected to swindle businesses out of more than six billion dollars.

These and comparable challenges pose far greater threats to marketers’ authority than almost anything consumers can conjure up. Yet marketers can hand off some of the responsibility for tackling such matters to expert third parties like data scientists, mathematicians, physicists, and teams of lawyers and lobbyists. Consumers, on the other hand, must rely principally on legislators, the courts, and the occasional whistle blower to look after their interests.

In Europe, officials are drafting a new privacy directive slated for 2016, which will introduce extremely stringent rules on data collection that will apply throughout the European Union. Across the Atlantic, similar efforts have been more equivocal. Although consumer advocates mostly cheered the Federal Communications Commission’s new regulations for net neutrality, the White House’s proposed Consumer Privacy Bill of Rights Act has left many wanting more.

So no, despite what marketers might believe, consumers are not in control. They never have been and probably don’t want to be. It would take too much of their time and effort. What they more likely desire are simple, transparent interactions. The kind where they get what they pay for without having to fight or shill for what is rightfully theirs. Brands that are willing to abandon the meme and acknowledge this mindset will benefit by building relationships on the basis of something other than a myth.

Content, Complexity, and Why Strategies Don’t Work

Strategy works until it doesn’t. So goes the old management saw that is especially relevant of late to content marketers. According to the most recent North American B2B Content Marketing survey, 83 percent of respondents have devised some form of strategy. Yet fewer than one-in-ten believe their efforts are “very effective.” It is not for lack of trying. To the contrary, there have been any number of attempts to define, codify, and document best practices for content strategy; and that is a big part of the problem.

Content has become the lifeblood of modern marketing as brands apply an infusion of trendy techniques. But much of what passes for original thinking these days are generally operational retreads. Content marketing, for example, borrows liberally from public relations. Native advertising replicates past practices of print advertorials and video news releases. While the doctrine of “paid, owned and earned” is rooted in decades-old principles of integrated marketing communications. In each case it is less a matter of reinventing the wheel than simply renaming it. But contemporary labels belie the outdated methods behind them.

The Fallacy of Documented Strategies

Content marketers are endorsing conventional strategy at a time when others have begun to question its value. “In a world where a competitive advantage often evaporates in less than a year,” writes Rita Gunther McGrath, a professor of management at the Columbia Business School, “companies can’t afford to spend months at a time crafting a single long-term strategy.” Nonetheless, many content marketers are doing just that. Browse the myriad how-to articles, books, and blogs online and you are bound to come upon numerous prescriptions for “repeatable frameworks” that can be used, as is, again and again. For its part, the Content Marketing Institute urges brands to “take the time to record your strategy and follow it closely.”

A generation ago, management scholars Henry Mintzberg and James A. Waters coined the term “deliberate strategy” to describe this approach, and little has changed since. Conceived primarily from a company’s perspective – and usually mandated from the top down – such schemes are still designed, developed, and executed with explicit directions; and with the expectation that results will play out as intended. Mintzberg, however, has noted that their success rests on two questionable assumptions: first, that an organization has the most topical information to formulate workable strategies; and second, that the external environment is predictable enough to ensure the strategies remain viable after they are implemented.

Given the unprecedented volume and velocity of new data, marketers must, at the very least, constantly rework their models just to keep pace. Moreover, even the most conscientious among them are unlikely to consistently foresee correct outcomes in a complex world. That is because a complex system is one in which many distinct parts interact in spontaneous and frequently unexpected ways. This is certainly true with respect to marketing, where interconnected of networks, the proliferation of devices, and fragmenting audiences multiply complexity exponentially.

Beyond the Capacity of Mere Mortals

Yet what may confound content marketers most is a mode of strategy far more advanced than their own. Not long ago, content plans mainly favored short prose studded with keywords and targeted by sundry links. That was, until Google’s menagerie of algorithms – Panda, Penguin, and Hummingbird – turned the logic on its head and certain sites saw their search rankings plummet practically overnight. More recently, Facebook and Twitter have fiddled with their own formulas to determine what content their users do and don’t see.

An algorithm, after all, is a calculated series of steps for solving a problem; though in an infinitesimal fraction of the time it takes a person to do so. These increasingly ubiquitous and highly sophisticated instructions are also capable of learning from their surroundings. So whether managing access to information; masquerading as humans in the form of bots; or creating content themselves, algorithms are getting technologically smarter, says physicist Sean Gourley. More so than mere mortals. Furthermore, when processing big data within complex systems they generate surprising phenomena. This is known as “emergence,” and can wreck havoc on the most meticulous methodologies.

Which is why professors Minztberg and Waters introduced the notion of “emergent strategy.” Unlike its premeditated counterpart, an emergent strategy develops at the juncture where an organization’s objectives collide with the environment’s realities. It is analogous to a conversation, since conversations are sometimes uncontrollable, often unpredictable, and almost always self-adjusting. Consequently, dealing with emergence requires considerable flexibility, enabling strategies to handle ever-changing circumstances.

Buddhists Have a Word for That

Content marketers, too, must be more versatile. Although pundits are fond of exhorting brands to think like publishers, this is dubious advice considering how many publications have fumbled the transition to digital. Rather, marketers ought to adopt a more comprehensive mindset. What they can learn from publishers is how to listen to their audiences and respond accordingly. Yet they can also take their cues from information designers, who manage various platforms and techniques to translate complicated concepts into intelligible ideas. And from data scientists who use metadata to put facts in context to enhance their value. Indeed, there are a host of disciplines from which content marketers can draw inspiration for emergent styles of strategy.

Complex situations call for diverse solutions. Swift and seemingly infinite changes make it impractical, if not impossible, to contain communication systems within hardbound rules or templates. What is more, genuinely successful strategies are rarely transferable because they address unique combinations of people, ideas, and events. Thus, codifying best practices makes it that much harder to learn and innovate.

Instead, content marketers can jettison inhibiting parameters and first think of every strategy at its most basic level: as simply a way to achieve a goal. From there they can proceed with what social scientists call “systems thinking,” and Zen Buddhists refer to as Shoshin or Beginner’s Mind. That is the capacity to set aside assumptions and understand how all of the separate parts of a problem intersect and influence each other. Then they can incorporate whatever components are most appropriate. So no matter how circumstances change, every strategy can effectively adapt.

This piece originally appeared in Medium

Content, Complexity, and Why Strategies Don’t Work (original version)

This piece first appeared on March 3rd in Medium

Content Strategy2Strategy works until it doesn’t. So goes the old management saw that is especially relevant of late to content marketers. According to the most recent North American B2B Content Marketing survey, 83 percent of respondents have devised some form of strategy. Yet fewer than one-in-ten believe their efforts are “very effective.” It is not for lack of trying. To the contrary, there have been any number of attempts to define, codify, and document best practices for content strategy; and that is a big part of the problem.

Content has become the lifeblood of modern marketing as brands apply an infusion of trendy techniques. But much of what passes for original thinking these days are generally operational retreads. Content marketing, for example, borrows liberally from public relations. Native advertising replicates past practices of print advertorials and video news releases. While the doctrine of “paid, owned and earned” is rooted in decades-old principles of integrated marketing communications. In each case it is less a matter of reinventing the wheel than simply renaming it. But contemporary labels belie the outdated methods behind them.

The Fallacy of Documented Strategies

Content marketers are endorsing conventional strategy at a time when others have begun to question its value. “In a world where a competitive advantage often evaporates in less than a year,” writes Rita Gunther McGrath, a professor of management at the Columbia Business School, “companies can’t afford to spend months at a time crafting a single long-term strategy.” Nonetheless, many content marketers are doing just that. Browse the myriad how-to articles, books, and blogs online and you are bound to come upon numerous prescriptions for “repeatable frameworks” that can be used, as is, again and again. For its part, the Content Marketing Institute urges brands to “take the time to record your strategy and follow it closely.”

A generation ago, management scholars Henry Mintzberg and James A. Waters coined the term “deliberate strategy” to describe this approach, and little has changed since. Conceived primarily from a company’s perspective – and usually mandated from the top down – such schemes are still designed, developed, and executed with explicit directions; and with the expectation that results will play out as intended. Mintzberg, however, has noted that their success rests on two questionable assumptions: first, that an organization has the most topical information to formulate workable strategies; and second, that the external environment is predictable enough to ensure the strategies remain viable after they are implemented.

Given the unprecedented volume and velocity of new data, marketers must, at the very least, constantly rework their models just to keep pace. Moreover, even the most conscientious among them are unlikely to consistently foresee correct outcomes in a complex world. That is because a complex system is one in which many distinct parts interact in spontaneous and frequently unexpected ways. This is certainly true with respect to marketing, where interconnected of networks, the proliferation of devices, and fragmenting audiences multiply complexity exponentially.

Beyond the Capacity of Mere Mortals

Yet what may confound content marketers most is a mode of strategy far more advanced than their own. Not long ago, content plans mainly favored short prose studded with keywords and targeted by sundry links. That was, until Google’s menagerie of algorithms – Panda, Penguin, and Hummingbird – turned the logic on its head and certain sites saw their search rankings plummet practically overnight. More recently, Facebook and Twitter have fiddled with their own formulas to determine what content their users do and don’t see.

An algorithm, after all, is a calculated series of steps for solving a problem; though in an infinitesimal fraction of the time it takes a person to do so. These increasingly ubiquitous and highly sophisticated instructions are also capable of learning from their surroundings. So whether managing access to information; masquerading as humans in the form of bots; or creating content themselves, algorithms are getting technologically smarter, says physicist Sean Gourley. More so than mere mortals. Furthermore, when processing big data within complex systems they generate surprising phenomena. This is known as “emergence,” and can wreck havoc on the most meticulous methodologies.

Which is why professors Minztberg and Waters introduced the notion of “emergent strategy.” Unlike its premeditated counterpart, an emergent strategy develops at the juncture where an organization’s objectives collide with the environment’s realities. It is analogous to a conversation, since conversations are sometimes uncontrollable, often unpredictable, and almost always self-adjusting. Consequently, dealing with emergence requires considerable flexibility, enabling strategies to handle ever-changing circumstances.

Buddhists Have a Word for That

Content marketers, too, must be more versatile. Although pundits are fond of exhorting brands to think like publishers, this is dubious advice considering how many publications have fumbled the transition to digital. Rather, marketers ought to adopt a more comprehensive mindset. What they can learn from publishers is how to listen to their audiences and respond accordingly. Yet they can also take their cues from information designers, who manage various platforms and techniques to translate complicated concepts into intelligible ideas. And from data scientists who use metadata to put facts in context to enhance their value. Indeed, there are a host of disciplines from which content marketers can draw inspiration for emergent styles of strategy.

Complex situations call for diverse solutions. Swift and seemingly infinite changes make it impractical, if not impossible, to contain communication systems within hardbound rules or templates. What is more, genuinely successful strategies are rarely transferable because they address unique combinations of people, ideas, and events. Thus, codifying best practices makes it that much harder to learn and innovate.

Instead, content marketers can jettison inhibiting parameters and first think of every strategy at its most basic level: as simply a way to achieve a goal. From there they can proceed with what social scientists call “systems thinking,” and Zen Buddhists refer to as Shoshin or Beginner’s Mind. That is the capacity to set aside assumptions and understand how all of the separate parts of a problem intersect and influence each other. Then they can incorporate whatever components are most appropriate. So no matter how circumstances change, every strategy can effectively adapt.

No, Consumers Are Not In Control (Short Version)

This piece first appeared on February 17th in Pulse

Control Key A myth, the late British philosopher Alan Watts once wrote, “is an image in terms of which we try to make sense of the world.” To that end, marketers have imagined any number of illusions to explain the current state of their disrupted profession. The most fashionable of these traverse the Internet as memes. None more so than the fiction that “consumers are in control.”

The logic behind this lore goes something like this: digital systems have shifted the balance of power from sellers of goods and services to buyers, who now dictate what, when and how business gets done. And though this may make sense to marketers, it is news to many consumers.

According to last summer’s Corporate Perception Indicator survey by CNBC and public relations firm Burson Marsteller, nearly half of American respondents believe corporations have too much influence over their economic futures. This is particularly true among millennials, 40 percent of whom see big business as something to fear. The concern is even more acute with the issue of privacy. A study by the Pew Research Center found that 91 percent of participants agree that consumers have lost control over how companies collect and use their personal information.

Information Is Still Power

So why this disconnect between marketers and their intended audiences? One reason is that the former choose to characterize the relationship in terms of control; which suggests they see it as adversarial. The dictionary defines control as “having power over someone”; and in an age when information is power, those who have it are loath to give it up.

For instance, when consumers buy digital books from Amazon or download music from iTunes, they don’t actually own what they purchase. Rather, they acquire a license agreement, which is essentially a long-term lease allowing the seller to revoke the privilege and delete the content at its discretion. Companies like Adobe and Microsoft take a somewhat similar tack by replacing boxed-software with annual online subscriptions, so customers must re-up every year if they want to keep using their programs.

More traditional retailers, too, have tried to expand their jurisdiction. Last year, cereal giant General Mills issued a policy informing consumers they gave up the right to sue the brand if they engaged in activities like joining its online community or subscribing to its email newsletters. (The firm has since withdrawn the mandate.)

In The Service Of The Customer?

To the extent consumers are aware of them, they find these practices offensive; which is why many enterprises take a more benign approach, adopting the mantra of customer service in the hope that, in the words of the ParisTech Review, “a satisfied customer will naturally become, at virtually no cost, a commercial agent of unequaled efficiency.”

Albeit this may be true of the most passionate brand advocates, most consumers are unlikely to buy into this type of an arrangement. Indeed, a survey by WPP agency Geometry Global found that more than half of respondents in this country said they saw no point in even friending a brand online; while seven-in-ten consumers queried by the PR firm Edelman believe companies interact with them only out of “a desire to increase profits.”

Nonetheless, quality customer service does have value, particularly when patrons have the advantage of taking their business elsewhere. This is often the case in markets where digital technologies have dismantled barriers of entry and new competitors can come from just about anywhere. Yet having alternatives is not the same as having control; and when choices are fewer and far between, there may be little need to curry favor with consumers.

Consider some of the businesses that reside at the bottom of the American Customer Satisfaction Index (ACSI). America’s airlines, for example, are the most profitable in the world despite subjecting fliers to more fees and less legroom. Comcast and Time Warner Cable are the nation’s largest cable and internet service providers, and the worst-rated. These companies operate in industries where size matters and the barriers remain fairly high.

Looking In The Wrong Direction

Granted marketers no longer exercise the leverage they once did, it is a delusion to assume it has largely been transferred to consumers. Despite conventional wisdom, only a minority of people online ever go public with their opinions, whether negative or positive. A mere fraction of those have the skills, means, or inclination to bend a company to their will as hackers did to Sony. The fact is, most consumers still don’t truly understand how those little people get inside their television sets, let alone how sophisticated algorithms track and exploit their personal data. Which may account for why the tool most often credited with empowering its users – social media – is the ACSI’s fourth lowest-scoring category.

Thus, marketers would be better served – as would their customers – if they abandoned the meme and turned their attention in other directions. They already relinquish considerable control to the likes of Facebook and Google, which continually alter the rules that determine what consumers do and don’t see. Moreover, brands must combat the seemingly endless onslaught of fraudsters – both human and robotic – which cost them as much as $6 billion last year alone.

These and comparable challenges pose far greater threats to marketers’ authority. Consumers, on the other hand, don’t exert such dominance. They never have and probably don’t want to. It would take too much of their time and effort. What they more likely desire are simple, transparent interactions. The kind where they get what they pay for without having to fight or shill for what is rightfully theirs. Brands that are willing to acknowledge this mindset will benefit by building relationships on the basis of something other than a myth.

The Conflicting Science Behind Viral Headlines

This piece first appeared on February 6th in icrunchdata news

There has been some good news and more bad news of late for journalists. A study of the state of the news media by Michael Mandel, the former chief economist at Business Week, found that the number of reporters, correspondents and analysts in 2013 was on the upswing. Editors, on the other hand, keep disappearing. Although Mandel offers no explanation for why fewer traditional blue-pencilers are processing content before it goes online, one likely reason is the belief that technology can do it at least as well, if not better.

There has been some good news and more bad news of late for journalists. A study of the state of the news media by Michael Mandel, the former chief economist at Business Week, found that the number of reporters, correspondents and analysts in 2013 was on the upswing. Editors, on the other hand, keep disappearing. Although Mandel offers no explanation for why fewer traditional blue-pencilers are processing content before it goes online, one likely reason is the belief that technology can do it at least as well, if not better.

During the past decade, publishers and a growing contingent of marketers, have regularly relied on devices like keywords to enhance their content. By gaming search algorithms through various SEO tactics, they have sought to make it easier to find. But as audiences migrate to social media and search engines change the rules, content providers of all stripes have been forced to look elsewhere for new ways to entice readers.

2f20142f022fFacebookthumb1Enter clickbait headlines. These catchy and often upbeat titles are frequently framed as questions, or offer lists of easy-to-read tools, tips and advice. (e.g. -“12 Roles Essential to the Future of Content Marketing”) For critics, however, a more apt label might be bait-and-switch headlines, since such banners at times promise more than their accompanying articles deliver. Moreover, what is served up instead may actually be inaccurate or totally misleading. Yet that hasn’t deterred content producers who have utterly bought into the concept, since this style of rubric is decidedly viral. And they believe they have the science to prove it.

For instance, a survey of 7,000 New York Times articles by two professors from The Wharton School at the University of Pennsylvania found that stories which evoke “high-arousal positive” emotions are, not surprisingly, more readily shared than those which produce “low-arousal, or deactivating” sentiments. Likewise, research published in the journal Social Influence determined that question headlines are, on average, as much as 175 percent more successful in generating readership than declarative captions.

For their part, so-called “listicles” are to the human brain what keywords are to algorithms. Cognitive scientists have shown that lists are able to garner attention because numbers more easily catch the eye in a field of words; because the mind prefers to process information spatially; and because lists save time and effort in delineating and categorizing key ideas. Such benefits are hard to refute, especially when a list-heavy site like Buzzfeed attracts an audience fourfold greater than the New York Times.

Nonetheless, what works for content providers may not necessarily serve the best interests of consumers. Headlines that belie the information within a story can leave readers frustrated, confused and ultimately distrustful of a source if it continually practices the technique. What is more, principles of cognition also raise doubts about the efficacy of lists.

Until recently, conventional scientific wisdom held that working memory – the mind’s ability to remember multiple bits of transitory information – leveled off at about seven items. But after applying more rigorous mathematical models, newer research has winnowed the number down to three or four. That makes it difficult for readers to take advantage of headlined articles that offer 10, 20 or 50 suggested solutions to a problem. And the task becomes still more daunting when trying to navigate several similar headlines on the same web site.

Recollection and, more importantly, comprehension suffer even more when addressing highly complex issues such as technology, marketing, management or finance. According to David Rock, author of Your Brain at Work, relational complexity studies – which examine the difficulty of a task based on the number of variables – have repeatedly shown that fewer variables result in better decisions. Indeed, the optimal number of different thoughts the mind can effectively hold is just one.

Thus, with a viral headline you can, to paraphrase an old proverb, lead a reader to an article, but you can’t make him comprehend it. That takes more than simply stringing together a series of ideas and adding a question mark. Rather, it requires crafting an irresistible header that introduces equally compelling and intelligible information, which can be a win-win for content providers and their audiences.

Article written by Howard Gross for icrunchdata news New York, NY

Technology Forecasts – Hyping the Future

This piece first appeared on December 20th in icrunchdata news

Another year is coming to a close and once again predictions about the future of technology are grabbing headlines. Scores of pundits, prognosticators and high-tech prophets are divining what will be hot – or not – in the days and months ahead. So in keeping with the spirit of the occasion I will offer my own projection. That is, most other predictions will be wrong. It is not that I’m more prescient than the next person. I just have history on my side. A significant majority of technology forecasts over the past century have missed their marks. Among the most notable were Albert Einstein’s skepticism that nuclear energy was obtainable; Bill Gates dismissing the likelihood of anyone ever building a 32-bit operating system; and countless dire warnings about Y2K.

Another year is coming to a close and once again predictions about the future of technology are grabbing headlines. Scores of pundits, prognosticators and high-tech prophets are divining what will be hot – or not – in the days and months ahead. So in keeping with the spirit of the occasion I will offer my own projection. That is, most other predictions will be wrong.

It is not that I’m more prescient than the next person. I just have history on my side. A significant majority of technology forecasts over the past century have missed their marks. Among the most notable were Albert Einstein’s skepticism that nuclear energy was obtainable; Bill Gates dismissing the likelihood of anyone ever building a 32-bit operating system; and countless dire warnings about Y2K.

Certainly, technology seers are not the only crystal ball gazers who under or over-shoot. In one of the most comprehensive tests of prognostic accuracy, University of Pennsylvania professor Philip Tetlock found the average political expert scored barely above a dart-throwing chimpanzee. And during a speech to last spring’s Princeton grads, Federal Reserve Chairman Ben Bernanke admitted that “Economics is a highly sophisticated field of thought that is superb at explaining to policymakers precisely why the choices they made in the past were wrong…about the future, not so much.”

2f20132f122fcrystal-ball-big-data-620x3541Yet forecasts about technology seem especially prone to miscalculation. There are any number of reasons, from self-interest to self-delusion; from wishful thinking to out-and-out fantasizing. Sometimes it is a matter of timing. New systems arrive too early, only to vanish and reappear years later in some other configuration. Plus a host of economic, social and political issues can skew results. Consequently, much of the speculation about emerging technologies is little more than a pastiche of promises, hopes and, increasingly, hyperbole.

Not surprisingly, hype has become a mainstay of technology forecasting. In an ever-expanding universe of information, attention eclipses accuracy, and exaggeration seems a surefire way to be seen and heard. What is bewildering though is why so many people – including those who should know better – buy into the hoopla, expending substantial time, effort and money in pursuit of the current Holy Grail.

To some degree, it may be a biological response. These are uncertain times and, according to Nobel prize winning psychologist Daniel Kahneman, people tend to be more impulsive and emotional in situations about which they have little knowledge or experience. In other words, they rely on their gut; or on the guts of others whom they perceive as authorities. Moreover, some neuroscientists claim that our brains are designed to actually take pleasure in positive predictions, secreting a suitable chemical whenever we encounter one.

But overreliance on bodily signals can lead some to take unwarranted risks, since our brains are also wired to minimize the mental effort required for deliberate decision-making. Which is why Kahneman believes that reasoning, or “thinking it through,” is a more effective approach to dealing with the future. Many companies, on the other hand, prefer confidence (and too often, overconfidence) to caution, and reward the kinds of aggressive behavior that may ignore uncertainty. Unfortunately, the credence most people put in their own intuition frequently belies its validity.

Despite the fallibility of hyped predictions, there is no shortage of consultants, gurus, PR hacks and industry publications at the ready to boost the “latest-thing.” A decade ago, the practice was dubbed “ management fashion ,” whereby initiators identified potential trends, developed the appropriate rhetoric and disseminated it to susceptible individuals and organizations. Little has changed since, except perhaps that today’s technology fashionistas are far more adept at appearing to reduce complexities to easily digestible memes.

That, above all else, may be the reason so many predictions go awry. The best hype, after all, is simple and exciting. But complex systems can’t be simplified, as they are comprised of myriad separate parts that interact in unplanned and unpredictable ways. The larger the system, say a global digital network, the more complex it becomes.

Albeit, many forecasters are experts in their particular fields, they don’t necessarily understand complex systems. Thus, they fail to recognize the influence external factors may have on outcomes. What is more, because the distance between cause and effect in such systems can be sizeable, it is tough to determine if and when a prophecy will come to pass. This is generally true of disruptive technologies, which are rarely revolutionary. Rather, they are subversive, slowly becoming apparent only when they are near-irreversible forces in society.

Accordingly, the cost of over-hyped predictions is hard to quantify, but it must surely be considerable; and while there are no standard guidelines for managing them, there are some advisable approaches:

Consider the source of any prediction, since the rosiest pictures are usually painted by those with the most vested interest.
• Distinguish between what a technology is capable of doing and what users actually need it to do.
• Actively seek other points of view.
• Don’t ignore the possible impact of economic, political or social uncertainties.
• Recognize the durability of existing technologies.
• Give innovations time to integrate themselves into society.

Someday, hyped technologies and techniques such as the cloud, Big Data and predictive analysis may make it possible to successfully foresee the future. But for now, as Microsoft researcher Duncan Watts contends, “The best we can hope for is to correctly predict the probability something will happen.”

Article written by Howard Gross for icrunchdata news New York, NY

Technology is Devaluing the Most Important Currency, Trust

This piece first appeared on October 8th in icrunchdata news

It comes as no surprise that in a global economy one of the earliest casualties of the U.S. government shutdown was the dollar. In the first hours after the closure, America’s currency hit a one-and-a-half year low as international investors began losing confidence in Washington’s ability to manage its affairs. But ours is also becoming a networked economy where, as Google’s executive chairman Eric Schmidt has put it, “trust is the most important currency.” And that too is being devalued. Although the current battle across Capitol Hill has eclipsed most other government-related matters, including the ongoing NSA controversy, the latter has already done damage. Within weeks of initial news reports about the agency’s surveillance activities, privacy jitters among Internet users more than doubled to over 50%, prompting no less than Mark Zuckerberg to acknowledge that the credibility of a number of technology firms had taken a hit.

It comes as no surprise that in a global economy, one of the earliest casualties of the U.S. government shutdown was the dollar. In the first hours after the closure, America’s currency hit a one-and-a-half year low as international investors began losing confidence in Washington’s ability to manage its affairs. But technology is also becoming a networked economy where, as Google’s executive chairman Eric Schmidt has put it, “trust is the most important currency.” And that too is being devalued.

Although the current battle across Capitol Hill has eclipsed most other government-related matters, including the ongoing NSA controversy, the latter has already done damage. Within weeks of initial news reports about the agency’s surveillance activities, privacy jitters among Internet users more than doubled to over 50%, prompting no less than Mark Zuckerberg to acknowledge that the credibility of a number of technology firms had taken a hit.

Accordingly, the Information Technology and Innovation Foundation estimates that the annual cost to American businesses could be as high as $35 billion, with much of that coming from losses overseas. Hoping to capitalize on this “trust gap,” several countries are creating national or regional havens for privacy, which would permit data access only through their own domestic technology companies; ostensibly locking out the likes of Google, Facebook and Microsoft.

2f20132f102fperson-bar-code1Yet regardless of how staggering the amount of information the government gathers may appear, it pales in comparison to what Corporate America stockpiles. The marketing technology and services firm Acxiom, is amassing a seemingly infinite amount of data. It compiles details on half-a-billion consumers worldwide, including a majority of American adults. At the same time, it processes more than 50 trillion data interactions a year.

What all if this information is actually worth is open to debate, but even among those who do not worry about how much data is being collected, there are concerns about how well it is being protected. Since 2005, the number of “reported” U.S. security breaches into records containing personal information has ballooned beyond 600 million; nearly twice the population. Some studies have also shown that the decline in trust these break-ins engender exceeds anything experienced as a result of lawful data collection.

Of course, while people may be losing confidence in government and business, they can always count on each other. Or not. Conventional wisdom has it that, with respect to recommendations about products and services, consumers rely on their friends and family first. The Nielsen Company has continually underscored this notion for years through its “Global Survey of Trust in Advertising,” the latest of which shows that 84% of consumers worldwide trust such word-of-mouth guidance. But even the most popular person has a limited number confidants from whom to garner information and insights, and must, at some point, turn to strangers. The problem is knowing just who – or what – those strangers really are.

Recently, New York regulators cracked down on 19 companies providing phony online reviews for businesses. In what can be deemed “digital sweatshops,” they were paying reviewers from Bangladesh and the Philippines a dollar per faux rating. Similar firms that mass produce Facebook “likes” pay their workers as little as $15 per thousand and Gartner estimates that by next year, 10-15% of all social media reviews will be fake.

Nonetheless, these establishments may not be long for this world. Not because of any legal maneuvers, but because, as in so many other instances, machines are replacing people. According to media monitoring firm Fisheye Analytics, as much as half of all web traffic already comes from algorithms or bots. Using software applications and procedures, hackers can fashion online personas that closely imitate humans. By Twitter’s own count, as many as 10 million users are neither flesh nor blood. The total is higher on Facebook.

Not surprisingly, growing numbers of consumers are losing confidence in online ratings and reviews. In a study by Maritz Research, upwards of 40% of participants indicated they did not trust “most or all” of the content on sites like TripAdvisor, Yelp or Zagat.

To make matters worse, it is not just consumers who are being duped. Digital security company White Ops claims to have found thousands of web sites that use bots to elicit money from advertisers for fraudulent traffic. More sophisticated than many of their consumer-focused counterparts, some can mimic consumer behaviors such as watching videos, clicking on ads and putting items in shopping carts.

Clearly then, advertisers will have to get a lot better at distinguishing legitimate sites from those that unleash scores of web robots. For their part, government and technology companies must prove that concepts like “transparency” are more than mere buzzwords. And consumers will have to become better informed about such issues, though few have either the desire or inclination to do so, especially since they can likely find alternatives to most afflicted ecommerce sites and social networks.

As for the offenders, they have every incentive to continue what they are doing. The cost of building bots is hardly prohibitive, and it is possible to construct thousands in a matter of days. It is also feasible to infect hundreds of computers with malware so they become unwitting distributors. The results are highly lucrative. Research shows that the market for fake Twitter followers alone may be as much as $360 million. Little wonder that the amount of social media spam has skyrocketed 355% during the first half of this year.

Article written by Howard Gross for icrunchdata news New York, NY