Can You Trust Social Media?

Can You Trust Social Media?

Interesting article in BBC about a project underway to develop a system that will rate information on the Internet as trustworthy or not.

Considering how quickly we get information from the Net and how easy it is to start crazy rumors, manipulate financial investors, or even cause a near panic, it would be good to know whether the source is legitimate and the information has been validated.

Are we simply getting someone mouthing off on their opinions or what they think may happen or perhaps they are unknowingly spreading false information (misinformation) or even purposely doing it (disinformation)?

Depending how the Internet is being used–someone may be trying to get the real word out to you (e.g. from dissidents in repressive regimes) or they may be manipulating you (e.g. hackers, criminals, or even terrorists).

To have a reliable system that tells us if information being promulgated is good or not could add some credibility and security online.

What if that system though itself is hacked? Then lies can perhaps be “verified” as truth and truth can be discredited as falsehood.

The Internet is dangerous terrain, and as in the life in general, it is best to take a cautious approach to verify source and message.

The next cyber or kinetic attack may start not with someone bringing down the Internet, but rather with using it to sow confusion and disarm the masses with chaos. 😉

(Source Photo: Andy Blumenthal)

Big Data, Correlation Or Causation?

Big Data, Correlation Or Causation?

Gordon Crovitz wrote about Big Data in the Wall Street Journal (25 March 2013) this week.

He cites from a book called “Big Data: A Revolution That Will Transform How We Live, Work, and Think,” an interesting notion that in processing the massive amounts of data we are capturing today, society will “shed some of its obsession for causality in exchange for simple correlation.”

The idea is that in the effort to speed decision processing and making, we will to some extent, or to a great extent, not have the time and resources for the scientific method to actually determine why something is happening, but instead will settle for knowing what is happening–through the massive data pouring in.

While seeing the trends in the data is a big step ahead of just being overwhelmed and possibly drowning in data and not knowing what to make of it, it is still important that we validate what we think we are seeing but scientifically testing it and determining if there is a real reason for what is going on.

Correlating loads of data can make for interesting conclusions like when Google Flu predicts outbreaks (before the CDC) by reaming through millions of searches for things like cough medicine, but correlations can be spurious when for example, a new cough medicine comes out and people are just looking up information about it–hence, no real outbreak of the flu. (Maybe not the best example, but you get the point).

Also, just knowing that something is happening like an epidemic, global warming, flight delays or whatever, is helpful in situational awareness, but without knowing why it’s happening (i.e. the root cause) how can we really address the issues to fix it?

It is good to know if data is pointing us to a new reality, then at least we can take some action(s) to prevent ourselves from getting sick or having to wait endlessly in the airport, but if we want to cure the disease or fix the airlines then we have to go deeper, find out the cause, and attack it–to make it right.

Correlation is good for a quick reaction, but correlation is necessary for long-term prevention and improvement.

Computing resources can be used not just to sift through petabytes of data points (e.g. to come up with neighborhood crime statistics), but to actually help test various causal factors (e.g. socio-economic conditions, community investment, law enforcement efforts, etc.) by processing the results of true scientific testing with proper controls, analysis, and drawn conclusions.

>Decoding Decision-Making

>

Decision-making is something we have to do every day as individuals and as organizations, yet often we end up making some very bad decisions and thus some costly mistakes.

Improving the decision-making process is critical to keeping us safe, sound, and stably advancing toward the achievement of our goals.

All too often decisions are made based on gut, intuition, politics, and subjective management whim. This is almost as good as flipping a coin or rolling a pair of dice.

Disciplines such as enterprise architecture planning and governance attempt to improve on the decision-making process by establishing a strategic roadmap and then guiding the organization toward the target architecture through governance boards that vet and validate decisions based on return on investment, risk mitigation, alignment to strategic business goals, and compliance to technical standards and architecture.

In essence, decisions are taken out of the realm of the “I think” or “I feel” phenomenon and into the order of larger group analysis and toward true information-based decision-making.

While no decision process is perfect, the mere presence of an orderly process with “quality gates” and gatekeepers helps to mitigate reckless decisions.

“Make Better Decisions,” an article in Harvard Business Review (HBR), November 2009, states, “In recent years, decision makers in both the public and private sectors have made an astounding number of poor calls.”

This is attributed to two major drivers:

Individuals going it alone: “Decisions have generally been viewed as the prerogative of individuals-usually senior executives. The process employed, the information used, the logic relied on, have been left up to them, in something of a black box. Information goes in [quantity and quality vary], decisions come out—and who knows what happens in between.”

A non-structured decision-making processes: “Decision-making has rarely been the focus of systematic analysis inside the firm. Very few organizations have ‘reengineered’ the decision. Yet there are just as many opportunities to improve decision making as to improve other processes.”

The article’s author, Thomas Davenport, who has a forthcoming book on decision-making, proposes four steps (four I’s) organizations can take to improve this process:

Identification—What decision needs to be made and which are most important?

Inventory—What are the factors or attributes for making each decision?

Intervention—What is the process, roles, and systems for decision-making?

Institutionalization—How do we establish sound decision-making ongoingly through training, measurement, and process improvement?

He acknowledges that “better processes won’t guarantee better decisions, of course, but they can make them more likely.”

It is interesting that Davenport’s business management approach is so closely aligned with IT management best practices such as enterprise architecture and capital planning and investment control (CPIC). Is shows that the two disciplines are in sync and moving together toward optimized decision-making.

One other point I’d like to make is that even with the best processes and intentions, organizations may stumble when it comes to decision making because they fail into various decision traps based on things like: groupthink, silo-thinking and turf battles, analysis paralysis, autocratic leadership, cultures where employees fear making mistakes or where innovation is discouraged or even frowned upon, and various other dysfunctional impediments to sound decision-making.

Each of these areas could easily be a discourse in and of themselves. The point however is that getting to better decision-making is not a simple thing that can be achieved through articulating a new processes or standing up a new governance board alone.

We cannot delegate good decision-making or write a cursory business case and voila the decision is a good one. Rather optimizing decision-making processes is an ongoing endeavor and not a one-time event. It requires genuine commitment, participation, transparency, and plenty of information sharing and collaboration across the enterprise.

>A Healthy Dose of Skepticism

>

To believe or not to believe that is the question.

Very frequently in life, we are confronted with incomplete information, yet by necessity, we must make a decision and act—one way of the other.

Sure, we can opt to wait for more information to trickle in, but delay in action can mean lost opportunity or more painful consequences.

The turmoil over the last couple of years in the financial markets is one example. Many people I talk with tell me they felt something was going to happen – that the stock market was heading off a cliff. Those who acted on what they perceived, and moved their financial assets to safety (out of the stock market), are certainly glad they did. Those who hesitated and waited for the painstaking days of 500 point drops wish they had acted sooner.

In relationships, sometimes those who fear commitment and don’t make a move to “tie the knot” may end up losing the ones they love to the arms of another. On the other hand, making serious commitments before really knowing another person can end up in painful heartache and divorce.

Similarly, with technology, we often hear about the “first-mover advantage.” Those who recognize the potential of new technology early and learn how to capitalize on it, can gain market share and profitability. Those who jump aboard only once the train in moving may end up just trying to keep up with the Joneses.

These days, not only must we make decisions based on incomplete information, but sometimes we don’t even know if the situations we face are actually real!

Daniel Henninger, writing in last Thursday’s Wall Street Journal (22 October 2009) wrote: “The ‘balloon boy’ floating over Colorado last week got me thinking of…[Air Force One] photo-op airliner flying around lower Manhattan last April at 9/11 altitude…it’s getting harder to know what’s real and unreal in a world that always seems to be slipping slightly out of focus.”

He reminds us of the “sensation” that Orson Welles caused in 1938 when he announced, on a “‘reality’ radio broadcast,” a Martian invasion. The response was widespread panic.

With all the chicanery and half-baked ideas and products and services being marketed and sold—whether to get on reality shows or for salespeople to “hit their numbers”—people have become cynical about everything and everybody. Mistrust is not longer for New Yorkers anymore.

The realization has hit us that most things we confront are not simple fact or fiction, but rather shades of gray, and so we shy away from taking a definitive stand, preferring instead perpetual limbo or “analysis paralysis.”

This cynicism is embodied in a CFO that I used to work for that was mistrustful of everyone—almost habitually, he called people inside and outside the organization, “snake oil salesman.”

In corporate America, we often call the art of shaping people’s perceptions “marketing” or “branding”. In politics, we typically call it “spin”. And a good marketer, or “spinmaster,” can overcome people’s skepticism and actually influence their perception of the truth.

As an IT leader, though, we can deal with incomplete information as well as “spinmasters” effectively. And that is by treading carefully, gathering the facts and testing reality. This is what market and competitive analysis, pilots, and prototypes are all about. Test the waters, before making a full forward commitment.

We can’t be swayed by emotion, but rather must vet and validate—do due diligence, before we either duck out of the fray or leap into action.

Leadership lesson: Act too harried, and you risk making serious mistakes. React too slowly or with too much skepticism, and you will not be leading but following, and your legacy will be truly unimpressive. Listen to what others have to say, but make your own call. That is what true leadership, in an IT context or anywhere, is all about.

>A Postmodernist Approach to Enterprise Architecture

>It’s a crazy world out there. Just about every argument a counter-argument, every theory a contrary theory, every point a counter-point: whether we are watching the Presidential debates, open session on Capitol Hill or at the United Nations, The Supreme Court or Court TV, bull or bear on the stock market, religious and philosophical beliefs, and more. Even what some deemed scientific “facts” or religious doctrine are now regularly debated and disputed. Some examples: Creation or “Big Bang,” global warming or cyclical changes in the Earth’s atmosphere, peak oil theory or ample supplies, right to life at conception or at some point of maturation, cloning and stem cells or not, criminal punishment or rehabilitation—it seems like there is no end to the argument.

So what are we to believe?

For those who are religious, the answer is simple, you believe what you have been taught and accept as the “word of G-d.” This is straightforward. However, the problem is that not everyone adheres to the same religious beliefs.

For those that believe in “following their hearts”, gut, intuition, then we have very subjective belief systems.

For those that only “believe what they see” or what is “proven”, then we are still left with lots of issues that can’t be proved beyond a doubt and are open to interpretation, critical thinking, teachings, or which side of the bed you woke up on that day.

Fortune Magazine, 27 October 2008, has an article on Saudi Arabia’s Aramco, “the state-owned national oil company…it’s daily profit is 21/2 times that of Exxon Mobil, the world’s most profitable publicly held company.” Aramco’s annual report contains details about “daily oil and natural gas production, the number of wells it drills, exports, refinery outputs, and more. The only problem: Since it contains no audited financial numbers, it’s hard to know what to believe.”

Similarly, “it warms readers not to listen to scaremongers spreading the ‘misperception’ that future supplies may not meet global demand. The world’s resources, it says are sufficient for well over a century—and when technological advances are factored in, for another 100 years after that.”

It’s hard to know who and what to believe, because usually those presenting “a side” have a vested interest in the outcome of a certain viewpoint. Hence all the lobbyists in Washington!

For example, Aramco and OPEC have a vested interest in everyone believing that there is plenty of oil to go around and stave off the research and development into alternatives energy sources. At the same time, those who preach peak oil theory, may have an interest in energy independence of this country or they may just be scared (or they may be right on!).

From an enterprise architecture perspective, I think the question of getting to the truth is essential to us being able to plan for our organizations and to make wise investment decisions. If we can’t tell the truth of a matter, how we can set a direction and determine what to do, invest in, how to proceed?

Of course, it’s easy to say trust but verify, validate from multiple sources, and so on, but as we saw earlier not everything is subject to verification at this point in time. We may be limited by science, technology, oppositional views, or our feeble brain matter (i.e. we just don’t know or can’t comprehend).

When it comes to IT governance, for example, project sponsors routinely come to the Enterprise Architecture to present their business cases for IT investments and to them, each investment idea is the greatest thing since Swiss cheese and of course, they have the return on investment projections, and subject matter expert to back them up. Yes, their mission will fail without an investment in X, Y, or Z and no one wants to be responsible for that, of course.

So what’s the truth? How do we determine truth?

Bring in the auditors? Comb through the ROI projections? Put the SMEs on the EA Board “witness stand” or take them to a back room and interrogate them. Perhaps, some water boarding will “make them talk”?

Many experts, for example, Jack Welch of GE and other high profile Fortune 500 execs, including the big Wall Street powerhouses (Bear Stearns, Lehman Brothers, AIG, Fannie/Freddie) call for asking lots of questions—tearing this way and that way at subordinates—until the truth be known. Well look at how many or most of these companies are doing these days. “The truth” was apparently a lot of lies. The likes of Enron, WorldCom, HealthSouth, and on and on.

So we can’t underestimate the challenge of getting to truth and planning and governing towards a future where argument and questions prevail.

Perhaps the answer is that truth is somewhat relative, and is anchored in the worldview, priorities, assumptions, and constraints of the viewer. From this perspective, the stock market crashes not necessarily or only because businesses are poorly run, but because investors believe that the bottom has fallen out and that their investments are worthless. Similarly, from an enterprise architecture perspective, the baseline/target may not be an objective set of “where we are” vs. “where we want to be,” but “how we perceive ourselves and are perceived by others now” vs. “how we want to be perceived in the future.”

This approach is not wholly new—this is the postmodern attitude toward the world that academics have been preaching for decades. What is unique is the application of postmodernism and relativism to the IT/business world and the recognition that sometimes to understand reality, we have to let go of our strict addiction to it.