With IBM’s Watson beating the pants off Jennings and Rutter in Jeopardy, a lot of people want to know can computers can really think?
With IBM’s Watson beating the pants off Jennings and Rutter in Jeopardy, a lot of people want to know can computers can really think?
Forget waiters and waitresses, the new Japanese Hajime Robot restaurant in Bangkok, Thailand invested almost $1 million on 4 robotic waitstaff.
You order your food by touch screen computer, and there is a countdown on the screen for when the food is ready and the robot brings it out to you.
While the samurai clad robots are not the best looking—their huge eyes are a little cartoonish—they are certainly quite dexterous and able as they nimbly serve the food in this restaurant and dance for the customers in between courses without missing a beat.
Initially automation affected the jobs of blue-collar workers in manufacturing and mechanical work as robots displaced people on the “assembly line.” Now we see the trend continuing and expanding with automation entering the service industry and jobs involving customer interaction, entertainment, and retail being affected. This is happening not only in restaurants, but also elder care (like robot uBot5 being developed out of University of Massachusetts), and in major retail establishments such as in warehouse automation with Kiva Systems robots being employed by major companies like Gap, Staples, and Zappos.
Further, the expansion of robots into traditional human work is also happening in our military—think Unmanned Aerial Vehicles (UAVs or Drones) like the Predators and Reapers, the robotics pack animals that can carry hundreds of pounds of gear (like Big Dog) and various bomb disposal robots. This is just the beginning.
We are witnessing the transformation of our workforce from traditional blue- and now even white-collar jobs to those with an emphasis on knowledge management (think engineers and technology professionals working at companies like iRobot, Intel, and Apple). This has obvious implications for selection of education pursuits and availability of professional opportunities in the future for our children and grandchildren.
The robots are coming. The robots ARE coming!
There is the old saying that rings true to basic leadership: “Keep it Simple Stupid,” (or KISS) yet for various reasons people and organizations opt or are compelled toward complexity.
And when things are complex, the organization is more prone to mistakes, people to misunderstandings, and leadership to mismanagement–all are points of failure in the dynamics of running an organization.
Mistakes can be costly from both a strategic and operational standpoint; misunderstandings between people are a cause of doubts, confusion, and hostility; and mismanagement leads to the breakdown of solid business process and eventually everything goes to pot.
An interesting article in the Wall Street Journal, 26 October 2009, defines four types of complexity:
Dysfunctional—This is the de facto complexity. It “makes work harder and doesn’t create value…research suggests that functional complexity creeps into a company over years through the perpetuation of practices that are no longer relevant, the duplication of activities due to mergers or reorganizations, and ambiguous or conflicting roles.”
Designed—This is an odd one…why would you design in complexity? “Executives may deliberately increase the complexity of certain activities or they may broaden the scope of their product offering, because they expect the benefits of those changes to outweigh the costs.” Example cited: “Dell believes that configuring each product to individual specs, rather than creating them all the same, makes customers more likely to buy from the company.”
Inherent—I guess this is the nothing I can do about it category, it just is hard! “The difficulty of getting the work done.” Plain and simple, some jobs are highly complex Mr. Rocket Scientist.
Imposed—This is the why are they doing this to us category—external factors. This “is largely out of the control of the company. It is shaped by such entities as industry regulators, non-governmental organizations and trade unions.” I would assume competitors’ misdeeds would fall into this one as well.
Whatever the reason for the complexity, we know implicitly that simplification, within the realm of what’s possible, is the desired state. Even when the complexity is so to say “designed in” because of certain benefits like with the Dell example, we still desire to minimize that complexity, to the extent that we can still achieve the organization’s goals.
I remember years ago reading about the complexity of some companies’ financial reports (income statements, balance sheets, statements of cash flows…) and news commentators questioning the authenticity of their reporting. In other words, if you can’t understand it—how do we know if it is really truthful, accurate, or the full story? Well-publicized accounting scandals like Enron, HealthSouth, and many others since around the mid-1990’s come to mind.
Generally, we know that when something is veiled in a shroud of complexity, there is often mismanagement or misconduct at play.
That is not to say that everything in life is simple—it isn’t. Certainly advances in the sciences, technology, and so on are not simple. Knowledge is incremental and there is certainly lot’s of it out there to keep us all occupied in the pursuit of life-long learning. But regardless of how complex things get out there—whether dysfunctional, designed, inherent, or imposed—we should strive to make things easier, more straightforward, and as effortless and trouble-free, as possible.
Will simplification get more difficult as a goal as our society continues to advance beyond the common man’s ability to understand it?
Yes, this is going to be a challenge. It used to be that graduating from high school was the farthest most people went with their education. Then college became the goal and norm for many. And now graduate and post-graduate studies are highly desirable and expected for many professional careers. It is getting difficult for people to keep us with the pace of change, breadth and depth of knowledge, and the advancement in technical fields.
One of the antidotes to the inherent complexity seems to be greater specialization such as in medicine, technology, engineering and so forth. As knowledge advances, we need to break it up into smaller chunks that people can actually digest and handle. The risk is that the pieces become so small eventually that we can lose sight of the bigger picture.
Complexity is here to stay in various forms, but we can and must tackle at the very least the dysfunctional complexity in our organizations. Some ways we can do this include breaking down the silos that impede our collaboration and information sharing; architecting in simplification into our strategic, operational, and tactical plans; building once and reusing multiple times (i.e. through enterprise and common solutions); filling gaps, reducing redundancies, and eliminating inefficiencies; reengineering our business processes as a regular part of “what we do”, constantly innovating better, faster, and cheaper ways of doing things; thinking and acting user-centric, improving the way we treat our people; and of course, being honest, transparent, and upright in our dealings and communications.
>Every day as leaders, we are called upon to make decisions—some more important than others—but all having impacts on the organization and its stakeholders. Investments get made for better or worse, employees are redirected this way or that, customer requirements get met or are left unsatisfied, suppliers receive orders while others get cancelled, and stakeholders far and wide have their interests fulfilled or imperiled.
Leadership decisions have a domino effect. The decisions we make today will affect the course of events well into the future–especially when we consider a series of decisions over time.
Yet leadership decisions span the continuum from being made in a split second to those that are deliberated long and hard.
In my view, decision makers can be categorized into three types: “impulsive,” “withholding,” and “optimizers.”
So it is clear that whichever mode decision makers assume, many decisions are still wrong. In my view, this has to do with the dynamics of the decision-making process. Even if they think they are being rational, in reality leaders too often make decisions for emotional or even unconscious reasons. Even optimizers can fall into this trap.
CIOs, who are responsible for substantial IT investment dollars, must understand why this happens and how they can use IT management best practices, structures, and tools to improve the decision-making process.
An insightful article that sheds light on unconscious decision-making, “Why Good Leaders Make Bad Decisions,” was published this month in Harvard Business Review.
The article states: “The reality is that important decisions made by intelligent, responsible people with the best information and intentions are sometimes hopelessly flawed.”
Here are two reasons cited for poor decision making:
The authors note some red flags in decision making: the presence of inappropriate self-interest, distorting attachments (bonds that can affect judgment—people, places, or things), and misleading memories.
So what can we do to make things better?
According to the authors of the article, we can “inject fresh experience or analysis…introduce further debate and challenge…impose stronger governance.”
In terms of governance, the CIO certainly comes with a formidable arsenal of IT tools to drive sound decision making. In particular, enterprise architecture provides for structured planning and governance; it is the CIO’s disciplined way to identify a coherent and agreed to business and technical roadmap and a process to keep everyone on track. It is an important way to create order of organizational chaos by using information to guide, shape, and influence sound decision making instead of relying on gut, intuition, politics, and subjective management whim—all of which are easily biased and flawed!
In addition to governance, there are technology tools for information sharing and collaboration, knowledge management, business intelligence, and yes, even artificial intelligence. These technologies help to ensure that we have a clear frame of reference for making decisions. We are no longer alone out there making decisions in an empty vacuum, but rather now we can reach out –far and wide to other organizations, leaders, subject matter experts, and stakeholders to get and give information, to analyze, to collaborate and to perhaps take what would otherwise be sporadic and random data points and instead connect the dots leading to a logical decision.
To help safeguard the decision process (and no it will never be failsafe), I would suggest greater organizational investments in enterprise architecture planning and governance and in technology investments that make heavily biased decisions largely a thing of the past.
Well this is interesting to write: a blog about blogging 😉
Blogs are becoming a great new tool for enterprise communications and an alternate to clogging up already full email boxes.
CIO Magazine, 15 January 2008, states that “enterprise users can get lost in storms of ‘reply-all’ e-mails while trying to manage projects. Blogs offer a better way.”
The group president of systems and technology for Bell Canada says that “email, used by itself just doesn’t cut it anymore for project management and interoffice communication.”
What’s the interest level and use of blogs?
Forester Research reports that “54% of IT decision makers expressed interest in blogs. Of companies that had piloted or implemented blogs, nearly two-thirds (63%) said they used them for internal communications. Fifty percent said they used blogs for internal knowledge management—and these companies are leading the way of the future.”
A software social consultant says that “traditional enterprise solutions were designed to keep IT happy. They’ve not usually designed with any thought to the user, like a blog is.” What a nice user-centric EA concept, design technical solutions that meet user requirements; let business drive technology, rather than doing technology for technology’s sake.
Why do people resist blogs?
“People are hung up on this concept of the blog as a diary and as an external marketing medium,” rather than understanding its criticality as a tool for communications and knowledge management.
How can you advance the use of blogs in your organization?
From an EA perspective, blogs are not a substitute for email; we need email (some of us desperately, like a morning cup of joe), but blogs are a great complementary tool for participatory communications that involve discussion type interaction by more than two users or for capturing enterprise knowledge and making it available for discovery. Also, blogs are a tool that gives a voice to people, who may otherwise remain part of the silent masses; people feel freer to express themselves in blogs, and through freedom of expression comes advancement of ideas, greater buy-in, and better enterprise decision-making.
User-centric Enterprise architecture is about capturing, processing, organizing, and effectively presenting business and technology information to make it valuable and actionable by the organization for planning and governance.
Google is a company that epitomizes this mission.
After reading a recent article in Harvard Business Review, April 2008, I came to really appreciate their amazing business practices and found many connections with User-centric EA.
In short, Google is a highly User-centric EA-driven organization and is a model for many of its core tenets.
Enterprise architecture plays an important role in corporate knowledge management. EA captures, analyzes, catalogues, and serves up information to end-users. In many cases, where more general KM endeavors fail, User-centric EA succeeds because it is a focused effort, with a clear value proposition for making information useful and usable.
Now, KM is being taken to whole new level. And rather than capturing information with clearly defined users and uses, the aim is total recall.
ComputerWorld, 6 April 2008, reports on an initiative for “storing every life memory in a surrogate [computer] brain.”
“Gordon Bello, a longtime veteran of the IT industry and now principle researcher at Microsoft’s Corp.’s research arm, is developing a way for everyone to remember those special moments. Actually, Bell himself wants to remember—well, everything…he wants the ability to pull up any picture, phone call, e-mail, or conversation any time he wants”
“The nine-year project, called MyLifeBits, has Bell supplementing his own memory by collecting as much information as he can about his life. He’s trying to store a lifetime on his laptop.”
“The effort is about not forgetting, not deleting, and holding onto all the bit of your life. In essence, it’s about immortality.”
What about privacy of your personal information?
It “isn’t about plastering a Myspace or Facebook page with information…[It’s] immensely personal…you will leave a personal legacy—a record of your life [on a personal computer].
And Bell is not discerning, he stores painful memories as he does happy ones; this “would actually let people see who he was as a person.”
Certainly people have strived for eternal life from the time of the first man and woman—Adam and Eve eating of the forbidden apple in their quest for immortality—and since with the search for the “fountain of youth” and other elixirs to prolong life. Similarly, people have sought to live eternal by leaving a legacy—whether great men or nefarious one—from rulers and inventors to conquerors and hate mongers. The desire to influence and be remembered everlasting is as potent as the most parch thirst of man.
Bell has gone to extremes collecting and storing his memories—good and bad—from “every webpage he has ever visited and television shows he has watched…video’s of lectures he’d given, CDs, correspondence and an avalanche of photos…he has also recorded phone conversations, images and audio from conference sessions, and with his e-mail and instant messages.”
In fact, Bell wears a SenseCam around his neck, a digital camera that automatically takes a photo every 30 seconds or whenever someone approaches.
“Bell figures that he could store everything about his life, from start to finish, using a terabyte of storage.”
“In 20 years, digitizing our memories will be standard procedure according to Bell. ‘Its my supplemental memory and brain’. It’s one of my most valuable possessions. It look at this thing and think, ‘My whole life is there.’”
So is that what a human life comes down to—a terabyte of stored information?
While maybe a noble effort at capturing memory, this seems to miss the mark at what a human being is really about. A person is much more than that which can be captured by a photo or sound bite of the external circumstances and events that take place around us. The essence of a person is about the deep challenges that go on inside us. The daily struggles and choices we make through our inner conscience—to chose right from wrong and to sacrifice for our creator, our loved ones, our nation, and our beliefs. Yes, you can see the resulting actions, but you don’t see the internal struggles of heart, mind, and soul.
Also, while capturing every 30 seconds of a person’s life may be sacred to the person whose life is being stored, who else really cares? The high-lights of a person’s life are a lesson for others, the minutia of their day are personal for their growth and reckoning.
From a User-centric EA perspective, I believe we should focus KM initiatives for both organizations and individuals from being a wholesale data dump to being truly meaningful endeavors that have a clarity or purpose and a dignity of the human beings being recorded.
Enterprise architecture is a major contributor to knowledge management.
The Wall Street Journal, 10 March 2008, reports that “knowledge management [KM] can make a difference—but it needs to be more pragmatic.”
What is KM?
“A concerted effort to improve how knowledge is created, delivered, and used.”
“Over the past 15 years or so, many large organizations have embraced the idea that they could become more productive and competitive by better managing knowledge—the ideas and expertise that originate in the human mind.”
But many KM programs have failed miserably or just gone nowhere—why?
“Some firms stumbled by focusing their knowledge management efforts on technology at the expense of everything else, while others failed to tie knowledge programs to overall business goals or the organization’s other activities.”
Here’s how to do KM right:
EA contributes to all three areas:
EA can be a shining example of KM, when it is User-centric!