From Cradle To Grave

Human Capital Lifecycle.jpeg

It’s funny how in organizations talk about the lifecycle of people. 

From a full lifecycle perspective, it’s “cradle to grave!”

In terms of lifecycle on the job, it’s “hire to retire (or to fire).”

Really the lifecycles are intertwined. 

It starts with the cradle…we are born and go through a maturation process that focuses on our education and preparation for life. 

Then we get hired into our (hopefully) dream jobs, where we spend our careers until we retire–or if you mess up badly and get fired or decide to change career course–you may have to go back to “go” and “do not collect $200” and you get hired again for another career round. 

Eventually you retire and start your 2nd life in retirement, where please G-d, you have the health and prosperity to enjoy the fruits of your labor and your families. 

Ultimately, our lifecycle ends at the grave with the death of our bodies–our souls go on to Heaven and live forever basking in the light of the Almighty. 

Thus, the human capital lifecycle. 😉

(Source Graphic: Andy Blumenthal)

See Yourself In The Future


Now seeing how you will look in the future is not just theoretical anymore. 

Merrill Edge (Merrill Lynch investing + Bank of America banking) has an online digital program that shows how you will look aged over time. 

They developed this as tool to encourage people to save more money for retirement by bringing home the message that you will not be young (and beautiful) forever. 

The Face Retirement tool asks for your age and gender, takes your picture, and then displays snapshots of how you will look over the course of your lifespan. 

I tried it and my smiling face was quickly tranformed into an old man with sagging skin, wrinkles, and more. 

My wife seeing those pictues says to me (even though we already save for retirement), “We better really start investing seriously for retirement!” — gee, thanks! 😉

And thanks Merrill Edge, you scared us straight(er) by looking at our own mortality, face-to-face. 

(Source Photo: here with attribution to Judy Baxter)

>A Vision of User-centric Communication Design


[Authored by Andy Blumenthal and published in Architecture and Governance Magazine November 2009]

As technology has advanced in leaps and bounds over the last 30 years, so has the number of information devices—from phones to faxes, pagers to PDAs, desktops to Netbooks—and it goes on and on.

Some devices, despite having outlived their useful lives, have been slow to disappear from the scene completely. For example, fax machines are still in our offices and homes, although now often combined with other de- vices such as the “all-in-one” copier, printer, scanner, and fax. However, why with the ability to scan and e-mail with attachments, do we even need to fax at all anymore?

Similarly, at one time, pagers were all the rave to reach someone 911. Then cell phones and PDAs took over the scene. Nevertheless, paging never fully went away; instead, it was replaced by “press 1 to send this per- son a page.” However, why do we need to page them at all anymore, if we can just leave them a voice mail or instant message?

It seems as if legacy technology often just doesn’t want to die, and instead of sun-setting it, we just keep packaging it into the next device, like the phone that comes with e-mail, instant messaging, texting, and more. How many ways do we need to say hello, how are you, and what time will you be home for dinner?

When is technology enough and when is it too much?

Of course, we want and love choice—heck, we’re consumers to the core. Technology choice is like having the perfect outfit for every occasion; we like to have the “right” technology to reach out to others in a myriad of different ways for every occasion.

Should I send you an e-mail on Facebook or should I “poke” you or perhaps we should just chat? Or maybe I should just send you a Tweet or a “direct message” on Twitter? No, better yet, why don’t I send you a message on LinkedIn? Anyway, I could go on for about another three paragraphs at least on how I should/could contact you. Maybe I’ll hit you up on all of them at the same time and drive you a little nuts, or maybe I’ll vary the communications to appear oh so technically versatile and fashionable.

Yes, technology choice is a wonderful thing. But it comes at a price. First, all the communication mediums

start to become costly after a while. I can tell you from my cell phone bill that the cost of all these options— e-mail, texting, Internet, and so on—definitely starts to add up. And don’t forget all the devices that we have to schlep around on our belts (I have one cell phone on each side—it’s so cool, like a gunslinger from the Wild West), pockets, and bags—where did I leave that de- vice? Let’s not forget the energy consumption and eco- unfriendliness of all these gadgets and all the messy wires.

Additionally, from a time-is-precious perspective, consider the time sinkhole we have dug for ourselves by trying to maintain a presence on all of these devices and social networking sites. How many hours have we spent trying to keep up and check them all (I’m not sure I can fully remember all my e-mail accounts anymore)? And if you don’t have single sign-on, then all the more hassle— by the way, where did I hide my list of passwords?

Next out of the gate is unified communications. Let’s interoperate all those voice mail accounts, e-mail ac- counts, IM, presence, and social media communications. Not only will your phone numbers ring to one master, but also your phone will transcribe your voice mails— i.e., you can read your voice mail. Conversely, you can listen to your e-mail with text-to-speech capability. We can run voice-over-IP to cut the traditional phone bill and speed up communications, and we can share nonreal-time communications such as e-mail and voice mail with real-time communication systems like our phone.

So, we continue to integrate different communication mediums, but still are not coalescing around a basic device. I believe the “communicator” on Star-Trek was a single device to get to someone on the Enterprise or on the planet surface with just the tap of a finger. Perhaps, our reality will some day be simpler and more efficient, too. When we tire of playing with our oodles of technology “toys” and signing up for myriad user accounts, we will choose eloquence and simplicity over disjointed—or even unified—communications.

As the founder of User-centric Enterprise Architecture, my vision is to have one communicator (“1C”) device, period. 1C is an intelligent device. “Contact John,” okay—no phone number to dial and no e-mail to address. 1C knows who John is, how to reach him, the best way to contact him, and if he is available (“present”) at the moment or not. 1C can take a message, leave a message, or communicate in any way (voice, text, video, virtual) that an individual prefers and that is appropriate for each portion of a particular communication to ensure that the communication intended is the communication received. 1C is not limited to a one-on-one communications, but is open to conferencing—as needed. Mention the need for Cindy to be in on the communication and instantaneously, Cindy is on and then off again. 1C is ubiquitous in time and space—I can send you a communication to arrive now or next week, when you’re here or there, when you’re in country or out, in a car, on a flight, on a ship, or underwater—it doesn’t matter. Like telepathy, the communication reaches you effortlessly. And, of course, 1C translates languages, dialects, acronyms, or concepts, as needed—truly it’s a “universal communicator.”

The closest we’ve come so far is probably the Apple iPhone, but with some 50,000 apps and counting, it is again too focused on the application or technology to be used, rather than on the user and the need.

In the end, it’s not how many devices or how many accounts or how many mediums we have to communicate with, but it is the communication itself that must be the focus. The 1C of the future is an enabler for the communication—anytime, anywhere, the right information to the right people. The how shouldn’t be a concern for the user, only the what.

>The Three I’s and Enterprise Architecture


One question that is frequently asked in enterprise architecture is whether new technologies should be adopted early (more cutting edge) or later (more as quick followers). Of course, the third course of action is to close ones eyes or resist change and simply “stay the course.”

The advantages to bleeding edge technology adoption is having the early advantage over competitors in the marketplace (this head start provides the ability to incorporate innovation into products early and capture a hefty market share and quite possibly dominance), while the advantage to quick followers being learning from mistakes of others, building from their initial investments and a more mature technology base (for example, with software, one where the bugs have been worked out) thereby potentially enabling a leapfrog effect over competitors. The advantage to staying the course is organizational stability in the face of market turmoil; however, this is usually short lived, as change overwhelms those resistant, as the flood waters overflow a levee.

The Wall Street Journal 5-6 July 2008 has an interview with Theodore J. Frostmann, a billionaire private-equity businessman, who tells of Warren Buffet’s “rule of the three ‘I’s,” which is applicable to the question of timing on technology adoption.

“Buffet once told me there are three ‘I’s in every cycle. The ‘innovator,’ that’s the first ‘I’. After the innovator comes the ‘imitator.’ And after the imitator in the cycle comes the idiot. So when…we’re at the end of an era it’s another way of saying…that the idiots have made their entrance.”

I relate the innovator and the early adopter in their quest for performance improvement and their sharing the early competitive advantage of innovation.

Similarly, I associate the imitator with the quick followers in their desire to learn from others and benefits from their investments. They recognize the need to compete in the marketplace with scarce economic resources and adapt mindfully to changes.

Finally, I relate the idiots that Warren Buffet refers to with those that ignore or resist change. Often these organizations mistake their early market success for dominance and in their arrogance, refuse to cede to the need to adjust to changing circumstance. Alternatively, these enterprises are truly ignorant of the requisite to adapt, grow, mature, and transform over time, and they mistakenly believe that simply sitting behind the cash register and waiting for customers is the way to run a business (versus a Costco whose warehouse, wholesale model has turned the nature of the business on its head).

In architecting the enterprise, innovation and imitation, while not without cost and risk, will generally speaking be highly rewarded by superior products and services, greater market share and more loyal customers, and a culture of success in the face of constant change. You don’t need to look far for examples: Apple, 3M, P&G, Intel, Toyota, Amazon, and more.

>Employee Onboarding and Enterprise Architecture


The human capital perspective of the enterprise architecture is often overlooked, and is not yet included in the Federal Enterprise Architecture (FEA), but I’m still hopeful.

A recent article in Federal Times, 9 June 2007, called “First Day on the job, first step to retention” demonstrates that human capital architecture is alive and well although not consistently used.

A report by the Partnership for Public Service found that “the government has no consistent approach to bringing new employees on board; new employees often aren’t taught about and initiated into their new agency’s culture; and technology that could ease the process and improve collaboration is underused.”

However, some agencies are coming up with new ways to welcome new employees, make them feel “at home” on the job, get them situated, acclimated and trained, so they quickly become productive employees with longevity at the agency.

For example in setting up the logistics for a new employee, “GAO and NASA have each developed case management systems to track new hires from the moment they accept the offer. The system alerts information technology offices to set up a computer network access and email account for a new employee; facilities staff will prepare an office and desk; and security staffs work in advance to arrange needed clearances.”

The Office of the Comptroller of the Currency (OCC) goes above and beyond when it comes to onboarding. OCC send new hires “a welcome basket with an agency-branded T-shirt, umbrella, luggage tags, and chocolates.” Not a bad introduction for someone starting new in a place.

GAO, NASA, OCC are all good examples of what human capital enterprise architecture is all about. I would suggest growing this list and building it into a FEA human capital perspective.

We cannot take our employees for granted. Their enthusiasm, determination, innovation, empowerment, and leadership is what will drive the organization ultimately to succeed or not.

The enterprise architecture human capital perspective can look at the lifecycle of the employee from recruiting, hiring, and onboarding all the way through their 30+ year careers and into retirement. The enterprise architecture can assess how we manage the human capital lifecycle in the enterprise today, establish a future target state, and develop the transition plan to move the organization towards best practices for managing our most critical asset, our people.

>TEOTWAWKI and Enterprise Architecture


TEOTWAWKI stands for the end of the world as we know it. It is a term used in the survivalist movement and is sometimes used as a reference to the apocalypse. (The apocalypse though has religious connotations in that the end of the world has greater meaning in terms of revealing G-d’s ultimate purpose for mankind.)

The end of the world—is there such a thing?

As mortal human beings, we know that all living things have a beginning and an end of life. Even inanimate objects are recognized as having a lifecycle, and this is often talked about from a management perspective in terms of administering “things” from their initiation through their ultimate disposition. Some common lifecycles frequently referred to are: organizations, products, projects, assets, investments, and so on.

So how about the world itself?

Well, the answer is of course, yes—even the world will one day come to end. Astronomers have long witnessed even the implosion of stars at their end of life—these are called supernovas. And our world is a lot smaller than a star; in fact, you could fit about a million Earths inside our sun (which is a star).

When times get tough, TEOTWAWKI is something that perhaps we ponder about more and wonder whether this is it!

For example, during the Cold War and the buildup of the nuclear arsenals of the Soviet Union and the United States, there were enough nukes to destroy the world ten times over. And people wondered when the button would actually be pushed.

Nowadays, we wonder less about nuclear holocaust and more about overpopulation (currently at 6.3 billion and expected to reach 9 billion by 2042) and depletion of world energy resources like oil (currently at $140 a barrel and up 44% in cost YTD), demand outstripping supply for silver, copper, aluminum, and many other commodities, and shortages of food (as the UK Times reported in February that “the world is only ten weeks away from running out of wheat supplies after stocks fell to their lowest levels for 50 years.”)

Further, while the population continues to explode and resources continue to be depleted, we continue to overflow the world’s dumps with garbage so much so that there has even been talk of sending garbage into space, just to get it the heck out of here!

And let’s not forget global warming and pollutants that stink up our cities, cause acid rain, asthma, and so many other unfortunate effects on the ecosystem and human health.

The good news is TEOWAWKI talk is often just fear and occasional panic and it is not imminent. The bad news is there are some very real problems in the world today.

The problems are so big that leaders and governments are having a difficult time trying to tackle them. All too often, the problems get passed to the next generation, with the mantra, “Let it be someone else’s problem.”

As an enterprise architect, my frame of reference is to look at the way things are (the baseline) and try to come up with a better state for future (the target) and work up a transition plan, and basically get moving.

We all know that it is extremely difficult to see our way through these extremely complex problems of global magnitude. But if enterprise architecture has taught me anything, it is that we must create a roadmap for transformation; we must forever work to change things for the better. We must do whatever we can to prevent TEOTWAWKI.

Perhaps the field of enterprise architecture can be expanded from one that is IT-focused and now becoming business and IT-focused to ultimately becoming a discipline that can drive holistic change for major world problems and not just enterprise problems. Does this mean that enterprise architecture at some point becomes world architecture?

>Frameworks and Enterprise Architecture


One of the issues facing IT professionals these days isn’t a scarcity of IT frameworks, but rather many good frameworks that can be in conflict if we do not sort them out.

Usually each framework is focused on a certain lifecycle, which on one hand helps align it to a certain view on things, but on the other hand complicates things further, because each framework attempts to be all things to all people, by covering an entire lifecycle.

Here are some examples of the frameworks and their associated lifecycles:

  • Enterprise architecture – the IT investment life cycle (Capital Planning and Investment Control, CPIC)
  • SDLC—the Systems Development Life Cycle
  • ITIL –the service lifecycle
  • PMBOK—the project lifecycle
  • CMMi—the process lifecycle
  • Configuration Management—the asset lifecycle

Each framework has plans that need to be developed, processes to be followed, reviews that get conducted, and go/no-go decisions issued.

If an organization attempts to introduce and utilize all these frameworks then there can result a major overlap of structures, processes, and reviews that can literally drive a project manager or program sponsor nuts!

The organization could grind to a standstill trying to comply with all these frameworks.

I believe that each framework has useful information and guidance for the organization to follow and that the key to implementing these is to determine which is the PRIMARY framework at various stages of IT. The other frameworks may overlap in the stages, but it is the primary framework that guides that each stage.

Here’s an example using a few of these:

  1. Enterprise architecture is the decision making framework for IT systems. It has a core function in influencing IT investment decisions based on the target architecture and transition plan and EA reviews of proposed new IT projects, products, and standards. Therefore, EA is the primary or dominant framework in the planning stage of IT (up to the IT investment decision).
  2. SDLC is the development framework for IT systems. SDLC has a core function is guiding the software development process. As such, SDLC is the primary framework for the development phase of the system (which comes after the IT investment decision and before operations and maintenance). Of course, SDLC has planning phases and O&M and disposition phases as well, but SDLC is the primary framework only in the development stage.
  3. ITIL is the service framework. It has a core function in determining how service will be delivered and supported. ITIL is the primary framework for the O&M stage of the system (this comes after the development and before the disposition of the system), since that is when the system needs service delivery and support. Again, ITIL has other stages that overlap with other frameworks, for example planning and configuration management, but ITIL is the dominant framework only in the O&M phase.

The other frameworks should conceptually also assume a primary role for specific phases of IT and then pass off that role to the next framework that is dominant in that particular area.

4. For example, maybe PMBOK would have a dominant framework role also in the planning phase, looking at cost, schedule, performance, resources, risks, and so on (this would be after the IT investment decision of EA and before the development or acquisition phases). Again that is not to say that PMBOK doesn’t shed light and provide requirements for other stages of IT—it does—but it just is not the primary framework for those other stages.

I believe it is only by developing a unified lifecycle and assigning primary framework “ownership” to each stage will we be able to develop a truly workable IT structure and process for our organizations. As the saying goes: ”Two kings cannot sit on one throne.” So too, two frameworks cannot be simultaneously guiding our project managers and program sponsors nor taking up valuable IT resources.

The end goal is a single, simple, step-by-step process for our projects to follow with clear actions, milestones, and reviews, rather than a web of confusion and siloed, redundant governance processes in play.