Decision effectiveness

A few years ago, i had a custom set of wheels made for my bike. I had the rims specifically weighed and picked out of a set of about 10 rims. I had the spokes weighed and balanced to make sure they were the lightest ones. The spoke nipples (the threaded parts that are basically nut that the spokes attach to the rims through) were color matched to the paint on my bike. all said, the wheels weighed about 1435 grams. Not crazy light, but pretty darn light. And they were fairly aerodynamic having decently deep rims and bladed spokes to cut through wind. Being aerodynamic, they cut through wind pretty well, and being light, they accelerated and climbed well. But custom rims cannot be laced as tight as some of the manufactured wheels out there. The one thing I lacked was the stability that comes from an incredibly tightly laced wheel.

I decided to give up my beloved wheel set and get a mass produced one (ok, so I have not yet seen another set of my wheels on the road, but still, they are not custom wheels). They happen to be just as light, almost as aerodynamic, and insanely sturdy. There is so little flex in my wheels that on hard corners going downhill at 45 mph, I have absolute confidence in them and I know exactly what they are going to do. Nonetheless, it was a hard decision to make, to replace my perfectly good older wheels.

I’ll admit. Even I talk too much about governance and the structure and network it takes to have a good governance model. But regardless of the model, it is not about your governance model, its about the effectiveness of your decisions. Do you make the right decisions? How fast do you make a decision? How often do you execute your decisions as planned?

You can have a great governance model. You can be totally well informed about what goes on in the organization based on working groups that inform you about the state of HR. You can be well networked and statused. And with all of that, you can still make the wrong decisions or avoid making decisions.

I have seen organizations where the governance model was to include so many people in the decision that at the end of the day, nobody wanted to be accountable for the final decision. The group would reach a point of consensus so that if anything went wrong, nobody had to take accountability, and they were all both blameless and at fault. It was also an environment where when the group was close to consensus, if someone saw something was clearly wrong, nobody would stand up for fear of being the one having to be accountable for a different decision.  It was a governance model. It was inclusive, well networked, but it turned out it was a bad model. Either nothing got done, or often, the wrong things got done. When it comes right down to it, you need to be inclusive and networked in the governance model, but you also need to be able to react quickly and authoritatively when the circumstances call for it. You need to have accountability for the decision that is separate from accountability for the execution and implementation of that decision. And you need to have the ability and the willingness to switch gears in the middle when you realize that something is either wrong, or jut that something could be better.

I had. Perfectly good bike, with perfectly good wheels. I’m continuously amazed at the quality of my new ride. It feels smoother when i ride over bumps, more solid when I ride down a hill, but jut as light and aerodynamic. I was the right decision to make, even though it pained me to make it.

Serendipity versus Decision Support

Would I be where I am today if I had all the facts every time?  I’m actually confident that if it were up to me, I would be digging ditches for a living somewhere (not to demean ditch diggers).  Let’s face it, I started off all on the wrong foot.  Being an Asian American kid with a prodigy brother, I was definitely the stupid one (I’ll assume you’ve all read about the Asian “Tiger Mom” thing lately).  I was the kid who, at the age of 6, was told by my piano teacher to quit.  I was the kid who was told by my 5th grade teacher, “too bad you’re not your brother.”  I was the Asian kid who graduated high school with only a 4.2 GPA.  (All of that is true btw).  I was also the kid who by some miraculous stroke of good fortune, managed to get accepted to my first choice college.  Being of relatively low income, my parents were quite please when I got a significant financial aid package (nothing compared to the brother, who incidentally got into every single Ivy League – also true).  At some point in the summer, I was sent a letter from my college of first choice and informed that they would no longer be able to offer me the amount of aid that I required.  With quite a large amount of desperation, I called around to various colleges, and was re-admitted to my (I think) 4th choice school with the financial aid that I needed.

It was at this school (one of the Claremont Colleges in S. California), where rather than hoards of students in large auditoriums being lectured to (a system that had clearly failed me so miserably to this point), I was instead surrounded by classes of maybe 15.  OK, maybe 20 max.  Rather than being lectured to, we sat around a table and talked about the book we read in the prior week.  I sat around on committees where I was literally a vote as a student to decide whether professors got tenure or not.  Rather than simple learning, I began understanding.  I really do consider this to be the first of several unplanned turning points.  Listen, I’m serious when I say that I was not a good student.  But learning for me happened a different way than for most.

We often talk about analytics and how it changes how we operate in HR.  High quality data leads to high quality choices – and often times that is true.  But it is also true that we don’t always have all of the data that we need at any specific point in time – if we had everything we needed to know, we might make vastly different choices.

I’ll take succession planning as an example.  We know who the top 10 succession candidates are for top positions (hopefully).  We know when they will be ready, what their relative skills and competencies are, and how their strengths compare to one another.  But we don’t know which of them are going to jump ship and go to another company before the position becomes vacant.  We don’t know which of them are going to stop growing, regardless of our best efforts to continue developing them.  The best that we can do, is to invest in a pool of candidates, and hope that one of them, the right one, is ready when the time comes.

We use decision support and analytics to crunch the numbers for us, but at the end of the day, it’s still serendipity – it’s still luck.  The hope here, is that while analytics and decision support can’t be a perfect predictor, we can in fact “make our own luck.”  We can improve our odds at getting the best outcomes.  At the end of the day, it is not serendipity versus decision support, but a combination of the two that will make our best data work for us.

Misinterpreting Apple and Microsoft

As a global community, we all hate Bill Gates.  Actually we all hate Microsoft.  While we might depend on things like Microsoft Outlook and the MS Office suite of products, most of us probably believe that they have a bit too much of a monopoly and probably have bullied around other companies to ensure that they have a strong foothold in their industry.  Alternatively, we all love Steve Jobs and Apple.  This guy has reinvented Apple and given us some great products with amazing usability.  Instead of the “blue screen of death” from Microsoft, we have the incredibly usable iPhone.  Apple is also all about community, and their technologies tend to have brought us closer together, creating now ubiquitous applications (also on Google Android phones) that help us better connect in real time.

But sometimes image and marketing is everything.  The Bill and Melinda Gates Foundation is one of the largest philanthropic institutions in the world, providing funding for diverse programs in health , global economic development, and education.  Steve Jobs on the other hand had a foundation for about 15 months, but it was shut down after never doing much of anything.  In comparing the two business leaders, it appears that Gates is rather selfless in his charitable intentions, but Jobs (in the rare circumstances that he endorses a cause) only mentions a cause when it serves the purposes of Apple and the growth of his personal wealth.

It’s easy to look at something, especially a set of data, and be swayed by our own personal experiences with it.  We often have events or our relationships with the business provide specific opinions that may or may not be close to the truth.  In the Gates and Jobs example, we even have clean and quantitative data prove that Jobs is the better guy.  Apple has overtaken Microsoft in market capitalization, and therefore the consumers have voted, not for the big corporate giant providing software to big corporate giants, but to the provider of tools to everyday individual consumers.

I’m not arguing what set of technologies is more deserving of our approval in terms of market cap, but how these images inform our ability to interpret data in the absence of external influences.  The reason that I’m an advocate of business intelligence analysts that view data and operate in a function that is not touched by the externalities of the “real world” is that they can touch and feel the data with an objective eye.  Those of us who operate in business and business process can often be blinded by prior results that are not directly related to the data, but we think they are.  I’ve seen organizations completely disregard employee engagement surveys that identified terrible managers simply because productions or sales happened to be “hot.”  We’ve been influenced by circumstantial evidence that very senior managers can’t be “messed” with or that diversity data is better than it really is.  We’ve completely missed the mark on interpreting trend lines because we don’t have analysts who know how to look at data from a mathematical perspective.

At the end of the day, situational evidence is critical in how we interpret our version of reality.  In no way can we ignore this, but at the same time, we have to be able to see through it and objectify the data before we can reach a conclusion.  We can’t let our judgment around data be clouded by only what our perceived reality already is, because if we do, our role to the business as part of the decision support chain is completely irrelevant.

Dynamical Systems

Forget for a moment that I’m talking about a mathematical theory.  At the core of this post is how we apply some quantitative reasoning to our ability to look at data and predict the future.  Personally, I don’t think we do enough to extend and create meaning from our data.  If we looked at data with an increasing view of the quantitative sciences, we would have a proportional increase in our ability to apply “art” to our interpretation.

We predict things intuitively every day.  Driving down the road, we watch other cars, measure their velocity, and predict if we are going to collide or not – and based on those predictions we often change our course.  In Human Resources, we should be predicting the direction and velocity that our workforce is travelling in a number of ways.  Not only do we watch the size of the workforce, but productivity, engagement, competency growth, and any number of other factors.

A dynamical system is a mathematical construct that predicts what the state of a particular object will be at a given point in time in the future.  Dynamical systems are useful in a limited way – through them we can predict future state outcomes in a limited number of variables.  But the point is that we can predict vector and velocity – in other words, if the workforce is travelling in the right direction and at the right speed.

First off, I should absolutely admit that HR is not comprised of a bunch of mathematicians.  This should be no surprise to anybody.  However, our organizations that make up the analytical arms of the HR organization, those who generate analytics and create meaning out of data, should have some ability to quantitatively view the data and understand trending, directionality, velocity, and the general parameters of the vector.  I hate taking us all back to high school calculus, but we remember the first derivative of the curve is the slope of the curve.  This is general trend of what our workforce metric is doing.  However, the second derivative of a curve is the determination of whether the curve’s direction is slowly shifting.  In other words, we might know that our overall turnover rate is dropping, and that is a positive sign.  However, do we know if the turnover rate is dropping more slowly than it was last quarter, even if the rate is better than it was before?

While most of HR is not in the mathematics field, and most of us don’t want to be anywhere near it, we should be applying some of these theories to our analytics.  If we look at any of our programs, we can see acceleration of desired outcomes after specific events, and deceleration of outcome after we have not done any change management or communications for a while.  We should be looking at competencies not as a growth metric, but as an acceleration curve over time.  Growth is good, don’t get me wrong, but acceleration is better.

I’d like to think that all HR data analysts out there are taking a serious look at the data they are presenting, trying to create and extend meaning beyond what is being requested, but today’s reality is quite mundane, even with all the cool business intelligence and dashboard technologies out there.