Archive
Actionable Data

It seems like everyone is talking about “Data” and “Data Analysis”. Of course, what everyone wants is actionable data!
A “play it where it lies” approach to data simply treats any data as a math problem. There are just numbers to be crunched and fed into statistical algorithms with only a loose reminder of their connection to the real world. We then have results and answers without conviction – UNactionable data.
Actionable data always has an explicit logical chain of thought beginning with first-hand observation of something. Then we sharpen our observations with attribute or variable measurement by making a systematic comparison to an objective standard. (We count and quantify.)
With physical characteristics this is often straightforward, but do we all agree on the objective definition of: a NEW customer? an EXCELLENT customer service call? full COMPLIANCE to policy or contract deliverables? a DEFECTIVE part? an IMPROVED process? an OUTAGE in our IT systems?
With even the best measurement system in place, we still have two recurring measurement quality issues: outliers and missing data. Have we investigated outliers as opportunities to learn and integrate new information or do we pretend that they don’t exist? And, what about missing data? Some missing data can be interpolated, but other missing data should also be treated as outlier information.
GPS position and location tracking data might be used to extrapolate an in-between location and time, but missing refrigerator temperature data might indicate an equipment malfunction or failure!
If, without grouping errors*, we correctly begin to reduce good data with descriptive statistics; then human ideas/abstractions will emerge from the numbers. We begin to confidently speak and accurately describe the typical “new customer”, “service call”, “warranty return” and so on; and we can graphically illustrate the analytical results that back up these sweeping generalizations.
Our confidence in these conclusions rests on our ability to trace all the way back to what could be seen, heard and experienced, and that is what makes this data actionable.
Changing How We Think About Work
There are many excellent points in this Linkedin” slide presentation, and I especially like the arrangement across time of these popular methods.
(See/Search: “Continuous improvement, encompassing Lean, Kaizen, and Agile methodologies, advocates ongoing excellence through gradual refinement….” on LinkedIn)
There are three things that I would add from my experience.
First, there is a tendency to equate quality with performance. Quality is value as seen by the customer. I have made this distinction in other articles and there are several ways to see this.
a. The project management connection: value = function(performance, time, cost, risk). That integrates understanding of project management and process management as the two basic skills of “tactical management”. (Management of -available- means to an end.)
b. There is the relationship of “performance”, “specs”, and “targets” to sales and marketing. “People don’t want to buy a quarter-inch drill. They want a quarter-inch hole!” –https://quoteinvestigator.com/2019/03/23/drill/
c. The measure is not the metric. Space inside cardboard box can be measured by length, width, depth, cubic inches. Or one can count the number of unbroken cookies that will fit securely in the box. (“Cookies or crumbles?”
Second, these various “schools” of quality are just that. They are different approaches to teaching the same/similar mental skills like: experimentation, basic measurement and statistics, cause/effect thinking, organizing and planning, conceptualizing and then drilling down to details, etc.
If one teaches root cause analysis with fishbone, 8 steps, 5 whys, deductive logic, or Aristotelian causality – the end skill is the ability to trace a causal chain back to choices one could have made and can make in the future.
Finally, when strategic decision makers provide new facilities, new tools, and new processes, performance expectations go up.
For tacticians, the driver of performance improvement is new knowledge about the existing facility. Experimentation techniques are “hunting” for the causes that keep us from hitting the target every time. Control chart techniques are patiently “fishing” for causes. In a sea of countless tiny independent causes of variation, we watch and wait for a cause that surfaces above the noise. That “out of control outlier” tells us when and where to look and learn, share, teach and improve.
We naturally expect that large capital outlays and clever engineering should result in better product performance. What one hundred years of these quality management tools teach us is that changing how we think about work can result in just as big an improvement.
It’s Hard To Make Things Easy…

I have often been asked how I came into the field of consulting, and how that career path forked and diverged into successful projects in several manufacturing and service industry companies. It began when I went to work with James Abbott. I was an early hire for his training organization and continued to work with him as we transitioned into a training and consulting group.
James had a vision. He saw the training and professional development need that every company has; and he decided to address that need with in-depth training seminars on a wide variety of subjects.
I had an academic background in physics and math and had a strong work background in Information Technology. I always have an appetite to master new material and I told James that I would come up to speed in any training material he wanted to offer. This turned out to be a perfect fit to my personality and skill set!
The Secret To Great Training
We worked hard to develop and organize the training materials in proper order from the basic to the advanced. Our course design approach was to always establish a context for new ideas. We often reminded each other that counting comes before arithmetic, arithmetic before algebra, and algebra before calculus.
This conceptual approach allowed us to effectively present a large amount of material in a compressed period of time. When community colleges began teaching computer skills classes – their instructors took two days of our material as a template for a semester long course.
What Students Said
James has a saying, “It is easy to make things hard and complex. It is hard to make things easy and simple!” This principle was obvious to me after every seminar that I taught. The student reviews typically fell into two categories.
I was disappointed with reviews like this: “The material was very advanced. I learned a lot. The instructor was an expert in the material and presented it very well” . Instead, I wanted to see reviews like this: “The class was easy. The time went by fast. I think I may have already known a lot of the material beforehand.”
With the same subject, similar audiences reported different experiences. I could always account for this particular difference by how I organized the material. I could tell that proper order was one important key to participants “getting” the material. (Arithmetic Before Algebra!)
The Principle Applied to Process Improvement
Any business, service industry or manufacturing process can improve if we learn something new and integrate it into the context of our existing knowledge. New process tools, knowledge and metrics are the principal drivers of improvement. Every process will degrade when we start to forget, and we let known manageable factors return to the dark unknown.
Learning and sustainment begins by conceptually building that contextual foundation. What do we know? When we start a consulting project, we often encounter this: “It’s all the same. A call is a call. An order is an order. A cast iron part is a part. It’s pretty obvious.”
When pressed for details on how things work and how tools are used, this posture then flips and becomes: “Our processes are so complex. You wouldn’t understand. No two calls in the call center are the same. No two customer manufacturing orders are the same. No two system failures are the same.”
We call this retrenching position: the “no two snowflakes are alike” response, and we then explain what we are looking for:
(1) A difference in degree that is a difference in kind: For example, 2- or 3-minute calls in the support center versus 30-minute calls. “Short calls” versus “extended calls”, or “quick feature training calls” versus “software installation and configuration calls”. These might be approached as two different processes.
(2) In manufacturing, we might want to think about similar products made from copper or aluminum versus wood. These might be considered as two processes if the end result differs only in material. “Soft Metal” versus “Woodworking”.
Is This Too Simple?
When teaching these first process concepts to beginners, they first object that this seems too simple. It is simple, once this “factor analysis” has been done!
Is This Too Hard?
Charles Hobbs was an important thought leader in the field of Time Management, and he once told a parable of a woman who found purpose when she was encouraged to start by examining what was in her own back yard. She found a rock – a steppingstone rock near her back door and she began to ask herself, “What kind of rock is this? What else can it be used for? What is its mineral content?” After a time, she became quite knowledgeable about minerals and when she felt that she had reached a goal she asked for advice again. Her mentor, replied: “What was under the rock?”
When even experienced engineers grasp the implications of recursively drilling down into processes to learn more, to “look under the rock”, they often experience a moment where it all seems never-ending and too hard.
Simple is not the same as easy. Creating a culture of learning in yourself and in an organization is hard work. However, a culture of learning results in better decisions, better products and services lower costs and hopefully – a virtuous circle of continuous improvement.