Blogs

Collecting data so you can use it

By Nicholas Zeisler, CCXP posted 09-13-2022 10:52 AM

  

I once worked with a client who was having a somewhat complicated concern about survey data:  Every time they interacted with a Customer they’d send out a survey invitation.  That was well and good, but they never knew (does anybody?) when they’d get a response.  They put a time limit on the survey in that the invitation expired after 30 days, but they knew their Customers were busy, so it may be a couple of weeks before someone got around to filling it out.

That dynamic made reporting a bit trickier because the results were not necessarily reflective of how their performance was at the time they came in.  Because they’d trickle in over the course of several weeks to a month, the picture and insights from one day’s experiences would be diluted and confused with those of later days.  Now, while in some situations, this may seem a trivial concern (What difference would it make, for example, if the Monday results got mixed in with the Tuesday and Wednesday results?), I’d been working with this group for a while and they’d embraced my admonition to act on the insights they were gathering from their Customers.  So in this case, the changes they were making should show up in the results of their VoC program...keeping the survey results aligned with what was going on when the encounter it was to represent occurred was important.



The solution was simple:  Keep sending out surveys the same way, but always tie the results to the event that triggered the survey when reporting and analyzing the results, rather than categorizing them based on when the surveys came in.  In practice, this would mean that the actual results (“How are we doing?”) would be a bit fluid for a while as they continued to receive survey responses.  For any given day, the results for those incidents would be pretty well-known about a week later (when a big chunk of survey responses would have been received), but it would potentially vary a little for a few more weeks as more surveys that had been sent out that day continued to come in.  Most responses came in sooner, but for a clearer picture, they also wanted to include responses that came in later.  The more comprehensive view was worth the little bit of temporary fluctuation.  (We also considered changing the time horizon on the survey time-out, but we chose not to so as not to limit the Customers’ options in responding.)

Then a revelation occurred that really set me back on my heels:  The boss didn’t like it.  In fact, some of the leadership team was leery of even broaching the subject of changing the way the metric was reported, knowing he’d not like it.  Without getting too far into it, the boss was one of those bosses with particular proclivities when it comes to how he’s shown data.  The idea that the number we showed him one week (that’s how frequently they were reporting to him) might look different the next was a complete non-starter.  In short, the leader was an idiot.  Sorry, but it’s true.  (Fair disclosure here:  There were many other data points I won’t get into here that pointed toward that summation of him.)  Worse, his staff was enabling his idiocy by not pressing the case for making this change.

What was going on here was a very dramatic demonstration of people using their CX KPIs for no reason other than reporting.  In fact, since the original way of reporting the metric made it impractical to use from an operational improvement perspective, the data wasn’t even valid.  Forget the culture that was being promoted by the boss to keep the numbers from being used properly (and effectively), the actual day-to-day work was being impacted.

In the end, the team decided simply to list the original date of the interaction with the Customer along with the received date of the survey response in the database.  That allowed them to easily sort and (big breath) keep a sort-of off-the-books record of what was really going on so they could get their jobs done.  It was an imperfect solution (they kept reporting the ‘wrong’ way to the boss, but kept things straight when they used the insights to make improvements), but it worked for them as long as my engagement with them lasted.

I’m not working with that group anymore (not because of that; the engagement simply ended), but I keep in touch with some of the team members from time to time.  The boss is still the boss, the ‘second set of books’ is still there, and improvements are still being made based on those insights.  All’s good in the world, in some sense.  But the funny part is, the boss isn’t aware really why.  I’m not sure how much longer this can last...one team member described it as rearranging deck chairs on the Titanic...but at least those responsible for getting things done are, well...getting things done.

  
(Originally Published 20220913)

– LtCol Nicholas Zeisler, CCXP, LSSBB, CSM
– Fractional Cheif Customer Officer/Principal, Zeisler Consulting
#2022
#VOCCustomerInsightUnderstanding
0 comments
153 views

Permalink