Here is a warning about customer service surveys: if you pipe up, you might want to expect a follow-up phone call! Unfortunately, most surveys are collected and not acted upon. They are a measure of progress or service that most businesses and companies need to perform as part of their going attempts at improvement. I know, I spent the last two semester’s of graduate school reading and thinking about evaluation – much of which included the use of surveys. I even designed three of them.
But, what I found out from my classmates, many of whom work for the USFWS, a Park System, or a school, survey data is often overlooked and rarely really used to make decisions, let alone improvements. This, in fact, is something that I have championed as a cause in our school district. If you collect the data, you must look at the data and use it to improve. For years, this was not done and in fact, kept from the very teachers who were asking questions about their students, progress, and potential system wide improvements. Just last year (the fall of 2016), I asked about data that was kept by our district about students dropping or failing courses taught in particular departments or of personal special interest, taught by a particular teacher. I strongly got the impression that this data was not looked at, even if it was collected and kept. Certainly, it was not available as general information to the public or even to the teaching staff. It seems that data is kept under lock and key until, perhaps, the “appropriate spin” can be put on it. As any statistician will tell you, the numbers can be made to look like anything one wishes them to resemble. My husband experienced this with a research job he had in medical school and my oldest son had some experience with it recently, in graduate school.
If one is careful enough to design a well thought out evaluative survey – and to be sure, there are many components to consider – one should want to be able to put the data to good use. In other words, the data should be aimed at improvements – whether that be improved student learning, improved teaching strategies, or improved customer service.
I had an experience with a customer service survey this week. One day after picking up contact lenses at a new local optometry business, I got a customer service survey. It was short. This is a a key component to having people answer the surveys – one in which I am still trying to find the right balance of information and brevity. Anyway, their survey was short. One question with a Likert Scale and a drop down box for feedback. Due to some issues, I rated my experience as a six! In the drop down box, I explained why I gave such a score and what could have been done differently. A few hours later, I received a phone call from the business apologizing for me inconvenience and offering some explanations, but not excuses, as I was told. I was shocked! And, totally unprepared for this call!
After a brief moment of uncomfortableness, I recounted my experience which prompted a lower than desired satisfaction score. In the end, it was a good conversation. The business had asked for feedback, and I had given it – honestly.
So, I am offering just a small word of caution – some organizations take their survey’s seriously and do act upon the feedback. Just be prepared that if you open the pandora’s box, you are ready to stay afloat and justify your feedback. And, kudos to this business for taking action on their evaluative survey!