Insights from Dennis – 11/21
As today’s Insight, Dennis gives you an impression of the analysis of one of our real surveys and shows you how we interpret the results. First, some basic information about the survey under consideration:
Key data about the survey
Topic of the survey: Survey on the reasons for termination by our customers
Important questions: Was the customer satisfied? How long did the customer use the tool? Did the customer achieve his goal?
Channel used: the invitation to the survey goes out automatically by e-mail approx. 30 minutes after the cancellation
Period: since August 2020
Target group: all German and English-speaking customers from all tariff levels
Statistics
Now let’s take a look at the survey statistics together. Here, the participation rate is particularly interesting. In the selected example, 521 invitations were sent out. Of these, 67 people took part in the survey, which corresponds to a response rate of 12.85 percent. This is a medium response rate, but completely within the norm for the topic of the survey.
In addition, the low bounce rate also shows that the length and complexity of the survey were well chosen. In this respect, no adjustment is necessary.
Results
Now we move on to the interpretation of the results. On the basis of the NPS query, i. e. the recommendation rate, it can be seen that mainly “satisfied” customers took part in the survey. An interesting starting point would be to motivate dissatisfied customers to participate, for example by changing the invitation text. This is because negative feedback, in particular, indicates starting points for improvement.
The next step is to find out why satisfied customers in general have cancelled. For this purpose, it also makes sense to include the duration of use. If we look at this, we can see that mainly customers who only used easyfeedback for a short time responded. This is a positive information for us because most of our customers are long-term customers and there are rarely any cancellations at this level.
The evaluation of the question on target achievement also gives us further important information. In the example chosen, 80 percent of customers stated that they had achieved their goal. One possible conclusion from this would be that the customers are satisfied with the tool and the results, but currently have no need for further surveys.
Furthermore, we are interested in whether there is a differentiation between users in the monthly and annual tariffs. If we look at the results in this regard, we notice that the annual rates are used twice as often for employee surveys.
66 percent of users in the annual plans said they had achieved their goals. Only one customer reported not meeting their goals. When asked, “Why didn’t you meet your goals?” the customer indicated that the survey was not answered adequately. The “why” question is important to us because we only have the opportunity to intervene if we know the customer’s motivations. In this case, we might have been able to retain the customer if we had managed to build trust in surveys. Another option would have been to provide him with valuable tips on how to increase his response rate.
Now let’s take a closer look at users in the monthly rates. Here, the focus is more on customer surveys and market research. Five of the customers surveyed were students who used the tool for their bachelor’s or master’s thesis and classified themselves in the “other” usage area. Here, it would be worth considering whether our question should be supplemented to include another area, “Study”, in order to make it more measurable.
Only one person stated that they had not achieved their goals. The reason given was that the tool was not flexible enough to use. This negative feedback turns out to be good feedback, because this is where we can start in marketing to keep users longer.
In the case of the customers surveyed who stated that they had only partially achieved their goals, they said that this was not due to the tool, but to external circumstances, such as an insufficient participation rate. Unfortunately, a high participation rate is often difficult to achieve in the market research sector. But this information is also important for us, as we have a new starting point for issuing further information, for example tips on how to increase the participation rate.
Interpretation
The thesis is confirmed that participation takes place when there is a bond with the author (satisfaction). Here, variants should be created in order to also receive negative feedback. For this, a plan should be developed to also reach the people who were not satisfied. Because this is where the gold of a survey lies.
Due to the low dropout rate, the length and questioning of the survey seem well chosen. In this respect, there is no need for adjustment.
Basically, we terminate because there is no need. In this respect, an important learning for us is to awaken the customers’ need.
The use for employee surveys mainly takes place in the area of annual rates. In this respect, an important question could be how we succeed in transferring more customers to the “annual rate” category. One possibility could be to develop new employee templates or to promote the benefits of employee surveys more strongly.
Only 2.98 percent of customers did not reach their goal. This shows that we are well positioned in the area of information performance and functions offered. In the case of customers who only partially achieved their goals, it was not due to the tool but to external influences. This also explains why the customers were mostly satisfied. One way to intervene here could be to provide even more tips on how to conduct surveys.
Conclusion
The tool seems to fulfill the requirements for the functions. Providing more information on implementation would potentially have a positive impact on reputation and contract length.
The goal should be to promote existing deployment options, such as employee surveys more presently or to communicate the benefits more strongly. Another idea is to create text variants for the survey invitation to get more negative feedback.
That was today’s Insight with a look behind the scenes of easyfeedback. We hope you enjoyed this insight and that you were able to take away a few ideas for evaluating your next survey. For more tips on evaluating and analyzing your next customer survey, feel free to read our blog article.
More on the topic of tips & tricks for surveys:
- Survey Tool
- Formulating texts and questions when creating your questionnaire
- 7 proven practical tips for creating your next questionnaire
- Negative feedback after a survey: how to best deal with it!
- Why you shouldn’t ask as many questions as you like in your survey
- Tips to motivate participants to take part in the survey
- Video: 8 tips for building your questionnaire
- Video: Win unmotivated participants for your survey
- Video: How many questions are useful in a survey
- Video: Derive measures from the results of a survey
- Video: How do I interpret results from a survey
- Webinar: How to recruit suitable survey participants and motivate them to participate