Problem: Unclear objectives
Data collection projects often start without clearly defined objectives or with vague questions.
In such cases, either too much, too little, or the wrong data is collected.
This leads to confusion, additional work in the evaluation, or even completely unusable results.

Solution:
• Formulate specific goals and questions, e.g.: “How satisfied are customers with the support process?”
• Create a survey concept that defines:
• What exactly should be measured?
• Who is the target group?
• How will the success of the survey be evaluated?
• Use methods such as decision trees or hypothesis models to sharpen the focus.
Problem: Unsuitable survey methods
If the survey method is not suited to the objective or target group, gaps or distortions will arise.
An example: An online survey of people with low digital literacy may not deliver the desired data—not because of a lack of interest, but because of the methodology.
Solution:
• Choose the method that suits your target group:
• Online surveys for digitally savvy individuals
• Telephone or face-to-face interviews for more complex topics
• Observations in behavioral studies
• Use combined approaches (“mixed methods”) to combine quantitative and qualitative strengths.
• Test the method in a pilot study before conducting a broad survey.
Problem: Low participation rates
Many surveys fail due to low response rates.
Reasons for this include lack of motivation, questionnaires that are too long or complicated, lack of anonymity, or simply disinterest.
The result: non-representative data.

Solution:
• Make the questionnaire user-friendly: max. 5–10 minutes to complete, clear language, clear design.
• Communicate transparently: Why is the survey being conducted? What happens to the data?
• Offer incentives (e.g., participation in prize draws, vouchers).
• Timing and channel selection are crucial: send surveys after customer contact or via the target group's preferred communication channels, for example.
Problem: Distorted or incorrect answers
Even when people participate, they do not always give correct or honest answers.
Reasons for this include social desirability (e.g., on the topic of sustainability), misunderstandings, or simply a lack of concentration.
Solution:
• Use neutral and unambiguous wording—avoid suggestive or double questions.
• Use scales with clear anchor points (e.g., “1 = very dissatisfied” to “5 = very satisfied”).
• Add anonymous response options for sensitive topics.
• Conduct pre-tests to identify any ambiguities in the questionnaire in advance.
• Randomize the order of responses for multiple-choice questions to avoid response patterns.
Problem: Technical issues & privacy concerns
Poorly optimized tools, loading problems, or lack of accessibility quickly lead to dropouts.
Similarly, uncertainty about data protection (e.g., what happens to my answers?) can reduce willingness to participate.

Solution:
• Choose reliable, GDPR-compliant tools such as easyfeedback that work on all devices.
• Test the technical implementation in advance on mobile devices, browsers, low bandwidth, etc.
• Explain clearly and transparently how the data will be used, stored, and protected.
• Offer the option of anonymous participation if no personal data is required.
Conclusion
Data collection is a demanding but crucial process.
Only when goals are clearly defined, methods are well chosen, and participants are actively involved can usable and reliable data be generated.
Companies and organizations that are aware of typical problems and take steps to prevent them gain a real knowledge advantage.
With professional planning and regular evaluation, data quality can be significantly improved—and with it, the success of data-based decisions.