Customers do not ignore surveys because they do not want to share feedback. They ignore them because surveys have trained them that feedback goes nowhere. Here is how to build surveys worth completing — and a feedback loop worth trusting.
Why Most Customer Surveys Get Ignored
A survey that arrives two weeks after a purchase. A survey with 20 questions when you need five. A survey asking about things you have already decided. A survey whose results are never communicated back. Customers learn the pattern quickly: surveys are theater, not genuine listening.
The antidote is simple in principle and hard to execute consistently: send surveys at the right moment, keep them short enough to complete in two minutes, ask only what you will genuinely act on, and then close the loop by showing customers their feedback changed something.
Timing: When to Ask
The 48-hour rule
Send surveys within 24–48 hours of the experience you are measuring. Purchase? Send within 24 hours. Support ticket closed? Send within an hour. Event ended? Send the next morning. The experience needs to be vivid and emotions still engaged. Response rates fall 20–30% for every day of delay past 48 hours.
Best practice — trigger-based sending: Do not send surveys on a schedule. Send them triggered by customer actions. When a support ticket closes, queue a survey automatically. When a purchase completes, send the next morning. Triggered surveys consistently get 2–3x better response rates than batch sends.
- Monday mornings — inbox overload
- Friday afternoons — mentally checked out
- Holidays and quiet periods
- During known product issues
- Right after complaints or escalations
- Right after a price increase
- Tuesday through Thursday
- Mid-morning (9–11am) or mid-afternoon
- Immediately after a positive interaction
- Same day as or morning after purchase
- One hour after a support ticket closes
Length: How Much Is Too Much
The 5-minute ceiling
Response rates fall sharply beyond 5 minutes. A 3-question survey might see 40% completion. A 10-question survey drops to 15%. A 20-question survey to 5%. Length kills participation more reliably than any other factor — more than topic, incentives, or channel.
This does not mean avoiding depth. It means being ruthless about what makes the cut. If a question will not change a decision or inform an action, it should not be in the survey.
Creating a survey because you have 30 questions you are curious about, then adding a progress bar to make it feel shorter. Customers feel the length regardless.
Starting with the two questions you cannot live without. If you later discover you need more depth, send a second, separate survey to customers who indicated willingness to help.
Channel: Where to Reach Customers
Different channels suit different moments in the customer journey. Use the one that matches context — not the one that is easiest to set up.
| Channel | Response Rate | Best For | Key Tip |
|---|---|---|---|
| 5–20% | Most businesses, post-purchase | Subject line matters. Be specific: "3 quick questions about your order" beats "We value your feedback." | |
| In-app / website | 20–40% | SaaS, apps, any digital product | Survey after the user completes an action, not during. Maximum 2–3 questions. Always dismissable. |
| SMS | 30–60% | Opt-in SMS audiences, high-value customers | Only for opt-in lists. Keep to 1–2 questions maximum. Use a shortened survey link. |
| Post-purchase QR | 5–15% | Physical products, packaging inserts | QR code plus a very short URL. Maximum 2 questions. Small incentive helps (10% off next order). |
| Social media poll | Varies widely | Quick pulse checks, brand engagement | Treat as engagement tool, not primary research. One question only. Data skews toward active followers. |
Do not pick one channel. Email works for broad coverage. In-app catches customers in the moment. SMS reaches high-value customers immediately. Layer them based on context — each has different strengths and optimal use cases.
Questions: What to Actually Ask
The one-question survey
If you ask only one question, ask: "How likely are you to recommend us to a friend or colleague?" on a 0–10 scale. This single question correlates with retention, churn risk, and revenue growth more reliably than longer surveys. Customers who score 9–10 are promoters. Those who score 0–6 are at risk of churning or recommending against you.
The three-question survey
For most customer survey use cases, three questions are optimal. One core metric — satisfaction or likelihood to recommend. One follow-up for context — "What is the main reason for your rating?" One action prompt — "What could we improve?" This structure takes 3–4 minutes and gives you quantitative data plus qualitative direction.
Question patterns to avoid
Incentives: Should You Offer Them?
Incentives — discounts, prize draws, gift cards — boost completion rates by 10–30%. A $5 credit can push response from 15% to 25%. Whether that is worth it depends on your margins and survey value.
The survey is longer than 5 minutes, you are surveying B2B customers whose time is expensive, the topic is sensitive or involves complaints, or response rate is genuinely critical to the survey's validity.
The survey is sent immediately after a positive interaction, it is under 3 minutes, you are surveying loyal customers, or it is an in-app survey where the customer is already engaged.
The most reliable response rate booster is not an incentive — it is timing. A well-timed 2-minute survey sent within an hour of a positive interaction will outperform an incentivised survey sent a week later.
Closing the Loop
This is where most businesses fail. They collect feedback, review it internally, and move on. Customers notice when nothing changes. They stop completing future surveys.
This cycle builds trust and response rates improve with each iteration. Customers who saw their previous feedback result in a change complete the next survey with genuine engagement rather than resignation.
Ready to Survey Your Customers?
Create a free customer survey in under a minute. No signup required from anyone.
Create Your Survey Free →