Customer Interview
Purpose: Capture customer needs, pain points, and behaviors through structured conversation
How to run this meeting
The most important discipline in a customer interview is separating observation from interpretation. Your job is to understand the customer's world as they experience it — not to validate your assumptions. Resist the urge to pitch, explain, or defend your product. When a customer says something surprising, lean in with curiosity rather than correction. Aim for a ratio of 80% listening to 20% talking.
Ask open-ended questions that begin with "tell me about," "walk me through," or "describe a time when." Avoid leading questions like "Would you find it useful if…" — these prime customers to agree with you rather than share their actual experience. When a customer gives a short answer, use silence or a neutral prompt like "say more about that" to draw out richer detail. The most valuable insights often come after the first answer.
Designate one person to interview and one to take notes — the interviewer should not be split between listening and typing. Capture verbatim quotes whenever possible; paraphrasing introduces your own interpretation. With permission, record the session so you can return to exact wording. Save synthesis and pattern-matching for after the interview, not during it.
Before the meeting
- Define the research question this interview is designed to answer
- Write a discussion guide with 5-8 core questions (not a script — a guide)
- Share context with the note-taker: what to listen for, what terminology matters
- Confirm recording permission and set up transcription tool if using one
- Review any prior interactions with this customer (support tickets, NPS responses, sales notes)
- Block 15 minutes after the call for immediate debrief while it's fresh
Meeting Details
- Date:
- Facilitator:
- Attendees:
- Duration: 45–60 minutes
Interviewee Info
Background on who this person is and what context they bring. Capture role, company size, how long they've been a customer, and how they use the product.
- Name: Priya Nair
- Title: Engineering Manager
- Company: Foxridge Health (Series B, ~200 employees)
- Customer since: 14 months
- How they use the product: Daily — manages 3 squads, uses primarily for sprint planning and retrospectives
Goals
What specific questions are you trying to answer with this interview? Be concrete — "understand onboarding friction" is better than "learn about their experience."
- Understand how engineering managers track cross-team dependencies today
- Learn what triggers a decision to escalate a blocking issue
- Identify gaps between what they expect from tooling and what they actually use
Questions
Your discussion guide. These are prompts, not a rigid script — follow the conversation where it goes.
- Walk me through the last time a dependency between teams caused a delay. What happened?
- How do you currently keep track of what other teams are working on?
- When something is at risk of slipping, how do you decide when to escalate?
- What does your current process look like for the week before a release?
- Tell me about a time when cross-team coordination went really well. What made it work?
- If you could change one thing about how your teams communicate status today, what would it be?
Key Quotes
Verbatim quotes from the customer. Don't paraphrase — exact words preserve the customer's frame, not yours.
"I spend probably 40% of my week just pinging people to find out if something is blocked. It's exhausting."
"By the time something shows up in the standup, it's already a crisis."
"I don't trust the dashboards because I know people only update them when they're asked to."
Insights
What did you observe? Stay close to the facts here — what did they say, do, or show you? No interpretation yet.
- She tracks cross-team dependencies in a shared Notion doc that she created herself; it is not used by others
- She mentioned checking Slack at 7am specifically to catch blockers before standup
- She used the word "exhausting" or "tiring" three times in relation to status tracking
- She did not mention the product's dependency tracking feature until directly asked; she was unaware it existed
Opportunities
Now interpret. What might this mean for the product? Surface themes, tensions, and design implications.
- Visibility is the core pain — the problem isn't that blockers aren't known, it's that they surface too late
- Trust in data is a prerequisite: if the system isn't reliably updated, people will work around it
- The 7am Slack behavior suggests there's a real workflow around "start of day review" that isn't currently served
- Discoverability gap: the dependency feature may need a stronger entry point from the places she already works
Action Items
| Owner | Action | Due Date | Status |
|---|---|---|---|
| @priya.chen | Share interview recording and transcript in #research Slack channel | 2025-04-18 | Open |
| @marcus.webb | Add insights to the dependency tracking research synthesis doc | 2025-04-20 | Open |
| @priya.chen | Schedule follow-up session to show dependency feature and get reaction | 2025-04-25 | Open |
Follow-up
Send the customer a brief thank-you note within 24 hours — no need to share your notes, but acknowledge what you heard and let them know it was valuable. Post the full transcript and notes to your team's research repository so future team members can reference it. If this interview is part of a research sprint, debrief with the team within 48 hours to pool observations before individual memories start to fade.
Skip the template
Let Stoa capture it automatically.
In Stoa, the AI agent listens to your customer interview and captures decisions, drafts artifacts, and tracks open questions in real time — no note-taking required.
Create your first Space — free