Skip to main content
Execution Reviews|Sprint Review / Demo
Execution Reviews

Sprint Review / Demo

Demonstrate completed work.

Sprint Review & Demo

Purpose: Show working software to stakeholders and gather feedback at the end of each sprint

How to run this meeting

Demo working software, not slides. The only acceptable artifact for a sprint review is a live product or a recorded demo of the live product. Screenshots of designs, lists of completed tickets, and status updates are not demos — they deprive stakeholders of the feedback loop that makes sprint reviews valuable. If something isn't demoable, it wasn't done in any meaningful sense.

Invite stakeholders who have context on the goals but aren't in the day-to-day. Product managers, customer success leads, sales engineers, and occasionally customers make for the best sprint review audiences. Their feedback is different from the team's internal perspective and often surfaces misalignments early. Keep the attendee list focused — a 20-person sprint review becomes a presentation, not a conversation.

Connect each demo explicitly to the sprint goal. Before showing a feature, say: "This sprint we set out to [goal]. Here's what we built." After the demo, ask: "Does this achieve what we intended?" This makes the review a real assessment, not a show-and-tell. Capture feedback in the notes during the demo, not after — real-time capture prevents good insights from being lost.

Before the meeting

  • Confirm the sprint goal is written down and visible to all attendees
  • Each demo item is working in a demo or staging environment — never demo in production
  • Assign a presenter for each demo item (the engineer who built it, ideally)
  • Prepare a short intro for each item: what problem it solves and who it's for
  • Pull the sprint metrics (velocity, burndown, completed vs. planned) before the meeting

Meeting Details

  • Date:
  • Facilitator:
  • Attendees:
  • Duration: 45–60 minutes

Sprint Goal Recap

Restate the sprint goal as it was set at the start of the sprint — not a post-hoc summary of what shipped. Did we accomplish it? Be honest.

Sprint 34 Goal: Give admins a complete view of workspace permissions so they can prepare for compliance audits without exporting data manually.

Outcome: Partially achieved. The permissions overview table and role filter panel shipped. The bulk action bar was scoped down mid-sprint due to a timezone data issue and moved to Sprint 35. Admins can now view and filter permissions; bulk changes are still manual.


Demo Items

For each item, capture: what was built, who presented, and what the key design decisions were. Link to the relevant issue or PR.

1. Permissions Overview Table — presented by @james The new Settings → Permissions page shows all workspace users in a table with their role, assigned projects, and last-active date. Admins can sort by any column. Role badges use color coding for fast scanning. → Issue #1142

2. Role Filter Panel — presented by @james A collapsible filter panel lets admins filter by role, project, or last-active date range. Filters are composable (you can apply multiple at once). Empty state is shown when filters return no results. → Issue #1143

3. Permissions Export (CSV) — presented by @ana Admins can now export the current filtered view to CSV with a single click. The export respects active filters — export only what you see. File name includes the workspace name and export date. → Issue #1144


Feedback

Capture stakeholder feedback in real time. Note who said it and whether it's a blocker, a suggestion, or validation.

  • @customer-success (Rena): "The filter by last-active date is going to be huge for deprovisioning workflows — customers ask about this constantly." ✓ Validation
  • @sales (David): "Enterprise prospects are going to ask whether this export is tamper-proof. Can we add a hash or signature?" → Suggestion — log for future consideration
  • @product (Priya): "The empty state copy is a bit generic — 'No results found' doesn't tell admins what to do next." → Blocker for shipping; needs update before launch
  • @engineering (Marcus): "Should we show a warning if the workspace has users with no role assigned?" → Good edge case — add to Sprint 35 backlog

Metrics

Report on sprint-level metrics: velocity, planned vs. completed, and any product metrics that are already trackable.

MetricTargetActual
Story points completed3428
Issues closed119
Bulk action bar (carry-over)ShippedMoved to Sprint 35
Time to generate CSV export (p95)< 3s1.4s ✓

Lessons Learned

What would the team do differently? What went well? Keep this honest — it feeds into the retrospective.

  • Went well: Splitting the permissions table and filter panel into separate issues made it easy to parallelize — @james and @ana worked in parallel without conflicts
  • Could improve: The timezone data inconsistency wasn't discovered until mid-sprint; an earlier spike would have caught it. Consider a 1-day investigation task at sprint start for any feature touching timezone data
  • Process note: Demo environment was unstable at the start of the meeting — add "verify demo env" to the sprint review prep checklist

Action Items

OwnerActionDue DateStatus
@designUpdate empty state copy for no-results filter view2024-12-09Open
@priyaAdd tamper-proof export to future roadmap considerations2024-12-13Open
@marcusAdd "users with no role" edge case to Sprint 35 backlog2024-12-06Open
@jamesMove bulk action bar issue to Sprint 35 with updated scope2024-12-06Open

Follow-up

Post the sprint review notes to the team channel within 24 hours. Tag any stakeholders whose feedback is being acted on. Link the notes from the sprint tracking board. Unresolved feedback items should be triaged into the backlog before the next sprint planning session. The PM owns communicating sprint outcomes to leadership.

Skip the template

Let Stoa capture it automatically.

In Stoa, the AI agent listens to your sprint review / demo and captures decisions, drafts artifacts, and tracks open questions in real time — no note-taking required.

Create your first Space — free