How to Survey Conference Attendees in Real Time
2026-03-29
Why post-conference surveys fail
The response rate for post-conference surveys sent by email is typically 10–20%. That means 80–90% of attendee experience data is never captured. What you receive is self-selected — usually from the most satisfied and the most dissatisfied, with the median experience underrepresented.
Real-time conference surveys capture data at the moment when experience is freshest and attendees are still present. Response rates for in-session QR code surveys typically reach 50–70%, giving you a much more representative sample.
Setting up per-session QR codes
Create a separate survey on rifts.to for each session or track. A standard session feedback survey needs just two questions: an overall rating (1–5) and one open question ("What was most valuable about this session?"). Create them in advance, print the QR codes, and include them in the session signage or on table tents.
At the end of each session, the speaker or moderator asks attendees to scan the code and fill in the two-minute anonymous form before leaving. Results appear in your admin dashboard in real time — you can review session scores during breaks to identify problems while you can still address them.
What to measure at a conference
Per-session satisfaction ratings let you rank content quality across speakers and topics. This data is valuable for future programming decisions: high-scoring session formats or topics should be expanded; low-scoring ones should be replaced or restructured.
Open-text responses on "most valuable" surface what attendees actually came for. This often differs from what organizers think they came for. Patterns in free-text responses — themes that appear across multiple sessions — reveal the conference's unmet needs and future programming opportunities.
Acting on data between sessions
Check in-session scores during lunch or breaks. If a session scores below expectations, brief the next speaker with specific feedback: "Attendees in the morning session wanted more practical examples." Real-time data enables real-time program adjustment in a way that post-event surveys never can.
Share aggregate scores with speakers after the conference, not during. Individual scores during the event can create anxiety that harms subsequent performance. Aggregate results after the event give speakers useful development feedback with enough distance to process it constructively.