Tell me about a time when you evaluated the customer experience of your product or service.
Asked at:
Amazon
Try This Question Yourself
Practice with feedback and follow-up questions
What is this question about
Interviewers use this question to see whether you treat customer experience as something to be understood directly rather than assumed from inside the team. They want evidence that you can gather signals, interpret them thoughtfully, and turn what you learn into meaningful product or service improvements. At higher levels, they are also looking for whether your evaluation approach matched your scope, influenced others, and led to durable changes.
“Describe a time when you dug into how customers were actually experiencing a product or feature you worked on.”
“Can you give me an example of when you looked closely at the user journey and learned something important?”
“Tell me about a situation where you tried to understand whether your product was working well for customers.”
“Have you ever uncovered a customer pain point by evaluating the experience directly? What did you do?”
“Walk me through a time you assessed the end-user experience and used that information to make improvements.”
Key Insights
- You should not frame this as "we looked at a dashboard and saw usage was down." Strong answers show how you got closer to the real customer experience, such as observing behavior, reviewing support themes, talking to users, or tracing pain points end to end.
- Do not stop at collecting feedback. Interviewers want to hear your judgment: how you separated symptoms from root causes, decided what mattered most, and acted on incomplete or messy signals.
- You do not need a dramatic redesign story. A strong answer can be about a smaller evaluation effort if you clearly show ownership, depth of thinking, and how what you learned changed the product, process, or team behavior.
What interviewers probe atlevel
Top Priority
You do not need perfect analysis at junior level, but you should show that you distinguished the main pain point from secondary noise.
Good examples
🟢I noticed there were a few cosmetic complaints, but the bigger pattern was that users could not complete the setup flow. I prioritized understanding that blocker first.
🟢There were multiple suggestions from users, and I worked with my teammate to separate preference feedback from the steps where users were actually getting stuck.
Bad examples
🔴Users mentioned several issues, so I focused on polishing the UI because it was the easiest part to change quickly.
🔴I treated every piece of feedback as equally important and tried to address all of it in the same update.
Weak answers chase visible or convenient issues; strong answers identify the highest-impact customer pain point.
A good junior answer does not end with learning something; it shows that you helped turn that learning into an actual improvement and checked whether it worked.
Good examples
🟢I proposed a small change to the confusing step, implemented it with support from my teammate, and then checked whether users were getting through that part more often.
🟢Once we saw where users were dropping off, I helped make the wording clearer and monitored support questions afterward to see if the confusion decreased.
Bad examples
🔴After identifying the issue, I shared my notes with my lead and assumed the team would take it from there.
🔴I updated the interface based on the feedback and moved on because the problem seemed straightforward once the change was shipped.
Weak answers stop at observation or shipment; strong answers carry the learning through to change and verify the result.
Valuable
For junior candidates, a narrow but concrete story is fine as long as you stayed close to the work and made a real contribution.
Good examples
🟢I focused on one part of the user flow I worked on directly, explained how I gathered evidence, and showed the specific improvement I helped make.
🟢My example was limited to a small feature, but I was clear about what I owned, who helped me, and how we checked whether the change improved the experience.
Bad examples
🔴I described how our whole product strategy improved customer experience, but my actual role was mostly implementing one ticket from the plan.
🔴I chose a story about watching one friend use the product once and presented it as a full evaluation of customer experience.
Weak answers either overstate ownership or pick an insignificant example; strong answers match the story's scale to actual junior-level contribution.
Example answers atlevel
Great answers
In my last internship, I worked on a signup flow for a small business tool, and after launch we saw fewer people finishing account setup than we expected. I asked my mentor if I could help investigate, and I reviewed support messages and watched a few recordings of new users going through the flow. I noticed several people got stuck on one step because the button label made it sound optional when it was actually required. I updated that wording and added a short explanation with help from another engineer on the team. After we released it, the completion rate for that step improved and support questions about setup dropped over the next couple of weeks. What I liked about that experience was learning that the product can feel clear to the team but still be confusing to someone using it for the first time.
At my first job after bootcamp I was assigned to a small team maintaining a medication reminders app for older adults. I wanted to understand real-world use, so I spent two afternoons at a local senior center running short usability sessions where people tried key tasks while I took notes and asked simple questions. I discovered problems like tiny touch targets, low contrast, and icons that didn’t mean anything to our users, so I worked with the designer to increase button size, improve color contrast, and replace icons with plain text labels. After we shipped those changes, support calls about navigation dropped noticeably and several participants emailed to say the app finally felt easy to use — that made me proud because it showed how small, empathetic adjustments can make technology genuinely helpful.
Poor answers
I usually evaluate customer experience by using the product myself and seeing whether the flow feels smooth. On one project, I clicked through the onboarding flow and thought it was a little busy, so I cleaned up some spacing and reorganized the page. The result looked much nicer and the team agreed it was easier to follow. I think that's a good example because sometimes you can tell what customers need just by putting yourself in their place.
Question Timeline
See when this question was last asked and where, including any notes left by other candidates.
Mid August, 2024
Amazon
Mid-level
Hello Interview Premium
Your account is free and you can post anonymously if you choose.