Search
⌘K

Tell me about a change you made that was customer-focused.

Asked at:

Amazon

Amazon

Microsoft

Microsoft


Try This Question Yourself

Practice with feedback and follow-up questions

What is this question about

Interviewers use this question to see whether you can recognize customer pain, translate it into an actionable change, and follow through to improve the user experience. They are also testing whether your idea of "customer-focused" is real and evidence-based, rather than just a feature you happened to build. At higher levels, they want to hear how broadly you influenced the product, team, or organization around customer needs.

  • Can you give me an example of a time you changed something because of what customers were experiencing?

  • Tell me about a product or engineering decision you made with the customer in mind.

  • What's a concrete improvement you've driven that made the user experience better?

  • Describe a time when you noticed customer pain and acted on it. What did you change?

  • Have you ever pushed for a change based on customer feedback or behavior? What happened?

Ownership
Ambiguity
Scope
Leadership
0

Key Insights

  • You should make the customer problem concrete. "We improved onboarding" is much weaker than explaining what customers were struggling with, how you knew, and why it mattered.
  • Customer focus is not the same as saying yes to every request. Strong answers show judgment: you understood the underlying need, weighed tradeoffs, and chose a change that meaningfully improved the customer experience.
  • Don't stop at shipping. You should explain how you validated that the change actually helped customers, even if the signal was qualitative or imperfect.

What interviewers probe at
level

Top Priority

At junior level, the strongest answers show that you understood a specific user pain point instead of just implementing a task someone handed you.

Good examples

🟢I was working on a support workflow and noticed several tickets from users who couldn't tell whether their upload had finished. I traced that to the UI not showing progress clearly, so I proposed adding a status indicator.

🟢While testing a small feature, I sat in on a few support reviews and saw customers repeatedly misinterpret an error message. That made it clear the issue wasn't just technical correctness; the message itself was causing confusion.

Bad examples

🔴My manager asked me to add a new filter to the dashboard, so I built it. Customers like having more options, so it was definitely customer-focused.

🔴We got a request to change the button color because users were confused, and I updated it. I assumed that would help because the old one didn't look great.

Weak answers label work as customer-focused without showing how the customer problem was known; strong answers tie the change to observed pain, feedback, or clear user friction.

You do not need to have invented the whole project, but you should show what you personally drove beyond just coding assigned tasks.

Good examples

🟢After noticing the issue, I wrote up a simple proposal, checked it with my mentor, and implemented the change end to end for my part of the product. I also followed up after release to make sure support wasn't still seeing the same problem.

🟢I didn't own the whole roadmap, but I took responsibility for clarifying the problem, suggesting a low-risk fix, and making sure the updated behavior matched what users actually needed.

Bad examples

🔴Once the change was approved, I implemented the ticket exactly as written and handed it off. The rest of the rollout was handled by others.

🔴I flagged that users were confused and then waited for design and product to decide what to do. After they gave me the final version, I built it.

Weak answers show task completion; strong answers show initiative and follow-through within the candidate's level of responsibility.

You don't need perfect data, but you should show that your change was based on some reasoning or evidence rather than guesswork.

Good examples

🟢Before changing the form, I looked at where users most often made mistakes and saw it was around two specific inputs. I focused the change there instead of rewriting the whole page.

🟢We weren't sure whether the issue was missing guidance or unclear wording, so I compared recent support cases and found most confusion centered on one message, which guided the fix.

Bad examples

🔴I thought the page looked cluttered, so I removed several fields to make it easier for customers. Simpler usually means better.

🔴Users seemed frustrated, so I added more help text everywhere. That felt like the safest customer-focused improvement.

Weak answers rely on taste or assumptions; strong answers show some investigation and a targeted response to the actual problem.

Valuable

Even at junior level, you should close the loop somehow; don't imply that shipping automatically means success.

Good examples

🟢After release, I checked with support a week later and saw that tickets about that confusion had dropped. I also tested the flow myself to make sure the new message was clear in the real product.

🟢We didn't have a formal metric for this, so I used a practical check: I reviewed recent user sessions with my mentor and saw fewer people getting stuck at the step we changed.

Bad examples

🔴We released the change and didn't hear many complaints after that, so I considered it successful.

🔴The feature went out on time and nobody asked for revisions, which showed it was the right customer-focused improvement.

Weak answers treat absence of noise as proof; strong answers make a deliberate effort to validate customer impact.

A strong junior story is usually a focused but real improvement; claiming ownership of a company-wide customer transformation often sounds inflated.

Good examples

🟢It was a small but meaningful change in one workflow that customers used frequently, and I can explain exactly what part I owned.

🟢I focused on one pain point in a larger product area and made a concrete improvement there rather than claiming to solve the whole customer journey.

Bad examples

🔴I changed the entire onboarding strategy for our product by suggesting we simplify the first screen. That really transformed the customer experience.

🔴I was an intern, and I basically fixed our customer retention problem by updating a settings page.

Weak answers overclaim impact relative to role; strong answers are credible, scoped, and clear about personal contribution.

Example answers at
level

Great answers

In my last role, I was working on a file upload page and noticed support tickets from users who thought uploads were failing when they were actually still processing. I dug into a few of the tickets and saw that the page gave almost no feedback after you clicked submit, so I suggested adding a visible progress state and a clearer success message. I checked the idea with my mentor, implemented it, and made sure it worked well on slower connections too. After we released it, support told us they were seeing fewer questions about "missing" uploads, which was a good sign we had fixed the confusion rather than just changed the UI. I liked that project because it taught me that a small engineering change can matter a lot when it's tied to a real user problem.

While I was at a small ed-tech startup I spent a week answering support chats and realized many teachers were stuck on our "create class" flow because terms like "CSV export" and "synchronization" confused them. I proposed simplifying the labels, adding plain-language helper text, and putting a short example CSV file directly on the page. I implemented the text changes and the example file myself, then tested the flow with two teachers from our pilot program and adjusted the wording based on their feedback. After we shipped the changes, activation for new classes rose noticeably and support saw far fewer roster-related questions. It felt great to make a small change that lowered the barrier for people who aren't technical and let them use the product right away.

Poor answers

One customer-focused change I made was updating the labels on a settings page because I felt the old wording was too technical. I changed several terms to simpler ones and sent it out in the next release. People didn't raise any issues afterward, so I consider that a success. It also showed I can think from the customer's perspective instead of just coding.

Question Timeline

See when this question was last asked and where, including any notes left by other candidates.

Mid March, 2026

Amazon

Amazon

Senior

Mid January, 2026

Amazon

Amazon

Mid-level

Late January, 2025

Microsoft

Microsoft

Senior

Your account is free and you can post anonymously if you choose.