Search
⌘K

Tell me about a time when you were unsatisfied with the status quo.

Asked at:

Amazon

Amazon


Try This Question Yourself

Practice with feedback and follow-up questions

What is this question about

Interviewers use this question to assess whether you merely notice problems or actually improve them. They want to see your standards, your initiative, and how you turn dissatisfaction into constructive action rather than complaint. At higher levels, they are also testing whether your dissatisfaction is aimed at consequential problems and whether you can change systems, not just patch local annoyances.

  • Describe a situation where you felt the existing way of doing things wasn't good enough. What did you do?

  • Tell me about a time you challenged an established process or approach.

  • Have you ever looked at how your team was working and thought, 'we should be doing this differently'? What happened?

  • Give me an example of when you saw a better way than the current default and tried to change it.

Ownership
Leadership
Ambiguity
Scope

Key Insights

  • You should pick a status quo that actually mattered. If the story is just about a minor annoyance, you may accidentally signal poor judgment about what is worth changing.
  • Don't stop at 'I saw a problem.' The strongest answers show how you diagnosed why the status quo existed, because that reveals whether you can change root causes instead of reacting to symptoms.
  • Unsatisfied does not mean cynical or combative. Show that you respected the existing constraints and still pushed for something better in a constructive, credible way.

What interviewers probe at
level

Top Priority

Even at junior level, interviewers want to see curiosity: don't just say what was bad, show how you learned why it was happening.

Good examples

🟢Before proposing a fix, I asked teammates where they were getting stuck and found that unclear setup steps, not the tooling itself, were the main issue.

🟢I spent a day tracing where test failures came from and realized the real problem was flaky shared data rather than the test framework.

Bad examples

🔴The build was slow, so I assumed our tools were the problem and started changing scripts right away.

🔴I thought the bug process was messy, so I created my own tracking sheet without asking how others were using the existing one.

Weak answers jump from irritation to solution; strong answers show a small but real diagnostic step that changes the quality of the fix.

A strong junior answer shows initiative within your lane: you don't need to own everything, but you should have taken concrete steps rather than waiting passively.

Good examples

🟢I drafted clearer setup steps, asked a teammate to review them, and then updated the team guide so the next person would have a better experience.

🟢I put together a simple comparison of failure cases, suggested a small fix we could test, and volunteered to implement it.

Bad examples

🔴I told my lead the process was bad and waited for them to decide what to do next.

🔴I complained to teammates about the issue for a few weeks until someone else finally fixed it.

Weak answers outsource ownership upward; strong answers show appropriately scoped initiative and follow-through.

At the junior level, the best stories focus on a real pain point in your day-to-day work and show that you can distinguish between inconvenience and actual inefficiency or risk.

Good examples

🟢I noticed that a manual testing step was causing us to miss regressions because people skipped it under time pressure, so I focused on improving that workflow.

🟢I was unhappy that onboarding to our service took new engineers several days because setup instructions were incomplete and outdated, which was affecting productivity.

Bad examples

🔴I didn't like how our team formatted comments in code, so I rewrote a bunch of files to make them more consistent.

🔴Our internal tool took a few extra clicks, and that bothered me, so I built my own shortcut without really checking whether others cared.

Weak answers center on personal annoyance; strong answers identify a problem with visible impact on quality, speed, or team effectiveness.

Valuable

You do not need a massive result, but you should show that something actually got better and that you can reflect on what made the change work.

Good examples

🟢After we updated the guide, the next new team member finished setup the same day, and I learned that small documentation gaps can have outsized cost.

🟢The test fix reduced repeat failures over the next few weeks, and I learned to validate the real cause before proposing automation.

Bad examples

🔴I changed the process and people seemed happier, so I considered it a success.

🔴We tried my idea for a little while and I assumed it helped because no one complained.

Weak answers assert success without evidence or learning; strong answers show a believable outcome and a takeaway that changes future behavior.

Interviewers want to hear that you can push for improvement without implying that everyone before you was careless or incompetent.

Good examples

🟢I understood the process existed for a reason, so I asked what constraints mattered before suggesting a simpler version.

🟢I framed the issue as something we could improve together rather than as a mistake others had made.

Bad examples

🔴The old way made no sense, so I just replaced it because it was obviously wrong.

🔴I was pretty direct that the team had been doing it inefficiently and needed to catch up.

Weak answers sound dismissive of prior decisions; strong answers show humility and respect while still advocating change.

Example answers at
level

Great answers

In my first few months on a backend team, I was unsatisfied with how long it took new engineers to get the service running locally. It had taken me almost three days, and I noticed people were asking the same setup questions repeatedly in chat. Instead of assuming the toolchain was the problem, I wrote down where I had gotten stuck and compared notes with another new hire; the bigger issue was that our setup guide was outdated in a few critical places. I updated the guide, added a short troubleshooting section, and asked one of the more experienced engineers to review it before we published it. When the next intern joined, she finished setup the same day, and the number of repeated questions dropped a lot. What I took from that was that small process improvements can save the team real time if you validate the actual pain point first.

I kept seeing support tickets from users who couldn't complete the checkout using only a keyboard, and as someone who cares about accessible products I was unsatisfied that accessibility seemed like an afterthought. I did a short manual audit of the checkout page, created a handful of targeted fixes (missing form labels, focus outlines, and a contrast variable), and submitted small, well-documented PRs so they could be reviewed quickly. I also added a three-item accessibility checklist to our PR template and ran a 20-minute demo showing how to do quick checks before merging. In the following weeks similar support tickets decreased and reviewers began flagging accessibility in code reviews, which felt like a real shift in the team's priorities toward more inclusive design.

Poor answers

I was unsatisfied with how our team wrote comments in code because everyone had a slightly different style. I decided to clean that up and spent a few days standardizing comments in a bunch of files. It made the code look a lot more consistent, and I think that kind of polish matters. After that, I started pointing it out in reviews so people would follow the same approach.

Question Timeline

See when this question was last asked and where, including any notes left by other candidates.

Early August, 2024

Amazon

Amazon

Mid-level

Your account is free and you can post anonymously if you choose.