Search
⌘K

Tell me about a time when you didn't know what to do next or how to solve a challenging problem.

Asked at:

Amazon

Amazon


Try This Question Yourself

Practice with feedback and follow-up questions

What is this question about

Interviewers use this question to see how you behave when the path is unclear, not just when you already know the answer. They want evidence that you can turn uncertainty into progress through reasoning, learning, and action rather than freezing, thrashing, or waiting passively for rescue. At higher levels, they are also looking at whether you create clarity for others, not just for yourself.

  • Describe a time when you were stuck on a problem and had to figure out how to move forward.

  • Tell me about a situation where the path forward wasn't clear. What did you do?

  • Can you give me an example of a difficult problem where you didn't have the answer at first?

  • What's a time you faced a lot of uncertainty in your work and had to decide on the next step?

  • Have you ever been handed something challenging with no obvious solution? Walk me through how you approached it.

Ambiguity
Ownership
Perseverance
Growth

Key Insights

  • You do not need a story where you eventually had a brilliant answer immediately. A strong response often starts with genuine uncertainty, then shows how you reduced that uncertainty step by step.
  • Do not make the story about being stuck until someone senior told you exactly what to do. Even when you ask for help, show how you framed the problem, tested options, and drove the next move.
  • Name the decision process, not just the ending. Interviewers learn a lot from how you narrowed possibilities, chose what to try first, and managed risk while the answer was still unclear.

What interviewers probe at
level

Top Priority

For junior candidates, interviewers are looking for a simple but disciplined approach: observe, narrow, test, and learn.

Good examples

🟢I started by reproducing the problem consistently, then changed one variable at a time so I could tell which assumption was wrong.

🟢I split the task into smaller questions, like whether the data was wrong or the display logic was wrong, and tested those separately.

Bad examples

🔴I wasn't sure where the issue was, so I tried a few changes that seemed reasonable and kept going until something passed.

🔴I read a lot of documentation and examples, but I didn't really have a plan for how that reading would help me decide what to do next.

Strong answers show an intentional learning loop; weak answers sound like activity without a method.

Even at junior level, do not stop at 'then it worked'—show what changed in your future approach because of the experience.

Good examples

🟢We fixed the issue, and afterward I started using the same reproduce-narrow-test approach on later bugs, which made me faster and less dependent on ad hoc help.

🟢The immediate problem was solved, and I also documented what I learned so the next person onboarding to that area would not hit the same confusion.

Bad examples

🔴In the end my teammate helped me solve it, and after that I moved on to the next task.

🔴We eventually fixed the problem, so it turned out fine and there wasn't much else to do.

Strong answers show both resolution and behavior change; weak ones treat success as the end of thinking.

At junior level, interviewers want to see that you can admit confusion early, stay engaged, and seek direction productively instead of hiding the problem.

Good examples

🟢I realized pretty quickly that I didn't understand the failure mode well enough, so I wrote down what I knew, what I didn't know, and asked a teammate to sanity-check my understanding before I went further.

🟢I was new to that part of the system, so instead of guessing, I reproduced the issue locally and narrowed the uncertainty to two likely components before asking for help on the specific gap.

Bad examples

🔴I was blocked because the codebase was unfamiliar, so I kept reading files for a few days until my teammate eventually explained the right area to change.

🔴I didn't know how to solve the bug, but I assumed it would become clear once I tried enough random fixes, so I just kept changing things until one worked.

Strong candidates are candid about not knowing, but they convert confusion into a structured understanding; weak candidates either hide the uncertainty or stay vague and reactive inside it.

Valuable

Persistence matters, but only if you are learning from each attempt instead of repeating effort blindly.

Good examples

🟢When my first idea failed, I documented what that ruled out and changed my approach instead of repeating the same kind of test.

🟢I kept at the problem over a few days, but each step was based on what I learned from the last one, so the search space got smaller.

Bad examples

🔴I spent a long time trying different fixes because I wanted to prove I could solve it on my own.

🔴The issue kept coming back, so I reran the same checks a few times in case I had missed something.

Strong candidates persist intelligently; weak ones equate effort or time spent with progress.

Junior candidates should absolutely ask for help, but the signal is whether you asked thoughtfully and kept ownership of execution.

Good examples

🟢After narrowing the issue, I asked a teammate a very specific question about one subsystem, then used that answer to continue the investigation myself.

🟢I came to office hours with what I had tried, what I learned, and the two things I was still unsure about, which made the conversation short and useful.

Bad examples

🔴Once I got stuck, I messaged my mentor and asked what I should do, then I followed their steps exactly.

🔴I escalated quickly because I didn't want to waste time, and after that I mostly waited for guidance before making changes.

Strong junior candidates use help to accelerate learning; weak ones use help to replace their own problem-solving.

Example answers at
level

Great answers

During my first few months in my last role, I was asked to fix a bug where some users were seeing stale profile data. I honestly didn't know what to do next at first because I was new to that service and there were several places the issue could have been coming from. I started by reproducing the bug consistently in a test environment, then I split the problem into smaller questions: was the data wrong in storage, or was the app showing old data? After narrowing it down, I asked a teammate one specific question about how the caching layer worked, and that helped me confirm the real issue without handing the problem over. I made the fix, added a small test for the case we missed, and wrote a short note for the team because I realized my biggest lesson was to make the unknowns explicit early instead of just reading code and hoping the answer would appear.

On a project where I was building a simple referral flow for our mobile app, I hit a point where I honestly didn’t know what to do next because the product requirements were vague and different stakeholders had conflicting ideas about privacy and user experience. I stopped trying to guess the right implementation and set up a short meeting with the designer and product manager to list the concrete outcomes we needed and the restrictions we couldn’t break. With those constraints clarified, I made a quick runnable prototype of the UI and a stubbed backend so we could try the flow on a device and get feedback from QA and support. That prototype revealed two edge cases we hadn’t considered, so I implemented a minimal, test-covered solution behind a feature flag and asked for a small beta rollout. The release went smoothly, and I learned how much time I save by turning uncertainty into a tiny experiment and involving the people affected early instead of coding in isolation.

Poor answers

I had a ticket once where an API call was failing and I wasn't sure why. I spent a couple of days going through different files and trying changes until I couldn't think of anything else, so I asked a more senior engineer and they showed me the problem. After that I updated the code and it worked. It was a good experience because sometimes you just need to keep digging until someone who knows the area can point you in the right direction.

Question Timeline

See when this question was last asked and where, including any notes left by other candidates.

Mid August, 2024

Amazon

Amazon

Mid-level

Your account is free and you can post anonymously if you choose.