Tell me about a time when you did some POC but the product decided to drop it
Asked at:
Meta
Try This Question Yourself
Practice with feedback and follow-up questions
What is this question about
This question assesses how you behave when your work does not ship, especially when a thoughtful technical exploration is ultimately not pursued. Interviewers want to see whether you can separate the value of learning from the value of launch, and whether you respond with maturity rather than defensiveness. For more senior candidates, it also tests judgment: did you run the right experiment, communicate the findings clearly, and help the team make a good decision even if it meant stopping your own idea?
“Describe a time you built a prototype or ran a short investigation, but the team decided not to pursue the idea.”
“Have you ever done exploratory work that answered the question, but the feature still didn't get built? What happened?”
“Tell me about a situation where you proved something was possible, and the product direction still changed.”
“What's an example of a prototype or early experiment you worked on that was ultimately shelved?”
“Can you walk me through a time when you invested in validating an idea and the final decision was to stop?”
Key Insights
- You do not need to make the product team look wrong. Strong answers show that a dropped proof of concept can still be a success if it reduced uncertainty, saved future cost, or clarified priorities.
- You should explain what decision was being informed, not just what you built. Interviewers care whether your experiment was well-shaped for the ambiguity, not whether the prototype was technically impressive.
- Name what you did after the idea was dropped. The strongest candidates show they closed the loop, shared the learning, and adapted their behavior or roadmap instead of treating the cancellation as wasted effort.
What interviewers probe atlevel
Top Priority
At junior level, show that you understood the purpose of the investigation and were not just building something because it seemed interesting.
Good examples
🟢The goal of the short investigation was to learn whether we could support a new workflow without adding too much complexity, so even though it was dropped, we answered that question quickly.
🟢I treated the prototype as a way to test feasibility and user impact early, not as a commitment to ship, and that made it easier to accept the final decision.
Bad examples
🔴I built a quick version because I thought it would be cool for the product, but they decided not to use it, so we just moved on.
🔴My lead asked me to try something out, and when product dropped it, I mostly felt the work was wasted since the prototype itself worked.
Weak answers center on attachment to the artifact; strong answers center on the decision the artifact was meant to inform.
A strong junior answer shows you handled the disappointment professionally, stayed engaged, and took responsibility for making the work useful.
Good examples
🟢I was disappointed, but I summarized what we had learned and asked my lead how to make sure the investigation helped future planning.
🟢I accepted the decision and focused on handing off the key findings clearly so the team could benefit from the work even though we were not shipping it.
Bad examples
🔴Once product decided not to move forward, there wasn't much else for me to do, so I set it aside and focused on other tickets.
🔴I disagreed with the decision because the prototype worked, but since it wasn't my call, I just stopped working on it.
Weak answers sound passive or resentful; strong answers show professionalism, ownership, and constructive follow-through.
Valuable
You do not need to agree with the decision, but you should show that you tried to understand why others made it.
Good examples
🟢When I asked follow-up questions, I learned the main issue was that the user benefit was not large enough compared with other work already committed.
🟢I understood that feasibility was only one input, and product was also weighing support cost and how much change users would actually see.
Bad examples
🔴Product dropped it even though the technical part was working, so I think they were just being overly cautious.
🔴From my point of view it was obviously useful, but they had different priorities and that's just how it goes.
Weak answers flatten the other side into arbitrary blockers; strong answers show curiosity about the real tradeoffs.
A junior story can be small, but it should still involve a real product or technical decision and your own contribution should be clear.
Good examples
🟢I owned a focused investigation within a team project, with guidance from my lead, and my findings contributed to a real decision about whether to continue.
🟢The effort was small in implementation but meaningful for the team because it answered whether a user-facing change was worth additional work.
Bad examples
🔴I tried a small UI tweak in my local environment and product decided not to use it, so that was my proof of concept story.
🔴The example was a hackathon idea I worked on by myself, and later the product team did not pick it up.
Weak answers are too trivial or disconnected from actual product decisions; strong answers are appropriately scoped and clearly consequential for a junior engineer.
Example answers atlevel
Great answers
In my first year, I was asked to do a short investigation on whether we could add a bulk edit flow to an internal tool our support team used. I built a small prototype that covered the core interaction and found that the technical part was manageable, but it also exposed a lot of validation rules that would make the user experience more confusing than we expected. In the review with my lead and product manager, they decided not to move forward because the support team had a bigger pain point elsewhere and this one would take more effort than the impact justified. I was a little disappointed, but I wrote up the edge cases and what we learned so we would not have to rediscover them later. That helped on a later project because we reused part of the validation approach and I also got better at keeping these investigations tightly scoped.
In my second month at a small consumer app startup I volunteered to prototype an offline caching and sync flow because I’d spoken with customer support about users in areas with flaky internet. Over a week I built a basic prototype that stored edits locally and replayed them when the device came online, including a simple rule for handling conflicts. In the roadmap review the product manager chose not to greenlight it: analytics showed most of our active users were on stable connections and the team couldn't commit the extra QA and monitoring work required for a safe rollout. I was disappointed — I really wanted to make the app more reliable for those users — but I packaged the prototype into a reusable module, wrote a short design note and a test plan, and shared the learnings with the team. A few months later someone used the module in a hack day and the design note made it much faster to build a more robust version. The experience taught me to tie POCs more explicitly to measurable business signals and to structure prototypes so they can be salvaged if priorities change.
Poor answers
I made a prototype for a new dashboard widget that I thought users would really like. It worked pretty well and I showed it to product, but they decided not to include it because they had other priorities. At that point there wasn't much else to do, so I just saved the code and moved on to my next task. I still think it was a good idea because the implementation part was already solved.
Question Timeline
See when this question was last asked and where, including any notes left by other candidates.
Mid September, 2024
Meta
Senior
Tell me about a time when you did some POC but the product decided to drop it
Hello Interview Premium
Your account is free and you can post anonymously if you choose.