Search
⌘K

Image Transform Pipeline with Scaling

Build a pipeline to apply JSON-defined transformations to 10k images, then optimize for performance using concurrency and design for distributed processing at million-image scale. Complete implementation, optimization, and architecture discussion within 45 minutes.

Asked at:

Anthropic


Question Timeline

See when this question was last asked and where, including any notes left by other candidates.

Mid October, 2025

Anthropic

Staff

Given a set of images in IMAGES folder ( and there are 2 sub folders, Large and Small ) apply the image transformations defined in other folder (TRANSFORM) . Transform files are json files. and there are in this formar [{transform:'mirror', arg:[]}, ...] . can be many transform object in that array. I did not count but probably 10k images and for each image 200 json files to apply. I mentioned that i never worked with image processing before and interviewer said, ' I can use any library' , I spent 10 minutes to find and understand how to use that library, first step was just, load all the files, and transform files and start applying them to each image and save the transformed image in to output. given the numbers that takes quite a bit time to complete. second step add whatever you want ( threading, concurrency etc) to process more efficiently. third step, discuss doing it in larger scale (million image) and make it distributed . ( Images are in different places, output goes different places etc.) and have many workers ( adding queues, kafka etc) and you get 45 minutes to complete the task.

Your account is free and you can post anonymously if you choose.