I get the point of this but I'm slightly annoyed that it dinged me for not telling it to pick up the knife to cut the sandwich in half when I didn't tell it to cut the sandwich in half. I don't want my PBJ cut in half. It gave me two different options for how to cut it in half and I didn't select either, so it shouldn't need the knife at that phase.
Demonstrations like this are a regular feature of the Japanese educational TV show "Texico", which teaches logical thinking with the specific goal of preparing young children for programming.
I highly recommend it. It's extremely well made, and quite entertaining even for adults.
It's available in English, 10 minutes per episode, no subscription required:
I've spent some time thinking about this earlier as this indeed is one way a teacher would introduce a young child to programming (but by using actual bread, pb and j). An important underlying question is why kids would learn programming in the first place if they're not going to be programmers... one answer, which applies to math as well, is that it is learning another way to think. The whole point is that it is difficult to specify exact behavior, especially when you can't lean on someone's already established understanding of the world.
Another related idea (if I don't misremember) is brought forth in the book "Program or be Programmed": that it's not the programming itself but learning that things powered by software are intentionally (by meticulous instruction, like above) made to work like they do rather than just happen to work a specific way. Which hopefully leads to the realization that we have agency and can change how things work in the world, should we want to.
Now, some people are arguing for teaching kids programming via vibe coding and one the one hand I can see their point but on the other hand, it was never about the programming in the first place. Vibe coding is kind of the opposite of the two ideas if you don't first teach them. It's making the PBJ-making teacher/robot go "oh, so you want a PBJ, here's one". There's no learning new ways of thinking. It's also making it seem like things are not intentionally made to work a specific way but more just happened to become that way. Some of that empowerment and agency is lost, I feel, although I can see that there is agency in creating things too.
I saw a YouTube Short of a teacher demonstrating this to her young students. Of course the kids are laughing lots at the results of her literally enacting their instructions and exaggerating the missing necessary info. But I bet they came out with a far more technical thought process.
i once had this "make a PB&J" as part of a written take-home interview.
i knew the schtick -- no matter how precise and complete you are, there is always the possibility for another little gotcha. and that makes it absolute rubbish for a take home because... how much detail do i need to go into to satisfy the manager reviewing this? i think i wrote a couple paragraphs and ended with a little rant about how i know how this problem works and it'd work better in person. i don't know how much they expected somebody to write.
> how much detail do i need to go into to satisfy the manager reviewing this?
it would've been fun to troll by writing the instructions on exactly which muscle to contract and extend for X seconds, and moving in an arc of Y minutes.
It'd be like writing assembly code for your skeleton and muscles'.
It’s kind of interesting relating this to LLMs. A chef in a kitchen you can just say you want PB&J. With a robot, does it know where things are, once it knows that, does it know how to retrieve them, open and close them. It’s always a mystery what you get back from an LLM.
Although this is a facetious take, instructing a robot to follow recipes is a fantastic introduction to coding. I added a visual scripting layer to Overcooked so kids can program robots to make all sorts of dishes (Sushi, Pasta, Cakes etc.)
I think this is a great introduction to logical thinking and coding. The overcooked scripting layer looks awesome and very polished. Reminds me a bit of Scratch (the programming language).
Are you going to make it available to others?
There are also video games based on this concept, e.g. Bots are Dumb. So maybe your scripting layer it could even become its own commercial game.
Breaka Club is still very early days. Current focus is in person, but the plan is to offer an online club experience also. I'm not quite sure what that will look like just yet. Ideally yes, I'd love to make this available to others.
We're also currently building Breaka Club's own game, which is where the majority of development efforts are focused. However, since we already have the Overcooked coding experience, we haven't prioritized the visual script layer for this game just yet - it's on our roadmap.
Presently, our game is more of a cozy farming RPG / world building sandbox, with a no-code solution for world building:
This feel diluted compared to what it could have been. Would be better if you had a bunch of instructions and could drag them into sequence at each screen.
It's great marketing! But yes. He considers it an error to specify "Use the same knife for the jelly" even though it's considered correct to state "Wipe the knife clean before using it for jelly". The latter statement implies the former, and if you follow all the instructions both are not wrong.
I also consider some of the instructions to be under specified. For example, a piece of bread could be said to have 6 sides, but only 2 of those are helpful for making a sandwich.
I'm just annoyed I lost points for specifying both "Open the peanut butter jar" and "unscrew the lid of the peanut butter jar". The first one's context! The second "more precise" version doesn't specify what you should do with the now-unscrewed lid and the obvious solution, for a robot that takes things literally, is to leave it sitting on top of the jar.
It’s funny, when I’ve seen this demonstrated, it’s basically literally impossible to get the right result because the test maker doesn’t define an instruction set that you can rely on. They will deliberately screw up whatever instructions you give them no matter how detailed. A computer has a defined ISA that is specified in terms of behavior. A compiler transforms a language with higher level abstractions into this low-level language. I’ve never seen this “test” done with any similar affordance, which doesn’t really teach anything.
Oh I think this lesson teaches quite a lot. Maybe your instructor is deliberately screwing up, but perhaps other end users are just not paying attention, or are missing assumed knowledge, or are feeling particularly adversarial on the day they need to follow your instructions.
One of many lessons that can be taken away from this exercise is to understand your audience and challenge the assumptions you make about their prior knowledge, culture, kind of peanut butter, et deters.
In the early 1980s I read an Usbourne (sp?) introduction to programming book for kids that had a picture of a robot walking through a brick wall while following its programming to ‘take a letter to the letterbox’.
At this rate, it looks like we’ll solve that problem by not having letters/letterboxes.
My "related" past threads fu is failing me just now but I know there have been threads with this theme in the past, including the video with the dad carrying out his kids' literal instructions in a cute but also borderline uncomfortable way.
When I was about 7 or 8 years old, my elementary school music teacher did this same exercise with us, except the goal was to draw a musical staff and the first 3 notes of Jingle Bells (or something along those lines). I can still remember how much fun I thought it was.
Of course, we need to give the robot a cognitive architecture so that it understands the task, the context, and corrects its actions, and then it will autonomously make such sandwiches every morning for breakfast.
There is an alternative to describing the (subjective) “process”. That is to describe a model of the sandwich - the parts and how they can collaborate. The issue is that how to do that is forgotten and unfashionable.
It’s almost like we need some deterministic set of instructions that can be fed to a machine and followed reliably? Like… I don’t know… a “programming language”?
I would say that's exactly not the solution, since the surface area is too large to hard code (which is somewhat the point of this). Evidence being, it's 2026 and there are exactly 0 robots that can do this simple task reliably, in any kitchen you put it in.
You need something general and flexible, dare I say "intelligent", or you'll be babysitting the automation, slowly adding the thousand little corner cases that you find, to your hard coded decision tree.
This is also why every company with a home service robot, that can do anything even remotely complex as a sandwich, are doing it via teleoperation.
What's the point? No matter how detailed and comprehensive the instructions and steps by the AI, you still don't get a PBJ sandwich to eat. You have to go to the kitchen and do it yourself.
It’s a reference to a famous YouTube video[0] about how to write instructions that can be followed.
One of the most important things a programmer needs to do is learn how to tell a computer how to do something. It’s a surprisingly hard skill because each step is way more complicated and has way more variables to go through.
I get the point of this but I'm slightly annoyed that it dinged me for not telling it to pick up the knife to cut the sandwich in half when I didn't tell it to cut the sandwich in half. I don't want my PBJ cut in half. It gave me two different options for how to cut it in half and I didn't select either, so it shouldn't need the knife at that phase.
Demonstrations like this are a regular feature of the Japanese educational TV show "Texico", which teaches logical thinking with the specific goal of preparing young children for programming.
I highly recommend it. It's extremely well made, and quite entertaining even for adults.
It's available in English, 10 minutes per episode, no subscription required:
https://www3.nhk.or.jp/nhkworld/en/shows/texico/
Please submit this link to Texico. I think it deserves a broader audience.
Texaco + Mexico = Texico? The Japanese never fail to amuse foreigners with their naming.
Texas?
I've spent some time thinking about this earlier as this indeed is one way a teacher would introduce a young child to programming (but by using actual bread, pb and j). An important underlying question is why kids would learn programming in the first place if they're not going to be programmers... one answer, which applies to math as well, is that it is learning another way to think. The whole point is that it is difficult to specify exact behavior, especially when you can't lean on someone's already established understanding of the world.
Another related idea (if I don't misremember) is brought forth in the book "Program or be Programmed": that it's not the programming itself but learning that things powered by software are intentionally (by meticulous instruction, like above) made to work like they do rather than just happen to work a specific way. Which hopefully leads to the realization that we have agency and can change how things work in the world, should we want to.
Now, some people are arguing for teaching kids programming via vibe coding and one the one hand I can see their point but on the other hand, it was never about the programming in the first place. Vibe coding is kind of the opposite of the two ideas if you don't first teach them. It's making the PBJ-making teacher/robot go "oh, so you want a PBJ, here's one". There's no learning new ways of thinking. It's also making it seem like things are not intentionally made to work a specific way but more just happened to become that way. Some of that empowerment and agency is lost, I feel, although I can see that there is agency in creating things too.
I saw a YouTube Short of a teacher demonstrating this to her young students. Of course the kids are laughing lots at the results of her literally enacting their instructions and exaggerating the missing necessary info. But I bet they came out with a far more technical thought process.
This should be part of the curriculum.
i once had this "make a PB&J" as part of a written take-home interview.
i knew the schtick -- no matter how precise and complete you are, there is always the possibility for another little gotcha. and that makes it absolute rubbish for a take home because... how much detail do i need to go into to satisfy the manager reviewing this? i think i wrote a couple paragraphs and ended with a little rant about how i know how this problem works and it'd work better in person. i don't know how much they expected somebody to write.
> how much detail do i need to go into to satisfy the manager reviewing this?
it would've been fun to troll by writing the instructions on exactly which muscle to contract and extend for X seconds, and moving in an arc of Y minutes.
It'd be like writing assembly code for your skeleton and muscles'.
It’s kind of interesting relating this to LLMs. A chef in a kitchen you can just say you want PB&J. With a robot, does it know where things are, once it knows that, does it know how to retrieve them, open and close them. It’s always a mystery what you get back from an LLM.
Also true of specifications. Anything not explicitly stated will be decided by the implementer, maybe to your liking or maybe not.
I'm reminded of wish-granting genies, and then of 'undefined behavior' and compilers...
Although this is a facetious take, instructing a robot to follow recipes is a fantastic introduction to coding. I added a visual scripting layer to Overcooked so kids can program robots to make all sorts of dishes (Sushi, Pasta, Cakes etc.)
https://youtu.be/ITWSL5lTLig
This is part of a club to teach kids coding, creativity and digital literacy.
I think this is a great introduction to logical thinking and coding. The overcooked scripting layer looks awesome and very polished. Reminds me a bit of Scratch (the programming language). Are you going to make it available to others?
There are also video games based on this concept, e.g. Bots are Dumb. So maybe your scripting layer it could even become its own commercial game.
Thanks!
Breaka Club is still very early days. Current focus is in person, but the plan is to offer an online club experience also. I'm not quite sure what that will look like just yet. Ideally yes, I'd love to make this available to others.
We're also currently building Breaka Club's own game, which is where the majority of development efforts are focused. However, since we already have the Overcooked coding experience, we haven't prioritized the visual script layer for this game just yet - it's on our roadmap.
Presently, our game is more of a cozy farming RPG / world building sandbox, with a no-code solution for world building:
https://breaka.club/blog/why-were-building-clubs-for-kids
I did this exercise in school, over 30 years ago. Of course, with today's multimodal models, it's more like "hey, robot, make me a PBJ sandwich."
This feel diluted compared to what it could have been. Would be better if you had a bunch of instructions and could drag them into sequence at each screen.
It's great marketing! But yes. He considers it an error to specify "Use the same knife for the jelly" even though it's considered correct to state "Wipe the knife clean before using it for jelly". The latter statement implies the former, and if you follow all the instructions both are not wrong.
I also consider some of the instructions to be under specified. For example, a piece of bread could be said to have 6 sides, but only 2 of those are helpful for making a sandwich.
>The latter statement implies the former
It does not, unless you have previously instructed not to intermix ingredients in their container.
I'm just annoyed I lost points for specifying both "Open the peanut butter jar" and "unscrew the lid of the peanut butter jar". The first one's context! The second "more precise" version doesn't specify what you should do with the now-unscrewed lid and the obvious solution, for a robot that takes things literally, is to leave it sitting on top of the jar.
It’s funny, when I’ve seen this demonstrated, it’s basically literally impossible to get the right result because the test maker doesn’t define an instruction set that you can rely on. They will deliberately screw up whatever instructions you give them no matter how detailed. A computer has a defined ISA that is specified in terms of behavior. A compiler transforms a language with higher level abstractions into this low-level language. I’ve never seen this “test” done with any similar affordance, which doesn’t really teach anything.
Oh I think this lesson teaches quite a lot. Maybe your instructor is deliberately screwing up, but perhaps other end users are just not paying attention, or are missing assumed knowledge, or are feeling particularly adversarial on the day they need to follow your instructions.
One of many lessons that can be taken away from this exercise is to understand your audience and challenge the assumptions you make about their prior knowledge, culture, kind of peanut butter, et deters.
In the early 1980s I read an Usbourne (sp?) introduction to programming book for kids that had a picture of a robot walking through a brick wall while following its programming to ‘take a letter to the letterbox’.
At this rate, it looks like we’ll solve that problem by not having letters/letterboxes.
Usborne. Here it is, page 8.
https://archive.org/details/computer-programming/page/n7/mod...
My "related" past threads fu is failing me just now but I know there have been threads with this theme in the past, including the video with the dad carrying out his kids' literal instructions in a cute but also borderline uncomfortable way.
PB&J AI (3 points, 1 year ago, 2 comments) https://news.ycombinator.com/item?id=42222009
Dad Annoys the Heck Out of His Kids by Making PB&Js Based on Their Instructions (2017) https://news.ycombinator.com/item?id=13688715 https://news.ycombinator.com/item?id=41599917
& infamous: sudo make me a sandwich (2009) https://news.ycombinator.com/item?id=530000
When I was about 7 or 8 years old, my elementary school music teacher did this same exercise with us, except the goal was to draw a musical staff and the first 3 notes of Jingle Bells (or something along those lines). I can still remember how much fun I thought it was.
Of course, we need to give the robot a cognitive architecture so that it understands the task, the context, and corrects its actions, and then it will autonomously make such sandwiches every morning for breakfast.
There is an alternative to describing the (subjective) “process”. That is to describe a model of the sandwich - the parts and how they can collaborate. The issue is that how to do that is forgotten and unfashionable.
This feels like a buzzfeed quizz for developers. If you think about each step long enough you can't really get a wrong answer
As always, there's an XKCD [1] for this!
[1] https://xkcd.com/149/
It’s almost like we need some deterministic set of instructions that can be fed to a machine and followed reliably? Like… I don’t know… a “programming language”?
I would say that's exactly not the solution, since the surface area is too large to hard code (which is somewhat the point of this). Evidence being, it's 2026 and there are exactly 0 robots that can do this simple task reliably, in any kitchen you put it in.
You need something general and flexible, dare I say "intelligent", or you'll be babysitting the automation, slowly adding the thousand little corner cases that you find, to your hard coded decision tree.
This is also why every company with a home service robot, that can do anything even remotely complex as a sandwich, are doing it via teleoperation.
What's the point? No matter how detailed and comprehensive the instructions and steps by the AI, you still don't get a PBJ sandwich to eat. You have to go to the kitchen and do it yourself.
It’s a reference to a famous YouTube video[0] about how to write instructions that can be followed.
One of the most important things a programmer needs to do is learn how to tell a computer how to do something. It’s a surprisingly hard skill because each step is way more complicated and has way more variables to go through.
https://youtu.be/FN2RM-CHkuI