A car that will not start on a cold winter day and one that will not start on a hot summer day usually indicate two very different situations. When pressed to explain the difference, we would give a winter account- Oil is more viscous in cold conditions, and that causes . . .\u27\u27 -and a summer story- Vapor lock is a possibility in hot weather and is usually caused by . . .\u27\u27 How do we build such explanations? One possibility is that understanding how the car works as a device gives us a basis for generating the explanations. But that raises another question: how do people understand devices? Model-based reasoning is a subfield of artificial intelligence focusing on device understanding issues. In any model-based-reasoning approach, the goal is to model\u27\u27 a device in the world as a computer program. Unfortunately, model\u27\u27 is a loaded term-different listeners understand the word to mean very different concepts. By extrapolation, model-based reasoning\u27\u27 can suggest several different approaches, depending on the embedded meaning of model.\u27\u2