V = I*R where V is the voltage drop across the resistor, I is the current drawn by your meter, and R is the value in Ohms of the resistor.
It depends upon how much current your multimeter draws over the voltage range it will see. If I were determined to use just a dropping resistor, I would find out how low of a voltage it can take and still function - since I don't want to failure test the high voltage level - and size the resistor, calculate the voltage across the meter at the highest voltage if it were to draw the same current and see if the value was reasonably within 10% or so of the meters voltage. The problem would be that if conditions existed where the meter drew almost no current, there would be almost no voltage reduction across the dropping resistor.
You might pick up a 3 pin 7809 voltage regulator from Radio Shack, E-bay, or some other place, and have better results for your meter. The standard 7809 should handle the low current draw of the meter up to the maximum voltage your 12V. system would see with no problems, and in a T-220 package, likely without additional heat sinking, but read the specifications. You can also add a switch so that it is only powered when you need it to reduce the mA's of draw when the meter is not in use. Rich