the best way to think about how much voltage/amperage that any device will use is actually to use both the terms together. As was pointed out earlier you have to think of it in terms of Watts. The mathematical relationship between voltage and current for any device is dependant on its power requirements...that is in watts or kilowatts.

now for anyone versed in basic algebrais is really easy...voltage x current x a mechanical constant called "power factor" = kilowatts (or power).

if you cut the voltage down, obviously the motor has to pull more current to maintain the same power...and the motor WILL maintain the same power.

try a radio shack power supply...they sell ac to dc converters that put out 12VDC (which is what your motor is used to running off of anyway) for anywhere from $10-$100 with varying current capacities. I bought 6A power supply that will handle two wiper motors that go to my hanging corpses and they run fine all night...(and if you check the equation above you will notice that with you running your motors at 6VDC andpulling 6A...it is mathematically equivalent to running at 12VDC at 3A, both pull about 36W).

I ran my power supply off a motion detector so that when ppl approach the props they are stationary, then as they get close, the motion detector trips, the power supply gets its juice from the wall and VIOLA...the props get to moving.

happy haunting...and I hope this helped,

DannyK

oh.. and also, there is no way to buy a power supply with too many amps... Amps are drawn from the power supply as needed by the total system load, think of voltage as potential energy and current as flow. you ALWAYS have water pressure in the pipes in your house, but you dont actually SEE any water flow until you ask for it...electricity works the same way.

once again, hope this helped.

DannyK