At last, something I can talk about.
There has been some conversation here about current through lamps increasing
as the voltage decreases because the lamp is rated at a particular wattage
and Ohm's law was used (abused) to show that this was true. However, that
isn't quite the way it works.
The wattage rating on a lamp is an indication of the maximum wattage it will
deal with without releasing excess smoke. A lamp can be thought of as a
constant resistance device. That means that no matter how much smoke you
force through it, the resistance remains essentially the same. It will, in
fact, change as the filament heats up but that is inconsequential here.
Wattage is calculated by Volts X Amps. Since the lamp is a fixed
resistance, you find out how much current will flow through it by respecting
Ohm's law V= I X R or I = V/R. Therefore, as the voltage drops so does the
current. As they both drop, so does the wattage. If the voltage increases
so does the current. If you increase the voltage too much, you will have to
much wattage and exceed the rating for the bulb and the filament will break
and spill smoke all over the inside of the bulb which will keep any more
light from coming out of it.
I hope that this helps to clarify the situation. If more is needed, or
wanted, I will be happy to hold forth at great length on this and related
topics on the study of smoke.
Ted
Edward B. (Ted) Weiler, tweiler@eskimo.com
Engineering Manager, Olympic Medical
Director, Volunteers NorthWest
http://www.eskimo.com/~tweiler/vnw.html
Membership, MG Car Club NorthWest Centre
http://www.eskimo.com/~tweiler/mgccnwc.html
|