I’m having a perplexing situation here. Wifey and I are heavy-duty boondockers with our RV. Our trailer is a 2012. The TV that came with it was a Jensen (built for traveling, so they say) and was 120 volts AC. We would power it with our older Cobra 2500-watt inverter. We really don’t watch a lot of TV but we use it to set up our Dish receiver so we can listen to the music it provides. However, once in a while it’s nice to watch some TV when the weather is bad outside. We’ve changed out all our lights in the trailer to LEDs, and I figured if we swapped our TV to a 12-volt LED as well, we’d be using less power since the inverter wouldn’t have to supply the power for the TV, and it’s an LED like all the lights.
Well, after the swap (to another Jensen TV) I hooked up my multi-meter to see how much of a voltage drop there would be when I turned on the TV. To my surprise, it seems like this new TV pulls more volts than the older one even on the inverter. Can this be?
Here are the specs on the two TVs. The old AC TV lists it at 125 watts. I put the info into an electrical calculator I found online. For 120 volts at 125 watts it shows 1 amp draw. For the new 12-volt TV, the specs say a 3.17 amp draw max. Using the calculator that means 38.4 watts. Since I’m a bit out of my league here, do these results point to a larger amperage draw with the 12-volt TV than the 120-volt TV using the inverter? Can you offer any suggestion to why my fancy new 12-volt TV seems to be drawing more power than the older AC model through the inverter? —Tommy Molnar
That’s an easy one once you understand the basics of voltage, amperage and wattage. Wattage is really the only thing that counts in terms of paying your home electric bill or managing your RV battery usage. I like to say that Watts is Watts, and it really doesn’t matter how you get there. The most basic equation to know is that volts times amperes equals watts, which looks like V x A = W.
Let’s assume a really basic power system with a 12-volt battery drawing 10 amperes of current. That’s 120 watts of power because 12 volts x 10 amps = 120 watts. If we then run the 12 volts from your battery through an inverter it’s changed to 120 volts, but the actual power (watts) doesn’t increase at all. In fact, ignoring the 5% or so of efficiency loss in the inverter electronics, you’ll have the same 120 watts of power on the 120-volt side. But because you’ve boosted the voltage by a factor of 10 (12 volts up to 120 volts), the amperage goes down by the same amount (10 amps down to 1 amp). Looking at the equation again, you can simply multiply 120 volts x 1 amp = 120 watts.
In recap, the battery side is supplying 12 volts x 10 amps = 120 watts, while the AC side of your inverter is supplying 120 volts x 1 amp = 120 watts. You’re simply trading volts for amps on each side of the equation. So if the voltage goes up by a factor of 10, then the amperage goes down by a factor of 10, and vice versa.
Circling back to your original question, we see that your old television draws 125 watts of power from the battery via the inverter, while your new television only draws 38 watts directly from the battery. That’s a 70% savings in power for your new television, which means you can run it more than 3 times as long on the same battery charge using your old TV through the inverter. So you did a great upgrade!
Feel better now?
Let’s play safe out there….
Mike Sokol is an electrical and professional sound expert with 40 years in the industry. Visit NoShockZone.org for more electrical safety tips. His excellent book RV Electrical Safety is available at Amazon.com. For more info on Mike’s qualifications as an electrical expert, click here.