I bought a Panasonic 42" plasma HDTV in 2008. It weighs about 40 pounds and cost $900. The picture is still bright and clear. But the bottom 3" lost the picture (turned black) which hides subtitles and closed captioning.
I bought a 43" Amazon Fire Omni TV on Prime Day (July 13) for $240. It weighs about 15 pounds.
My electric bill just came in. Last month, it was $134.70. This month it’s $78.62. I can’t think of anything that changed except for the TV.
Does that make sense? Could a plasma TV use that much electricity?
If you still own the plasma TV, you could try an electrical use meter like Kill-A-Watt to measure the electricity usage of each TV. For my area, there is too much variability in usage each month to just compare electric bills between months.
Seems like a lot…depends on how much TV you watch, though.
Guestimating some numbers…
$56 difference in the bill
Divide by your rate in $/kwh (assume 0.15): gives 374 kwh decrease
You should be able to check this on your bill
Assume 5 hrs per day times 30 days = 150 hrs/month
That gives you 374kwh/150h = 2.5 kwh/hr or 2500 watts
That means your Plasma is consuming 2500 watts more than your new TV.
Your new TV is probably 100-200 watts, total.
So your plasma would need to be 2600-2700.
This is not possible. A 15 amp circuit, at most, can have a load of about 1800 watts.
And realistically I doubt the Plasma is more than 1000w.
So you need to rerun these calculations with your actual electric rate and actual viewing hours.
If you have 300 viewing hrs/mon, then you’d be at 1250 watts increase…possible but still a lot.
This is a good reminder for everyone to know how much electricity your heavily used appliances consume!
An average a later model plasma tv will use about twice as much power as a comparable LCD or LED, but the number is still quite small. Using 100 watts/hr and running it 20 hours per day it would cost about 50¢ a day to leave it on at average US prices for the plasma tv. You’d save 25¢/day with another flat screen; you can do the ratios to figure savings for more/fewer hours, higher/lower rates, etc.
The earlier model plasma screens were much less efficient (about 5x) so if yours is of that vintage (I forget when they stopped making them) then the savings could be more meaningful.
Not that it’s relevant to the discussion, but an old CRT TV half the screen size wold use about the same. It was hard to get a CRT screen size of 42”, but I had an old SONY Trinitron that was close.
assume 0.15
This Californian laughs at your number:-)
Off peak is $0.33, and peak is $0.47 for me.
I look longingly back at the days of $0.15.
I know Wendy is in Washington so likely not as bad as here.
Alan
is your meter only read every second month and it gets estimated in between?
Are there any utilities left that are still using meters that must be read in person? Hasn’t everyone pretty much switched to smart meters that send a boatload of data back to the utility for time of use billing, and in the process eliminated meter reading?
" I can’t think of anything that changed except for the TV."
Check for a possible hot water leak - drips or a leaking hot water heater if you use
electric for that service. Also, an air conditioner unit can add a significant load
based on weather changes or window or door air leakage. Wet insulation might also
cause issues. Then refrigerator dust buildup might result in the compressor running
more often.
our plasma tv generated so much heat we could heat that section of the living room. when we switch we noticed not only lower electric bill but lower cooling bill. heat was another story.
I have the same TV!!! Or at least a very similar 42" Panasonic plasma bought in the same year.
It could just about use $50 per month worth of electricity, if you’re in California and leave it on literally 24 hours per day, seven days per week. It wouldn’t be $50 more than another TV…we’re talking about its total use just on its own.
Something changed besides the TV, despite you not being able to think of what it might be.