Vampire Devices Are Real, and How Much Power You're Wasting Will Shock You

2022-05-21 15:55:29 By : Mr. Jason Tian

Most modern smart devices and appliances sip power when not in use. But if you have a house full of them, it can certainly add up.

Stop me if you’ve heard this one: a TV in standby mode uses hundreds of dollars worth of electricity a year to keep that LED on. And it’s not just your TV—there are devices all over your home literally burning the dollars in your pocket simply by being left on. They’re electrical vampires!

Well, that sounds quite scary. But how bad is the problem really? Today we’re going to measure some household devices on standby and finally answer the question of whether or not “vampire devices” are really a problem.

Spoiler: unfortunately, they very much can be, but it varies a lot. How much are you willing to pay for convenience, and what compromises can you make?

When we talk about "standby mode," we're referring to appliances and devices ready for you to perform some function on demand. This covers things like pressing the power button on your TV remote, placing your phone on a Qi charger, or being able to wirelessly print whenever you want. The actual power usage when the device is in use will be higher, but that’s an intentional usage that we won’t be covering.

We want to know only about power being wasted when we're not using something.

Of course, devices will use some power when they’re in standby mode—that little red LED isn’t powered by fairy dust. The question is whether it’s a significant amount or not. In most cases, it's not just an LED. A smart speaker has billions of electrons flowing around every second, listening for your wake word just so it can finally answer the question, “How many grams is one cup of flour?”

But whether that uses a significant amount of energy is subjective. Could you be bothered to unplug a device if it saved you a whole dollar per year? How about ten dollars, or a hundred?

It’s also worth noting that the answer has changed over time. When the concept of standby modes first came around, circuits were inefficient and did in fact use an awful lot of power to do mostly nothing. Energy was cheap and plentiful. But as new regulations were introduced, electrical components were miniaturized, and their power requirements went down. A TV bought in 2022 is completely different from one bought in 1990. Circuits today are incredibly efficient.

In this article, I measure a number of household appliances, electrical, and smart home devices—but you should confirm your own numbers with a low-cost watt-meter.

Before we delve into some real-world numbers, though, we should learn a little about how electricity is measured.

We’re going to use kilowatt-hour (or kWh) as a common measurement because that’s how you pay your bill. One kilowatt-hour—also known as one unit of electricity—is the total amount of power that a device using 1kW of power will consume if left on for an hour-long period. Most people pay a set rate per kilowatt-hour (per unit) that they consume.

In the UK, we’re currently paying around 40 cents per unit on a standard variable rate tariff. That’s absurdly high and probably double what the average consumer might be paying in the US. But it is what it is, and your prices may rise soon too.

In real terms, a small space heater such as the one pictured below might be rated at 2kW. That means if we left it on for one hour, that space heater alone would have used two kilowatt-hours of power (2kWh). Or, in other words, two units of electricity. At a rate of 40 cents per unit, it cost us 80 cents to run that for an hour!

Most electrical devices are not measured in kilowatts, however, but in watts (W). 1000 watts (W) is 1 kilowatt (kW), so you can just divide the power in watts by 1000 to get the number of kilowatts.

For instance, a laptop might use 65W when in use, or 0.065kW. Multiply that by the number of hours it’s on to get a unit consumption, or kWh. If you used that laptop for eight hours a day, every day of the year, it would cost: 8 (hours) x 0.065 (kW) x 365 (days) x 0.40($ unit cost) = $75.92. So it would cost me $76 per year to use that laptop (that is, screen on) for eight hours every day.

Hate math? Use this helpful online power cost calculator, and be sure to set your units correctly as watts or kilowatts.

Here are my results of a selection of home devices I measured, in order of usage (standby / annual units):

The TV standby usage shocked me but could be due to additional smart circuitry for the Ambilight, built-in Google TV, or Wi-Fi connectivity. Whatever the reason, it's completely unjustifiable. Most TVs should use around 3W on standby, so the key takeaway is to measure for yourself. I’ll be unplugging that immediately.

I’ve also listed just above just one of the ten or so smart Wi-Fi-connected LEDs lights that I have plugged in around the home. Taken in isolation, 2.8W isn’t a lot. But it all adds up. With ten of these plugged in, I’m paying about $100 a year to power them when they’re not even turned on! That’s a convenience I think I could do without. And how many random phone chargers do you have plugged in all the time?

Next, let’s look at devices that aren’t necessarily on standby, but are generally on all the time to perform some function. These are things I would personally never consider turning off, but it’s important to know how much power they do actually use.

Here we can see even more interesting data. I had no idea that Starlink used so much power, though it makes sense given it's literally beaming a signal to space and back. I'm impressed that my office desktop is so efficient, though. Even with the monitor on, it still only uses 65W to power everything. That’s a convenience I’m more than happy to pay for, as I regularly dip in and out of writing and web browsing every day. Having to turn it on and off would kill my productivity.

To put this all in context, boiling an 8-cup 3.2kW-rated kettle uses 0.17kWh each time. Once a day, that adds up to 62kWh annually. You probably have more than one cup of coffee a day, of course.

I'm going to avoid the obvious answer of "just unplug stuff" because it's both patronizing and hypocritical when I can't be bothered to physically pull a plug out either. But there are some smarter ways of doing this.

If you have a number of devices sitting on standby for most of the day, it might be worth putting a smart plug on an extension lead to power them all. The idea sounded absurd when I first heard about it: use a smart device that itself uses power to turn off some other smart devices so they don't waste power!

But as I found above, a typical smart plug will use less than 1W of power when idle. So if you have a total of 10W from an entertainment system in standby, switching it to power on with a smart plug when needed will save you at least 90% of that power usage. Bear in mind that not all smart plugs are the same.

I use Philips Hue, which runs on a separate Zigbee mesh network. Anything based on Wi-Fi will use a little more. There are other ways a smart plug can help you save energy, too.

Our TV and smart lighting in the bedroom, for instance, will benefit immensely from a smart plug. Since we only watch TV in the evenings, I’ll compromise by putting it on a schedule as well as maintaining some level of smart control. From 9pm, the TV and smart lighting will be ready to use as normal and automatically turn off at midnight. That cuts down on power usage by almost 85%, even when we account for the smart plug itself.

Another option is to use a low-cost mechanical timer plug. I’ve now created a “charging station” shelf with various battery charging devices plugged in (power tools, AA-batteries, cameras, etc). The timer activates only overnight from 12:30-4:30pm when our power cost drops to 9 cents. So rather than keeping this multitude of chargers on standby all the time for convenience, we can plug in the batteries that are empty and then wait until the following morning to collect them. No messing around with plugs is needed, and it’s rare not to have any batteries spare anyway.

If I add up all the smart lighting, TV, chargers, and other things in standby mode, it comes to about 20% of our total resting power usage (that is, the lowest the house will use without us actively doing anything).

Realistically, I could live without most of those devices being on all the time without too much inconvenience. It won't save me thousands of dollars on the annual bill, but it might save a hundred or two. If you're less concerned with cost savings and more about your environmental impact, other smart home devices can help reduce your carbon footprint.

What I found by investigating my own multitude of devices was that there's an enormous variation in power usage and efficiency—even among modern devices. My entire work desktop setup uses around 65W—and that's when I'm actively using it. Most of the time, it uses 30W with the screen off, and the convenience of being able to access my work quickly is something I'm more than happy to pay for.

Our TV, on the other hand, uses 15W to do absolutely nothing. And we don't even watch that much TV. What a waste!

So, the answer to the question of whether “vampire devices” are real is: it depends. Modern smart devices sip power when not in use. But if you have a house full of them, that quickly adds up to a torrent of wasted power. Check your own usage, and you may be surprised.

Then it’s up to you to decide: how much is that convenience worth to you?

James has a BSc in Artificial Intelligence and is CompTIA A+ and Network+ certified. When he's not busy as Hardware Reviews Editor, he enjoys LEGO, VR, and board games. Before joining MakeUseOf, he was a lighting technician, English teacher, and data center engineer.

Join our newsletter for tech tips, reviews, free ebooks, and exclusive deals!