Skip to Content

Does A PC Use More Electricity Than A TV?

As a PC user, I was curious if my PC used more electricity than my TV. After doing some research, I found that the answer to this question isn’t as simple as it may seem. In this blog post, I’ll be sharing what I’ve learned about the electricity usage of PCs and TVs, as well as some tips and tricks to help you save energy and money.

An office PC uses about half the electricity of a modern TV. However, gaming PCs consume more electricity than TVs when they run graphically-intensive games but start to match TVs when they run low-intensive games. When PCs idle, they use less electricity than TVs that idle or not.  

Tips:
PCs generally use less electricity than TVs when they idle, but more when they run graphically-intensive games.
TVs typically consume between 100 – 250 watts and 0.5 kWh – 0.25 kWh (kilowatts per hour).
TVs with bigger screens, older screens, 4K resolution, and high brightness levels consume more electricity.
PCs with an i5-7600k and a 1080ti graphics card will use 25 watts on idle, 50 watts when watching videos or browsing the internet, 115 watts when playing low-intensity games, and 370 watts when playing graphically-intensive games.
You can reduce the power usage of your TV by turning it off when not in use, disabling the auto-brightness feature, dimming the backlight, and setting an auto-sleep feature.

Does A PC Use More Electricity Than A TV?

While your PC and TV are not the biggest offenders of electricity usage (we’re looking at you, heating), knowing which one consumes more electricity can make purchasing one or the other a little easier.

On average, a standard 55-inch TV consumes the same electricity as an everyday office desktop PC. 

What Can Make A TV Use More Electricity Than A PC?

Most modern TVs require between 100 – 250 watts to run and consume approximately 0.5 kWh – 0.25 kWh (kilowatts per hour). They come in varying sizes and technologies that drastically impact their power usage. 

The make and model also influence electricity usage because manufacturers use different technologies. TVs with bigger screens consume more electricity because a larger surface area needs light. For instance, a 50-inch screen uses an average of 70 watts, while a 75-inch TV will use 115 watts.  

TVs with older screens like Cathode-Ray Tube (CRT) and Plasma are less energy efficient, often consuming twice as much power as modern LED, LCD, and OLED TVs. LCD TVs consume the least power, followed by LCDs and OLEDs. 

Similarly, 4K TVs use twice as much power as a TV that only supports 1080p. However, electricity usage also scales with brightness levels as it does with size. You may still use more power with an LCD TV than an OLED if you watch everything in 4K or HDR quality and on maximum brightness. 

Lastly, check your TV manual or look at the label on the back of the TV to know how many watts your TV consumes. 

What Can Make A PC Use More Electricity Than A TV?

The average office computer will use 35 – 50 watts when they browse and use about 5 percent of the central processing unit (CPU) and graphics processing unit (GPU). If you were to push the CPU usage to 100 percent, it would use approximately 150 watts of electricity. 

On the other hand, it’s typical for gaming PCs to use between 150 – 500 watts. Consider you have a decent gaming PC – something like an i5-7600k with a 1080ti graphics card – it will use about 25 watts on idle. 

Watching videos or browsing the internet might push it up to 50 watts. Playing a graphically-intensive game will consume around 370 watts, and a low-intensive 2D game will use 115 watts of electricity.   

When you add the peripherals like the headset, mouse, and keyboard, the idling number climbs to 60 watts, and the gaming number, when on a full load, to 440 watts. Monitors can also use a lot of power, usually adding 30 – 70 watts to the power consumption of a PC. 

Remember that having a 1000-watt power supply in a PC does not mean it will run at that amount of watts. Most PCs have features to reduce power usage when idle, meaning someone browsing the internet will use much less electricity than someone mining cryptocurrencies. 

In other words, the 1000-watt figure refers to the maximum wattage intended for the power supply. 

The CPU and GPU are the most power-hungry components in a PC. Your CPU can use 55 – 150 watts of power, and the GPU 25 – 150 watts, depending on whether you browse, watch videos, or play graphically-intensive games. 

How Do You Reduce Power Usage Of A TV?

You can reduce power usage of your TV by lowering the brightness of the backlight on your TV,  disabling the ambient light control feature, turning off the TV when you don’t use it, and setting an auto-sleep feature for when you fall asleep in front of the TV. 

You can do a couple of practical things to reduce how much electricity your TV consumes. 

  • Turn off your TV – it’s the most hands-off approach to reducing your TV’s power usage. Avoid leaving it on because it will also place unnecessary wear on the hardware.
  • Disable auto-brightness – most modern TVs have features like ambient light control that uses sensors to detect light in the room and automatically adjust the brightness of the TV. It might make it unnecessarily bright and have it consume more power. 
  • Dim the backlight on your TV –  is the most significant contributor to power usage. By going into the Lighting settings of your TV, you can turn down the backlight and place your TV in a well-lit room to compensate. 
  • Set an auto-sleep feature if you’re one to doze off in front of the TV, look to see whether your TV has a sleep timer. It can save on electricity when you fall asleep, and the TV switches off afterward.  

Conclusion 

A TV uses more electricity on the middle ground – when you use it to watch something. On the other hand, a PC uses less electricity when you do only light tasks, the same amount of electricity as a TV when you play low-intensity games, and more electricity than a TV when you play high-intensity games. 

Read more: PC gaming on TV