I thought this video was rather interesting, because at 12:27, the presenter crunches the numbers to find out how many years it would take for a new computer purchase to be more environmentally friendly (in regards to total CO2 expended) compared to using a less efficient used model.
Depending on the specific use case, it could take as little as 3 years to breakeven in terms of CO2 if both systems were at max power draw forever, and as long as 30 if the systems are mostly at idle.
This should be a well know, but often misunderstood thing. Lots of reddit selfhosting threds urge people to buy a new mini-pc for its “low power draw” when usually its the same or 1-2watts less then a laptop from 2012. However performace to watt is much higher, so if you need massive preformance new is much better, if your system is idling most of the time anyway, basically no diffrence in buying old
You just can’t buy too old or the inverse happens and the performance per watt drops. I think you’re right that 2012 is about the cutoff. Maybe 2007 for certain items, like my 2007 iMac. But if you’re getting back to the Pentium 4 era you’ve gone too far and need to turn back around.
Oh god, P4? Yea, those were just 100 watt light bulbs.
My first computer was 33htz. Ran Windows 3.1. And Warcraft 2.
So yeah. The perfect computer.
Well, I would hope for 33 MHz at least… ;-)
No, the graphics from Intel back in 07-10 were crap. 2012-2013 would be my bare minimum, usb3 if only for loading a new OS.
No, the graphics from Intel back in 07-10 were crap.
What are you using graphics on a server for instead of just CLI?
Transcoding video for streaming.
How much video is really needed for transcoding?
I ask because I need to get a video card for transcoding to a 65" 4k TV. I’m converting all my DVDs to MKV and using Jellyfin as my server and client. It transcodes lighter stuff fine (cartoons, old TV shows), but better movies get some artifacts that don’t occur if I have the TV play the same file from a thumb drive.
I’ve read Jellyfin’s recommendation, but it’s really just “use at least this video chipset”, not a particular card, so I’m trying to determine what card I should get.
Server to TV should be local, why are you transcoding? I watch 4K files on my 4K TV without issues, with Kodi because I don’t need Jellyfin for that.
I use Jellyfin to stream when I’m outside my home, and transcoding 4K is what takes a lot of resources.
It’s transcoding because Jellyfin decided it needs to transcode for some reason, frustratingly. I’ve converted to formats/codecs I know the TV supports, and yet Jellyfin still transcodes, with a message about the TV not supporting the codec (yet if I play the file on the TV from a thumb drive, it works fine with the crappy built-in media player). I’m using the Jellyfin client on the TV because it’s easy to install without a Samsung account, and I don’t think I can get Kodi on it (besides my experience with Kodi is not great, it’s sluggish on real hardware, I can only imagine how bad it would be on an underpowered garbage TV and I don’t know if a client exists).
From a bigger picture perspective, I think Jellyfin as a client will be better for my family. It’s a simpler interface with less to get them in trouble.
I’ll need transcoding for other/non-local devices anyway, so I still have to address the issue (annoying iPad for example).
If you have any advice about troubleshooting why it’s transcoding, I’m all ears. This is the first I’ve gotten Jellyfin to work after multiple attempts over the years, across multiple servers and clients, so my experience with it is limited. I’m just glad it works at all - it’s the first I’ve gotten to work other than Plex.
Thanks - at least now I know it shouldn’t be transcoding.
You don’t really want to live transcode 4K. That’s a tremendous amount of horsepower required to go real time. When you rip your movies you want to make sure they’re in some format that whatever player you’re using can handle. If that means that you use a streaming stick in your TV instead of the app on your TV that’s what you do. I think you could technically do it with a 10th+ gen Intel with embedded video. I know that a Nvidia 2070 super on a 7th gen Intel will not get the job done for an upper and Roku. So all of my 4K video is either h264 or HEVC so it all direct plays on my flavor of Roku.
For my first server, after moving on from 2 raspberrys to a Proxmox host, I went with an embedded Asrock MB, passively cooled so you know it wasn’t drawing much power, still had multiple SATA ports and with the right sticks I could get 32GB RAM in.
Seems better to me than a minipc where you have no expandability, especially no chance for RAID.
I use a 2011 ThinkPad X120e as an FTP/Syncthing server. It was underpowered as a laptop from day one, but still works fine as a lightweight server. The best thing about ThinkPads is that TLP allows you to set min/max charging thresholds, so that you can keep an old battery in good shape for … well, I’ll let you know. This one’s 14 years old and still has a four-hour run time.
One thing I’d like to try is “Wake My Potato” for shutdown / automatic restart when a power outage occurs.
Links:
TLP - https://linrunner.de/tlp/index.html
Wake My Potato - https://github.com/pablogila/WakeMyPotato
@BackYardIncendiary @ProdigalFrog If you have an old latitude, newer kernels also allow you to set min/max charging thresholds. My syncthing server (and NAS and a few other things) is an old 2013/2014 dell latitude e7240. It’s not the original battery, but I do keep it in decent shape via charging thresholds.
Yeah, this is why I reuse my old PC parts. Here’s my rough history:
- Built PC w/ old AM3 board for personal use
- Upgraded to AM4, used AM3 build for NAS (just bought drives)
- Upgraded CPU and mobo (wanted mini-ITX), and upgraded NAS to AM5 (did need some RAM)
My NAS power draw was cut in half from 2-3, and it’ll probably be cut again when I upgrade my PC again.
Old PC parts FTW!
I am currently building a home server, this project timeline has been extended as I had no idea hard drives would be THAT expensive at the capacities I want…
I do have an old computer that is not in use, but I don’t want to run a Bulldozer plattform…
So I am basing my new server on the AMD Ryzen 4600G, should be fine
I have a 4-node heater here, but only 2 nodes are in use currently. I got it because it was exceedingly cheap (£75 here in UK and all 4 nodes have 3xE5-2620s and 48 gig ram) but in reality it overkill. Tempted to make a solar powered rpi 5 + m.2 server with battery backup just because I can, but it will be for serving websites both static and wordpress
I have an i7-2600 prebuilt for my NAS— is idle most of the time, bought it for $100 4 years ago. Have pretty cheap internet at like $0.12 per KWh, but again mostly idle so probably doesn’t cost much anyway.
I did lol at cheap kwh internet - sounds way better than talking energy costs
Older desktops can have a somewhat hefty idle power draw due to the overall system consumption contributing more than expected, such as the southbridge. According to this old review of the i7-2600k, the system idles at 74w, which at $0.12 per KWh, would cost you roughly $77 per year. Though you might want to confirm that with a Kill-a-watt meter if you can (libraries sometimes lend them out), since I’m pretty sure that total system power chart includes a discrete GPU, so the real number for a GPU-less system is probably around 40 or 50w at idle.
If that is accurate, you could potentially replace your i7-2600 with a used Dell Wyse 5070 thin client from ebay for about $40 (in the US), and that idles at 5w, which would only cost you $5 a year at the same rate.
Older thin clients and laptops tend to have much better idle power draws compared to desktops. For other people reading this, if you’re using a desktop for a low-power use case, it’s probably worth finding out what its idle power consumption is and doing the calculation to determine if it’d be worth replacing it with a more efficient used thin-client or office mini-pc.
In Germany consumer power is something like 0.4 EUR/kWh, so economics of running power-hungry hardware might be different. Solar PV might change the equation once again.
I’m wanting something mini at home, I’ve been looking at the gmktec g9 for its nvme slots