Does older hardware cost MORE than newer hardware when you factor power cost?
Yes, in most cases
You see this on reddit all the time; People will post saying they just got a new Dell 2950 (Which is 14 years old) and asking for advice on what to do with it. Generally people will tell them it's terrible and to get something newer, but some people will tell them to use what they have because it works out cheaper.
For something like a 2950 it seems to be common knowledge that its not worth it. Yet when it comes to something like a Dell R710 people seem to be in denial over the complete waste of money they actually are. You can work this out for yourself with some basic maths. I don't know if its because a large number of people are running them, or that people are just stuck in the past.
For this comparison I am using the USA average electricity cost of $0.12 per KWH. Note that some places can be much cheaper, and some places much more expensive. I am in Houston, TX where it costs me currently $0.05 per KWH, but some people especially in other countries can see much, much higher electricity costs. So keep that in mind for this article, if your power is super cheap, it might make sense to keep old hardware. But if you pay more than the USA average, you really are better off ditching the old stuff. You should also factor in cooling. If it is cold most of the time where you live, the heat put off by these systems is benefiting you, but if you live somewhere like I do, you need to run your huge air conditioning unit for the majority of the year, and any extra heat is the enemy. The extra cooling means your electric cost for the server is actually now higher.
Because all servers are configured differently this isn't an exact science but, I looked at 30 different posts where people posted their config and their power usage for 2 different servers. I went with the Dell R710 and Dell R720 for these as they are both very popular and very similar apart from age. If your numbers vary from these greatly please let me know. This is all user generated data from reddit, servethehome and random forums.
The average IDLE power usage for the R710 in this case was right around 150w. That means in 1 year the R710 will cost $164 to run in electricity alone. This means in 3 years you will use around $492 in electrical cost. The average system had dual E5620's and around 92gb to 128gb of RAM. The CPU's get you 7977 total passmark score for both CPU's (Which for comparison, is less than a mid range laptop CPU, a Intel Core i5-8350U @ 1.70GHz)
The average idle power usage for the R720 was just 80w, this comes to $87 per year or $261 for 3 years. The average config for those were dual E5-2650v2 CPU's and 128GB. This gets you 18787 passmark score which is MUCH higher than the R710. Its actually a slightly higher than a brand new high end desktop CPU, the Intel Core i9-9900
So now let's go for the purchase cost. I found the average cost of a decent R710 with good specs was just $150, and the average cost of a R720 with at least as much RAM was a lot more, $350. But what happens when we add in the electrical cost?
Total 3 year cost for the R710 = $642
Total 3 year cost for the R720 = $611
So there you have it, not only will you end up paying more in the long run for the R710, but you:
- Get less than half the raw performance
- Use more power, hurting the environment
- Have to deal with compatibility issues like ESXi 6.7 being technically unsupported
- Older hardware which should in theory fail sooner
- No warranty (Some R720's sold are still in warranty)
- Deal with more fan noise (More power = More heat = more noise generally)
- Less path for upgrading things like Memory and higher core count CPU's
- Lack some useful features like booting to NMVe, USB 3.0
- Usually have to deal with older BMC (iLO, iDRAC, IPMI etc) which often is not supported and unpatched, and may require versions of Java that were out of support years ago causing headaches and security issues
Not only all of the above, but now you also need to deal with the increased heat in the room the server is in. If you are putting it in a small closet, you may end up putting out more heat than you can deal with in that space, making it impractical.
You also now need a larger UPS which may cost more, or you may even push the limits of the power in your specific room/location
Please leave a comment if you think I am completely wrong, but I personally see little reason to sink money into older hardware. My opinion is you should buy the very newest you can possibly afford
I do have one exception though, if you don't plan to actually run it 24/7 and just wants hands on enterprise hardware experience and don't have the money for a newer box
I hope this was a good read