Skip to main content

Popular Myths When Choosing Components

Last updated on

I often hear the same rumors or half-truths when helping someone build their first PC. They read X is better than Y in some article or blog; now, it's stuck in their brain. I have certain brands I tend to like above others, but I've been building these things long enough to see trends change. Today's "facts" become tomorrow's untruths fairly regularly. The following are some things I've heard or discussed directly or through forums with some patently wrong people. Try to avoid these when putting together the components for your build.

Intel Processors Are Better Than AMD Processors for Gaming
I've been building PCs long enough to remember when AMD blindsided Intel with the release of the AMD Athlon K7 in 1999. AMD's architecture allowed them to perform more processing at the same clock speed as Intel, making the K7 the fastest CPU available. The Thunderbird core that followed was still faster, and clock speed was no longer the only measure of a CPU's performance. The Athlon 64 that came after that kept AMD in the lead. I built gaming systems on AMD CPUs until Intel released its Core 2 systems in 2006. I kept a tiny light lit, hoping it would happen again, and starting in 2017, it seems to have. Right now, in 2024, AMD has the top end.

This is a rumor that toggles between true and false. At the high end in August 2024, AMD's recently released R9 9950X beats the Intel i9 14900K by a decent amount in high-performance applications like Blender and Photoshop. For gaming, the AMD 7800X3D still yields higher frames per second (FPS) results. When the 9800X3D comes out (if it does), that will probably supplant the 7800X3D. This is one of the rare times that AMD has both the performance and gaming CPU crowns. It doesn't help that Intel acknowledged issues feeding too much voltage to the cores in its 13th and 14th-generation processors. The "fix" for that is a microcode patch in the process of being distributed by BIOS patches from various board makers.

The best approach is to determine the maximum you're willing to spend and then figure out the best CPU - Intel or AMD - within that budget. This marks one of the rare times AMD leads in gaming and production application performance (depending on which CPU we are talking about).

My favorite CPU and motherboard testing review sites are the YouTube channels Gamers Nexus and Hardware Unboxed. They cover other types of hardware in more depth than anyone else; it's all they do. As the CPU dictates what motherboards you can get, I pick that component first.

Nvidia graphics cards are better than ATI graphics cards (or vice versa)

This is a rumor that I wish I felt was more of a rumor. In sheer number-crunching, Nvidia and AMD have done a pretty good job slotting their video cards into a line of price/performance - except at the very high end where the RTX 4090 is priced higher than it should be (in my opinion) because it has no competition. It should be as simple as figuring out your budget and buying whichever manufacturer's card you can get within that range. The truth - for me, at least - isn't that simple. If the "correct" choice is an AMD card, I balk. I'll choose Nvidia every time if it's in the budget to bump to the next highest Nvidia card from that initial choice. Why? The answers are video drivers and build quality.

I still have at least one AMD card in one of my systems. It is not, however, a gaming system. It's a Linux box with an AMD card typically used for home theater PCs. My last "gaming" AMD card was an ATI Radeon (R300) 9700 (from 2002). It had to be replaced because of bad capacitors, resulting in the display of the pink checkerboard of death when trying to play a game. I got it directly from ATI, so I had to ship it to Canada for replacement. I remember this because Canada wanted me to pay an import tax for the declared value on a card I was returning for warranty work.

Later, a friend tried to use a card made by MSI based on an AMD R9 290X GPU. It was the correct card for his budget, and the R9 290X had decent reviews. His first card booted fine but would lock up whenever he tried a game. Any manufacturer can have an occasional build issue, so he got a replacement. The second one had the same issue. This time, we installed it in my gaming rig, thinking it might be some issue specific to his system. It got a little farther, but running 3DMark locked the system up within a minute. Eventually, that one died so bad that I couldn't even get back into Windows long enough to uninstall it. The AMD Catalyst drivers are so bad that I blue-screened when I tried to put my actual video card (by Nvidia) back into the system. I had to nuke and pave my OS to get the system working again. This could have been an MSI issue with AMD cards, perhaps. We replaced that card with an MSI card based on the Nvidia 970 GTX. It cost more, but now, there are no problems with it in his system. That leaves me somewhat soured on AMD cards.

The bottom line is, technically, you shouldn't necessarily pick Nvidia or AMD as always being the best. Look and see what makes sense for the budget available. My favorite GPU testing review sites are (again) the YouTube channels Gamers Nexus and Hardware Unboxed. Like CPUs, they cover GPU testing in more depth than anyone else.

Installing a larger power supply means my system will use more power

This is a question I've answered online on more than one occasion. I have taken the liberty of copying myself. I'll sue myself for infringement later.

Computer power supply units (PSUs) are on-demand current draw devices. That is, they only supply as much power on the various voltage lines (3.3 V, 5V, 12V, etc.) as the components in your PC require. As such, if you were to replace your current power supply with a larger rated one (without changing any other components in your system), the difference in the current draw should be negligible. More than that, if you replace an old, poorly-designed 300W PSU with a new, more efficient 550W model, it's even possible the current draw will be decreased, not increased, due to increased efficiency. Efficiency is the ratio of power consumption from the wall socket compared to the power delivered to the computer components. A loss of efficiency manifests itself as heat generation. A PSU that is 85% efficient wastes less electricity in the form of heat than a 70% efficient PSU.

Power supplies operate most efficiently when driven at 50 - 75% of their rated maximum load. Let's say you've been adding hard drives over time (even external ones if the USB port powers them) and have also upgraded your video card. The 550W power supply was fine when you first got your system, but now, let's say you use 460W as a worst case. (It won't always draw that much, but when playing a video game driving the graphics card and the CPU hard, it may stay at that draw for extended periods.) Your 550W PSU is operating at 84% of its rated max. An 800W PSU, on the other hand, would only be operating at 58% of its rated max. The 550W power supply is affected by the loss of efficiency in converting 120V AC to 3.3V, 5V, 12V, etc. DC, and when loaded above 75%, it may pull more current from the wall to deliver 460W than the 800W supply would require. (It's more complicated than this because it matters how much current is needed by each voltage "rail," such as 12V compared to 3.3V, rather than just the total power. I've also ignored talking about thermal design power altogether.)

That said, it's a good idea to check the output of an existing PSU and upgrade it when adding components with a higher current draw. Some of the very high-end graphics cards now require 350-400W or more when they are running full blast. Pair that with a CPU pulling nearly 250-300W, and you see why 1000W supplies are needed. If you upgrade your graphics card, you will probably draw more current from the wall. Purchase a cheap Kill-A-Watt power meter to verify or measure the power requirements. Remeasure after upgrading a component to ensure the PSU is operating within safe margins.