How to avoid frying your graphics card playing Amazon's New World

New World, new day, new way to slay. (Screenshot: Amazon Game Studios)
New World, new day, new way to slay. (Screenshot: Amazon Game Studios)

Amazon Game Studios' New World has been all the rage recently, with big streamers like Shroud streaming the game instead of his usual first-person-shooters.

The game has amassed close to a million concurrent players on Steam itself recently, and it doesn’t seem like it’s going to stop anytime soon.

But, as usually the case with any large-scale game such as this, there are bound to be a lot of technical problems arising in the game itself, and I have had the “pleasure” of coming across most of them while diving into the world of Aeternum, be it on the technical side of things or server-based issues.

However, through all of this, one major problem with the game stood out the most for me, and made me absolutely paranoid, so I decided to do a deep dive into this to the answer of this very important question:

Will my Graphics Card explode?

During the beta, there were reports of New World bricking and frying computer graphics cards. Amazon have claimed that they’ve "fixed" this issue during the beta, but there are still reports of GPUs dying (and even literally smoking up) in the official release.

During my own observation (yes, I monitor EVERYTHING that goes on in my computer because it’s my baby), I noticed something very peculiar about New World’s power draw and why it may be causing GPUs to die.

I own a Asus ROG Strix AMD Radeon 6900 XT, which comes with three 8-pin power connectors. Each 8-pin power connector is rated to draw up to 150w of power, so in theory, the card could draw a good 450w of power through its cables, and an additional 75 watts through the card’s PCIe connector, essentially being able to draw 525w of power from the power supply.

Asus’s Strix cards have always been robust, and the components for the card are well equipped to draw that much of power. However, the card is hard limited by the chip’s power limit, which is set at a maximum of 381w power draw.

During my time with the card, I have never seen it spike anywhere beyond the 400w mark, no matter what game or benchmarks I ran, due to the power limit the card has. The power limit would always stabilise the card’s power draw down to around 380w if some applications decide to spike the usage of the GPU.

New World has made my GPU power draw spike to around 480-500w.

When this happens, I also have random freezes in-game and I have to restart to get it running again.

This is extremely concerning because of the way the game requires the GPU to function.

I know my GPU is able to hit those power draw numbers without much problem, but for it to spike way past my power limit is something that Amazon Game Studios need to look at ASAP.

I’m lucky enough to have a strong GPU that can handle these spikes, but what if I had something more "inferior"?

What if I had a card that only had two 8-pin power connectors, which essentially only allows a maximum power draw of 375w?

This is also, more often than not, the reason why Nvidia cards are the more widespread victims of failing. An RTX 3090 commands a base power draw of 350w, and if the card only has two 8-pin connectors, it is already approaching its maximum power draw of 375w.

GPUs like this may also have components that can only handle that much of power draw. If the game decides to request for 450w, it will obviously trip something in the GPU, and if the components are not robust enough, it will damage them.

This is also why there are more affected units in Gigabyte’s and EVGA’s camp of graphics cards. EVGA has come out to say that there was a soldering defect on a batch of their cards that died while playing the game, while for Gigabyte’s case, it is not surprising that their components may have not been of the highest quality, especially after the whole debacle with their exploding power supplies.

However, through some internet scouring, the failures are not limited to these two brands only. There was a Zotac card that caught fire as well while running the game that can be seen in a video made by Jayztwocents (5.50 minute mark).

Jayztwocents also recently released a video highlighting the power draw of the game itself, and lo and behold, it supports my observation on the behaviour of New World. The power draw and spikes are relentless in this game.

Many have argued that this should be no cause for concern as long as the GPU is well-made, and I personally agree.

There are a lot more people that are running the game successfully without a hitch, and this whole GPU thing is blown out of proportion. Not that a game blowing up your graphics card should be acceptable, though.

The way New World requests for power can also be the reason why the game also freezes while you’re playing it. Usually, when you overclock a card, the programme that you run will freeze if the GPU can’t handle the stated overclocks. This seems to be the same kind of freezing that is happening in game as well.

So, how do you take the precaution to avoid any of these things from happening?

I personally have decided to cap my framerate at a maximum of 120 frames per second, while playing in windowed 1080p. It is not ideal for me since my monitor is a 1440p monitor, but I don’t see the need to stress the card out if these spikes happen more often than not.

I also underclocked my card to run at 80 per cent power limit while game is running, and reduced the overall clocks for the GPU.

Now, it still spikes to about 350-400w at times, but my power draw for the game averages out at 300w, so I have enough headroom to deal with any oddities that the game could throw at my GPU (hopefully). My game has been running fine with no crashes since the adjustment, and it’s a very welcomed change.

Just to be clear, heat has never been an issue for me because I watercool my card, so I cannot say for sure if New World also increases your GPU’s heat, but speaking to friends who also play the game, they haven't see their GPUs heat up much more than playing other games, so I guess that is not a cause for concern.

So, in short, to prevent your GPU from doing anything funky:

  1. Reduce your power limit

  2. Reduce your GPU clocks (if you have any kind of overclock on your card, remove it)

Nonetheless, no matter what camp you stand in, be it whether manufacturers need to make more robust GPUs, or the game needs to be fixed, this problem shouldn’t have happened.

Even two 8-pin GPUs like the Asus TUF series are not having any kind of trouble running the game, so one could say that GPU manufacturers should be a little more stringent on making durable graphics cards.

That being said, I am also not comfortable with the fact that the game is requesting way more power than it should from my GPU.

Please fix this, Amazon Game Studios, and perhaps I would be able to start playing it at 1440p again to appreciate the graphical glory of Aeternum.

Dominic loves tech and games. When he is not busy getting headshotted in VALORANT or watercooling anything he sees, he does some pro wrestling.

For more gaming news updates, visit https://yhoo.it/YahooGamingSEA. Also follow us on Twitter!

Watch more videos on Yahoo