What Are the Best Graphics Options for PS5 and Xbox Series X/S?

If you're wondering what the best graphics options are for PS5 and Xbox Series X/S games, this will explain everything you need to know. Graphics, resolution and framerate are now more customisable than ever. But are more options better for console gaming?

What Are the Best Graphics Options for PS5 and Xbox Series X/S?

With the PS5 and Xbox Series X/S now out across the world, it has become clear that the need for graphics options to fine-tune your experience is now more important than ever. However, with settings relating to graphics, resolution and framerate, what is the best way to maximise your experiences across various titles? The new hardware and specifications of the PS5 and new Xbox consoles allow for more graphics options than before. 

It’s worth noting that all of these settings could be considered ‘graphics’ options. In order to break things down, I have separated them into three categories: 1. everything related to how good any individual frame looks, generally in term of photo-realism, which is simply ‘graphics’; 2. the resolution of the image itself in terms of pixel count, and finally 3. the framerate which the game is displayed at.

These settings slowly began creeping their way into console games over the last generation, especially with the launch of the updated PS4 Pro and Xbox One X, and now they’re even more common (but obviously still limited compared to PC gaming). This article will highlight what options should be in all games going forward, what each of them means and how this impacts the gaming experience. Also, are these settings really necessary for console gaming and could they have a negative impact on the games and the player?


Ray tracing is one of the shiny new features (pun intended) of the PS5 and Xbox Series X/S. This means that the new consoles can accurately simulate real-life lighting including everything from reflections to shadows. It helps to present a more complete image with shadows that fill in the environments adding depth and reflections that look exactly as you’d expect in real life. The effect adds a lot to the realism of the image even if there are only ray-traced shadows included, as seen in Call of Duty Black Ops Cold War.

Ray-traced shadows alone add greater depth to the image as seen in Black Ops Cold War.

Ray-traced shadows alone add greater depth to the image as seen in Black Ops Cold War.

However, all this beautiful lighting comes at a cost as ray tracing is very demanding on the hardware. As the rendering of each frame must take place in a very short amount of time, ray tracing can take up a large amount of the processing budget and result in a loss in frames-per-second (FPS) and uneven frame-times. Therefore I believe that ray tracing should always be an option for PS5 and Xbox Series X games so that players can ensure a solid framerate but also have the option for the lighting to look incredible as long as the framerate doesn’t take too much of a hit. To continue using Black Ops Cold War as an example, it seems as though these consoles have no difficulty running consistently at 60fps with ray-traced shadows and the 120hz mode requires ray tracing to be turned off anyway.

Ultimately the inclusion of ray tracing undoubtedly makes the games look better but it isn’t something that you’d immediately notice unless it was specifically highlighted for you. So, therefore, it isn’t something that is required for the enjoyment of a game especially if it comes at the cost of performance, but if it is possible to include without any negative impacts then it definitely should be.

Fortnite with RTX: Ray Tracing Cinematic Trailer | GeForce Community Showcase

The level of detail of game environments and textures is often the cost of a higher resolution, framerate or other fancy effects. Meaning that if the developers of a game want to ensure a solid 60 frames-per-second and also a high resolution the amount of detail of the game has to be sacrificed. 

This is evident in the launch titles, Assassin’s Creed Valhalla and Demon’s Souls. For Valhalla, Ubisoft targeted 60 FPS at launch for PS5 and Series X with a dynamic resolution. However, not all fans needed 60 FPS especially considering that all previous Assassin’s Creed games have run at 30 FPS on consoles. It’s clear that to achieve 60 FPS, the detail of the world, especially of objects in the distance, was reduced slightly. Therefore Ubisoft has released an update which allows for the option of a 30 FPS quality mode which has a native 4K resolution and an increased level of detail, i.e. better graphics.

The story is similar in regards to the Demon’s Souls remake for PS5. The 60 FPS High Frame Rate Mode runs smoothly and still looks great, but the 30 FPS 4K Mode obviously has a higher resolution plus the ground and wall surfaces have a greater sense of depth to them and the textures are more detailed.

Each gamer’s personal preference comes down to what they’re willing to sacrifice. It’s important for the option to exist as it seems Ubisoft have realised because each person wants different things from their games. Some people don’t like playing at anything less than 60 FPS but for others, the image quality and how pretty a game looks is the priority. Going forward developer should just ensure that the options are easy to find and it is explained in-game exactly what each mode means. The PS5 makes this easy with its ‘Game Presets’ option in the console’s settings as people can just choose what they prefer from the start and all games will apply the setting automatically. 


The difference between 1080p and 4K is dramatic when side-by-side.

The difference between 1080p and 4K is dramatic when side-by-side.

The most common options for resolution are 720p, 1080p, 1440p, 2160p (4K) or a dynamic resolution. However, based on what we’ve seen from the next-gen launch titles most games end up at native 4K or a dynamic resolution falling somewhere between 1080p and 2160p, with the Series S generally being lower. Of course, as mentioned previously the resolution often depends on the options that the player has selected.

An issue that exists in relation to resolution relates to the prevalence of 4K televisions/monitors. No matter your display you’ll always be able to see ray tracing and better quality graphics; however, if you’re still using a 1080p display or even lower you can’t fully take advantage of the power of the new consoles. While it’s true that the prices of 4K TVs have dropped in recent years, many gamers still may not be able to afford one, especially when considering that they may have been saving up for the new console itself and new games.

This again highlights the importance of the options in-game. While you can select the resolution in the console settings and get a supersampled image in-game, you won’t benefit as much from the additional horsepower of the PS5 and Xbox Series X/S. However if the games have the option to lower the resolution in order to gain benefits elsewhere then it only makes sense to have the games running at something like 1080p if that’s the best your TV can do.

At the end of the day, is the difference between resolutions really that noticeable? When it comes down to the choice between a dynamic resolution between 1080p and 4K and native 4K you’d be hard-pressed to notice any significant downgrade with the dynamic resolution – especially when you consider that the resolution will often only drop to its lowest limit during chaotic and action-packed moments when you’re too focused on the gameplay to notice anyway. In my opinion, going back to a 1080p resolution after playing at 4K is definitely noticeable, as seen with the performance mode in Marvel’s Avengers, but most games that release for PS5 and Xbox Series X will run at higher than 1080p but unfortunately, the Series has been seen to run around that resolution for some games.



The typical options for framerate are 30 FPS, 60 FPS and now 120 FPS. In the past generation, we generally saw single-player games running at 30 FPS while multiplayer games (or specific MP modes) would run at 60 FPS. Could we now begin to see 60 FPS as the new standard with 120 FPS saved for multiplayer? I don’t think this will entirely be the case but we will begin to see less and less 30 FPS options and only then will it be along with the game running at native 4K and with a high level of detail. As developers learn the best ways to use a new console they will be able to efficiently take advantage of them allowing for 60 FPS to be much more common.

120 FPS is something that will most likely need to be saved for the most optimised multiplayer games. We’ve seen with Rocket League, and others, that it’s seemingly easier for developers to enable 120 FPS on the Xbox consoles than it is for PS5. Not that it is impossible, 120 FPS is an option on both consoles for Black Ops Cold War, but rather that the games will need a specific next-gen patch to take advantage of the PS5.

Is 120 FPS the future? Or just limited to a small number of games and players?

Is 120 FPS the future? Or just limited to a small number of games and players?

It’s also worth bearing in mind that not many people are likely to have access to 120Hz display, which are probably less common than 4K displays. The vast majority of console gamers play on televisions as opposed to gaming monitors and very few TVs support refresh rates over 60Hz meaning that they cant display games at 120 FPS. Of course in a few years, the availability of such display will be greater and the prices will go down but for now, the option of 120 FPS is only possible for a small group of console gamers.

For a lot of games nowadays it can be a difficult choice between resolution and framerate. For example, after playing in 60 FPS in Assassin’s Creed Valhalla for almost 10 hours I found it very jarring when I tried out the 30 FPS quality mode to see how much better it looked. If the quality mode was available when I started playing I probably would have stayed on the mode, but after experiencing the smoothness of 60 FPS I simply couldn’t go back.

This raises the issue of whether or not the option should even exist. The developers likely have a preferred way for the player to experience their game and the focus on just one option could improve the stability and polish of the game. The struggle of the players to decide what aspect they want to sacrifice for another could leave them feeling like they’re missing out in some way. On the other hand, at least the option gives the player the freedom to decide or to switch back and forth rather than being stuck with something that they don’t enjoy.

Xbox Series X - Optimized for Xbox Series X Trailer

The Future

As we move into the future, TVs are going to get better, cheaper and more common. The front of the PS5 box has the bold and striking, if potentially overly-ambitious, ‘8K’ logo. The cheapest 8K TVs currently sit around £1500 ($2000) and there’s very little 8K content available. Even if the PS5 and Xbox Series X are technically capable of displaying games at 8K resolution it will still be many years until 8K TVs are economically viable. Even by that point, what sort of sacrifices will we have to make in terms of graphics and framerate for it to be possible for the games to run?

In around a year or two we’re likely to see most games releasing exclusively on the PS5 and Xbox Series X/S. These games are likely to be able to take full advantage of the hardware without the limitations for them to also work on last-gen consoles. Also, developers will learn better and more efficient ways to use the PS5 and Xbox Series X/S meaning that the overall graphics will improve. Therefore the graphics options will likely change over time; offering more benefits with fewer sacrifices. Perhaps even a few games will run at native 4K and 60 FPS with fancy features such as ray tracing enabled as well, meaning that you won’t have to choose to give up one of them.

However, I believe that we will continue to see an increase in the graphics option for console games going forward. It’s clear that this is a welcome addition to games as there isn’t always a single best option that the developers can just decide on. It is important that these options remain accessible for all gamers and aren’t too complicated and don’t ruin the gameplay experience in any way. Unlike PC gaming, there aren’t hundreds of possible combinations of graphics cards, processors and memory formats, there are just 3 varieties (PS5, Xbox Series X and Series S) all of which are functionally very similar. So the vast array of options and settings aren’t necessary for console gaming.

PlayStation 5 Showcase - Opening Sizzle

Regardless, I believe that the ‘winner’ of this console generation will partly be determined by whichever company takes full advantage of these graphical options. Many gamers have been hoping for the ability to fine-tune their experience, like on PC, for many years now. This is because if you can customise the graphics, resolution and framerate to your liking you’re much more likely to get more enjoyment from your games and have a more positive response overall.

So ultimately, what are the best graphics options for PS5 and Xbox Series X/S games? Personally, for single-player games, I have no issues playing at 30 FPS (as long as you don’t immediately go from playing a game at 60) if it means that the resolution can be native 4K and there can be more detailed environments. However, for multiplayer game modes, 60 FPS is absolutely the minimum. You need the split-seconds response times and smooth gameplay to be able to stay competitive. Whereas the quality of the graphics, resolution and the addition of ray tracing is less important unless they somehow provide a competitive advantage. Of course, you can still get great-looking multiplayer games but framerate is always the priority.