• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Last of Us Part I PC Features Trailer- Ultra-Wide Support, Left Behind, and More

Mr Moose

Member
Give us DLSS please
Part I will feature AMD FSR 2.2 support*, Nvidia DLSS Super Resolution support*, VSync and frame rate cap options, and a host of features designed specifically for PC, including adjustable Texture Quality, Shadows, Reflections, Ambient Occlusion, and more.
* Compatible PC and graphics card required for enhanced graphics.
 

TrebleShot

Member
Once again hahaha
Console hardware is not like for like with PC hardware.

PS5 runs this at 1440p 60fps and 4k40fps which according to these reference specs you need a 4080 to squeeze another 20fps out no RTS.
 

ChiefDada

Member
Pretty beefy specs for the PS5 remake that doesn't look better than Pt.2 :messenger_smirking:


A 6600 xt is 10.6 tflops. Only doing 1080p 60 fps. Looks like another uncharted situation.

Yes, but what's the 6600xt memory bandwidth? Why are you constantly obsessed with teraflops?

If PS5 can do 4K 30, a 3080 10GB with a reasonable CPU can pull out 60 easily at the same quality. That’s the rule of thumb lately.

Not sure about Ultra, but I doubt the PS5 is doing Ultra, so…

PS5 texture settings are always on the highest preset and TLOU Pt 1 textures are immaculate, particularly in cutscenes. I'd be shocked if cards 10GB and below don't struggle immensely.
 

RobRSG

Member
Pretty beefy specs for the PS5 remake that doesn't look better than Pt.2 :messenger_smirking:




Yes, but what's the 6600xt memory bandwidth? Why are you constantly obsessed with teraflops?



PS5 texture settings are always on the highest preset and TLOU Pt 1 textures are immaculate, particularly in cutscenes. I'd be shocked if cards 10GB and below don't struggle immensely.
Textures are not the only thing consuming VRAM, you know…
 

ChiefDada

Member
Textures are not the only thing consuming VRAM, you know…

What I do know is the high quality textures play a large part in making this game look amazing, along with relatively high asset quality in general and the indirect lighting. 10GB and below cards will have a hard time reaching console parity for this title, I suspect. The specs listing specifically surrounding memory and storage tells you all you need to know.
 

GreatnessRD

Member
1440p with a 5800X3D and 6800 XT about to be butter, baby. I'll grab it when its $49.99 or free if I end up buying another AMD GPU for my living room PC. We'll see. Kinda excited to play it to be honest.
 

Buggy Loop

Member
Once again hahaha
Console hardware is not like for like with PC hardware.

PS5 runs this at 1440p 60fps and 4k40fps which according to these reference specs you need a 4080 to squeeze another 20fps out no RTS.

You’re basing that from the recommended specs?

Are you new to PC recommended specs?

With console equivalent settings, it’s a pushover.
 

Jrecard

Neo Member
Looks like I'm about bang on for 1440/60 with a 6700xt. Only played the PS3 version so will probably pick this up in the near future once I'm done with Hogwarts.
 

RobRSG

Member
What I do know is the high quality textures play a large part in making this game look amazing, along with relatively high asset quality in general and the indirect lighting. 10GB and below cards will have a hard time reaching console parity for this title, I suspect. The specs listing specifically surrounding memory and storage tells you all you need to know.
Until the game releases, your guess is just as good as mine.
 

Utherellus

Member
Interesting to observe Sony's strategy: Aiming for this remake as a starting point in their PC-porting campaign for TLOU franchise, inherently familiarizing Naughty Dog engineering team with PC hardware.

I wonder when Part 2 hits PC.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Once again hahaha
Console hardware is not like for like with PC hardware.

PS5 runs this at 1440p 60fps and 4k40fps which according to these reference specs you need a 4080 to squeeze another 20fps out no RTS.
You are assuming the PS5 is using Ultra settings?

The 6750XT is likely gonna be hitting the 80-90 range at 1440p console settings.
Imma bet a 4080 at console settings at 4K runs close to 100fps.
I wouldnt be shocked if an Intel A770 matches the PS5 at 1440p.
 
Those ultra setting better be super fucking juicy to need a damn 4080...
I have one and plan on buying since there's keys for it and resident evil 4 for 40 bucks but WTF

I'll just end up DLSS since I want more frames at 4K but that shits ridiculous. I'll be posting here on day 1
 

Buggy Loop

Member
Can't wait to see how much it would've cost me to play the equivalent PS5 version on PC.

Prediction: ~$2000

$825, without even looking for deals

Digital Foundry’s console equivalent build with a 6700XT replacing the A770. Hell the 6700XT is too much for just equivalent
 
Last edited:

GymWolf

Member
I have one and plan on buying since there's keys for it and resident evil 4 for 40 bucks but WTF

I'll just end up DLSS since I want more frames at 4K but that shits ridiculous. I'll be posting here on day 1
I played the thing on ps5 already so i really don't care about it, but those requirements are a bit strange.
 
Last edited:

Sanepar

Member
Once again hahaha
Console hardware is not like for like with PC hardware.

PS5 runs this at 1440p 60fps and 4k40fps which according to these reference specs you need a 4080 to squeeze another 20fps out no RTS.
Ps5 doesn't run with ultra settings, probably medium-high settings from PC version. 6750 runs 1440p 60 fps so nothing special on PS5 version
 

ChiefDada

Member
Ps5 doesn't run with ultra settings, probably medium-high settings from PC version. 6750 runs 1440p 60 fps so nothing special on PS5 version

6600 XT is more than enough to have PS5 performance.


Will Smith Smh GIF by The Academy Awards
 

yamaci17

Member
6600 XT is more than enough to have PS5 performance.
its not, it often run into extreme performance bottleneckes at 1440p and above

literally on it box says 1080p. it is a gpu targeted for 1080p. it can match ps5 like for like at 1080p, but once you go beyond there, it will start having massive performance slowdowns.

here it is like this:


at 4k, 5700xt is %12 faster
at 1440p, they're matched
at 1080p, 6600xt is %8 faster

that's a whopping %20 performance drop going from 1080p to 4K.

cyberpunk is even more brutal


at 4k, 5700xt is %23 faster
at 1440p, they're matched
at 1080p, 6600xt is %10 faster

a whopping massive %23 performance drop going from 1440p to 4K and another %10 from 1080p to 1440p

6600xt is a gimped product that only works to its potential at 1080p.

problems are even more pronounced with ray tracing;



at native 1080p with rt set to high (ps5 ray tracing settings), it gets around 50-70 FPS.

ps5 gets these framerates native 4k with its fidelity mode;



as I said, 6600xt has pathetic bandwidth at 256 GB/s. and infinity cache only works good enough at lower resolutions where cache hits are repetetive. at 4k/1440p/ray tracing situations, it falls quite a bit below PS5.

there really exists no GPU that is a proper match for PS5. 6700xt overshoots, 6600xt is a situational gimped card. best way is to compare 6700xt to PS5 and see how much of a lead it has. otherwise comparisons will be moot.

as I said, just because 6600xt matches a PS5 in terms of TFLOPS does not mean it will match it. there are other factors. bandwidth is the most crucial one.
 
Last edited:
People still obsessed chasing Ultra settings? high is plenty good enough, give me those juicy frames. Gotta upgrade my RAM though, 32GB seems fucking massive…
 

Gaiff

Member
You are assuming the PS5 is using Ultra settings?

The 6750XT is likely gonna be hitting the 80-90 range at 1440p console settings.
Imma bet a 4080 at console settings at 4K runs close to 100fps.
I wouldnt be shocked if an Intel A770 matches the PS5 at 1440p.
Depends what kind of magic Naughty Dog does. Uncharted 4 as mentioned before requires a much stronger GPU to run at equivalent settings to the PS5. Maybe this will run better? Who knows.
 

Mr Moose

Member
No, it is not. Plenty of screens in my comparison thread to show you just that - https://www.neogaf.com/threads/tlou...olutely-not-fair-to-ps3-original-but.1646395/
It is in fidelity mode.
PS5 in performance mode renders at a native resolution of 2560x1440.
PS5 in both fidelity modes renders at a native resolution of 3840x2160.

The comparisons in the article and the video use fidelity mode unless noted otherwise, as both Part 1 and Remastered on PS4 Pro support native 4K output and it made sense to compare performance at the highest possible pixel count.

Fidelity mode is native 4K at 30fps, while the 60fps performance mode uses a dynamic resolution - which always resolved to 1440p in our tests.

While a full-fat 4K image is appreciated and does look very clean, the loss of fluidity relative to the performance mode is a bit painful. But what if there was an option that split the difference between these two modes? On paper, that's exactly what the 40fps fidelity mode is supposed to offer, with the same 4K resolution and a boosted frame-rate - but while 40fps modes have worked well in other PS5 titles, here there's not enough headroom over the 30fps line to take full advantage, and instead you just get a more inconsistent presentation often in the region of 35fps.
 
Last edited:
It is in fidelity mode.


Pure BS. There's a ton of DRS or CBR - clearly visble on my screens, plus the overal image is way softer than native 4K is, plenty of games to test and see for yourself, or just wait for PC release and I'll show you.
 

Mr Moose

Member
Pure BS. There's a ton of DRS or CBR - clearly visble on my screens, plus the overal image is way softer than native 4K is, plenty of games to test and see for yourself, or just wait for PC release and I'll show you.
I'm not looking through 409 images. VGTech is very good with pixel counts.
 
Youre smart enough to know chasing Ultra is a fools errand.....but you are gonna fall for 32GB rec spec for RAM?
Yeah, I do a few other bits and pieces where I’ve seen usage go up and I typically have other apps open in the background when playing games. I can imagine that more games are going to start asking for 32GB RAM in the not too distant future too. No harm in upgrading while it’s cheapish!

Also, we’ve not played the game yet, that requirement could actually be accurate. We’ll need to wait and see.
 
I'm not looking through 409 images.
Well, that's not my problem. PS5 can't even run Uncharted games in native 4K without visual downgrades (and TLOU P1 is way better visually) - PS4 versions look better and don't have missing effects, lighting etc., not to even mention abysmal local shadows, which is not fixed for the PS5 versions. I'm not turning on PC Master Race here, I'm just telling you that PS5 is not running the game in native 4K, they're using plenty of DRS or CBR or maybe combination of two with something else.
 
Last edited:

Mr Moose

Member
Well, that's not my problem. PS5 can't even run Uncharted games in native 4K without visual downgrades (and TLOU P1 is way better visually) - PS4 versions look better and don't have missing effects, lighting etc., not to even mention abysmal local shadows, which is not fixed for the PS5 versions.
What pixel counts do you arrive at then?
 

ChiefDada

Member
Pure BS. There's a ton of DRS or CBR - clearly visble on my screens, plus the overal image is way softer than native 4K is, plenty of games to test and see for yourself, or just wait for PC release and I'll show you.

Naughty Dog Engine doesn't support DRS or reconstruction. You are probably seeing film grain, which TLOU uses extensively. Get your facts straight before you call BS on someone else.
 
Naughty Dog Engine doesn't support DRS or reconstruction. You are probably seeing film grain, which TLOU uses extensively. Get your facts straight before you call BS on someone else.
I've said they're maybe using something else and what exactly I don't and can't know. I'm also not DF and can't lick Sony's ass. But it's not native 4K, nowhere near close. And it's not film grain, c'mon.
What pixel counts do you arrive at then?
It's very hard to tell under close examination, cuz the game is horribly pixelated at times and not telling you anything - on PC in some games you can see or tell with DLSS. I think they maybe using some form of internaly developed image reconstruction like in Quantum Break, cuz I know when I see one, you need to be blind af to not see it on my screens - it ruined plenty of cool screens I made.

That's not native 4K:
LGZiz7h.png
And it's just one screen out of almost 700.
 
Last edited:

ChiefDada

Member
I've said they're maybe using something else and what exactly I don't and can't know. I'm also not DF and can't lick Sony's ass. But it's not native 4K, nowhere near close.

Developer, VGTech, NX Gamer, Digital Foundry: "It's native 4k/fidelity and native 1440p/performance with no DRS"

Guy on NeoGaf: "It's nowhere close to native 4k"

Boombox Shut Up GIF
 

rodrigolfp

Gamepads 4 Life
I've said they're maybe using something else and what exactly I don't and can't know. I'm also not DF and can't lick Sony's ass. But it's not native 4K, nowhere near close. And it's not film grain, c'mon.

It's very hard to tell under close examination, cuz the game is horribly pixelated at times and not telling you anything - on PC in some games you can see or tell with DLSS. I think they maybe using some form of internaly developed image reconstruction like in Quantum Break, cuz I know when I see one, you need to be blind af to not see it on my screens - it ruined plenty of cool screens I made.

That's not native 4K:
LGZiz7h.png
And it's just one screen out of almost 700.
Maybe some post anti aliasing they use that makes things blurry.
 
Top Bottom