sendit
Member
Are you also saying Unreal Engine 5 is not a next gen engine because it can scale down to mobile devices?It's a ps4 game, with a PS4 based graphics engine.
Duh!
Jesus man.....
That's blizzard's art team for you.....
Are you also saying Unreal Engine 5 is not a next gen engine because it can scale down to mobile devices?It's a ps4 game, with a PS4 based graphics engine.
Duh!
Jesus man.....
That's blizzard's art team for you.....
I think his point is that with it being top down with no fourth wall or geometry behind the camera means the technicality of it isnt impressive enough to warrant such praise as "nextgen" looking.But isn't it all own opinion. No game that has come out impressed me like a next gen title should do? Just take the title of this thread for example, it's called "I expect" thats far from objective.
Its really pretty and art style for once is good.What i've meant is that the PS4 can do almost all graphical features that the PS5 can do in GoW:R, i've never said PS4 can do anything, ok? I was referring to GoW.
But like i said there aren't many really next gen titles yet.
But ok, lets say Diablo 4 doesn't qualify for a next gen title it's nonetheless a pretty looking game.
Comparing the biggest commercial multiplatform engine with unprecedented features like Nanite to a no name in house engine originally targeting exactly one platform.Are you also saying Unreal Engine 5 is not a next gen engine because it can scale down to mobile devices?
Right. One of largest development studios in the world is incapable of making an engine that scales.Comparing the biggest commercial multiplatform engine with unprecedented features like Nanite to a no name in house engine originally targeting exactly one platform.
Tiny difference.....
"scale"....ok, lol.Right. One of largest development studios in the world is incapable of making an engine that scales.
The game runs at 1920x800 which is ~10% more pixels than Ryse's 900P (1600x900) without any upscaling on a 1080P screen.Yeah, i fucking hate having a dirty iq, and the order was the dirtier game i played recently, it's not just a matter of low res (and the game is not 1080p btw more like 800p if i remember correctly), they use pretty much all the tricks in the book to ruin iq:
Heavy blur
Low res soft image
Black bars
Film grain
Heavy postprocessing
Thank god for the absence of chromatic aberration (i think)
The game could look MUCH better.
I'm waiting for the remaster before re-playing bloodborne, at least that one has still a chance to get one, and i could not care less about driving games
There is no such a thing like high quality chromatic aberration, it always look like shit in videogames.The game runs at 1920x800 which is ~10% more pixels than Ryse's 900P (1600x900) without any upscaling on a 1080P screen.
If i remember correctly it also had a slight high quality chromatic aberration implementation.
The game runs at 1920x800 which is ~10% more pixels than Ryse's 900P (1600x900) without any upscaling on a 1080P screen.
If i remember correctly it also had a slight high quality chromatic aberration implementation.
I thought i was going crazy...The Order 1886 has terrible image quality, due to the use of film grain, chromatic aberration and motion blur.
The game might have some nice texture and modeling work, but with all the crap they smear on screen, the game ends up looking quite bad.
I thought i was going crazy...
It seems like not many people care about clean iq.
Some games let you disable film grain and other stuff and i play all the sony exclusives on console, no game come close to the visual noise of the order, not sure about third parties since i play all of them on pc.Games on console rarely have options to disable crap like chromatic aberration, motion blur, depth of field and film grain.
So most gamers on console don't even know what a clean image looks like. They have always lived in the muck and are used to these effects.
I must say that you are right but he is right too. Im an Unreal Engine user. I work with it every day. Its my job.Comparing the biggest commercial multiplatform engine with unprecedented features like Nanite to a no name in house engine originally targeting exactly one platform.
Tiny difference.....
Real cameras in real world record video with, at least, a bit of motion blur. Its just how physics, light and cameras work. A real video with 0 motion blur would look very unnatural. Like, a weird stuttering efect. Not fluid at all.Games on console rarely have options to disable crap like chromatic aberration, motion blur, depth of field and film grain.
So most gamers on console don't even know what a clean image looks like. They have always lived in the muck and are used to these effects.
Real cameras in real world record video with, at least, a bit of motion blur. Its just how physics, light and cameras work. A real video with 0 motion blur would look very unnatural. Like, a weird stuttering efect. Not fluid at all.
So motion blur is NEEDED for a natural look. At least a little bit.
Software is just software, ofc it can be improved/built upon.I must say that you are right but he is right too. Im an Unreal Engine user. I work with it every day. Its my job.
And Unreal Engine 5 is an overpowered version of Unreal Engine 4. 99% of the engine is still exactly the same with a polished and improved UI. And yes: its new features, spetially Lumen and Nanite, are incredibly awesome, and what you can do is just amazing. Even unbelievable. It's a next gen engine for sure. People don't trully know how powerful and incredible this engine is.
But that means that in-house engines can also be improved with new features that makes them next-gen too. Even if their base code keeps the same.
So both of you could be correct somehow.
Man. If you know ANYTHING about how cameras work, you will know that its absolutelly IMPOSSIBLE to get no motion blur at all if there are moving objects or the camera is moving.The only reason movies need some motion blur is because they still use very low frame rates.
If movies were filmed at 120 fps, like we can play games, then motion blur would be unnecessary.
Man. If you know ANYTHING about how cameras work, you will know that its absolutelly IMPOSSIBLE to get no motion blur at all if there are moving objects or the camera is moving.
Its just physics. Its not about beeing necessary or not.
And just for you to know, when we render CGI videos from Unreal or from any renderer, we always, ALWAYS! use motion blur not because we like the "effect", but because not having motion blur makes the video look totally unnatural. Why is that? because of how cameras in real world work.
PERIOD.
Even videos recorded/rendered at 120 have motion blur. Of course its a lot less intense, again, because of how cameras work and how much light the sensor captures every frame. But for eliminating 100% of the motion blur you would need to have a infinitely high shutter speed, which is phisically impossible, but even if it was possible, that would mean that the result would be a black image.
Motion blur is needed for natural looks. And yes, for higher frame rates, motion blur is and should be less intense both in real world, CGI or videogames. Of course, you can turn it off in videogames if it annoys you. Its not the real world, so you can do that.. But do not underestimate its value for visual fidelity or the reasons why it could be important to achieve a certain look of a product/videogame.
Wave your hands in front of you and say that again.motion blur isn't natural
Neither is flashing increments 30 to 60 times to simulate motion. Perfectly sharp images flashed in sequence do not look natural to the eye.motion blur isn't natural
These aren't unrelated though. High framerate cameras have to have a much shorter exposure. I get that they're not totally 1:1, but it's misleading to say they have nothing to do with each other.In film, framerates have nothing to do with motion blur, it's the shutter speed that's responsible for that.
Wave your hands in front of you and say that again.
Neither is flashing increments 30 to 60 times to simulate motion. Perfectly sharp images flashed in sequence do not look natural to the eye.
Motion blur needs to be calibrated to the frame-rate, and sometimes is not. A game running at 60fps should have half the blur of a game at 30 fps, and a game at 120fps should have half the blur of that, because it should be simulating a shorter "exposure" time. When done right, motion blur looks very natural and more like how our eye sees, but usually we see it in 30fps games that are trying to look like movies instead.
Well it looks like ass in videogames, i don't play in 4k just to have everything turning into a blurry mess the momemt i move the camera.Man. If you know ANYTHING about how cameras work, you will know that its absolutelly IMPOSSIBLE to get no motion blur at all if there are moving objects or the camera is moving.
Its just physics. Its not about beeing necessary or not.
And just for you to know, when we render CGI videos from Unreal or from any renderer, we always, ALWAYS! use motion blur not because we like the "effect", but because not having motion blur makes the video look totally unnatural. Why is that? because of how cameras in real world work.
PERIOD.
You are mixing blurry mess and natural motion blur.Well it looks like ass in videogames, i don't play in 4k just to have everything turning into a blurry mess the momemt i move the camera.
MENSTRUAL CRAMPS.
You are mixing blurry mess and natural motion blur.
In order to be able to disable in-game motion blur and achieve natural motion blur, You need like 300fps. For me, the 240hz result was VERY similar to 60hz with good motion blur in doom 2016 at the very least.
Essentially, a good motion blur should "blur" the frame to show it's movement in time of it's existance. So if you play at 60fps, each frame should be showing/containing 16ms of movement. NOT A STILL frozen frame.
The colelction of 16ms movements, create a realistic movement. Creation of 60 stills is not the same.
Look at photography. This is what shutter speed is. If you film waterall or rotating objects, you would need to set 1/2000 shutter speed to avoid motion blur and it looks vey jerky and unnatural.
So You set shutter speed to double of your framerate. 1/60 for 30fps filming is the usual.
Great comparison here. Notice how unnatural fast shutter speed looks. If You look at these objects in reality, you don't see them as sharp and jerky.
wave a hand in front of your face or the famous pencil bending illusion. It is the "natural motion blur" effect. Eyes don't use frames. The photons are continuous. But there is some delay in processing by your brain. That's why it happens.
And no. Monitor will not create motion blur for you. You are looking at a static object (monitor) and it is not movie. Only contents it's displaying are changing. So if You animate something at 60fps on that monitor without motion blur, there is not enough frame data for your brain to make it look as it should. Take 300hz? yep. thats good.
What I agree with you is terrible in-game motion blur implementations but recently most stuff is good and we are long past RE4 or gta3 terrible blur.
Stuff like Uncharted 4 per object motion blur is fantastic. And believe it or not - the camera motion blur GREATLY helps on oled at 30/40 fps. Oleds are too fast and 30fps really looks like ass without good motion blur. Even 60fps looks kinda dry
It's not blurry. The motion is jittery otherwise often. Unless you play at high framerate (not 60).I just turn that shit down because i hate blurry stuff on my 4k tv, don't care how natural it looks.
Taa on pc usually blur the shit out of everything, this is why people use alternative stuff from the nvcp.Horizon was the first time I disabled motion blur. I honestly dont notice it. People like Alex see issues with native 4k images and im like wtf, I cant even see shimerring or jaggies in 1440p, let alone 4k. I still have no idea what TAA does to a native 4k image. It looks amazing to me in every single game.
HFW had that insane brightness flickering in its native 4k 30 fps mode that just gave me literal headaches. But for whatever reason, turning down camera acceleration and turning off motion blur fixed it, and I never turned it back on. Fucking John from digital foundry told GG that it was a sharpness issue so GG turned off the sharpness completely ruining the pristine IQ of the native 4k 30 fps version. God I hate DF sometimes and how devs dont think twice before taking their advice.
Horizon was the first time I disabled motion blur. I honestly dont notice it. People like Alex see issues with native 4k images and im like wtf, I cant even see shimerring or jaggies in 1440p, let alone 4k. I still have no idea what TAA does to a native 4k image. It looks amazing to me in every single game.
HFW had that insane brightness flickering in its native 4k 30 fps mode that just gave me literal headaches. But for whatever reason, turning down camera acceleration and turning off motion blur fixed it, and I never turned it back on. Fucking John from digital foundry told GG that it was a sharpness issue so GG turned off the sharpness completely ruining the pristine IQ of the native 4k 30 fps version. God I hate DF sometimes and how devs dont think twice before taking their advice.
Yep, there aren't really next gen looking games right now. I think GTA 6 will be one of the first real next gen games.No seriously GAF.
WHERE ARE THE NEXT GEN GAMES ?!!!!!
How long are we now into this gen and still there's not a game in sight that we can for sure say is really NEXT GEN. Where's the leap in lighting and physics or other graphics effects that won't be at all possible on PS4?
I want to see a game and go 'Fuck, PS4 and Xbox One will die spontaneously combust trying to run this'. Hasn't happened yet.
No seriously GAF.
WHERE ARE THE NEXT GEN GAMES ?!!!!!
How long are we now into this gen and still there's not a game in sight that we can for sure say is really NEXT GEN. Where's the leap in lighting and physics or other graphics effects that won't be at all possible on PS4?
I want to see a game and go 'Fuck, PS4 and Xbox One will die spontaneously combust trying to run this'. Hasn't happened yet.
Wasn't one of the biggest points realized though? The blazing-super-duper fast SSD? We are back at 1-2 second loading times instead of 50 second load screens, so there's the real next-gen benefit.I'm convinced it's never happening at this point. I've played everything worth playing this gen. I know what the limitations of these systems are by now. The best we can hope for is something like Ratchet, demon's Souls or Forbidden West ir Ragnarok. Only first party Sony games can look great at both high resolutions and high framerates- and even these are compromised in SOME way such as lacking ray tracing or fidelity (like ragnarok which looks good but not pushing graphics).
Everything that's received a ps5 upgrade from last gen is only able to get around 1440p/60 with a mix of max/high settings ...and those are last gen games. Every cross gen, new game has similar limitations in terms of resolution at 60 fps- elden ring, dying light 2, Cyberpunk etc etc
Whenever there's a game that really pushes graphics, there are drawbacks in terms of resolution- like this re4 demo which would look stunning if not for the low resolution and poor image quality. Dead Space Remake and Callisto Protocol only being able to do 1440p/60 without RT (and not even "next gen" visuals).
Ever since the start of this gen up until the present, there have been obvious limitations on everything that'd come out for these consoles.
If you track your hand it wont blur.Wave your hands in front of you and say that again.
You are mixing blurry mess and natural motion blur.
In order to be able to disable in-game motion blur and achieve natural motion blur, You need like 300fps. For me, the 240hz result was VERY similar to 60hz with good motion blur in doom 2016 at the very least.
Essentially, a good motion blur should "blur" the frame to show it's movement in time of it's existance. So if you play at 60fps, each frame should be showing/containing 16ms of movement. NOT A STILL frozen frame.
The colelction of 16ms movements, create a realistic movement. Creation of 60 stills is not the same.
Look at photography. This is what shutter speed is. If you film waterall or rotating objects, you would need to set 1/2000 shutter speed to avoid motion blur and it looks vey jerky and unnatural.
So You set shutter speed to double of your framerate. 1/60 for 30fps filming is the usual.
Great comparison here. Notice how unnatural fast shutter speed looks. If You look at these objects in reality, you don't see them as sharp and jerky.
wave a hand in front of your face or the famous pencil bending illusion. It is the "natural motion blur" effect. Eyes don't use frames. The photons are continuous. But there is some delay in processing by your brain. That's why it happens.
And no. Monitor will not create motion blur for you. You are looking at a static object (monitor) and it is not movie. Only contents it's displaying are changing. So if You animate something at 60fps on that monitor without motion blur, there is not enough frame data for your brain to make it look as it should. Take 300hz? yep. thats good.
What I agree with you is terrible in-game motion blur implementations but recently most stuff is good and we are long past RE4 or gta3 terrible blur.
Stuff like Uncharted 4 per object motion blur is fantastic. And believe it or not - the camera motion blur GREATLY helps on oled at 30/40 fps. Oleds are too fast and 30fps really looks like ass without good motion blur. Even 60fps looks kinda dry
For me the usual targef is 4k Native with just FXAA. At 4k the FXAA does good enough and the blur at the resolution when downsampled isnt noticable.Taa on pc usually blur the shit out of everything, this is why people use alternative stuff from the nvcp.
I straight up don't use AA when i play at 4k in 90% of cases.
Proper motion blur is less and less, the more fps you gotZero motion blur at 60fps+ looks correct on a Plasma display. When I track objects with my eyes they become clear everytime on it. Same with CRT.
Not a fan of anything more than subtle object motion blur when 60fps+.
If I stare forward it Tracers just like if I wave my hand in front of my face without tracking. Thats my preference.Proper motion blur is less and less, the more fps you got
Shhhhh! don’t Let rofif know you said this. Forspoken is his sweet baby Jesus.PS4.5 is already too generous imho. The lightning is atrocious.
AO seems nonexistent in many cases, shadow rendering distance is laughable, the bright areas are basically just bloom at 500%, indirect lightning is just permaglow-everything.....
We've had lots of ps4 games doing this much better. Side by side with something like RDR2 this game looks older, not newer....
Except our vision can't add motion blur to a series of static images. Motion blur should be a standard rendering feature but so should be the option to disable it for those who dislike it. A more advanced implementation should be possible in VR now that some headsets have eye tracking.If you track your hand it wont blur.
I get what he is saying. Hes saying we should rely on our on vision for the blur and not manually insert it.
Plasma isnt a series of static images it uses a 600hz "field" to reproduce images.Except our vision can't add motion blur to a series of static images. Motion blur should be a standard rendering feature but so should be the option to disable it for those who dislike it. A more advanced implementation should be possible in VR now that some headsets have eye tracking.
Well, that's not really the common case. If we had a super high-framerate sample-and-hold display then sure, motion blur will occurr naturally. Though in that case motion blur haters would be screwed because then they wouldn't be able to turn it off lol.Plasma isnt a series of static images it uses a 600hz "field" to reproduce images.