TV-based AI Image Enhancement and gaming: Thoughts and impressions

1 : Anonymous2022/02/16 12:20 ID: stul7j

I bought a Sony X90J TV last year, and one of its (and the rest of the brands mid-to-upper tier of TVs) much hyped features was the new XR processor. It is proclaimed to take Sony's existing AI-driven image processing a step further by breaking down and enhancing elements in real time based on how the human eye works. Hype or not, Sony's TVs were once again hailed as the industries best in processing, and the X90J has not disappointed me on this front (low dimming zones and no VRR are a different story).

All this to say that I only recently decided to tinker with some of the TV's image enhancement features specifically for gaming, and I've had some surprising results (I'll share some screenshots at the bottom of the post). I was motivated by the recent talk around the tradeoff between Horizon's Fidelity and Performance modes. The new Uncharted Collection had a similar conundrum, and I thought to mess around and see if the TV's features could in any way lessen the gulf between the modes. While tinkering with the motion smoothing did enhance Uncharted's perceived motion in the 4K Fidelity mode, it of course came at the expense of massive input lag.

But using the TV's 'Reality Creation' feature in order to enhance the Performance mode's visual fidelity stopped me cold. Enabling the option and setting its value of 55/100 instantly gave the game back the visual crispness I felt in the 4K mode. I thought I was mistaken, only to go in and verify the game settings were correct. The TV's menus allow you to adjust the current visual settings in real time with only a small corner overlay, and toggling between my manual preset and turning the Reality Creation off plainly demonstrated the increase in sharpness and detail, both on edges and in internal texture detail (walls, clothing, rocks etc). Whether the image was 1-to-1 with its 4K counterpart or not, the perceptual enhancement was very apparent, and I knew I needed to further experiment with the feature in other games and in concert with other values and settings.

This lead me to next try Ghost of Tsushima (PS5 version), and the results were decidedly different. At the 55/100 value, the Reality Creation setting turned the high-frequency grass into something decidedly too "crunchy", while blowing out highlight detail on tree leaves. This seemed a bit vexing at first, but it was clear that Reality Creation reacted differently to the checkerboard implementation of GoT than it did to Uncharted's native 1440p Performance mode. This lead me down a rabbit hole to test almost a dozen other games (which I could detail further if needed) -both old and new, and I came away with some ideas as to how and when this feature seems to perform best.

One major takeaway is that games with mediocre (or worse) anti-aliasing will see a bigger tradeoff with Reality Creation enhancement. The tradeoff always seems to be that, while scene detail will be enhanced, jagged edges are also sharpened. Games with lower texture detail also seem to benefit less, as there doesn't seem to be enough to target for enhancement, but lines and edges still get boosted, giving some a kind of "Borderlands effect".

Both the Leagacy of Thieves Collection and the unlocked-for-PS5 Last of Us 2 seemed to be the optimal candidates for enhancement (with RDR2 getting a considerable boost as well), and the best correlation I can make is that they have 1440p presentations with great anti-aliasing and high-quality assets underneath. This means that the edge-detection is not vexed by checkerboarding, that enhanced lines and edges aren't crunched thanks to their smooth source material, and that surfaces and materials show a lot of improvement in fine detail due to the soft-but-present texture quality underneath.

The final effect works best when tweaked to each individual game (although going past the 55/100 mark starts looking weird pretty fast on anything), with jaggy / checkerboard source material benefitting from a lighter touch (Tsushima looked best in the 20-25 range). I would describe the overall impression of it as something between a sharpening filter and a basic kind of DLSS, which to many may sound very similar to AMD's Super Resolution software-based upscaling solution.

That brings us to the broader topic at hand. While this may sound like an ad for Sony TV's (and to be fair, Sony has pushed the envelope in this space), LG has made it clear that its new TVs will also include Object Enhancement and Scene Detection tech driven by improved AI. TVs and display tech in general have continued to further implement machine-learning into how they process images, so this will likely only become a more prevalent and powerful feature in many televisions of the future.

As for what this means for gaming, I think it provides a potentially important middle ground between the limitations of console specs / game display options and the end user experience. While before, a better TV simply did a better job at delivering the source material to the viewer, TVs of the future will be increasingly capable of actively, intelligently, and instantly adding detail to the content being presented. If gamers can still run higher framerate modes but recoup lost resolution detail thanks to an intelligent TV-based enhancement solution, it lessens the blow of choosing performance vs fidelity, with the only cost being the expense of the television. While a Super Resolution implementation would hit the game's performance (albeit only slightly in most cases), and DLSS is only available to those who can afford / acquire full PC rigs with pricey GPUs, TV-based enhancement isn't subject to either limitation (although obviously pricier TVs will likely do it better).

So as long as you have to have a TV to play the game (all respect to monitor players), and TV's are increasingly filled with high-powered processors and ever-improving software, there's no reason why the TV itself shouldn't be an active participant in the image processing pipeline. Sony is actually in a unique position in this space as the only display company who makes gaming hardware. Although their massive fumbles with VRR implementation may not inspire confidence, there's no reason why they couldn't tune their TVs to accommodate specific display configurations for software featured on their console; essentially 'putting the DLSS in the TV'. Is this pretty ambitious and almost entirely speculative at this point? Absolutely, but its not out of the realm of possibility, as even the current generic implementation has clear benefits, particularly with certain kinds of content.

I'm curious to see if anyone else has had an experience with using 'Reality Creation' (or other forms of AI enhancement or sharpening) in games. I'm made a small gallery of screenshots demonstrating some of the differences I described above. Its a less-than-ideal depiction, as its better seen in motion IRL, but screenshots are still better than words.

While taking the shots, I noticed both that A) dimming the game-level brightness may accommodate for the somewhat blown-out highlights caused by Reality Creation, and B) I actually didn't mind the higher 55/100 Reality Creation setting as much on Tsushima the second time around. I think its most that long green grass blades at mid-distance exhibited the most crunchiness, but most other types of foliage aren't as noticeable. All that being said, I'm looking forward to see if Reality Creation adds some pop back into Horizon: Forbidden West's Performance mode. We'll see soon enough!

2 : Anonymous2022/02/16 12:46 ID: hx5ztdd

TV upscaling/reconstruction is fundamentally different from videogames because of the temporal element. For movies you can interpolate between frames (whether you want to add frames or upscale existing ones), while games only have previous frames.

The other thing is that DLSS doesn't just use frames, but data from the engine (like motion vectors) to predict details where there is none. In its core DLSS was created to assist RTX, so it is more integrated into hardware and software of games. And DLSS isn't just a static thing, but uses machine learning and training to optimise for each game (although this process has become a lot easier and nvidia's goal is to make it engine, not game dependent).

I mean, yeah, a good TV with great calibration WILL increase image quality. But outsourcing image reconstruction from a gaming hardware to a TV seems sketchy.

PS: also, tensor cores aren't exclusive to high end hardware per se. Tensor cores aren't expensive, GPUs are expensive. But every GPU is atm. Jensen Huang specifically said that hardware is not the biggest question, software is. He stated nvidia is just as much a software company as a hardware one. Without software there is nothing to accelerate. That's why AMD is lagging behind nvidia is most key fields. They have pretty decent hardware, but nvidia's softwares from drivers to dlss to RTX API to services to a bazillion things are just ahead of the curve by 2-10 years. More sophisticated, easier to use and working in tandem with their specialised hardware.

ID: hx61m50

The TV is essentially doing something similar to AMD's FSR. It is unlikely that we will ever get something close to DLSS on the PS5, so FSR is a better comparison.

ID: hx7g997

It's not really like FSR or DLSS. Upscaling is always on on the TV even in game mode, but is only used for 720p/1080p input signals. PS5 outputs 4K so it's never used. When DF talks about games rendering in 1080p/1440p etc. that's internally, it's upscaled before it leaves the console.

These other things are 2D post-processing effects (ie not types of upscalers like DLSS and FSR) and IMHO aren't really helpful. Game mode gives you what the developer intended. Sure, if you really want you can change settings like sharpness, but generally keeping everything on low/off will give better and more consistent results.

The reason 120Hz looks good is partly because the TV can't do motion interpolation (no extra frames to play with) and likely has to go easy on any 'fancy' enhancements given the high speed.

XSX has ALLM, PS5 doesn't (yet) so you have to set game mode manually, which I'd very much recommend. On my LG C9 I accidentally left it on a default picture mode and couldn't figure out why all the motion looked super jerky compared to XSX. Back to game mode and it was all good again.

"Offloading" upscaling to the TV is a bad idea for a number of reasons - TVs are very weak compared to console GPUs so the benefits would be minimal, and you'd get into all sorts of trouble with the main scene being at a different source resolution to the HUD, and you'd have to test every TV out there to make sure it's not doing anything screwy, etc (you really want to wait for a firmware update for your TV for a new game to work?). And then cheaper TVs won't have this feature, so the game would have to implement it anyway. It's really best left up to the game to do the upscaling and compositing.

IMHO the right thing for a TV to do when attached to a game console is display the image provided with as high fidelity as possible in the minimum possible time.

ID: hx62jdy

Exactly. That was the most direct comparison I made, and that is the closest impression I could get from experiencing it. Offloading a super-resolution style enhancement to the TV in order to preserve (even the slight dip in) performance seems like it could be a future trend.

ID: hx622hv

While I'm not an expert, I have been following DLSS since its inception and I'm familiar with the way it is implemented. My point wasn't to make a direct comparison to the technologies per se, but more about the perceptual appearance of the Reality Creation feature and its relative cost (both computational and monetary) compared to similar options. I take your point about the tensor cores, but I think that it essentially further makes my point ("and DLSS is only available to those who can afford / acquire full PC rigs with pricey GPUs").

DLSS, the technology, is exclusive to Nvidia both in terms of IP and in terms of tech prowess, and many of Nvidias DLSS products are prohibitively expensive and difficult to acquire at the moment (although perhaps not as difficult as a year ago). As you said, cheap tensor cores alone can't get you to DLSS, so if you want all the benefit that DLSS offers, you have to pay in time and $$. Super Resolution et all may be cheaper, but the results aren't as impactful...nor could they ever be, due to their limitations compared to DLSS.

ID: hx83kpe

DLSS, the technology, is exclusive to Nvidia both in terms of IP and in terms of tech prowess

The principle behind DLSS is not exclusive to Nvidia, Intel has a competitor that works in an identical way with hardware acceleration on their upcoming GPUs, using their own algorithm.

Sure, you're not going to get it on existing consoles, outside of some simplified version tha runs without full hardware acceleration.

ID: hx82jhr

And DLSS isn't just a static thing, but uses machine learning and training to optimise for each game (although this process has become a lot easier and nvidia's goal is to make it engine, not game dependent).

You're a few years late...

There hasn't been any game specific training nor optimization for 2 years now since DLSS 2.0

3 : Anonymous2022/02/16 19:16 ID: hx7k224

This is a bad path to go down. Any post processing the TV is doing, could be done by the game for minimal - and I mean barely noticeable - performance loss, not to mention to a higher degree and quality, since there is more information to use than just the displayed image, which is all that the TV has. So the most important question to ask before doing anything of this sort is - why did the developer didn't do this? Because they almost surely could, easily. So it's something to consider. This is no different than using equalizers for music, you are overwriting the original material.

The job of TVs didn't change. They must recreate the original data they were fed to the best of their ability, it must be as accurate as possible, because it's an empty vessel for the already existing data to manifest itself in as close as possible to the real thing. No different from speakers.

Does this mean you can't enhance a game beyond its default state? Of course not, that's what modding is for. But the Sony TV post processing suite is nothing like ReShade on PC. It is heavy handed, dumb post-processing that is sure to create sharpness artifacts etc. that is not ripe for this purpose.

I don't deny you might truly prefer the change, just like how many would enjoy a V signature equalizer on their music or a cold color balance - but - I do think "purism" is ultimately more rewarding because the other changes are only gratifying because you were conditioned for them, even though they are in fact less refined than the pure experiences.

Bar special cases of course 🙂

4 : Anonymous2022/02/16 13:56 ID: hx684ef

I am using a decent Sony TV with the old X1 processor and also do a lot of tinkering and testing. My personal experience is that most settings are very highly dependant on the person in front of it. Some people dont feel input or game lag whereas I do. Same goes to poppy vs real colors, high gamma or more dark-ish calibration etc.

I also tried to use pro calibrations and tested some personal tweaks here and there. A good friend of mine, also gamer, very different story. Although in general we agree on some optimizations, some wont ever rise if I didnt point them out specifically. The subjective perception is just very different.

In general I feel some TVs have an edge with some post processing intelligence - Sony is pretty advanced in that field due to other products they have - although I feel most important is the base hardware itself in terms of color palette, backlights etc. If the base is cheaper, enhancement with software tweaking often results in just mediocre results.

Another huge issue are post-processing chains that can occur, e.g. PS5 to AVR to TV. If all the systems add image altering or bad pass-through this is also something that can highly mess up or improve image quality.

All just personal experiences. Everyones might differ.

5 : Anonymous2022/02/16 14:18 ID: hx6b25l

Got a bravia XE90, its a mid range model from some years ago but it has the same processing features.

I have dabbled with the "reality creation" feature before and see it as a secondary sharpening filter, as it has similar drawbacks as the sharpness setting. Any high contrast edges on pixels will get over sharpened to the point they get a visible white pixel outline. Depending on the content of the image this can dramatically increase the overal image brightness. This is also the reason I leave it off now in most games, and just keep the sharpness setting relatively high. Depending on screen content the effect can visibly pop in. The screen brightness change combined with a high local dimming setting and enhanced contrast for hdr makes the brightness change more pronounced. Overall I do like the sharper look, even if its clearer edges mean a bit more shimmer, but the effect makes certain games look distractingly unstable to me sometimes. Still its pretty cool the feature is available in game mode, Animal Crossing looks great with it on.

6 : Anonymous2022/02/16 12:51 ID: hx60d15

Here's something interesting for you, I found that while TV's processing capabilities are increasingly awesome, using the PS5 for the streaming services over the TV's native player offers superior image quality.

I guess the PS5's a raw powerhouse and using that for streaming is overkill, but it's definitely worth checking out if you want the best video playback experience.

ID: hx6mkcm

Ps5 doesn’t have Dolby vision support though. So if that’s something important to you the. You might be better streaming from the tvs internal apps

ID: hx7swpm

HBO, Hulu, and Disney+ apps on PS5 still do not support 4K or HDR.

That's an objectively worse experience.

If you have a smart TV in the mid-to-high range from the last 4 or so years, it's more than good enough for streaming. Those platforms have the benefit of usually getting the latest features on their apps, because that's their primary objective use.

App support and updates has been disgustingly slow on the PS5, especially compared to the Xbox counterparts.

ID: hx6ejxr

[deleted]

ID: hx6fxmt

GPUs today use hardware decoding.

ID: hx6gvgr

Video image rendering. The data comes through the internet but the data is translated to an image via the GPU.

I find that it blends HDR colours better and seems to upscale HD content better that the native TV does.

It's easier to see on the loading logo of something like Disney plus.

On my TV's in built player you can see the background as a series of overlaying circles.

When loaded from the PS5 the loading background is a subtle fade effect in a continuous fade with no obvious overlapping circles.

I can't remember the technical name for the effect, but the shading effect requires a separate chip to process that most electronics lack (modern day cost saving). The PS5 obviously didn't skimp on it and makes HDR colours blend and shade better.

There's probably some more post processing effects, frame interpolation, motion smoothing, that once again the PS5 makes no trouble out of calculating.

The OP.is talking about using a TV to do some clever post processing effects to boost the graphics.

I'm saying that I find the opposite to be true and removing all post processing effects, for peak low input lag, and relying on the PS5 to render video streaming offers superior image output.

7 : Anonymous2022/02/16 17:18 ID: hx72hbf

The biggest thing these TVs do is intelligently scale without creating pixelation and removing artifacts such as banding and pattern repetition. It works pretty dang well for average content, but it can and will frequently get things wrong when it comes to rendered images with style (i.e. not aiming for 100% photorealism). Games often use transparency and artistic interpretation to achieve looks that are acceptable. GoT is one of those games with its greenery.

Upscaling and AI can't create realistic detail where there isn't any to begin with. It can't take a leaf modeled with a rectangle and make it look like a detailed maple. It also can't add details in textures if they aren't there to begin with.

8 : Anonymous2022/02/16 15:43 ID: hx6nefp

The first thing I do when I buy a TV is disable most of these features, without question if it introduces any input latency.

I buy a display to do one thing, accurately present the signal it is sent.

ID: hx7zoid

Same here.

I have zero desire for my screen to layer a bunch of photoshop filters over whatever I'm watching.

9 : Anonymous2022/02/16 15:52 ID: hx6otd8

These tricks work until they don’t.

10 : Anonymous2022/02/16 20:45 ID: hx7xxs6

can someone draw some red circles. i cant tell the difference between the pictures lol

11 : Anonymous2022/02/16 12:42 ID: hx5zee5

Sony does have good tech but I think the main advantage is the color palette tends to be the most real in terms of comparing them to the real world.

For popping images I’d go with Samsung over them / possibly still besides that just because of how important vrr is.

LG CX is still the gold standard for gaming and nothing really compares imo regardless of the post processing that takes place or how much an image can pop elsewhere.

12 : Anonymous2022/02/16 14:22 ID: hx6bkoo

Interesting read

For years I’ve always been competitive and casual gamer and laid back on the sofa to play video games. I did in-depth research before purchasing my next big TV for PS5 back in December2019 to upgrade my 10 years old 50inc LCD Samsung tv to LG C9 55inc OLED.

[Hint] Sony and Samsung get their screen displays from LG…

LG C models is best in the market for gaming and if you have a look at HDTVtest channel for further spec test, the latest model C1 has in-put lag of 4ms. Which is outstanding for a tv that’s 55 or 75 screen size. Not only that the picture quality is always amazes me.

As for my C9, my expectations were high and LG updated it to 120hz 4k with 2.1HDMI.

LG C1 2021 settings for PS5 from HDTVtest.

Rtings top reviewers in my opinion

If you are new or looking forward to purchase a Tv for games,movies and tv shows, and wiling to pay over 1.2 or 1.5k go for LG models you won’t regret it.

ID: hx7zb6g

I just got a Sony X85J for the simple reason that its the ONLY modern tv that doesn’t flicker. Lcd (pwm) flicker is bad for your eyes and could lead blindness. Oled doesn’t need to flicker but does because the tv has to baby itself to prevent burn-in. Resulting in 240 flickers per second on the c1 specifically. Sony’s x85j are the only tvs without flicker according to rtings.com.

I can’t handle other screens due to bad migraines. But also I don’t want to baby sit my tv. Which was my reason for avoiding oled. It IS a problem if you game thousands of hours a year.

13 : Anonymous2022/02/16 12:58 ID: hx617y0

Good read. Thanks for putting in the time to test this out. I have a 85’ 900h that I’ll try and mess around with the same settings and try to corroborate your findings.

ID: hx6ct1r

The 900H is the exception to this as it doesn't feature a Sony processing chip, just the mediatek one.

ID: hx6ec5n

Good point.. oops. Well I’ll just sit here with my 1/2 promises features display, hah. That being said I actually have been quite okay with the 900h thus far. Got a clean panel from dse and I haven’t updated firmware in a year as they have introduced more problems with their updates thus far. Happy gaming everyone.

Ps - for anyone reading through this thread in a decision on what to buy, just get a lg c1 and call it a day

14 : Anonymous2022/02/16 12:41 ID: hx5zajz

Really interesting.

For me on a LG OLED it's the motion flow that is use on all game on ps5. Going 60fps to 120 is awesoome, no artefacts, no input lag problem, you juste get 120hz fluidity. On 30fps games it's awfull and I recommend to deactivate but on 60 it's a miracle.

I tried few 120Hz games (Fortinte, Uncharted LL, Dirt 5), but I can't see any diffrence between native 120hz and TV 120Hz. At the cost of course of lowering drastically resolution in game 120Hz mode.

ID: hx60a7t

Yeah, motion interpolation seems better than ever, but it will always come at a response time penalty. I think what you're describing is how that, even with a small bit of input lag added, 60fps is still responsive enough that a game is still highly playable with interpolated frames. The added smoothness of the motion plus a playable response time will always feel like a good combination (if people could be impressed by how smooth R&Cs 40fps mode felt, then 60ish will always impress them).

I really do feel the difference in true 120fps vs 60fps in games like Doom Eternal, Rocket League, and Rogue Company, despite the fact that 60 is highly playable. All of those games still look very presentable at their lowered resolutions, but obviously the dream is to have that high framerate with as much detail as possible. Curious to see where both TVs and possible mid-gen consoles choose to focus the efforts to improve graphics in the near future. LG's XR competitor is the best chance yet to take Sony's image processing crown (LGs already look amazing anyway).

ID: hx63sxg

I really do feel the difference in true 120fps vs 60fps in games like Doom Eternal, Rocket League, and Rogue Company

Yep, on competitive games no doubt you'll feel the difference.

And I add that the framerate must be really stable, motion flow work perfectly on 60fps games but on 40-50 it worsening the fps drops...

ID: hx63ezx

I believe what you refering to is oled motion pro i believe? If so that is black frame insertion. Meaning way better motion, no input lag. Only downside is a loss in brightness. And for some people who are sensitive to that kind of stuff they see some flicker. Any other motion stuff should be off since it causes input lag

And to op, i really advise to turn and sceen enchanting off keep the picture as is. It causes tons of input lag, reason why its usually all disabled when the tv is in game mode. Also usually theirs one thing that gets prettier and others which get uglier. You should take tons of screenshots and compare

15 : Anonymous2022/02/16 14:25 ID: hx6c1v7

If you have a "Game" picture mode this is the one you should always be using. This gives the lowest possible input lag.

Any post processing done by the TV will add input lag and make the the game feel less responsive.

ID: hx6co38

The feature I described was used exclusively in the Game Mode setting.

ID: hx6d6e5

Ok, but you don't state that in your post.

My comment is important since many people don't understand the importance of using Game mode.

16 : Anonymous2022/02/16 14:23 ID: hx6bsfd

I have LG CX 48 and i've almost always played with BFI High (oled motion pro), but this is not enough to make 30 fps look smooth. While playing guardians of the galaxy, i've been eager to have ray-tracing, but couldn't stand 30 fps.

This is when i've tried switching from GAME mode for the first time. I've pushed motion smoothing all the way to the right (both de-jadder and de-blur) and achieved the ultimate picture quality with ray tracing and what feeled like 120 fps! The only exception was when looking at lingering flames - everything else was like 120 fps and flames looked like 30 fps inside the smooth image around it. Yeah, i also hate the soap-opera in movies, but for games... this might have been a breakthrough!

There was an input delay, yeah, but in guardians of the galaxy you don't really care - you do not aim, you just run around, smashing some buttons. It didn't bother me at all.

Later i've tried switching to smooth mode again in HZD. Felt a bit laggier, but still playable in third person.

I hoped i could do the same with Deathloop, but nope, in first person shooter i just couldn't stand the input lag. Unfortunately the performance mode in Deathloop feels almost the same as resolution... I don't know why, but its bad. I think i'll just continue to suffer in quality mode with BFI.

Really hope i'd be able to smoothen resolution mode in HFW up to "120 fps" without much input lag issue, but we'll see soon.

Anyway, i think this method might work for some game genres. Maybe sometimes you don't even need to put de-blur and de-judder on maximum. Try what fits for you.

As for your path - i haven't tried it, but i will!
I agree that TVs now have quite a lot processing power and we should use them where applicable.

ID: hx75v22

BFI seems like a decent workaround but the problem is it dims the image and the LG C tvs aren’t the brightest so the dimming is more profound

17 : Anonymous2022/02/16 14:49 ID: hx6fbgd

I performed similar experiments with the XR processor on my A80J specifically on the Legacy of Thieves Collection and the Nathan Drake Collection (at 1080p). My findings were:

In general, the processing is good, even better than something like FSR. Although, any high values (40+) seem to introduce some over-sharpening effects and also had less detail in textures. I keep it at 20. The processor works drastically better when supplied with the rendering resolution of the game so playing the nathan drake collection with the PS5 set to output 1080p was vastly better than setting the output to 4K. The perceived sharpness and detail changed drastically based on viewing distance, some of the magic disappeared when viewing too close (< 5ft away for 77" TV). The processing seems to use a lot of micro contrast (localised contrast) to improve perceived detail instead of typical sharpening methods.

Overall, I keep it on but still prefer to play Legacy of Thieves in Resolution mode.

18 : Anonymous2022/02/16 16:41 ID: hx6wk19

I have a very similar TV and the AI up scaling can give some fantastic results, including very convincing frame rate interpolation from 30 to 60fps. The problem, of course, is the extra input lag. It’s much improved over past models, but still unsuitable for any kind of serious gaming.

19 : Anonymous2022/02/16 17:57 ID: hx78j8n

I didn't read all your post but will say I have a 900H, 85 inches and found reality creation around 40-50 to be great for lot of games that are sub 4k. It wouldn't be perfect 4K crispness but it's was surprising to me how much better it looked. I made a post less detailed than yours and didn't get much response when I found out a bout the feature. The first game I tried it on was Deathloop in 60 fps mode and it was a huge difference to my eyes.

For me though I'm 9 feet from an 85 inch screen so I feel it might magnify the benefits quite a bit. I'm curious as well to try Horizon performance mode with this one.

That all being said the 40 FPS mode in Rift Apart is still my gold standard, I didn't find the 60 fps to feel that much better to me, but it looked tremdously worse than the 30/40 fps mode. I wish games like Horizon had a 40 fps mode with like a smidge under 4K resolution if that is what it takes. I know many people don't have 120 hz displays but it's going to look dated in 5 years when many people do and are demanding that kind of perfect middle ground between graphics and performance imho.

20 : Anonymous2022/02/16 19:43 ID: hx7od63

I have the A80J Sony OLED and the reality creation setting is incredible for lower res games. Like you said, it doesn't work well with an already 4k image or in games with lots of thin foliage like GoT. But on performance modes it really does make a huge difference in clarity. I recently played Guardians of the Galaxy in performance mode which is 1080p and was really confused reading comments online complaining about the low res and being extremely blurry before I realized I had reality creation on so I was seeing none of that. I'm hoping it can similarly bridge the gap between performance and fidelity modes in the upcoming Horizon

21 : Anonymous2022/02/16 21:59 ID: hx89ett

This just looks like sharpening with extra steps

22 : Anonymous2022/02/16 22:14 ID: hx8bpnp

talk about a post in need of a TLDR

引用元:https://www.reddit.com/r/PS5/comments/stul7j/tvbased_ai_image_enhancement_and_gaming_thoughts/

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x