- Samsung confirms its next-gen Exynos SoC with AMD RDNA2 GPU will support ray tracing - VideoCardz.com
-
Why tho? I don't see this working at a reasonable frame rate.
ID: hfe3nsgID: hfelkl2Unreal 5 Lumen is very much all about ray-tracing. It can use hardware ray-tracing when available or it can use very fast software ray-tracing with some compromises.
ID: hfe4iwvI'm pretty sure Unreal 5's Lumen uses ray tracing
ID: hfgjg9yIt maybe because the hardware and software is already there? i'm not sure why people are overthinking it. It's not like they will go out of their way to specifically disable ray tracing on a ray tracing capable GPU.
ID: hfh1ti2It will improve of course and in nVidia land RTX with DLSS already runs buttery smooth
-
Why are people mad about it? Imagine those 2D shiet games with RT or some entry 3D games with that. Noone expects GTA SA, Real Racing 3 or anything like this with RT (but tbh, even those would EASLY work)
ID: hffspdjOri :O
-
Technically but if it's got the same ray tracing design as current rdna2 gpu it ain't gonna work well. Even ray tracing at ampere's level ain't gonna be enough. This is a rumored 3wgp (or 6cu) chip
ID: hfcipr7Isn't this for mobile smartphone SoCs? In that case i doubt ray-tracing even matters. They should just focus on more frames and leave that stuff for the pc/consoles where people nowadays spend thousands of dollars for hardware.
ID: hfck25gYea it's probably just for marketing and ain't gonna work well in use. The ray accelerator units for rdna2 are real small in die area compared to the "cu" and other stuff like the tmu. Samsung probably didn't rip them out due to more work from redesigning and the insignificant area savings
Probably thought they could market it as the 1st ray tracing mobile phone on the market
ID: hfckec2It would still be enough for some applications, e.g. RT audio.
ID: hfcllndYea but not in gaming graphics that the official samsung post is advertising
ID: hfdkhiiThis is a rumored 3wgp (or 6cu) chip
In what world would you believe 6CU@1.3GHz can beat A14?
ID: hfdnew4We don't got a clue about the benchmarking conditions. We don't know if it's clocked higher than intended for testing purposes or not or if that performance can last over time due to thermals. We don't know how the chip was benched (in the phone? on a test bench?)
It ain't gonna matter anyway, 6cu, 8cu or 10cu the ray tracing performance ain't gonna be great if it's the same rdna2 design
-
I actually think it's pretty cool. As the technology matures, maybe in 10 years or so, we won't be saying it isn't possible anymore.
Ray-traced graphics on a device in your pocket. Evolution-wise it would be: PS2 (2007 or so), X360 (2013), Switch/Wii U (2017), now RT for mobile (2021-). Seems like new technologies are going to be trickle-up instead (sooner), which is a different mentality.
I'd say it's up to the desktop space to freaking move their asses, lol.
-
The only use case I see in this is not for games but for accelerating professional applications but definitely not for real time 3D.
Or maybe some more realistic looking augmented reality tools, like for IKEA and the likes? Or some slow moving games that can rely on cached image data, like that lego game?
-
It has VRS (variable rate shading) too. Would love to be able to unfold my Fold 3 and have a nice gaming experience on the inside tablet screen.
ID: hffsmk0Number of games supporting VRS on PC is so low, I bet that it will take whole a lot of time to do one on Mobile. Till Diablo Immortal have something to do with it. As it's the only AAA games I can think about when we talk about Android.
-
Honestly... This ray tracing stuff is getting a bit much. Like new features are welcome, but come on... At least it'd be nice to have a feature championed that added some real value.
-
I am trying to teach myself how to build a local AI and of the things I have learned is a ray tracing is hugely important in AI and is not necessarily only beneficial just for higher graphics performance. So my guess is this will be used to optimize the phone for AI tasks. On the graphics side it can be used to enhance camera functions such as auto focus and low light denoise processes. But can also be used to run local voice to text and conversational AI. Also it can be used to do really crazy selective listening to help with call quality in noisy environments. Honestly the amount of things that can be accomplished with even a small number if parallel cores capable of Ray tracing is astounding to me. I am only a few weeks into learning so please forgive me if I am wrong. I would love more resources if anyone has any. Currently I am working on a license plate recognizer to open my gate but I hope to do much more. My dream is to have a virtual assistant that doesn't share my data with anyone but me.
ID: hfixe07Currently I am working on a license plate recognizer to open my gate but I hope to do much more.
Just curious...why do you need ray tracing for this?
I wrote one application that used image to text, and conceptually i cant see how ray tracing is needed for this. This was about 10 years ago, and my experience was with the tesseract library. I also did some image pre-processing using opencv to improve accuracy before feeding it into tesseract. Accuracy with this was quite good as long as you had ok resolution...unfortunately I had very low resolution images to work with, but it was still good enough for acceptable accuracy.
ID: hfj1rnaLike I said I am just starting my journey into AI so I may be way off base on why Ray tracing is helpful for AI so take this with a grain of salt. But my rudimentary understanding is that you drop the "ray" and just think about it like path tracing. When a new set of data is received by the brain it is basically looking for associations and associations of those associations until you effectively have a neural path to a conclusion. If that conclusion is useful then the path is positively reinforced and if not it is negatively reinforced. Ray tracing has allowed us to accelerate the amount of paths we can check for usefulness. So while it is absolutely possible to do AI without Ray Tracing the same program can be ran way faster if you have that available to you. Currently the GPU I am using is a 980ti so I don't get those benefits but hopefully soon I will be able to afford a used Tesla card from nvidia or what ever makes the most sense once I have the money. License plate reading is rather rudimentary as far as complexity goes so I should be able to pull it off with what I have. But if I want to do real time facial recognition of employees it will take significantly more power.
-
Cool, but will it have open source drivers ?
ID: hffsx9mSince Android 11 you can upgrade graphic drivers via Play Store. And AMD prepared own for Android, will probably update them the same way as PC drivers.
ID: hfg28e9I don't want to use Play store or any closed source software from Google !
I hate spyware.
I'll be using Lineage OS with F-droid.
-
Supporting it, and utilizing it are different things. I don't see this having enough horsepower to effectively utilize it. But who knows, maybe they will surprise us.
-
every little bit that comes out... the more i look forward to it.... that launch date inches a little closer and my Galaxy S3 that's still in use will be allowed to sleep.
-
Gimmick
-
Great... ray traced ads.
-
well once a computer filled a building now its a size of a watch.
引用元:https://www.reddit.com/r/Amd/comments/q141v0/samsung_confirms_its_nextgen_exynos_soc_with_amd/
I guess the idea is #1 sell more due to gimmick, #2 hope that in the long term the performance improves and it becomes standard.
Meanwhile Unreal 5 already depreciated ray-tracing in favor of its own faster lighting solution so, lol... who knows.