ldr_wgpu high performance realtime LDraw renderer


RE: ldr_wgpu high performance realtime LDraw renderer
#6
(2023-04-05, 15:48)Philippe Hurbain Wrote: My machine is equipped with a GeForce GTS240 board. That's a 2009 hardware  Blush. When I say ageing...

That is indeed quite old. That's great that it still runs Smile . I wouldn't expect ldr_wgpu to run much faster than existing programs on your GPU even if I could get it working due to hardware limitations.

For those curious, an OpenGL 3.3 compatible way to do occlusion culling is with occlusion queries. This tells the GPU to draw an object (usually a simplified version like a bounding box), check if it's occluded, and then tell the CPU if it's occluded or not. Issuing all those queries and waiting to hear back can take a while. With lots of small objects like in Lego models, this would add a lot of overhead. Some game engines wait some number of frames to check the queries to improve performance, but this can create flickering if the camera moves too quickly. You can also "bake" the occlusion checks by removing hidden geometry like some scripts have done, but this won't occlude as much geometry and takes a while to calculate.

By "modern" GPUs I mean GPUs with more general purpose computing capabilities like compute shaders. GPUs used to just do rendering. Now you can schedule your own computations to run on the GPU. With compute shaders, the GPU can perform a similar check just using some math operations and run this for many objects in parallel. This scales very well to large scenes with lots of objects and doesn't require the CPU to wait for anything to complete.
Reply
« Next Oldest | Next Newest »



Messages In This Thread
RE: ldr_wgpu high performance realtime LDraw renderer - by Jonathan N - 2023-04-08, 0:12

Forum Jump:


Users browsing this thread: 4 Guest(s)