ĭefaultGraphicsRHI=DefaultGraphicsRHI_DX11 You can either make DefaultGraphicsRHI=DefaultGraphicsRHI_DX11 in DefaultEngine.ini (to use Direct X 11 rather than 12) or go to “Edit → Project settings → Platforms → Windows → Default RHI” and select DirectX 11 or Vulkan (I didn’t test it yet). Hence, you need to either change your RHI settings or your dynamic GI method. It seems that this issue occurs due to a bug using DX12 (DirectX 12) with “lumen” which is the default “Dynamic Global Illumination Method” (in the the projects DefaultEngine.ini we have DefaultGraphicsRHI=DefaultGraphicsRHI_DX12 and r.DynamicGlobalIlluminationMethod=1). This is the work that has been done on several UE4 projects to keep the memory down, remove this issue, and still maintain a high level of fidelity and qaulity. Dig into the engine, and tell it to start reusing buffers of the proper size, but from a previous frame. If you absolutely must by dynamic then standardize your capture size so you can re-use the off screen buffers. Dynamic textures that are all different sizes. Dynamic geometry, with varible number of verts. Then the next frame, the buffer is resized to 1080p again, but it cant re-use the 640 buffer, and it forgot about the previous 1080p buffer, so then we are consuming even more GPU memory than ever. Each time the renderer changes the outbut buffer size, those buffers get locked for the frame of the GPU its used on. EI: doing a render at 1080p, then another one at 640. One thing that exacerbates this is resizing the output buffer size. Each one represents THAT SINGLE frame of dynamic data waiting to be consumed on the GPU. When you start creating and detroying assets then you end up with 4 copies in memory. If you have static images geometry, then the same asset is used each from on the GPU. The GPU copy cannot be freed or modified until those 4 frames have passed. It gets sent to render a frame, In 3-4 frames we will see it. When you create an asset for the GPU, this typically gets populated from the cpu, a second buffer for the GPU is created.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |