rtx 3090ti 24Gb VRAM, Cuda out of Memory error

#6
by tjohn8888 - opened

On my rtx 3090ti 24Gb VRAM, I get a Cuda out of Memory error when running app.py... How much video memory is required for this application?
Separately using TripoSG (without gradio interface) 3d mesh builds fine. But the whole problem is texture mapping.

Sign up or log in to comment