本文主要是介绍RuntimeError: CUDA out of memory. Tried to allocate 5. If reserved memory is >> allocated memory,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
报错信息如下:
RuntimeError: CUDA out of memory. Tried to allocate 50.00 MiB (GPU 0; 5.80 GiB total capacity; 4.39 GiB already allocated; 35.94 MiB free; 4.46 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
经过向ChatGPT询问,得到如下答案:
import torch
torch.cuda.set_per_process_memory_fraction(0.9, 0) # 可根据需要调整0.9的值
torch.cuda.set_per_process_memory_growth(True)torch.cuda.set_max_split_size(max_size_mb=1024) # 根据需要调整max_size_mb的值
这篇关于RuntimeError: CUDA out of memory. Tried to allocate 5. If reserved memory is >> allocated memory的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!