DEV Community

NirmalSankalana
NirmalSankalana

Posted on

My PyTorch Model in Kaggle uses 100% CPU and 0% GPU During Training

I'm trying to fine-tune a PyTorch classification model to classify plant disease images. I have properly initialized the CUDA device and sent the model, train, validation, and test data to the device. However, when training the model, it uses 100% CPU and 0% GPU. Why is this happening?

device =

Top comments (0)