Skip to content

Inquiry Regarding "CUDA Out of Memory" Error with Spitfire Filter #3

@rohud91

Description

@rohud91

Hello,
I am encountering a "CUDA out of memory" error when applying the Spitfire filter to large images. I was wondering if there is a way to process these images iteratively or by splitting them into smaller parts while maintaining their original size and resolution.
Specifically, is it possible to send smaller batches to CUDA for processing without downscaling the image? I have already tried clearing the CUDA cache and setting max_split_size_mb: 21, but this did not resolve the issue.
I would appreciate any insights you can provide. Thank you for your help and your excellent work in image analysis!
Best regards,
Roman Hudeček

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions