Skip to content

weights_load issue #140

@fredericcervenansky

Description

@fredericcervenansky

Hello,

I'm trying to use your code (through your dockerfile) )

bin/C2C spine -i ../data

but I'm having the following problem.

Processing: ../data/1.3.6.1.4.1.14519.5.2.1.3651847619618266745147039566120810943 with 451 slices

Inference pipeline:
(1) DicomToNifti
(2) SpineSegmentation
(3) ToCanonical
(4) SpineComputeROIs
(5) SpineMetricsSaver
(6) SpineCoronalSagittalVisualizer
(7) SpineReport

Starting inference pipeline for:

Running DicomToNifti with input keys odict_keys(['inference_pipeline'])
Finished DicomToNifti with output keys dict_keys([])

Running SpineSegmentation with input keys odict_keys(['inference_pipeline'])
Segmenting spine...

If you use this tool please cite: https://pubs.rsna.org/doi/10.1148/ryai.230024

No GPU detected. Running on CPU. This can be very slow. The '--fast' or the --roi_subset option can help to reduce runtime.
TotalSegmentator sends anonymous usage statistics. If you want to disable it check the documentation.
Downloading model for Task 292 ...
Downloading: 100%|█████████████████████████████████████████████████████████| 234M/234M [00:31<00:00, 7.35MB/s]
Download finished. Extracting...
Resampling...
Resampled in 11.53s
Predicting part 1 of 1 ...
Traceback (most recent call last):
File "/Comp2Comp/comp2comp/utils/process.py", line 131, in process_3d
pipeline(output_dir=output_dir, model_dir=model_dir)
File "/Comp2Comp/comp2comp/inference_pipeline.py", line 73, in call
output = inference_class(inference_pipeline=self, **output)
File "/Comp2Comp/comp2comp/spine/spine.py", line 51, in call
seg, mv = self.spine_seg(
File "/Comp2Comp/comp2comp/spine/spine.py", line 141, in spine_seg
seg = totalsegmentator(
File "/usr/local/lib/python3.9/site-packages/totalsegmentatorv2/python_api.py", line 293, in totalsegmentator
seg_img = nnUNet_predict_image(input, output, task_id, model=model, folds=folds,
File "/usr/local/lib/python3.9/site-packages/totalsegmentatorv2/nnunet.py", line 372, in nnUNet_predict_image
nnUNetv2_predict(tmp_dir, tmp_dir, tid, model, folds, trainer, tta,
File "/usr/local/lib/python3.9/site-packages/totalsegmentatorv2/nnunet.py", line 205, in nnUNetv2_predict
predictor.initialize_from_trained_model_folder(
File "/usr/local/lib/python3.9/site-packages/nnunetv2/inference/predict_from_raw_data.py", line 86, in initialize_from_trained_model_folder
checkpoint = torch.load(join(model_training_output_dir, f'fold_{f}', checkpoint_name),
File "/usr/local/lib/python3.9/site-packages/torch/serialization.py", line 1524, in load
raise pickle.UnpicklingError(_get_wo_message(str(e))) from None
_pickle.UnpicklingError: Weights only load failed. This file can still be loaded, to do so you have two options, do those steps only if you trust the source of the checkpoint.
(1) In PyTorch 2.6, we changed the default value of the weights_only argument in torch.load from False to True. Re-running torch.load with weights_only set to False will likely succeed, but it can result in arbitrary code execution. Do it only if you got the file from a trusted source.
(2) Alternatively, to load with weights_only=True please check the recommended steps in the following error message.
WeightsUnpickler error: Unsupported global: GLOBAL numpy.core.multiarray.scalar was not an allowed global by default. Please use torch.serialization.add_safe_globals([numpy.core.multiarray.scalar]) or the torch.serialization.safe_globals([numpy.core.multiarray.scalar]) context manager to allowlist this global if you trust this class/function.

Check the documentation of torch.load to learn more about types accepted by default with weights_only https://pytorch.org/docs/stable/generated/torch.load.html.
Exception ignored in: <totalsegmentatorv2.libs.DummyFile object at 0x79b40aaeaf40>
AttributeError: 'DummyFile' object has no attribute 'flush'`

What should I do? An additional argument to enter?
Thanks in advance

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions