Skip to content

set_epoch for DistributedSampler #224

@ananyahjha93

Description

@ananyahjha93

Describe the bug
PyTorch example suggests the use set_epoch function for DistributedSampler class before each epoch start. I could not find this function call in lightning's trainer module.

https://github.com/pytorch/examples/blob/master/imagenet/main.py
Line 232-234

As can be seen from the DistributedSampler class code (https://github.com/pytorch/pytorch/blob/master/torch/utils/data/distributed.py), the set_epoch function is required to set the seed for each iter function call.

Can you confirm if this function has been called on DistributedSampler (for training dataset) at some point in lightning's trainer module?

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions