Skip to content

a multi-gpu bug #136

@DingJuPeng1

Description

@DingJuPeng1

If I use the batchbins in espnet, It will trigger a multi-GPU bug.
for example, if I use two GPUs, and the final batch_size is 61, and I use data parallel, it will divide into 30, 31,
when I thy to use torch-audiomentations, it will trigger a bug as follow.
image

whether The batch size of each card must be the same or there can be other solutions to avoid this bug.

looking forward to a reply

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions