Skip to content

Soft DTW with ignore_padding_token #515

@anupsingh15

Description

@anupsingh15

Hello,

I have a batch of pairs of sequences. Each pair contains sequences of different lengths, which are padded to equal lengths. Is there a way to ignore these padded elements to compute the soft-dtw alignment? For example, https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html provides a feature to ignore a particular class index to compute cross-entropy loss.

Do you suggest any workaround to compute the dtw loss efficiently in such a case? I can only think of processing (removing paddings) each pair sample individually, but this will be too slow.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions