Skip to content

concatenate two BPE tokenizer #129

@mackmake

Description

@mackmake

hello. i saw in a paper that they combined two tokenizer, one of them was trained by authors and the second was a pretrained one.
is it possible to do this without having their training data?
if yes, how should we put merges in a correct way?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions