Skip to content

Lora Adapter Loading Issues in Multi-Stage Model Training #176

@Gbone3176

Description

@Gbone3176

Thank you for your outstanding work!

I'm a newbie just getting started with LoRA fine-tuning and have some questions I'd like to ask. When reproducing the process, my training workflow is mntp -> SimCSE -> Supervised. After completing the SimCSE training, when I start the Supervised training, do I need to load both LoRA adapter checkpoints from the mntp and SimCSE training processes simultaneously? Or is it sufficient to import only the adapter checkpoint obtained from the SimCSE training? I'm eagerly awaiting your response and thank you so much!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions