Skip to content

[model] Add bridge support for GLM-4.7-Flash #2972

@jQizhang

Description

@jQizhang

Hugging Face repository

https://huggingface.co/zai-org/GLM-4.7-Flash

Architecture family

MoE decoder-only (e.g. DeepSeek V2/V3, OLMoE, Qwen3-MoE, MiniMax-M2)

Required deliverables

  • Model providers
  • HF conversion bridge
  • Unit tests (config and bridge)
  • Model conversion functional tests
  • Optimal pretraining recipe
  • Optimal finetuning recipe
  • Recipe unit tests
  • Recipe functional tests
  • End-to-end CI coverage

Owner if known

No response

Extra context

zai-org/GLM-4.7-Flash uses a new architecture class Glm4MoeLiteForCausalLM (model_type: glm4_moe_lite), which is not covered by the existing GLM45Bridge. Are there any plans to support this model? Thanks.

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:modelModel implementations and HF bridge logiccommunity-requestfeatureNew capabilities, enhancements, or enablement workneeds-authorAuthor action is required before review or merge can continue

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions