Skip to content

Conversation

@XianBW
Copy link
Collaborator

@XianBW XianBW commented Dec 15, 2025

Description

Motivation and Context

How Has This Been Tested?

  • If you are adding a new feature, test on your own test scripts.

Screenshots of Test Results (if appropriate):

  1. Your own tests:

Types of changes

  • Fix bugs
  • Add new feature
  • Update documentation

📚 Documentation preview 📚: https://RDAgent--1314.org.readthedocs.build/en/1314/

Jensen246 and others added 30 commits November 22, 2025 21:13
* feat: add iterative evolve and evaluation support with partial chain stop

* feat: add FTDataEvaluator and support multiple implement functions in finetune
…1303)

* feat:(1) support for multi layer dataset extraction (2) add category.json for dataset in datasets/

* fix: fix bug for generate category.json

* feat: add get_dataset_folder_desc

* init data proposal and merge qzli/ft

* update data proposal prompts and add max_position_embeddings and resolve confilcts

* remove sample counts in data proposal

* turn data and train to unified hypo_gen

* refine prompts

* remove category.json and add it to dataset_info

* fix jinja problem and proposal done

* lint

* add ai-generated description and raw readme into dataset_info.json

* update prompt for description

* add datasets

* initial fix for proposal of data

* final version for data proposal

* lint
* refactor(dataset): add stats into dataset_info.json, and remove dataset from gitignore_folder

* feat: enable data coder and run data process
* feat: implement finetune data coding, evaluation, and config improvements

* fix: deepspeed config path

* fix: dataset info columns

---------

Co-authored-by: Young <[email protected]>
@Jensen246 Jensen246 force-pushed the finetune branch 3 times, most recently from 2fca13c to 726ad84 Compare January 15, 2026 15:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants