Skip to content

Darcyddx/graph-prompt

Repository files navigation

Project Page Graph Your Own Prompt

Paper PDFProject PagearXiv

Xi Ding, Lei Wang, Piotr Koniusz, Yongsheng Gao

Citation

@article{ding2025graph,
  title={Graph Your Own Prompt},
  author={Ding, Xi and Wang, Lei and Koniusz, Piotr and Gao, Yongsheng},
  journal={Advances in Neural Information Processing Systems},
  year={2025}
}

Overview

framework

GCR is a plug-and-play, parameter-free, and lightweight method that works with any model, improving feature quality and generalization without changing the architecture.

Installation

git clone https://github.com/Darcyddx/graph-prompt.git
cd graph-prompt
bash setup.sh

Data Preparation

Before running the experiments, please prepare the datasets as follows:

  1. Download datasets

    • CIFAR-10 and CIFAR-100 will be downloaded automatically if you run the training code.
    • You can download the Tiny ImageNet from Kaggle.
  2. Organize the data structure

graph-prompt/
├── data/
├── CIFAR-10 files (auto-downloaded by torchvision)
├── CIFAR-100 files (auto-downloaded by torchvision)  
└── tiny/             
    ├── train/
    │   ├── n01443537/
    │   │   └── images/
    │   ├── n01629819/
    │   │   └── images/
    │   └── ... (200 class folders)
    └── val/
        ├── images/
        └── val_annotations.txt

Usage

Training Examples

1. Train on CIFAR-10 with GoogLeNet

python train.py -dataset cifar10 -net googlenet -num_elements 15 \
  -stage_mode early -weight_method linear -use_detach \
  -log train_logs -log_name googlenet.log \
  -best_checkpoint checkpoints -gpu

2. Train on CIFAR-100 with MobileNet

python train.py -dataset cifar100 -net mobilenet -num_elements 6 \
  -stage_mode late -weight_method equal \
  -log train_logs -log_name mobilenet.log \
  -best_checkpoint checkpoints -gpu

3. Train on Tiny ImageNet with MobileViT

python train.py -dataset tiny_imagenet -net mobilevit_xxs -num_elements 6 \
  -stage_mode middle+late -weight_method adaptive -detach_adaptive \
  -log train_logs -log_name mobilevit_xxs.log \
  -best_checkpoint checkpoints -gpu

Note: When running on different datasets, ensure you change the num_class of the model's classifier head accordingly and num_elements in args.

Evaluation

python eval.py -dataset cifar10 -net mobilenet \
    -weights_path ./checkpoints/cifar10_mobilenet-best.pth \
    -num_elements 6 -batch_size 128 -gpu

Visualization

For t-SNE visualization on CIFAR-10 dataset:

python tsne.py

Set the model path to your trained model in the script.

Pre-trained Models

Pre-trained models are available at: Google Drive

Applying GCR to Your Own Models

If you want to apply the GCR method to another model, you can directly import gcr.py into the model you want to use. Then, add the layers with GCL applied inside the forward function, just like in some of the models in ./models.

Integration Steps:

  1. Import the GCR module: from gcr import GCR
  2. Initialize GCR layers in your model's __init__ method
  3. Apply GCR transformations in the forward method at desired stages

Acknowledgement

We would like to express our gratitude to the authors of pytorch-cifar100 for providing such a valuable resource, and to the contributors of the following great models: MobileNet, ShuffleNet, SqueezeNet, GoogLeNet, ResNeXt, ResNet, DenseNet, Masked Autoencoders, Stochastic ResNet, SE-ResNet, ViT, Swin, MobileViT, CEiT, iFormer, and ViG.

Xi Ding, a visiting scholar at the ARC Research Hub for Driving Farming Productivity and Disease Prevention, Griffith University, conducted this work under the supervision of Lei Wang.

We sincerely thank the anonymous reviewers for their invaluable insights and constructive feedback, which have greatly contributed to improving our work.

This work was supported by the Australian Research Council (ARC) under Industrial Transformation Research Hub Grant IH180100002.

This work was also supported by computational resources provided by the Australian Government through the National Computational Infrastructure (NCI) under both the ANU Merit Allocation Scheme and the CSIRO Allocation Scheme.

Releases

No releases published

Packages

 
 
 

Contributors