Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changing max_seq_length does not update max_length in config.json #97

Open
jackievaleri opened this issue Apr 4, 2023 · 0 comments
Open

Comments

@jackievaleri
Copy link

jackievaleri commented Apr 4, 2023

I was wondering if this behavior is intended. For instance when I run run_finetune.py with the following code:

python run_finetune.py --model_type dna --tokenizer_name=dna$KMER --model_name_or_path $MODEL_PATH --task_name dnaprom --do_train --data_dir $DATA_PATH --per_gpu_eval_batch_size=32 --per_gpu_train_batch_size=32 --learning_rate 2e-4 --output_dir $OUTPUT_PATH --logging_steps 100 --save_steps 4000 --warmup_percent 0.1 --overwrite_output --weight_decay 0.01 --n_process 8 --max_seq_length 59 --hidden_dropout_prob 0.1 --num_train_epochs 5.0

The config.json file still has "max_length": 20. Should I be editing the config.json file prior to finetuning?

Thanks so much for your help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant