From ee5350521ede55e6d64f2fb619e84f99262084b1 Mon Sep 17 00:00:00 2001 From: Junyang Lin Date: Sun, 8 Oct 2023 10:24:35 +0800 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index ccb717a..7144841 100644 --- a/README.md +++ b/README.md @@ -429,7 +429,7 @@ model = AutoPeftModelForCausalLM.from_pretrained( ).eval() ``` -If you want to merge the adapters and save the finetuned model as a standalone model, you can run the following codes: +If you want to merge the adapters and save the finetuned model as a standalone model (you can only do this with LoRA, and you CANNOT merge the parameters from Q-LoRA), you can run the following codes: ```python from peft import AutoPeftModelForCausalLM