From 8effcb2a76402a0e9caa94e6bd8ec8b68cac89f6 Mon Sep 17 00:00:00 2001 From: Junyang Lin Date: Fri, 20 Oct 2023 22:04:43 +0800 Subject: [PATCH] Update README_CN.md --- README_CN.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/README_CN.md b/README_CN.md index 6f98599..a38db3a 100644 --- a/README_CN.md +++ b/README_CN.md @@ -667,7 +667,8 @@ merged_model.save_pretrained(new_model_directory, max_shard_size="2048MB", safe_ ### vLLM 如希望部署及加速推理,我们建议你使用vLLM和FastChat。首先安装相应的代码库: ```bash -pip install vllm fastchat +pip install vllm +pip install "fschat[model_worker,webui]" ``` 你也可以通过`git clone`和`pip install -e .`的方式通过源码安装。如果遇到安装问题,请阅读它们的官方文档。