update readme

main
yangapku 2 years ago
parent 32ede85ee2
commit cb2c174450

@ -75,8 +75,8 @@ If your device supports fp16 or bf16, we recommend installing [flash-attention](
git clone -b v1.0.8 https://github.com/Dao-AILab/flash-attention
cd flash-attention && pip install .
# Below are optional. Installing them might be slow.
pip install csrc/layer_norm
pip install csrc/rotary
# pip install csrc/layer_norm
# pip install csrc/rotary
```
Now you can start with ModelScope or Transformers.

@ -75,8 +75,8 @@ pip install -r requirements.txt
git clone -b v1.0.8 https://github.com/Dao-AILab/flash-attention
cd flash-attention && pip install .
# 下方安装可选,安装可能比较缓慢。
pip install csrc/layer_norm
pip install csrc/rotary
# pip install csrc/layer_norm
# pip install csrc/rotary
```
接下来你可以开始使用Transformers或者ModelScope来使用我们的模型。

@ -79,8 +79,8 @@ pip install -r requirements.txt
git clone -b v1.0.8 https://github.com/Dao-AILab/flash-attention
cd flash-attention && pip install .
# 以下はオプションです。インストールに時間がかかる場合があります。
pip install csrc/layer_norm
pip install csrc/rotary
# pip install csrc/layer_norm
# pip install csrc/rotary
```
これで ModelScope か Transformers で始めることができます。

@ -161,4 +161,4 @@ with gr.Blocks() as demo:
if len(sys.argv) > 1:
demo.queue().launch(**vars(args))
else:
demo.queue().launch()
demo.queue().launch()
Loading…
Cancel
Save