Update README.md

main
Junyang Lin 2 years ago committed by GitHub
parent 54189acc41
commit 12e4c8bda5
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -102,10 +102,8 @@ pip install -r requirements.txt
If your device supports fp16 or bf16, we recommend installing [flash-attention](https://github.com/Dao-AILab/flash-attention) (**we support flash attention 2 now.**) for higher efficiency and lower memory usage. (**flash-attention is optional and the project can run normally without installing it**) If your device supports fp16 or bf16, we recommend installing [flash-attention](https://github.com/Dao-AILab/flash-attention) (**we support flash attention 2 now.**) for higher efficiency and lower memory usage. (**flash-attention is optional and the project can run normally without installing it**)
```bash ```bash
# Previous installation commands. Now flash attention 2 is supported. git clone https://github.com/Dao-AILab/flash-attention
# git clone -b v1.0.8 https://github.com/Dao-AILab/flash-attention cd flash-attention && pip install .
# cd flash-attention && pip install .
pip install flash-attn --no-build-isolation
# Below are optional. Installing them might be slow. # Below are optional. Installing them might be slow.
# pip install csrc/layer_norm # pip install csrc/layer_norm
# pip install csrc/rotary # pip install csrc/rotary

Loading…
Cancel
Save