Update README.md

main
Junyang Lin committed by GitHub
parent 05577dea69
commit 581512f6b5
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -99,11 +99,13 @@ Before running the code, make sure you have setup the environment and installed
pip install -r requirements.txt
```
If your device supports fp16 or bf16, we recommend installing [flash-attention](https://github.com/Dao-AILab/flash-attention) for higher efficiency and lower memory usage. (**flash-attention is optional and the project can run normally without installing it**)
If your device supports fp16 or bf16, we recommend installing [flash-attention](https://github.com/Dao-AILab/flash-attention) (**we support flash attention 2 now.**) for higher efficiency and lower memory usage. (**flash-attention is optional and the project can run normally without installing it**)
```bash
git clone -b v1.0.8 https://github.com/Dao-AILab/flash-attention
cd flash-attention && pip install .
# Previous installation commands. Now Flash Attention 2 is supported.
# git clone -b v1.0.8 https://github.com/Dao-AILab/flash-attention
# cd flash-attention && pip install .
pip install flash-attn --no-build-isolation
# Below are optional. Installing them might be slow.
# pip install csrc/layer_norm
# pip install csrc/rotary

Loading…
Cancel
Save