diff --git a/README.md b/README.md index 5441957..85f2b5e 100644 --- a/README.md +++ b/README.md @@ -62,7 +62,7 @@ Before running the code, make sure you have setup the environment and installed pip install transformers==4.31.0 accelerate tiktoken einops ``` -We recommend installing [flash-attention](https://github.com/Dao-AILab/flash-attention) for higher efficiency and lower memory usage. (**flash-attention is optional and the project can run normally without installing it**) +If your device supports fp16 or bf16, we recommend installing [flash-attention](https://github.com/Dao-AILab/flash-attention) for higher efficiency and lower memory usage. (**flash-attention is optional and the project can run normally without installing it**) ```bash git clone -b v1.0.8 https://github.com/Dao-AILab/flash-attention