diff --git a/README.md b/README.md index 2ef63c9..9450d29 100644 --- a/README.md +++ b/README.md @@ -63,7 +63,7 @@ Before running the code, make sure you have setup the environment and installed pip install transformers==4.31.0 accelerate tiktoken einops ``` -We recommend installing `flash-attention` for higher efficiency and lower memory usage. +We recommend installing [flash-attention](https://github.com/Dao-AILab/flash-attention) for higher efficiency and lower memory usage. ```bash git clone -b v1.0.8 https://github.com/Dao-AILab/flash-attention