From 34389ecdd6d56db28dceaa9904e397a6583f8af0 Mon Sep 17 00:00:00 2001 From: Wang Peng <36780733+logicwong@users.noreply.github.com> Date: Fri, 4 Aug 2023 17:08:52 +0800 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index e3b978e..5441957 100644 --- a/README.md +++ b/README.md @@ -62,7 +62,7 @@ Before running the code, make sure you have setup the environment and installed pip install transformers==4.31.0 accelerate tiktoken einops ``` -We recommend installing [flash-attention](https://github.com/Dao-AILab/flash-attention) for higher efficiency and lower memory usage. +We recommend installing [flash-attention](https://github.com/Dao-AILab/flash-attention) for higher efficiency and lower memory usage. (**flash-attention is optional and the project can run normally without installing it**) ```bash git clone -b v1.0.8 https://github.com/Dao-AILab/flash-attention