Add USE_FLASH_ATTENTION flag to setup.py (#92903)

# Summary
Adds documentation to setup.py for USE_FLASH_ATTENTION=0 disabling to decrease build times.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/92903
Approved by: https://github.com/cpuhrsch, https://github.com/bdhirsh
This commit is contained in:
Driss Guessous
2023-01-24 22:59:47 +00:00
committed by PyTorch MergeBot
parent bf1ff4918f
commit 4bc0491752

View File

@ -95,6 +95,9 @@
# USE_FFMPEG
# enables use of ffmpeg for additional operators
#
# USE_FLASH_ATTENTION=0
# disables building flash attention for scaled dot product attention
#
# USE_LEVELDB
# enables use of LevelDB for storage
#