YiJina / requirements.txt
Tonic's picture
add flash attention
f1d7f71 unverified
raw
history blame
No virus
494 Bytes
huggingface_hub
einops
sentence-transformers
torch
transformers
openai
python-dotenv
chromadb
langchain-community
langchain-chroma
unstructured[all-docs]
libmagic
gradio
flash-attn==2.6.3 # Flash attention module
numpy<2 # Downgrade to avoid NumPy 2.0.1 conflicts
pybind11>=2.12 # Ensure compatibility for modules needing pybind11
# poppler
# tesseract
# libxml2
# libxslt
# git+https://github.com/xlang-ai/instructor-embedding.git@4721e7375afeb8fcb32400a13057f9348bb69392