Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
jinaai
/
jina-bert-flash-implementation
like
5
Follow
Jina AI
400
Transformers
bert
custom_code
Inference Endpoints
🇪🇺 Region: EU
Model card
Files
Files and versions
Community
18
Train
Deploy
Use this model
d9a681b
jina-bert-flash-implementation
/
configuration_bert.py
Commit History
removed num_tasks from config
6546b2c
Markus28
commited on
Mar 20
feat: make num of loras part of the config
a416a9d
Markus28
commited on
Mar 18
added classifier dropout
767b681
Markus28
commited on
Mar 5
feat: moved flash attention code into this repository
46df05d
Markus28
commited on
Mar 5
feat: added encode method
32458be
Markus28
commited on
Mar 1
feat: added option for QK normalization
463061d
Markus28
commited on
Mar 1
feat: implement task type embeddings (
#1
)
8adf551
verified
Markus28
commited on
Mar 1
feat: added back option not to use flash attention
d4d5621
Markus28
commited on
Mar 1
Added additional config options
5b58f09
Markus28
commited on
Feb 27
changed model_type
c35343d
Markus28
commited on
Feb 23
feat: added dense_seq_output to config
75a4e4d
Markus28
commited on
Feb 22
feat: changed model_type
eeb05a3
Markus28
commited on
Feb 22
feat: reverted monkey patch
3160695
Markus28
commited on
Feb 22
Revert "feat: added back option to disable flash attention"
b7ee9c4
Markus28
commited on
Feb 21
feat: added back option to disable flash attention
a2c07ba
Markus28
commited on
Feb 21
initial commit
87b642a
Markus28
commited on
Feb 21