Model card of JOxIE_2x7B_base
This is my Token customized macadeliccc/dolphin-mixtral-2x7b model
[Origional Model][https://huggingface.co/macadeliccc/dolphin-mixtral-2x7b]
This is based on macadeliccc/dolphin-mixtral-2x7b model with added custom special Tokens.
New added Special Tokens
'<|functions|>',
'<|gökdeniz|>',
'<|user|>',
'<|josie|>',
'<|assistant|>',
'<|function_call|>',
'<|function_response|>',
'<|image|>',
'<|long_term_memory|>',
'<|short_term_memory|>',
'<|home_state|>',
'<|current_states|>',
'<|context|>'
New BOS and EOS Tokens
BOS = '<|startoftext|>'
EOS = '<|endoftext|>'
New added Normal Tokens
['Gökdeniz Gülmez', 'Gökdeniz', 'Gülmez', 'JOSIE', 'J.O.S.I.E.', 'Josie', 'josie', 'Just an Outstandingly Smart and Intelligent Entity']
- Downloads last month
- 13
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.