fidelrmdhn's picture
Initial convert functionary-small-v2.4 model weights with q4f16_1 quantized
0461508
raw
history blame
95 Bytes
{
"<|content|>": 32002,
"<|from|>": 32000,
"<|recipient|>": 32001,
"<|stop|>": 32003
}