ONNX format
Generated from roneneldan/TinyStories-1M
For use with Transformers.js
const pipe = await pipeline(
"text-generation",
"mkly/TinyStories-1M-ONNX",
);
const response = await pipe(
"Some example text",
{
max_new_tokens: 500,
temperature: 0.9,
},
);
console.log(response[0].generated_text);
- Downloads last month
- 16
Inference API (serverless) does not yet support transformers.js models for this pipeline type.