Status on inference?
#2
by
ZQ-Dev
- opened
I see the tag on the model card, but everything required for inference with transformers seems to be there, file-wise. Anybody keen on the current status?
I've done what I can to mimic a proper Transformers-supported model, hence why everything looks to be there file-wise, yet inference isn't actually supported.
I've had limited time to dedicate to getting and testing a working Transformers modeling architecture, but I think I should be done by this weekend. Though someone else may beat me to it, which I hope will be the case! ( @bjoernp ?)