-
In-Context Pretraining: Language Modeling Beyond Document Boundaries
Paper • 2310.10638 • Published • 28 -
Magicoder: Source Code Is All You Need
Paper • 2312.02120 • Published • 79 -
Parameter Efficient Tuning Allows Scalable Personalization of LLMs for Text Entry: A Case Study on Abbreviation Expansion
Paper • 2312.14327 • Published • 6 -
WaveCoder: Widespread And Versatile Enhanced Instruction Tuning with Refined Data Generation
Paper • 2312.14187 • Published • 49
sood
atinsood
AI & ML interests
None yet
Organizations
Collections
1
Papers
2
models
None public yet
datasets
None public yet