Responsibly building AI also means knowing its impact on the environment and the hidden carbon costs associated with itπ± If you're interested in the subject, you can check out my latest community article: https://huggingface.co/blog/as-cle-bert/is-ai-carbon-footprint-worrisome Where I try to unravel AI's carbon footprint and potential solutions to reduce itπ» Enjoy!π€
I have finished writing a blogpost about building an image-based retrieval system, This is one of the first-ever approaches to building such a pipeline using only open-source models/libraries π€
Hi everyone! I'm Alex, I'm 16, I've been an internship at Hugging Face for a little over a week and I've already learned a lot about using and prompting LLM models. With @victor as tutor I've just finished a space that analyzes your feelings by prompting an LLM chat model. The aim is to extend it so that it can categorize hugging face posts.
Thrilled to introduce Adam-mini, an optimizer that achieves on-par or better performance than AdamW with 45% to 50% less memory footprint. Adam-mini can also achieve 49.5% higher throughput than AdamW on Llama2-7B pre-training.
The design of Adam-mini is inspired by certain Hessian structures we observed on Transformers.
Feel free to try it out! Try switching to Adam-mini with the same hyperparams of AdamW, it would work with only half memory. Hope Adam-mini can help save time, cost, and energy in your tasks!
We have just uploaded a new version of ALERT π¨ on ArXiv with novel insights into the weaknesses and vulnerabilities of LLMs! π https://arxiv.org/abs/2404.08676