Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
davanstrien 
posted an update Apr 5
Post
2747
TIL: since Text Generation Inference supports Messages API, which is compatible with the OpenAI Chat Completion API, you can trace calls made to inference endpoints using Langfuse's OpenAI API integration.

A Hugging Face Pro subscription includes access to many models you want to test when developing an app (https://huggingface.co/blog/inference-pro). Using the endpoint and tracing your generations during this development process is an excellent way for GPU-poor people to bootstrap an initial dataset quickly while prototyping.
In this post