Static and dynamic facial emotion recognition using the Emo-AffectNet model

PWC App

This is Emo-AffectNet model for facial emotion recognition by videos / images.

To see the emotion detected by webcam, you should run run_webcam. Webcam result:

result

For more information see GitHub.

Citation

If you are using EMO-AffectNet model in your research, please consider to cite research paper. Here is an example of BibTeX entry:

@article{RYUMINA2022,
  title        = {In Search of a Robust Facial Expressions Recognition Model: A Large-Scale Visual Cross-Corpus Study},
  author       = {Elena Ryumina and Denis Dresvyanskiy and Alexey Karpov},
  journal      = {Neurocomputing},
  year         = {2022},
  doi          = {10.1016/j.neucom.2022.10.013},
  url          = {https://www.sciencedirect.com/science/article/pii/S0925231222012656},
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Inference API (serverless) does not yet support PyTorch models for this pipeline type.