The aim of this project is to measure the satisfaction while testing food products.
The system is applied in Deep Learning to measure and assess the emotions of the face while eating. MobileNetV2 was chosen for detecting eating behavior, with an accuracy of 95.7%. And VGG16 was chosen for recognizing facial expression, with an accuracy of 92.9%.
The satisfaction is divided into nine types, for enabling food producers to have more accurately the satisfaction that testers have with the food or product.
- The Eating Behavior Detection dataset was collected by our team, consisting of 14 university students.
- The Facial Expression Recognition is trained and tested on a mixture of CK+ and JAFFE datasets.
- opencv
- numpy
- keras
- sklearn
- mlxtend
- matplotlib
- plotly
- pandas
Pattern | Symbol | Meaning |
---|---|---|
Positive | + + | Easy to chew, Taste good and feeling satisfied after eating |
Neutral | N N | Easy to chew and feeling normal after eating |
Negative | - - | Chewing food at first and feeling less satisfied after eating |
Positive Neutral | + N | Easy to chew, Taste good and feeling normal after eating |
Positive Negative | + - | Easy to chew, Taste good and feeling less satisfied after eating |
Neutral Positive | N + | Easy to chew and feeling satisfied after eating |
Neutral Negative | N - | Easy to chew and feeling less satisfied after eating |
Negative Positive | - + | Chewing food at first and feeling satisfied after eating |
Negative Neutral | - N | Chewing food at first and feeling normal after eating |
After installed libraries, then copy downloaded model folders to ./models directory
To run the program, use script below.
python main.py --video {VIDEO_PATH}
And the program will display satisfaction graph of the video after processed.