IT- Facts Details

MIT Has fake smile Detect...

The research could pave the way for computers that better assess the emotional states of their users and respond accordingly. It could also help train those who have difficulty interpreting expressions, such as people with autism, to more accurately gauge the expressions they see.

“The goal is to help people with face-to-face communication,” says Ehsan Hoque, a graduate student in the Affective Computing Group of MIT’s Media Lab who is lead author of a paper just published in the IEEE Transactions on Affective Computing. Hoque’s co-authors are Rosalind Picard, a professor of media arts and sciences, and Media Lab graduate student Daniel McDuff.

Get A Quote

Need Any Consultations or
Work Next Projects