This is an old revision of the document!
Time: Thursday 3:30-5:30pm.
Place: mostly in DBH4011, except for week 2 and 3. I will send out reminder for this.
At least one week prior to your presentation, please fill out the papers/topics that you plan to present at the meeting.
Topic: Facial expression analysis and affective computing
Abstract: Automatic analysis of facial expression in a realistic scenario is a difficult problem due to that the 2-D imagery of human facial expression consists of rigid head motion and non-rigid muscle motion. We are tasked to solve this “coupled-motion” problem and analyze facial expression in a meaningful manner. We first proposed an image-based representation, Emotion Avatar Image, to help person-independent expression recognition. This method allows us to analyze facial expression in a canonical space, which makes the comparison of corresponding features more accurate and reasonable. Second, an real-time registration technique is designed to improve frame-based streaming facial action unit (AU) recognition. We do not always have the luxury of obtaining the temporal segmented discrete facial expressions, e.g., joy or surprise. The project introduces a frame-based method for registration. It not only aligns faces (or objects in general) to a reference, but also guarantees temporal smoothness, both of which are essential for spontaneous expression analysis. Third, the proposed accurate expression recognition techniques are then applied to the field of advertising, where facial expression is demonstrated to be closely correlated with the commercial viewing behavior of audiences.
Paper:
Paper:
Paper:
Paper:
Paper:
Paper:
Paper:
Paper:
Paper: