Automated Representation of Non-Emotional Expressivity to Facilitate Understanding of Facial Mobility: Preliminary Findings

Clawson, K., Delicato, Louise and Young, Shell (2017) Automated Representation of Non-Emotional Expressivity to Facilitate Understanding of Facial Mobility: Preliminary Findings. In: Intelligent Systems Conference 2017, 7-8 September 2017, London. (In Press)

WarningThere is a more recent version of this item available.
[img]
Preview
PDF
CAMERA_READY_ARNEE.pdf - Accepted Version

Download (900kB) | Preview

Abstract

We present an automated method of identifying and representing non-emotional facial expressivity in video data. A benchmark dataset is created using the framework of an existing clinical test of upper and lower face movement, and initial findings regarding automated quantification of facial motion intensity are discussed. We describe a new set of features which combine tracked interest point statistics within a temporal window, and explore the effectiveness of those features as methods of quantifying changes in non-emotional facial expressivity of movement in the upper part of the face. We aim to develop this approach as a protocol which could inform clinical diagnosis and evaluation of treatment efficacy of a number of neurological conditions including Parkinson’s Disease.

Item Type: Conference or Workshop Item (Paper)
Subjects: Computing > Information Systems
Computing
Divisions: Faculty of Computer Science
Faculty of Applied Sciences > Department of Computing Engineering and Technology
Depositing User: Kathy Clawson
Date Deposited: 05 Apr 2017 09:59
Last Modified: 14 Aug 2017 15:41
URI: http://sure.sunderland.ac.uk/id/eprint/7035

Available Versions of this Item

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year