Treffer: The Role of Facial Action Units in Investigating Facial Movements During Speech.
Weitere Informationen
Investigating how facial movements can be used to characterize and quantify speech is important, in particular, to aid those suffering from motor control speech disorders. Here, we sought to investigate how facial action units (AUs), previously used to classify human expressions and emotion, could be used to quantify and understand unimpaired human speech. Fourteen (14) adult participants (30.1 ± 7.9 years old), fluent in English, with no speech impairments, were examined. Within each data collection session, 6 video trials per participant per phoneme were acquired (i.e., 102 trials total/phoneme). The participants were asked to vocalize the vowels /æ/, /ɛ/, /ɪ/, /ɒ/, and /ʊ/; the consonants /b/, /n/, /m/, /p/, /h/, /w/, and /d/; and the diphthongs /eI/, /ʌɪ/, /i/, /a:/, and /u:/. Through the use of Python Py-Feat, our analysis displayed the AU contributions for each phoneme. The important implication of our methodological findings is that AUs could be used to quantify speech in populations with no speech disability; this has the potential to be broadened toward providing feedback and characterization of speech changes and improvements in impaired populations. This would be of interest to persons with speech disorders, speech language pathologists, engineers, and physicians. [ABSTRACT FROM AUTHOR]
Copyright of Electronics (2079-9292) is the property of MDPI and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)