TY - GEN
T1 - Estimating self-Assessed personality from body movements and proximity in crowded mingling scenarios
AU - Cabrera-Quiros, Laura
AU - Gedik, Ekin
AU - Hung, Hayley
N1 - Publisher Copyright:
© 2016 ACM.
PY - 2016/10/31
Y1 - 2016/10/31
N2 - This paper focuses on the automatic classification of self-Assessed personality traits from the HEXACO inventory du- ring crowded mingle scenarios. We exploit acceleration and proximity data from a wearable device hung around the neck. Unlike most state-of-The-Art studies, addressing personality estimation during mingle scenarios provides a challenging social context as people interact dynamically and freely in a face-To-face setting. While many former studies use audio to extract speech-related features, we present a novel method of extracting an individual's speaking status from a single body worn triaxial accelerometer which scales easily to large populations. Moreover, by fusing both speech and movement energy related cues from just acceleration, our experimental results show improvements on the estimation of Humility over features extracted from a single behavioral modality. We validated our method on 71 participants where we obtained an accuracy of 69% for Honesty, Conscientiousness and Openness to Experience. To our knowledge, this is the largest validation of personality estimation carried out in such a social context with simple wearable sensors.
AB - This paper focuses on the automatic classification of self-Assessed personality traits from the HEXACO inventory du- ring crowded mingle scenarios. We exploit acceleration and proximity data from a wearable device hung around the neck. Unlike most state-of-The-Art studies, addressing personality estimation during mingle scenarios provides a challenging social context as people interact dynamically and freely in a face-To-face setting. While many former studies use audio to extract speech-related features, we present a novel method of extracting an individual's speaking status from a single body worn triaxial accelerometer which scales easily to large populations. Moreover, by fusing both speech and movement energy related cues from just acceleration, our experimental results show improvements on the estimation of Humility over features extracted from a single behavioral modality. We validated our method on 71 participants where we obtained an accuracy of 69% for Honesty, Conscientiousness and Openness to Experience. To our knowledge, this is the largest validation of personality estimation carried out in such a social context with simple wearable sensors.
KW - Personality
KW - Proximity
KW - Speaking turn
KW - Wearable acceleration
UR - http://www.scopus.com/inward/record.url?scp=85016611614&partnerID=8YFLogxK
U2 - 10.1145/2993148.2993170
DO - 10.1145/2993148.2993170
M3 - Contribución a la conferencia
AN - SCOPUS:85016611614
T3 - ICMI 2016 - Proceedings of the 18th ACM International Conference on Multimodal Interaction
SP - 238
EP - 242
BT - ICMI 2016 - Proceedings of the 18th ACM International Conference on Multimodal Interaction
A2 - Pelachaud, Catherine
A2 - Nakano, Yukiko I.
A2 - Nishida, Toyoaki
A2 - Busso, Carlos
A2 - Morency, Louis-Philippe
A2 - Andre, Elisabeth
PB - Association for Computing Machinery, Inc
T2 - 18th ACM International Conference on Multimodal Interaction, ICMI 2016
Y2 - 12 November 2016 through 16 November 2016
ER -