English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

On the classification of emotional biosignals evoked while viewing affective pictures: An integrated data-mining-based approach for healthcare applications

MPS-Authors
There are no MPG-Authors available
External Ressource
No external resources are shared
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Frantzidis, C. A., Bratsas, C., Klados, M., Konstantinidis, E., Lithari, C. D., Vivas, A. B., et al. (2010). On the classification of emotional biosignals evoked while viewing affective pictures: An integrated data-mining-based approach for healthcare applications. IEEE Transactions on Information Technology in Biomedicine, 14(2), 309-318. doi:10.1109/TITB.2009.2038481.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0024-C5C3-2
Abstract
Recent neuroscience findings demonstrate the fundamental role of emotion in the maintenance of physical and mental health. In the present study, a novel architecture is proposed for the robust discrimination of emotional physiological signals evoked upon viewing pictures selected from the International Affective Picture System (IAPS). Biosignals are multichannel recordings from both the central and the autonomic nervous systems. Following the bidirectional emotion theory model, IAPS pictures are rated along two dimensions, namely, their valence and arousal. Following this model, biosignals in this paper are initially differentiated according to their valence dimension by means of a data mining approach, which is the C4.5 decision tree algorithm. Then, the valence and the gender information serve as an input to a Mahalanobis distance classifier, which dissects the data into high and low arousing. Results are described in Extensible Markup Language (XML) format, thereby accounting for platform independency, easy interconnectivity, and information exchange. The average recognition (success) rate was 77.68% for the discrimination of four emotional states, differing both in their arousal and valence dimension. It is, therefore, envisaged that the proposed approach holds promise for the efficient discrimination of negative and positive emotions, and it is hereby discussed how future developments may be steered to serve for affective healthcare applications, such as the monitoring of the elderly or chronically ill people.