Loading...
Search for: emotional-state
0.009 seconds

    Artificial emotions for artificial systems

    , Article 2008 AAAI Spring Symposium, Stanford, CA, 26 March 2008 through 28 March 2008 ; Volume SS-08-04 , 2008 , Pages 46-49 ; 9781577353607 (ISBN) Harati Zadeh, S ; Bagheri Shouraki, S ; Halavati, R ; Sharif University of Technology
    2008
    Abstract
    To produce emotional artificial systems in AI domain, usually a subset of human emotional states are imported to the target domain and the major differences between natural and artificial domains are often ignored. In this paper we will discuss about why such an approach is not useful for all possible applications of emotions and we will show how it is necessary and possible to produce artificial emotion systems based on the target systems goals, abilities and needs  

    Using decision trees to model an emotional attention mechanism

    , Article Frontiers in Artificial Intelligence and Applications ; Volume 171, Issue 1 , Volume 171, Issue 1 , 2008 , Pages 374-385 ; 09226389 (ISSN); 9781586038335 (ISBN) Zadeh, S. H ; Bagheri Shouraki, S ; Halavati, R ; Sharif University of Technology
    IOS Press  2008
    Abstract
    There are several approaches to emotions in AI, most of which are inspired by human emotional states and their arousal mechanisms. These approaches usually use high-level models of human emotions that are too complex to be directly applicable in simple artificial systems. It seems that a new approach to emotions, based on their functional role in information processing in mind, can help us to construct models of emotions that are both valid and simple. In this paper, we will try to present a model of emotions based on their role in controlling the attention. We will evaluate the performance of the model and show how it can be affected by some structural and environmental factors. © 2008 The... 

    Spontaneous human-robot emotional interaction through facial expressions

    , Article 8th International Conference on Social Robotics, ICSR 2016, 1 November 2016 through 3 November 2016 ; Volume 9979 LNAI , 2016 , Pages 351-361 ; 03029743 (ISSN) ; 9783319474366 (ISBN) Meghdari, A ; Alemi, M ; Ghorbandaei Pour, A ; Taheri, A ; Sharif University of Technology
    Springer Verlag  2016
    Abstract
    One of the main issues in the field of social and cognitive robotics is the robot’s ability to recognize emotional states and emotional interaction between robots and humans. Through effective emotional interaction, robots will be able to perform many tasks in human society. In this research, we have developed a robotic platform and a vision system to recognize the emotional state of the user through its facial expressions, which leads to a more realistic humanrobot interaction (HRI). First, a number of features are extracted according to points detected by a vision system from the face of the user. Then, the emotional state of the user is analyzed with the help of these features. For the... 

    Effective connectivity inference in the whole-brain network by using rDCM method for investigating the distinction between emotional states in fMRI data

    , Article Computer Methods in Biomechanics and Biomedical Engineering: Imaging and Visualization ; 2022 ; 21681163 (ISSN) Farahani, N ; Ghahari, S ; Fatemizadeh, E ; Motie Nasrabadi, A ; Sharif University of Technology
    Taylor and Francis Ltd  2022
    Abstract
    In recent years, the regression dynamic causal modelling (rDCM) method was introduced as a new version of dynamic causal modelling (DCM) to derive effective connectivity in whole-brain networks for functional magnetic resonance imaging (fMRI) data. In this research, we used data obtained while applying the stimulation of audio movie comprised different emotional states. We applied this method to two networks consisting of ten auditory and forty-four regions, respectively. This method was used to study effective connections between emotional states and represent the distinction between emotions. Finally, significant effective connections were found in emotional processing and auditory...