Search for: sajedin--a
0.008 seconds

    Evidence-based mixture of MLP-experts

    , Article Proceedings of the International Joint Conference on Neural Networks, 18 July 2010 through 23 July 2010 ; July , 2010 ; 9781424469178 (ISBN) Masoudnia, S ; Rostami, M ; Tabassian, M ; Sajedin, A ; Ebrahimpour, R ; Sharif University of Technology
    Mixture of Experts (ME) is a modular neural network architecture for supervised learning. In this paper, we propose an evidence-based ME to deal with the classification problem. In the basic form of ME the problem space is automatically divided into several subspaces for the experts and the outputs of experts are combined by a gating network. Satisfactory performance of the basic ME depends on the diversity among experts. In conventional ME, different initialization of experts and supervision of the gating network during the learning procedure, provide the diversity. The main idea of our proposed method is to employ the Dempster-Shafer (D-S) theory of evidence to improve determination of...