Loading...
Search for: shaban--a
0.005 seconds

    Cascading randomized weighted majority: A new online ensemble learning algorithm

    , Article Intelligent Data Analysis ; Volume 20, Issue 4 , 2016 , Pages 877-889 ; 1088467X (ISSN) Zamani, M ; Beigy, H ; Shaban, A ; Sharif University of Technology
    IOS Press  2016
    Abstract
    With the increasing volume of data, the best approach for learning from this data is to exploit an online learning algorithm. Online ensemble methods take advantage of an ensemble of classifiers to predict labels of data. Prediction with expert advice is a well-studied problem in the online ensemble learning literature. The weighted majority and the randomized weighted majority (RWM) algorithms are two well-known solutions to this problem, aiming to converge to the best expert. Since among some expert, the best one does not necessarily have the minimum error in all regions of data space, defining specific regions and converging to the best expert in each of these regions will lead to a... 

    Manifold coarse graining for online semi-supervised learning

    , Article Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 5 September 2011 through 9 September 2011 ; Volume 6911 LNAI, Issue PART 1 , September , 2011 , Pages 391-406 ; 03029743 (ISSN) ; 9783642237799 (ISBN) Farajtabar, M ; Shaban, A ; Rabiee, H. R ; Rohban, M. H ; Sharif University of Technology
    2011
    Abstract
    When the number of labeled data is not sufficient, Semi-Supervised Learning (SSL) methods utilize unlabeled data to enhance classification. Recently, many SSL methods have been developed based on the manifold assumption in a batch mode. However, when data arrive sequentially and in large quantities, both computation and storage limitations become a bottleneck. In this paper, we present a new semi-supervised coarse graining (CG) algorithm to reduce the required number of data points for preserving the manifold structure. First, an equivalent formulation of Label Propagation (LP) is derived. Then a novel spectral view of the Harmonic Solution (HS) is proposed. Finally an algorithm to reduce... 

    Efficient iterative Semi-Supervised Classification on manifold

    , Article Proceedings - IEEE International Conference on Data Mining, ICDM ; 2011 , Pages 228-235 ; 15504786 (ISSN); 9780769544090 (ISBN) Farajtabar, M ; Rabiee, H. R ; Shaban, A ; Soltani Farani, A ; National Science Foundation (NSF) - Where Discoveries Begin; University of Technology Sydney; Google; Alberta Ingenuity Centre for Machine Learning; IBM Research ; Sharif University of Technology
    Abstract
    Semi-Supervised Learning (SSL) has become a topic of recent research that effectively addresses the problem of limited labeled data. Many SSL methods have been developed based on the manifold assumption, among them, the Local and Global Consistency (LGC) is a popular method. The problem with most of these algorithms, and in particular with LGC, is the fact that their naive implementations do not scale well to the size of data. Time and memory limitations are the major problems faced in large-scale problems. In this paper, we provide theoretical bounds on gradient descent, and to overcome the aforementioned problems, a new approximate Newton's method is proposed. Moreover, convergence...