Loading...
Search for: distributed-machine-learning
0.007 seconds

    Coded Computing for Distributed Machine Learning

    , Ph.D. Dissertation Sharif University of Technology Jahaninezhad, Tayyebeh (Author) ; Maddah Ali, Mohammad Ali (Supervisor)
    Abstract
    Nowadays, we are forced to use distributed computing due to the growth of data, the challenge of storing and processing it, as well as the emergence of new problems in machine learning and the complexity of the models. In distributed computing, the computation is per- formed by a distributed system consisting of several worker nodes such that, the main task is divided into several smaller tasks and assigned to each worker node. Then, different worker nodes will cooperate to accomplish the main task. Although distributed systems are efficient in solving problems and dealing with the mentioned challenges, they are vulnerable to the presence of stragglers, adversarial worker nodes, high... 

    Evaluation and optimization of distributed machine learning techniques for internet of things

    , Article IEEE Transactions on Computers ; 2021 ; 00189340 (ISSN) Gao, Y ; Kim, M ; Thapa, C ; Abuadbba, S ; Zhang, Z ; Camtepe, S ; Kim, H ; Nepal, S ; Sharif University of Technology
    IEEE Computer Society  2021
    Abstract
    Federated learning (FL) and split learning (SL) are state-of-the-art distributed machine learning techniques to enable machine learning without accessing raw data on clients or end devices. However, their comparative training performance under real-world resource-restricted Internet of Things (IoT) device settings, e.g., Raspberry Pi, remains barely studied, which, to our knowledge, have not yet been evaluated and compared, rendering inconvenient reference for practitioner. This work firstly provides empirical comparisons of FL and SL in real-world IoT settings regarding learning performance and on-device execution overhead. Our analyses demonstrate that the learning performance of SL is... 

    DONE: Distributed approximate newton-type method for federated edge learning

    , Article IEEE Transactions on Parallel and Distributed Systems ; Volume 33, Issue 11 , 2022 , Pages 2648-2660 ; 10459219 (ISSN) Dinh, C. T ; Tran, N. H ; Nguyen, T. D ; Bao, W ; Balef, A. R ; Zhou, B. B ; Zomaya, A. Y ; Sharif University of Technology
    IEEE Computer Society  2022
    Abstract
    There is growing interest in applying distributed machine learning to edge computing, forming federated edge learning. Federated edge learning faces non-i.i.d. and heterogeneous data, and the communication between edge workers, possibly through distant locations and with unstable wireless networks, is more costly than their local computational overhead. In this work, we propose ${{sf DONE}}$DONE, a distributed approximate Newton-type algorithm with fast convergence rate for communication-efficient federated edge learning. First, with strongly convex and smooth loss functions, ${{sf DONE}}$DONE approximates the Newton direction in a distributed manner using the classical Richardson iteration... 

    Private Inner product retrieval for distributed machine learning

    , Article 2019 IEEE International Symposium on Information Theory, ISIT 2019, 7 July 2019 through 12 July 2019 ; Volume 2019-July , 2019 , Pages 355-359 ; 21578095 (ISSN); 9781538692912 (ISBN) Mousavi, M. H ; Maddah Ali, M. A ; Mirmohseni, M ; Sharif University of Technology
    Institute of Electrical and Electronics Engineers Inc  2019
    Abstract
    In this paper, we argue that in many basic algorithms for machine learning, including support vector machine (SVM) for classification, principal component analysis (PCA) for dimensionality reduction, and regression for dependency estimation, we need the inner products of the data samples, rather than the data samples themselves.Motivated by the above observation, we introduce the problem of private inner product retrieval for distributed machine learning, where we have a system including a database of some files, duplicated across some non-colluding servers. A user intends to retrieve a subset of specific size of the set of the inner product of every pair of data items in the database with...