Loading...

Application of Adversarial Training in Medical Signals

Yousefi Moghaddam, Hossein | 2021

259 Viewed
  1. Type of Document: M.Sc. Thesis
  2. Language: Farsi
  3. Document No: 54574 (19)
  4. University: Sharif University of Technology
  5. Department: Computer Engineering
  6. Advisor(s): Rabiee, Hamid Reza; Rohban, Mohammad Hossein
  7. Abstract:
  8. Recent success of Deep Learning models, resulted in their evergrowing application in many fields. However these models usually require huge datasets, which can sometimes be hard to collect. One of the challenges related to medical data, is the Batch Effect; Medical data is usually gathered through multiple experiments. Each experiment might have a slightly different conditions than the other, resulting a shift in the data related to that batch. Batch effects can have more severe impact during testing time, as the shift in the data distribution could be bigger. Many methods have been proposed to reduce or remove the effect of external conditions on data distribution.Deep Learning models have been proven to be vulnerable to adversarial attacks; A small perturbation in the input, usually unobservable to human eye, could easily fool the model. Adversarially robust training is the framework focusing on resisting such attacks.We propose a framework, using adversarial attacks to model batch effects. Furthermore, using this framework, we introduce a data augmentation method which can help models to train better in presence of batch effects. We then prove effectiveness of proposed method both quantitatively and qualitatively using two datasets of medical signals.
  9. Keywords:
  10. Adversarial Robust Training ; Batch Effect Correction ; Medical Signals ; Deep Learning

 Digital Object List