Loading...
Search for:
information-theoretic-learning
0.005 seconds
Mono-modal image registration via correntropy measure
, Article Iranian Conference on Machine Vision and Image Processing, MVIP ; Sept , 2013 , Pages 223-226 ; 21666776 (ISSN) ; 9781467361842 (ISBN) ; Fatemizadeh, E ; Sharif University of Technology
IEEE Computer Society
2013
Abstract
The registration of images is a fundamental task in numerous applications in medical image processing. Similarity measure is an important key in intensity based image registration. Here, we propose correntropy measure as similarity measure in mono modal setting. Correntropy is a important measure between two random variables based on information theoretic learning and kernel methods. This measure is useful in non-Gaussian signal processing. In this paper, this measure is used in image registration. Here, we analytically illustrate that this measure is robust in presence of spiky noise (impulsive noise). The experimental results show that the proposed similarity has better performance than...
A general approach for mutual information minimization and its application to blind source separation
, Article Signal Processing ; Volume 85, Issue 5 SPEC. ISS , 2005 , Pages 975-995 ; 01651684 (ISSN) ; Jutten, C ; Sharif University of Technology
Elsevier
2005
Abstract
In this paper, a nonparametric "gradient" of the mutual information is first introduced. It is used for showing that mutual information has no local minima. Using the introduced "gradient", two general gradient based approaches for minimizing mutual information in a parametric model are then presented. These approaches are quite general, and principally they can be used in any mutual information minimization problem. In blind source separation, these approaches provide powerful tools for separating any complicated (yet separable) mixing model. In this paper, they are used to develop algorithms for separating four separable mixing models: linear instantaneous, linear convolutive, post...