All Publications

Show all


Taborda, Camilo G; Perez-Cruz, Fernando; Guo, Dongning

New Information-Estimation Results for Poisson, Binomial and Negative Binomial Models Inproceedings

2014 IEEE International Symposium on Information Theory, pp. 2207–2211, IEEE, Honolulu, 2014, ISBN: 978-1-4799-5186-4.

Abstract | Links | BibTeX | Tags: Bregman divergence, Estimation, estimation measures, Gaussian models, Gaussian processes, information measures, information theory, information-estimation results, negative binomial models, Poisson models, Stochastic processes


Leiva-Murillo, Jose M; Artés-Rodríguez, Antonio

Information-Theoretic Linear Feature Extraction Based on Kernel Density Estimators: A Review Journal Article

IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 42 (6), pp. 1180–1189, 2012, ISSN: 1094-6977.

Abstract | Links | BibTeX | Tags: Bandwidth, Density, detection theory, Entropy, Estimation, Feature extraction, Feature extraction (FE), information theoretic linear feature extraction, information theory, information-theoretic learning (ITL), Kernel, Kernel density estimation, kernel density estimators, Machine learning


Koch, Tobias; Lapidoth, Amos

Gaussian Fading Is the Worst Fading Journal Article

IEEE Transactions on Information Theory, 56 (3), pp. 1158–1165, 2010, ISSN: 0018-9448.

Abstract | Links | BibTeX | Tags: Additive noise, channel capacity, channels with memory, Distribution functions, ergodic fading processes, Fading, fading channels, flat fading, flat-fading channel capacity, Gaussian channels, Gaussian fading, Gaussian processes, H infinity control, high signal-to-noise ratio (SNR), Information technology, information theory, multiple-input single-output fading channels, multiplexing gain, noncoherent, noncoherent channel capacity, peak-power limited channel capacity, Signal to noise ratio, signal-to-noise ratio, single-antenna channel capacity, spectral distribution function, time-selective, Transmitters


Perez-Cruz, Fernando

Kullback-Leibler Divergence Estimation of Continuous Distributions Inproceedings

2008 IEEE International Symposium on Information Theory, pp. 1666–1670, IEEE, Toronto, 2008, ISBN: 978-1-4244-2256-2.

Abstract | Links | BibTeX | Tags: Convergence, density estimation, Density measurement, Entropy, Frequency estimation, H infinity control, information theory, k-nearest-neighbour density estimation, Kullback-Leibler divergence estimation, Machine learning, Mutual information, neuroscience, Random variables, statistical distributions, waiting-times distributions

Leiva-Murillo, Jose M; Salcedo-Sanz, Sancho; Gallardo-Antolín, Ascensión; Artés-Rodríguez, Antonio

A Simulated Annealing Approach to Speaker Segmentation in Audio Databases Journal Article

Engineering Applications of Artificial Intelligence, 21 (4), pp. 499–508, 2008.

Abstract | Links | BibTeX | Tags: Audio indexing, information theory, Simulated annealing, Speaker segmentation


Leiva-Murillo, Jose M; Artés-Rodríguez, Antonio

Maximization of Mutual Information for Supervised Linear Feature Extraction Journal Article

IEEE Transactions on Neural Networks, 18 (5), pp. 1433–1441, 2007, ISSN: 1045-9227.

Abstract | Links | BibTeX | Tags: Algorithms, Artificial Intelligence, Automated, component-by-component gradient-ascent method, Computer Simulation, Data Mining, Entropy, Feature extraction, gradient methods, gradient-based entropy, Independent component analysis, Information Storage and Retrieval, information theory, Iron, learning (artificial intelligence), Linear discriminant analysis, Linear Models, Mutual information, Optimization methods, Pattern recognition, Reproducibility of Results, Sensitivity and Specificity, supervised linear feature extraction, Vectors