All Publications

Mostrar todo

2014

Taborda, Camilo G; Perez-Cruz, Fernando; Guo, Dongning

New Information-Estimation Results for Poisson, Binomial and Negative Binomial Models Proceedings Article

En: 2014 IEEE International Symposium on Information Theory, pp. 2207–2211, IEEE, Honolulu, 2014, ISBN: 978-1-4799-5186-4.

Resumen | Enlaces | BibTeX | Etiquetas: Bregman divergence, Estimation, estimation measures, Gaussian models, Gaussian processes, information measures, information theory, information-estimation results, negative binomial models, Poisson models, Stochastic processes

2012

Leiva-Murillo, Jose M; Artés-Rodríguez, Antonio

Information-Theoretic Linear Feature Extraction Based on Kernel Density Estimators: A Review Artículo de revista

En: IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), vol. 42, no 6, pp. 1180–1189, 2012, ISSN: 1094-6977.

Resumen | Enlaces | BibTeX | Etiquetas: Bandwidth, Density, detection theory, Entropy, Estimation, Feature extraction, Feature extraction (FE), information theoretic linear feature extraction, information theory, information-theoretic learning (ITL), Kernel, Kernel density estimation, kernel density estimators, Machine learning

2010

Koch, Tobias; Lapidoth, Amos

Gaussian Fading Is the Worst Fading Artículo de revista

En: IEEE Transactions on Information Theory, vol. 56, no 3, pp. 1158–1165, 2010, ISSN: 0018-9448.

Resumen | Enlaces | BibTeX | Etiquetas: Additive noise, channel capacity, channels with memory, Distribution functions, ergodic fading processes, Fading, fading channels, flat fading, flat-fading channel capacity, Gaussian channels, Gaussian fading, Gaussian processes, H infinity control, high signal-to-noise ratio (SNR), Information technology, information theory, multiple-input single-output fading channels, multiplexing gain, noncoherent, noncoherent channel capacity, peak-power limited channel capacity, Signal to noise ratio, signal-to-noise ratio, single-antenna channel capacity, spectral distribution function, time-selective, Transmitters

2008

Perez-Cruz, Fernando

Kullback-Leibler Divergence Estimation of Continuous Distributions Proceedings Article

En: 2008 IEEE International Symposium on Information Theory, pp. 1666–1670, IEEE, Toronto, 2008, ISBN: 978-1-4244-2256-2.

Resumen | Enlaces | BibTeX | Etiquetas: Convergence, density estimation, Density measurement, Entropy, Frequency estimation, H infinity control, information theory, k-nearest-neighbour density estimation, Kullback-Leibler divergence estimation, Machine learning, Mutual information, neuroscience, Random variables, statistical distributions, waiting-times distributions

Leiva-Murillo, Jose M; Salcedo-Sanz, Sancho; Gallardo-Antolín, Ascensión; Artés-Rodríguez, Antonio

A Simulated Annealing Approach to Speaker Segmentation in Audio Databases Artículo de revista

En: Engineering Applications of Artificial Intelligence, vol. 21, no 4, pp. 499–508, 2008.

Resumen | Enlaces | BibTeX | Etiquetas: Audio indexing, information theory, Simulated annealing, Speaker segmentation

2007

Leiva-Murillo, Jose M; Artés-Rodríguez, Antonio

Maximization of Mutual Information for Supervised Linear Feature Extraction Artículo de revista

En: IEEE Transactions on Neural Networks, vol. 18, no 5, pp. 1433–1441, 2007, ISSN: 1045-9227.

Resumen | Enlaces | BibTeX | Etiquetas: Algorithms, Artificial Intelligence, Automated, component-by-component gradient-ascent method, Computer Simulation, Data Mining, Entropy, Feature extraction, gradient methods, gradient-based entropy, Independent component analysis, Information Storage and Retrieval, information theory, Iron, learning (artificial intelligence), Linear discriminant analysis, Linear Models, Mutual information, Optimization methods, Pattern recognition, Reproducibility of Results, Sensitivity and Specificity, supervised linear feature extraction, Vectors