List of Publications (2008-2017)

Show all

2014

Journal Articles

Alvarado, Alex; Brannstrom, Fredrik; Agrell, Erik; Koch, Tobias

High-SNR Asymptotics of Mutual Information for Discrete Constellations With Applications to BICM (Journal Article)

IEEE Transactions on Information Theory, 60 (2), pp. 1061–1076, 2014, ISSN: 0018-9448.

(Abstract | Links | BibTeX | Tags: additive white Gaussian noise channel, Anti-Gray code, bit-interleaved coded modulation, discrete constellations, Entropy, Gray code, high-SNR asymptotics, IP networks, Labeling, minimum-mean square error, Modulation, Mutual information, Signal to noise ratio, Vectors)

Pastore A,; Koch, Tobias; Fonollosa, Javier Rodriguez

A Rate-Splitting Approach to Fading Channels With Imperfect Channel-State Information (Journal Article)

IEEE Transactions on Information Theory, 60 (7), pp. 4266–4285, 2014, ISSN: 0018-9448.

(Abstract | Links | BibTeX | Tags: channel capacity, COMONSENS, DEIPRO, Entropy, Fading, fading channels, flat fading, imperfect channel-state information, MobileNET, Mutual information, OTOSiS, Random variables, Receivers, Signal to noise ratio, Upper bound)

Inproceedings

Koch, Tobias

On the Dither-Quantized Gaussian Channel at Low SNR (Inproceeding)

2014 IEEE International Symposium on Information Theory, pp. 186–190, IEEE, Honolulu, 2014, ISBN: 978-1-4799-5186-4.

(Abstract | Links | BibTeX | Tags: Additive noise, channel capacity, dither quantized Gaussian channel, Entropy, Gaussian channels, low signal-to-noise-ratio, low-SNR asymptotic capacity, peak power constraint, peak-and-average-power-limited Gaussian channel, Quantization (signal), Signal to noise ratio)

2013

Inproceedings

Alvarado, Alex; Brannstrom, Fredrik; Agrell, Erik; Koch, Tobias

High-SNR Asymptotics of Mutual Information for Discrete Constellations (Inproceeding)

2013 IEEE International Symposium on Information Theory, pp. 2274–2278, IEEE, Istanbul, 2013, ISSN: 2157-8095.

(Abstract | Links | BibTeX | Tags: AWGN channels, discrete constellations, Entropy, Fading, Gaussian Q-function, high-SNR asymptotics, IP networks, least mean squares methods, minimum mean-square error, MMSE, Mutual information, scalar additive white Gaussian noise channel, Signal to noise ratio, signal-to-noise ratio, Upper bound)

2012

Journal Articles

Leiva-Murillo, Jose; Artés-Rodríguez, Antonio

Information-Theoretic Linear Feature Extraction Based on Kernel Density Estimators: A Review (Journal Article)

IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 42 (6), pp. 1180–1189, 2012, ISSN: 1094-6977.

(Abstract | Links | BibTeX | Tags: Bandwidth, Density, detection theory, Entropy, Estimation, Feature extraction, Feature extraction (FE), information theoretic linear feature extraction, information theory, information-theoretic learning (ITL), Kernel, Kernel density estimation, kernel density estimators, Machine learning)

Inproceedings

Koch, Tobias; Martinez, Alfonso; i Fabregas, Albert Guillen

The Capacity Loss of Dense Constellations (Inproceeding)

2012 IEEE International Symposium on Information Theory Proceedings, pp. 572–576, IEEE, Cambridge, MA, 2012, ISSN: 2157-8095.

(Abstract | Links | BibTeX | Tags: capacity loss, channel capacity, Constellation diagram, dense constellations, Entropy, general complex-valued additive-noise channels, high signal-to-noise ratio, loss 1.53 dB, power loss, Quadrature amplitude modulation, Random variables, signal constellations, Signal processing, Signal to noise ratio, square signal constellations, Upper bound)

Taborda, Camilo; Perez-Cruz, Fernando

Derivative of the Relative Entropy over the Poisson and Binomial Channel (Inproceeding)

2012 IEEE Information Theory Workshop, pp. 386–390, IEEE, Lausanne, 2012, ISBN: 978-1-4673-0223-4.

(Abstract | Links | BibTeX | Tags: binomial channel, binomial distribution, Channel estimation, conditional distribution, Entropy, Estimation, function expectation, Mutual information, mutual information concept, Poisson channel, Poisson distribution, Random variables, relative entropy derivative, similar expression)

Pastore, Adriano; Koch, Tobias; Fonollosa, Javier Rodriguez

Improved Capacity Lower Bounds for Fading Channels with Imperfect CSI Using Rate Splitting (Inproceeding)

2012 IEEE 27th Convention of Electrical and Electronics Engineers in Israel, pp. 1–5, IEEE, Eilat, 2012, ISBN: 978-1-4673-4681-8.

(Abstract | Links | BibTeX | Tags: channel capacity, channel capacity lower bounds, conditional entropy, Decoding, Entropy, Fading, fading channels, Gaussian channel, Gaussian channels, Gaussian random variable, imperfect channel-state information, imperfect CSI, independent Gaussian variables, linear minimum mean-square error, mean square error methods, Medard lower bound, Mutual information, Random variables, rate splitting approach, Resource management, Upper bound, wireless communications)

Taborda, Camilo; Perez-Cruz, Fernando

Mutual Information and Relative Entropy over the Binomial and Negative Binomial Channels (Inproceeding)

2012 IEEE International Symposium on Information Theory Proceedings, pp. 696–700, IEEE, Cambridge, MA, 2012, ISSN: 2157-8095.

(Abstract | Links | BibTeX | Tags: Channel estimation, conditional mean estimation, Entropy, Estimation, estimation theoretical quantity, estimation theory, Gaussian channel, Gaussian channels, information theory concept, loss function, mean square error methods, Mutual information, negative binomial channel, Poisson channel, Random variables, relative entropy)

2009

Inproceedings

Fresia, Maria; Perez-Cruz, Fernando; Poor, Vincent

Optimized Concatenated LDPC Codes for Joint Source-Channel Coding (Inproceeding)

2009 IEEE International Symposium on Information Theory, pp. 2131–2135, IEEE, Seoul, 2009, ISBN: 978-1-4244-4312-3.

(Abstract | Links | BibTeX | Tags: approximation theory, asymptotic behavior analysis, Channel Coding, combined source-channel coding, Concatenated codes, Decoding, Entropy, EXIT chart, extrinsic information transfer, H infinity control, Information analysis, joint belief propagation decoder, joint source-channel coding, low-density-parity-check code, optimized concatenated independent LDPC codes, parity check codes, Redundancy, source coding, transmitter, Transmitters)

2008

Inproceedings

Koch, Tobias; Lapidoth, Amos

On Multipath Fading Channels at High SNR (Inproceeding)

2008 IEEE International Symposium on Information Theory, pp. 1572–1576, IEEE, Toronto, 2008, ISBN: 978-1-4244-2256-2.

(Abstract | Links | BibTeX | Tags: channel capacity, Delay, discrete time systems, discrete-time channels, Entropy, Fading, fading channels, Frequency, Mathematical model, multipath channels, multipath fading channels, noncoherent channel model, Random variables, Signal to noise ratio, signal-to-noise ratios, SNR, statistics, Transmitters)

Perez-Cruz, Fernando

Kullback-Leibler Divergence Estimation of Continuous Distributions (Inproceeding)

2008 IEEE International Symposium on Information Theory, pp. 1666–1670, IEEE, Toronto, 2008, ISBN: 978-1-4244-2256-2.

(Abstract | Links | BibTeX | Tags: Convergence, density estimation, Density measurement, Entropy, Frequency estimation, H infinity control, information theory, k-nearest-neighbour density estimation, Kullback-Leibler divergence estimation, Machine learning, Mutual information, neuroscience, Random variables, statistical distributions, waiting-times distributions)

Koch, Tobias; Lapidoth, Amos

Multipath Channels of Unbounded Capacity (Inproceeding)

2008 IEEE 25th Convention of Electrical and Electronics Engineers in Israel, pp. 640–644, IEEE, Eilat, 2008, ISBN: 978-1-4244-2481-8.

(Abstract | Links | BibTeX | Tags: channel capacity, discrete-time capacity, Entropy, Fading, fading channels, Frequency, H infinity control, Information rates, multipath channels, multipath fading channels, noncoherent, noncoherent capacity, path gains decay, Signal to noise ratio, statistics, Transmitters, unbounded capacity)

2007

Journal Articles

Leiva-Murillo, Jose; Artés-Rodríguez, Antonio

Maximization of Mutual Information for Supervised Linear Feature Extraction (Journal Article)

IEEE Transactions on Neural Networks, 18 (5), pp. 1433–1441, 2007, ISSN: 1045-9227.

(Abstract | Links | BibTeX | Tags: Algorithms, Artificial Intelligence, Automated, component-by-component gradient-ascent method, Computer Simulation, Data Mining, Entropy, Feature extraction, gradient methods, gradient-based entropy, Independent component analysis, Information Storage and Retrieval, information theory, Iron, learning (artificial intelligence), Linear discriminant analysis, Linear Models, Mutual information, Optimization methods, Pattern recognition, Reproducibility of Results, Sensitivity and Specificity, supervised linear feature extraction, Vectors)