All Publications

Mostrar todo

2014

Alvarado, Alex; Brannstrom, Fredrik; Agrell, Erik; Koch, Tobias

High-SNR Asymptotics of Mutual Information for Discrete Constellations With Applications to BICM Artículo de revista

En: IEEE Transactions on Information Theory, vol. 60, no 2, pp. 1061–1076, 2014, ISSN: 0018-9448.

Resumen | Enlaces | BibTeX | Etiquetas: additive white Gaussian noise channel, Anti-Gray code, bit-interleaved coded modulation, discrete constellations, Entropy, Gray code, high-SNR asymptotics, IP networks, Labeling, minimum-mean square error, Modulation, Mutual information, Signal to noise ratio, Vectors

A, Pastore; Koch, Tobias; Fonollosa, Javier Rodriguez

A Rate-Splitting Approach to Fading Channels With Imperfect Channel-State Information Artículo de revista

En: IEEE Transactions on Information Theory, vol. 60, no 7, pp. 4266–4285, 2014, ISSN: 0018-9448.

Resumen | Enlaces | BibTeX | Etiquetas: channel capacity, COMONSENS, DEIPRO, Entropy, Fading, fading channels, flat fading, imperfect channel-state information, MobileNET, Mutual information, OTOSiS, Random variables, Receivers, Signal to noise ratio, Upper bound

2013

Alvarado, Alex; Brannstrom, Fredrik; Agrell, Erik; Koch, Tobias

High-SNR Asymptotics of Mutual Information for Discrete Constellations Proceedings Article

En: 2013 IEEE International Symposium on Information Theory, pp. 2274–2278, IEEE, Istanbul, 2013, ISSN: 2157-8095.

Resumen | Enlaces | BibTeX | Etiquetas: AWGN channels, discrete constellations, Entropy, Fading, Gaussian Q-function, high-SNR asymptotics, IP networks, least mean squares methods, minimum mean-square error, MMSE, Mutual information, scalar additive white Gaussian noise channel, Signal to noise ratio, signal-to-noise ratio, Upper bound

2012

Taborda, Camilo G; Perez-Cruz, Fernando

Derivative of the Relative Entropy over the Poisson and Binomial Channel Proceedings Article

En: 2012 IEEE Information Theory Workshop, pp. 386–390, IEEE, Lausanne, 2012, ISBN: 978-1-4673-0223-4.

Resumen | Enlaces | BibTeX | Etiquetas: binomial channel, binomial distribution, Channel estimation, conditional distribution, Entropy, Estimation, function expectation, Mutual information, mutual information concept, Poisson channel, Poisson distribution, Random variables, relative entropy derivative, similar expression

Pastore, Adriano; Koch, Tobias; Fonollosa, Javier Rodriguez

Improved Capacity Lower Bounds for Fading Channels with Imperfect CSI Using Rate Splitting Proceedings Article

En: 2012 IEEE 27th Convention of Electrical and Electronics Engineers in Israel, pp. 1–5, IEEE, Eilat, 2012, ISBN: 978-1-4673-4681-8.

Resumen | Enlaces | BibTeX | Etiquetas: channel capacity, channel capacity lower bounds, conditional entropy, Decoding, Entropy, Fading, fading channels, Gaussian channel, Gaussian channels, Gaussian random variable, imperfect channel-state information, imperfect CSI, independent Gaussian variables, linear minimum mean-square error, mean square error methods, Medard lower bound, Mutual information, Random variables, rate splitting approach, Resource management, Upper bound, wireless communications

Taborda, Camilo G; Perez-Cruz, Fernando

Mutual Information and Relative Entropy over the Binomial and Negative Binomial Channels Proceedings Article

En: 2012 IEEE International Symposium on Information Theory Proceedings, pp. 696–700, IEEE, Cambridge, MA, 2012, ISSN: 2157-8095.

Resumen | Enlaces | BibTeX | Etiquetas: Channel estimation, conditional mean estimation, Entropy, Estimation, estimation theoretical quantity, estimation theory, Gaussian channel, Gaussian channels, information theory concept, loss function, mean square error methods, Mutual information, negative binomial channel, Poisson channel, Random variables, relative entropy

2011

Goparaju, S; Calderbank, A R; Carson, W R; Rodrigues, Miguel R D; Perez-Cruz, Fernando

When to Add Another Dimension when Communicating over MIMO Channels Proceedings Article

En: 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 3100–3103, IEEE, Prague, 2011, ISSN: 1520-6149.

Resumen | Enlaces | BibTeX | Etiquetas: divide and conquer approach, divide and conquer methods, error probability, error rate, error statistics, Gaussian channels, Lattices, Manganese, MIMO, MIMO channel, MIMO communication, multiple input multiple output Gaussian channel, Mutual information, optimal power allocation, power allocation, power constraint, receive filter, Resource management, Signal to noise ratio, signal-to-noise ratio, transmit filter, Upper bound

2010

Perez-Cruz, Fernando; Rodrigues, Miguel R D; Verdu, Sergio

MIMO Gaussian Channels With Arbitrary Inputs: Optimal Precoding and Power Allocation Artículo de revista

En: IEEE Transactions on Information Theory, vol. 56, no 3, pp. 1070–1084, 2010, ISSN: 0018-9448.

Resumen | Enlaces | BibTeX | Etiquetas: Collaborative work, Equations, fixed-point equation, Gaussian channels, Gaussian noise channels, Gaussian processes, Government, Interference, linear precoding, matrix algebra, mean square error methods, mercury-waterfilling algorithm, MIMO, MIMO communication, MIMO Gaussian channel, minimum mean-square error, minimum mean-square error (MMSE), multiple-input-multiple-output channel, multiple-input–multiple-output (MIMO) systems, Mutual information, nondiagonal precoding matrix, optimal linear precoder, optimal power allocation policy, optimal precoding, optimum power allocation, Phase shift keying, precoding, Quadrature amplitude modulation, Telecommunications, waterfilling

2008

Perez-Cruz, Fernando

Kullback-Leibler Divergence Estimation of Continuous Distributions Proceedings Article

En: 2008 IEEE International Symposium on Information Theory, pp. 1666–1670, IEEE, Toronto, 2008, ISBN: 978-1-4244-2256-2.

Resumen | Enlaces | BibTeX | Etiquetas: Convergence, density estimation, Density measurement, Entropy, Frequency estimation, H infinity control, information theory, k-nearest-neighbour density estimation, Kullback-Leibler divergence estimation, Machine learning, Mutual information, neuroscience, Random variables, statistical distributions, waiting-times distributions

Perez-Cruz, Fernando; Rodrigues, Miguel R D; Verdu, Sergio

Optimal Precoding for Digital Subscriber Lines Proceedings Article

En: 2008 IEEE International Conference on Communications, pp. 1200–1204, IEEE, Beijing, 2008, ISBN: 978-1-4244-2075-9.

Resumen | Enlaces | BibTeX | Etiquetas: Bit error rate, channel matrix diagonalization, Communications Society, Computer science, digital subscriber lines, DSL, Equations, fixed-point equation, Gaussian channels, least mean squares methods, linear codes, matrix algebra, MIMO, MIMO communication, MIMO Gaussian channel, minimum mean squared error method, MMSE, multiple-input multiple-output communication, Mutual information, optimal linear precoder, precoding, Telecommunications, Telephony

Rodrigues, Miguel R D; Perez-Cruz, Fernando; Verdu, Sergio

Multiple-Input Multiple-Output Gaussian Channels: Optimal Covariance for Non-Gaussian Inputs Proceedings Article

En: 2008 IEEE Information Theory Workshop, pp. 445–449, IEEE, Porto, 2008, ISBN: 978-1-4244-2269-2.

Resumen | Enlaces | BibTeX | Etiquetas: Binary phase shift keying, covariance matrices, Covariance matrix, deterministic MIMO Gaussian channel, fixed-point equation, Gaussian channels, Gaussian noise, Information rates, intersymbol interference, least mean squares methods, Magnetic recording, mercury-waterfilling power allocation policy, MIMO, MIMO communication, minimum mean-squared error, MMSE, MMSE matrix, multiple-input multiple-output system, Multiple-Input Multiple-Output Systems, Mutual information, Optimal Input Covariance, Optimization, Telecommunications

Vila-Forcen, J E; Artés-Rodríguez, Antonio; Garcia-Frias, J

Compressive Sensing Detection of Stochastic Signals Proceedings Article

En: 2008 42nd Annual Conference on Information Sciences and Systems, pp. 956–960, IEEE, Princeton, 2008, ISBN: 978-1-4244-2246-3.

Resumen | Enlaces | BibTeX | Etiquetas: Additive white noise, AWGN, compressive sensing detection, dimensionality reduction techniques, Distortion measurement, Gaussian noise, matrix algebra, Mutual information, optimized projections, projection matrix, signal detection, Signal processing, signal reconstruction, Stochastic processes, stochastic signals, Support vector machine classification, Support vector machines, SVM

2007

Leiva-Murillo, Jose M; Artés-Rodríguez, Antonio

Maximization of Mutual Information for Supervised Linear Feature Extraction Artículo de revista

En: IEEE Transactions on Neural Networks, vol. 18, no 5, pp. 1433–1441, 2007, ISSN: 1045-9227.

Resumen | Enlaces | BibTeX | Etiquetas: Algorithms, Artificial Intelligence, Automated, component-by-component gradient-ascent method, Computer Simulation, Data Mining, Entropy, Feature extraction, gradient methods, gradient-based entropy, Independent component analysis, Information Storage and Retrieval, information theory, Iron, learning (artificial intelligence), Linear discriminant analysis, Linear Models, Mutual information, Optimization methods, Pattern recognition, Reproducibility of Results, Sensitivity and Specificity, supervised linear feature extraction, Vectors