## 2014 |

Taborda, Camilo G; Perez-Cruz, Fernando; Guo, Dongning New Information-Estimation Results for Poisson, Binomial and Negative Binomial Models Inproceedings 2014 IEEE International Symposium on Information Theory, pp. 2207–2211, IEEE, Honolulu, 2014, ISBN: 978-1-4799-5186-4. Abstract | Links | BibTeX | Tags: Bregman divergence, Estimation, estimation measures, Gaussian models, Gaussian processes, information measures, information theory, information-estimation results, negative binomial models, Poisson models, Stochastic processes @inproceedings{Taborda2014, title = {New Information-Estimation Results for Poisson, Binomial and Negative Binomial Models}, author = {Camilo G Taborda and Fernando Perez-Cruz and Dongning Guo}, url = {http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=6875225}, doi = {10.1109/ISIT.2014.6875225}, isbn = {978-1-4799-5186-4}, year = {2014}, date = {2014-06-01}, booktitle = {2014 IEEE International Symposium on Information Theory}, pages = {2207--2211}, publisher = {IEEE}, address = {Honolulu}, abstract = {In recent years, a number of mathematical relationships have been established between information measures and estimation measures for various models, including Gaussian, Poisson and binomial models. In this paper, it is shown that the second derivative of the input-output mutual information with respect to the input scaling can be expressed as the expectation of a certain Bregman divergence pertaining to the conditional expectations of the input and the input power. This result is similar to that found for the Gaussian model where the Bregman divergence therein is the square distance. In addition, the Poisson, binomial and negative binomial models are shown to be similar in the small scaling regime in the sense that the derivative of the mutual information and the derivative of the relative entropy converge to the same value.}, keywords = {Bregman divergence, Estimation, estimation measures, Gaussian models, Gaussian processes, information measures, information theory, information-estimation results, negative binomial models, Poisson models, Stochastic processes}, pubstate = {published}, tppubtype = {inproceedings} } In recent years, a number of mathematical relationships have been established between information measures and estimation measures for various models, including Gaussian, Poisson and binomial models. In this paper, it is shown that the second derivative of the input-output mutual information with respect to the input scaling can be expressed as the expectation of a certain Bregman divergence pertaining to the conditional expectations of the input and the input power. This result is similar to that found for the Gaussian model where the Bregman divergence therein is the square distance. In addition, the Poisson, binomial and negative binomial models are shown to be similar in the small scaling regime in the sense that the derivative of the mutual information and the derivative of the relative entropy converge to the same value. |

## 2011 |

Maiz, Cristina S; Miguez, Joaquin On the Optimization of Transportation Routes with Multiple Destinations in Random Networks Inproceedings 2011 IEEE Statistical Signal Processing Workshop (SSP), pp. 349–352, IEEE, Nice, 2011, ISBN: 978-1-4577-0569-4. Abstract | Links | BibTeX | Tags: Approximation algorithms, communication networks, Estimation, graph theory, Histograms, intelligent transportation, Monte Carlo algorithm, Monte Carlo methods, multiple destinations, optimisation, Optimization, random networks, route optimization, routing, Sequential Monte Carlo, Signal processing algorithms, stochastic graph, Stochastic processes, telecommunication network routing, time-varying graph, transportation routes @inproceedings{Maiz2011, title = {On the Optimization of Transportation Routes with Multiple Destinations in Random Networks}, author = {Cristina S Maiz and Joaquin Miguez}, url = {http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5967701}, isbn = {978-1-4577-0569-4}, year = {2011}, date = {2011-01-01}, booktitle = {2011 IEEE Statistical Signal Processing Workshop (SSP)}, pages = {349--352}, publisher = {IEEE}, address = {Nice}, abstract = {Various practical problems in transportation research and routing in communication networks can be reduced to the computation of the best path that traverses a certain graph and visits a set of D specified destination nodes. Simple versions of this problem have received attention in the literature. Optimal solutions exist for the cases in which (a) D >; 1 and the graph is deterministic or (b) D = 1 and the graph is stochastic (and possibly time-dependent). Here, we address the general problem in which both D >; 1 and the costs of the edges in the graph are stochastic and time-varying. We tackle this complex global optimization problem by first converting it into an equivalent estimation problem and then computing a numerical solution using a sequential Monte Carlo algorithm. The advantage of the proposed technique over some standard methods (devised for graphs with time-invariant statistics) is illustrated by way of computer simulations.}, keywords = {Approximation algorithms, communication networks, Estimation, graph theory, Histograms, intelligent transportation, Monte Carlo algorithm, Monte Carlo methods, multiple destinations, optimisation, Optimization, random networks, route optimization, routing, Sequential Monte Carlo, Signal processing algorithms, stochastic graph, Stochastic processes, telecommunication network routing, time-varying graph, transportation routes}, pubstate = {published}, tppubtype = {inproceedings} } Various practical problems in transportation research and routing in communication networks can be reduced to the computation of the best path that traverses a certain graph and visits a set of D specified destination nodes. Simple versions of this problem have received attention in the literature. Optimal solutions exist for the cases in which (a) D >; 1 and the graph is deterministic or (b) D = 1 and the graph is stochastic (and possibly time-dependent). Here, we address the general problem in which both D >; 1 and the costs of the edges in the graph are stochastic and time-varying. We tackle this complex global optimization problem by first converting it into an equivalent estimation problem and then computing a numerical solution using a sequential Monte Carlo algorithm. The advantage of the proposed technique over some standard methods (devised for graphs with time-invariant statistics) is illustrated by way of computer simulations. |

## 2008 |

Vila-Forcen, J E; Artés-Rodríguez, Antonio; Garcia-Frias, J Compressive Sensing Detection of Stochastic Signals Inproceedings 2008 42nd Annual Conference on Information Sciences and Systems, pp. 956–960, IEEE, Princeton, 2008, ISBN: 978-1-4244-2246-3. Abstract | Links | BibTeX | Tags: Additive white noise, AWGN, compressive sensing detection, dimensionality reduction techniques, Distortion measurement, Gaussian noise, matrix algebra, Mutual information, optimized projections, projection matrix, signal detection, Signal processing, signal reconstruction, Stochastic processes, stochastic signals, Support vector machine classification, Support vector machines, SVM @inproceedings{Vila-Forcen2008, title = {Compressive Sensing Detection of Stochastic Signals}, author = {J E Vila-Forcen and Antonio Artés-Rodríguez and J Garcia-Frias}, url = {http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=4558656}, isbn = {978-1-4244-2246-3}, year = {2008}, date = {2008-01-01}, booktitle = {2008 42nd Annual Conference on Information Sciences and Systems}, pages = {956--960}, publisher = {IEEE}, address = {Princeton}, abstract = {Inspired by recent work in compressive sensing, we propose a framework for the detection of stochastic signals from optimized projections. In order to generate a good projection matrix, we use dimensionality reduction techniques based on the maximization of the mutual information between the projected signals and their corresponding class labels. In addition, classification techniques based on support vector machines (SVMs) are applied for the final decision process. Simulation results show that the realizations of the stochastic process are detected with higher accuracy and lower complexity than a scheme performing signal reconstruction first, followed by detection based on the reconstructed signal.}, keywords = {Additive white noise, AWGN, compressive sensing detection, dimensionality reduction techniques, Distortion measurement, Gaussian noise, matrix algebra, Mutual information, optimized projections, projection matrix, signal detection, Signal processing, signal reconstruction, Stochastic processes, stochastic signals, Support vector machine classification, Support vector machines, SVM}, pubstate = {published}, tppubtype = {inproceedings} } Inspired by recent work in compressive sensing, we propose a framework for the detection of stochastic signals from optimized projections. In order to generate a good projection matrix, we use dimensionality reduction techniques based on the maximization of the mutual information between the projected signals and their corresponding class labels. In addition, classification techniques based on support vector machines (SVMs) are applied for the final decision process. Simulation results show that the realizations of the stochastic process are detected with higher accuracy and lower complexity than a scheme performing signal reconstruction first, followed by detection based on the reconstructed signal. |