Ongoing Research Projects
|Information Theory for Low-Latency Wireless Communications (LOLITA) European Comission, ERC Starting Grant; 2017-2022|
|The design of low-latency wireless communication systems is a great challenge, since it requires a different focus than that which is used in current high-speed data transmission systems. “The project seeks to establish the theoretical framework necessary to describe the fundamental tradeoffs in low-latency wireless communications,” Koch explained. “This enables the design of novel systems that employ resources such as bandwidth and energy more efficiently.”
Current wireless communication systems exchange packets of several thousand bits and include large correction codes to protect them against transmission errors. “What we do is to include additional bits to correct possible errors,” Koch stated. In this way, the reliability of the system is guaranteed (what is transmitted is what is received). However, future low-latency systems will exchange information in a much quicker way (almost in real time) and, hence, exchange packets of only a few hundred of bits (a much smaller size), which requires the design of novel correction codes of a shorter length.
Put differently, it is like transporting goods in thousands of cars instead of dozens of trucks. For that purpose, it is necessary to design new correction codes that allow the cars to stay on track when there are driving mistakes. “If we have to send many packets, we can decide if we store them in a warehouse and later send all of them in a truck, or if we send the packets one by one in a car,” Koch explained. With the truck, it would take longer because you would have to wait to complete the load, but its advantage is that larger and stronger security systems (correction codes) can be employed because we have more space. In contrast, transportation by car would be faster because each packet could be sent the moment that it arrives at the warehouse, but then codes must be used that are not as strong.
This simile is related to some applications for this kind of technology. In the future, vehicles will be interconnected wirelessly, inter alia, to avoid accidents. To this end, communication needs to occur in almost real time (with a delay of not more than 10 milliseconds), researchers point out. Furthermore, low-latency wireless communications will be used in 5G networks, and applications can be found in many industrial processes.
This project, which starts on March 1, 2017 and has a duration of five years, will receive funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement number 714161).
|Finite-length iterative decoding: fundamental limits, practical constructions and inference (FLUID) RETOS 2015; Ministerio de Economía y Competitividad; 2016-2018|
|Shannons channel capacity establishes the largest information rate at which data can be reliably transmitted over a noisy channel; in this context, reliability is attained by using codes that add redundancy to the information message. Recently, a number of code families have been recently shown to perform close to the channel capacity. Among them, low-density parity-check (LDPC) codes have been adopted in many modern standards. The decoders of such codes are based on the belief propagation (BP) principle, an efficient algorithm to solve inference problems by iteratively passing local messages. The current analysis of BP iterative decoding algorithms focuses on their infinite-length performance. However, due to delay and complexity constraints, practical communication schemes transmit information in finite-length packets. Even if a coding scheme can be shown to asymptotically achieve capacity, it may perform far from the theoretical limit at finite blocklength. This can be attributed either to the code itself or to the poor performance of BP when loops in the associated graph shorten as the parity-check matrix becomes denser. Nowadays, there are no theoretical tools characterizing these two effects in a unified manner. In this project, we will provide an information-theoretic analysis of iterative decoding. Then, we will define design criteria for finite-length Generalized LDPC (GLDPC) codes that approach the corresponding fundamental limits. We shall also propose novel beyond-BP decoders based on recent advances in expectation propagation (EP) approximate inference for discrete distributions. Implementation constraints will be taken into account. In particular, we shall consider quantization methods for iterative decoders based on the theories of rate-distortion and mismatched decoding.
The iterative decoding principle extends to modern receivers, where general soft-input soft-output algorithms for receiver blocks such as multiple-input multiple-output detectors or turbo equalizers play a central role. In this context, optimal solutions are computationally unaffordable and BP fails as approximate inference approach. We shall extend the novel EP approach developed for channel coding into a soft-input soft-output algorithm receiver coupled to the decoder.
This project aims at building an ambitious theoretical framework for iterative approximate inference with focus on the finite-length regime. Specific project contributions are the following. First, the theoretical characterization, in terms of tradeoffs between rate, block length, and error probability, of short-length transmission under iterative decoding. Second, original GLDPC coding schemes under state-of-the-art decoding to approach these limits. Third, novel techniques to improve approximate inference in iterative decoders and detectors. And fourth, comprehensive experimental scenarios and toolboxes to evaluate code performance as a trade-off between computational complexity and gap to capacity limits, including realistic implementation constraints.
|Advanced Bayesian computation methods for estimation, prediction and control in multi-sensor complez system (ADVENTURE) RETOS 2015; Ministerio de Economía y Competitividad; 2016-2018|
|We have recently witnessed the advent of new technologies that hold promise of great improvements to the well-being of elderly people and individuals who suffer from a number of health conditions. State-of-the-art sensor technology bundled together with light-weight computing and communication devices, for example, potentially enable the monitoring of patients within their homes, with a standard of care that just a few years ago was only possible at special units in hospitals.
Unfortunately, hardware technology alone is not enough to bring all that potential into reality. It guarantees fast and inexpensive access to a wealth of data —yet how to extract knowledge from it and how to make informed and useful decisions is a problem of a different nature. There is an exacting demand of models that impose structure on the data, in order to interpret it, and algorithms that combine those models and the data bunch to estimate key magnitudes, detect ongoing conditions or predict future events. In order to meet these demands, we advocate a Bayesian approach to statistical inference and learning, which encompasses the tasks of model design, comparison and validation, as well as the development of flexible algorithms that fully exploit the features and structure of the underlying models.
|Annomalous human behavIour Detection (AID) Explora2014; Ministerio de Economía y Competitividad; 2015-2017|
|Brain disorders represent an enormous disease burden, in terms of human suffering and economic cost. Many brain disorders are chronic and incurable conditions that entail impairments in functioning across social, vocational and residential domains. In the AID project we will focus on two of them of higher prevalence: schizophrenia and affective disorders (depression and bipolar disease).The avoidance of relapses increases the functional performances, but patients with brain disorders are often unaware of their disability or their symptoms. Having access to real-life and real-time monitoring of psychiatric patients will allow to identify “relapse signatures” and acute symptom triggers, and to objectively monitor the effectiveness and side effects of treatments. This monitoring method provides objective data, overcoming the limitations of self-report (including subjectivity and recall bias), and the shortcomings of external informants, such as limited reliability.The aim of the AID project is to explore the feasibility of a method for detecting automatically the behavioral change in the beginning of a relapse of schizophrenic or affective disorder patients in ambulatory conditions by using inertial sensors. The desirable characteristics of this method are: 1) to provide interpretable information, 2) to be easy to personalize, 3) to be able to detect on-line behavioral changes, 4) to include the circadian and the calendar influence on the behavior, 5) to be robust, and; 6) to have a low complexity implementation.In AID we propose to develop a Bayesian on-line changepoint detection method over the sequence of activities provided by a human activity classifier that fulfill all the above requirements. The development of the methods and the assessment of the feasibility of such a device are based on the monitoring of real patients. The whole approach represents a breakthrough in the care and monitoring of psychiatric patients.|
|A new sequential Monte Carlo framework for tracking of nonlinear complex dynamical systems Office of Naval Research; 2015-2017|
|Many problems related to environmental sensing, situation awareness and information fusion boil down to the ability of efficiently tracking complex nonlinear, high-dimensional stochastic dynamical systems. Examples abound, ranging from classical multi-target tracking in battlefield scenarios to weather/environmental forecasting for tactical planning. The algorithms for prediction and tracking of random dynamical systems are collectively termed stochastic filters. Most of these techniques seek numerical approximations, since closed form solutions do not exist for general nonlinear systems. This is the case of particle filters (PFs), which are recursive (online) methods based on statistical Monte Carlo principles. While PFs can be applied to any dynamical system, they are often criticized as computationally heavy and inefficient in high-dimensional models, precisely because of their reliance on Monte Carlo integration. However, although a number of deterministic methods have been recently proposed (e.g., cubature, optimal transportation or deterministic flow filters) as potential replacements of PFs, none of them has e↵ectively overcome the dimensionality/complexity problem yet. In this proposal, we advocate the development of a new particle filtering framework (including an extended methodological setting and the theoretical tools for its analysis) that still has sequential Monte Carlo integration at its core but is endowed with a number of features that address directly the key issues of dimensionality and complexity. Such features include the partitioning of high-dimensional state spaces (a divide and conquer approach), the prevention of the degeneracy phenomenon in importance samplers and the ‘automatic stabilization’ of the tracker.We aim at developing both the methodological and the theoretical aspects of the new framework, and to apply the resulting algorithms to selected problems related to the tracking of multiple and/or complex targets.The design of new and efficient nonlinear trackers for multiple and/or complex targets is relevant to several focus areas of the US Naval Science & Technology Strategic Plan, including, at least, Assure Access to the Maritime Battlespace (focus area #1), Autonomy and Unmanned Systems (f. a. #2) and Expeditionary and Irregular Warfare (f. a. #3). We will specifically address the application of the new methodology to two problems: the joint tracking of a large number of targets and the forecasting of complex meteorological phenomena for tactical planning.|
|Intelligent Systems: Concepts and Applications (CASI-CAM-CM) Comunidad de Madrid; 2014-2018|
|The relevance of Intelligent Systems (IS) and Machine Learning (ML) becomes obvious when considering that they have to meet the challenge of providing applications many of them, like Smart Cities and eHealth, irreversible due to social and sustainability requirements demanding innovative massive data processes oriented to interdisciplinary fields. This project preempts the separation in the knowledge on IS/ML addressing some of their most important areas in an integrated and cooperative way, together with, necessarily interdisciplinary, applications. The following general objectives are pursued:
|Overhead-Throughput-Optimal Signaling Schemes for Next Generation Wireless Networks (OTOSiS) Retos2013; Ministerio de Economía y Competitividad; 2014-2016|
|The surge in the use of broadband services combined with the growth of machine-type communication poses high demands on future wireless networks, from the core (backhaul) to the periphery (cellular base stations). Network operators predict that network throughput will have to increase by two to three orders of magnitude by 2020 to match future demands. The required throughput gains can be achieved only by a denser and heterogeneous deployment of the wireless network infrastructure. Under this scenario, interference management, which requires the exchange of control information, becomes crucial.
While the control overhead due to exchange of control information in currently deployed wireless networks is negligible, the situation is different in the envisaged dense and heterogenous communication networks. In fact, in certain scenarios, the control-information overhead may outweigh the potential throughput gains. Hence, control-information appears to be the actual bottleneck towards the achievement of the throughput objectives of future broadband networks.
This project advocates a paradigm shift in the way control information is treated in wireless networks. Our philosophy is to view the amount of overhead due to control information as a crucial metric to assess the optimality of physical layer schemes, rather than just an accessory. We will investigate the cost of acquiring network knowledge in dense heterogeneous networks from a fundamental perspective, taking the latency constraints associated with different traffic typologies into account. By studying the information-theoretic limits of wireless networks, we will be able to describe their fundamental overhead-throughput-latency tradeoff. Using these limits, system designers will be able to perform a global wireless network optimization, thereby achieving unparalleled throughput and energy efficiency. We will further propose physical layer signaling schemes that optimally trade overhead, throughput, and latency.
|Towards and Efficient Mobile Internet (MobileNET) European Comission; Marie Curie Career Integration Grant; 2013-2017|
|It is expected that, very soon, the Internet will connect billions of mobile device users. This places high demands on the communications infrastructure and on the mobile devices. To explore how to use the resources in future communication networks in the most efficient way, we will study the information-theoretic limits of communication networks and suggest communication strategies that attain those limits. Such limits have been studied extensively in the past, but the vast majority of the work has made simplifying assumptions, such as that the nodes are perfectly synchronized, have perfect channel state information, or use infinitely long codewords, which are not realistic for wireless and dynamic networks. In contrast, we will derive realistic fundamental limits by including asynchronism, noncoherence, and limited codeword length in the analysis. A related topic addressed in this project is the design of mobile devices. Using tools from information theory, we will study the fundamental tradeoff between performance, robustness against nonlinearities in the devices, and implementation complexity, aiming at novel encoding and decoding algorithms that can be implemented in hardware.|
|Computational Inference in High Dimensional Random Complex Systems (COMPREHENSION) Ministerio de Economía y Competitividad; 2013-2015|
|The term “complex system” is often used to describe a network of elementary units whose collective behavior depends not only on the features of these constituent blocks but also, and specially, on their interactions. The seminal work by Watts and Strogatz (Nature, 1998) sparked a tremendous interest in these kind of large-scale systems. Currently, there exists a wealth of engineering and scientic problems to be addressed related to the modeling, prediction and control of complex systems. To narrow the focus, here we investigate dynamic, high-dimensional and random complex systems and we aim at developing new methodologies for computational inference which are both theoretically sound and practically effective in this setup.While the advance in the theoretical and methodological field is of utmost importance, we also pursue practical applications of the new methods. The most ambitious goal is the modeling of atrial fibrillations (AF) in the human heart; we also investigate relevant problems related to wireless communications and sensor networks (WCSNs), including collaborative routing and distributed implementation of statistical signal processing methods on multi-hop WCSNs. The third axis on which we move from theory to applications deals with environmental applications.|
|Advances in Learning, Communications and Information Theory (ALCIT) Ministerio de Economía y Competitividad; 2013-2015|
|With the current technology trends, communication networks are evolving towards ever-complex systems consisting of a large number of heterogeneous nodes that enjoy enhanced capabilities for sensing, storing, processing and transmitting data in many sophisticated forms. This project deals with two important aspects of the design and analysis of these networks. We first study the information-theoretic limits of the aforementioned networks. Networks typically process and transmit information in the form of packets of finite duration in order to accommodate system constraints on delay and complexity. On the other hand, information theory ignores these constraints and finds the ultimate limits of network compression and communication for packets of infinite duration. This is clearly unfeasible in practice and hence, in this project, we will focus on the derivation of finite-length information-theoretic results for our networks of interest. We will also focus on the design of practically implementable compression and communications systems that approach those limits. Secondly, we need to learn from the data captured from these devices that are constantly monitoring a changing, diverse and complex environment. In this scenario parametric models will typically fail, because they cannot deal with the richness and complexity of real world data. We will rely on nonparametric models that can adjust the complexity of their solution to that of the data and describe the richness on the world that surrounds them. Our main goal is to advance towards the solution of a number of fundamental problems that arise in this scenario, both in terms of new formal methodologies and numerical techniques and in the demonstration of their validity by means of an adequate hardware platform.We will apply the obtained results in machine learning and finite length information theory to solve all relevant psychiatric problems: the remote registration of patient’s activities. We will build two demonstrators using currently available hardware and software elements as an intermediate stage to the building of the final system.|
|Machine Learning for Personalized Medicine (MLPM) European Comission; Marie Curie Actions; 2013-2017|
|MLPM is a Marie Curie Initial Training Network, funded by the European Union within the 7th Framework Programme. MLPM has started on January 1, 2013 and will be carried out over a period of four years. MLPM is a consortium of several universities, research institutions and companies located in Spain, France, Germany, Belgium, UK, and in the USA. MLPM involves the predoctoral training of 14 young scientists in the research field at the interface of Machine Learning and Medicine. Its goal is to educate interdisciplinary experts who will develop and employ the computational and statistical tools that are necessary to enable personalized medical treatment of patients according to their genetic and molecular properties and who are aware of the scientific, clinical and industrial implications of this research.|
|Environment and Genes in Schizophrenia (AGES) Comunidad de Madrid; 2012-2016|
|Mental disorders, as a whole, are the leading cause of disability world-wide. In fact, mental health costs represent between 3 and 4% of the gross domestic product in the EU, and mental disorders are the leading cause of early retirement and disability pensions in pour society.
In addition, mental diseases take a heavy toll that is not just financial, but also societal, in terms of their burden on individuals and their families. Schizophrenia has been described as the most devastating psychiatric disorder with a prevalence of 1% in the general population. It is estimated that more than 50.000 people in Madrid experience schizophrenia during their lifetime.
The AGES-CM consortium (Ambiente y Genes en Esquizofrenia – Grupos de Investigación de la Comunidad de Madrid) is formed by leading schizophrenia research groups from the Comunidad de Madrid (CM). These research groups from our Community lead the field of schizophrenia research in Spain and at the international level in many aspects of the disease.
Completed Research Projects (2008-2014)
|Foundations and Methodologies for Future Communication and Sensor Networks (COMONSENS) Ministerio de Ciencia e Innovación (Consolider-Ingenio 2010); 2008-2014|
|New Computational Inference Methods for Spatial Dynamical Models Ministerio de Educación, Cultura y Deporte; 2012-2013|
|Analysis, Design and Optimization of Next Generation Wireless Communications Systems Ministerio de Ciencia e Innovación; 2010-2012|
|Estimation, Transmission and Optimization in Sensor Networks (ETORS) Comunidad Autónoma de Madrid and Universidad Carlos III de Madrid; 2011|
|Distributed Learning, Communication and Information Processing (DEIPRO) Ministerio de Ciencia e Innovación; 2009-2012|
|Ubiquitous Service Arquitecture for the Mobile Super Prosumer (uSERVICE) Ministerio de Industria, Turismo y Comercio – Plan Avanz@; 2009-2010|
|Intelligent Intermodal Freight Transport (TIMI) CDTI (programa CENIT); 2007-2010|
|Smart Monitorization (MONIN) Ministerio de Educación y Ciencia; 2007-2009|
|Consortium for the Development of Advanced Technologies for Medicine (CDTEAM) CDTI (CENIT Consortium); 2006-2010|
|Approximate Inference for Communications Marie Curie Outgoing Fellowship (FP6 – European Union). 2006-2009|
|Multimedia Distributed Processing (PRO-MULTIDIS-CM) Comunidad Autónoma de Madrid; 2006-2009|
|Eficient Multimedia Communications Enabled by Advanced Learning Algorithms (CREMA3) Ministerio de Educación y Ciencia; 2006-2008|