320,756 research outputs found

    Signal Processing in Large Systems: a New Paradigm

    Full text link
    For a long time, detection and parameter estimation methods for signal processing have relied on asymptotic statistics as the number nn of observations of a population grows large comparatively to the population size NN, i.e. n/Nn/N\to \infty. Modern technological and societal advances now demand the study of sometimes extremely large populations and simultaneously require fast signal processing due to accelerated system dynamics. This results in not-so-large practical ratios n/Nn/N, sometimes even smaller than one. A disruptive change in classical signal processing methods has therefore been initiated in the past ten years, mostly spurred by the field of large dimensional random matrix theory. The early works in random matrix theory for signal processing applications are however scarce and highly technical. This tutorial provides an accessible methodological introduction to the modern tools of random matrix theory and to the signal processing methods derived from them, with an emphasis on simple illustrative examples

    A Nonlinear Super-Exponential Rational Model of Speculative Financial Bubbles

    Full text link
    Keeping a basic tenet of economic theory, rational expectations, we model the nonlinear positive feedback between agents in the stock market as an interplay between nonlinearity and multiplicative noise. The derived hyperbolic stochastic finite-time singularity formula transforms a Gaussian white noise into a rich time series possessing all the stylized facts of empirical prices, as well as accelerated speculative bubbles preceding crashes. We use the formula to invert the two years of price history prior to the recent crash on the Nasdaq (april 2000) and prior to the crash in the Hong Kong market associated with the Asian crisis in early 1994. These complex price dynamics are captured using only one exponent controlling the explosion, the variance and mean of the underlying random walk. This offers a new and powerful detection tool of speculative bubbles and herding behavior.Comment: Latex document of 24 pages including 5 eps figure

    DYNAMICS OF RANDOM EARLY DETECTION GATEWAY UNDER A LARGE NUMBER OF TCP FLOWS

    Get PDF
    While active queue management (AQM) mechanisms such as Random Early Detection (RED) are widely deployed in the Internet, they are rarely utilized or otherwise poorly configured. The problem stems from a lack of a tractable analytical framework which captures the interaction between the TCP congestion-control and AQM mechanisms. Traditional TCP traffic modeling has focused on "micro-scale" modeling of TCP, i.e., detailed modeling of a single TCP flow. While micro-scale models of TCP are suitable for understanding the precise behavior of an individual flow, they are not well suited to the situation where a large number of TCP flows interact with each other as is the case in realistic networks. In this dissertation, an innovative approach to TCP traffic modeling is proposed by considering the regime where the number of TCP flows competing for the bandwidth in the bottleneck RED gateway is large. In the limit, the queue size and the aggregate TCP traffic can be approximated by simple recursions which are independent of the number of flows. The limiting model is therefore scalable as it does not suffer from the state space explosion. The steady-state queue length and window distribution can be evaluated from well-known TCP models. We also extend the analysis to a more realistic model which incorporates session-level dynamics and heterogeneous round-trip delays. Typically, ad-hoc assumptions are required to make the analysis for models with session-level dynamics tractable under a certain regime. In contrast, our limiting model derived here is compatible with other previously proposed models in their respective regime without having to rely on ad-hoc assumptions. The contributions from these additional layers of dynamics to the asymptotic queue are now crisply revealed through the limit theorems. Under mild assumptions, we show that the steady-state queue size depends on the file size and round-trip delay only through their mean values. We obtain more accurate description of the queue dynamics by means of a Central Limit analysis which identifies an interesting relationship between the queue fluctuations and the random packet marking mechanism in AQM. The analysis also reveals the dependency of the magnitude of the queue fluctuations on the variability of the file size and round-trip delay. Simulation results supporting conclusions drawn from the limit theorems are also presented

    Detecting the Influence of Spreading in Social Networks with Excitable Sensor Networks

    Full text link
    Detecting spreading outbreaks in social networks with sensors is of great significance in applications. Inspired by the formation mechanism of human's physical sensations to external stimuli, we propose a new method to detect the influence of spreading by constructing excitable sensor networks. Exploiting the amplifying effect of excitable sensor networks, our method can better detect small-scale spreading processes. At the same time, it can also distinguish large-scale diffusion instances due to the self-inhibition effect of excitable elements. Through simulations of diverse spreading dynamics on typical real-world social networks (facebook, coauthor and email social networks), we find that the excitable senor networks are capable of detecting and ranking spreading processes in a much wider range of influence than other commonly used sensor placement methods, such as random, targeted, acquaintance and distance strategies. In addition, we validate the efficacy of our method with diffusion data from a real-world online social system, Twitter. We find that our method can detect more spreading topics in practice. Our approach provides a new direction in spreading detection and should be useful for designing effective detection methods

    Dynamic weight parameter for the Random Early Detection (RED) in TCP networks

    Get PDF
    This paper presents the Weighted Random Early Detection (WTRED) strategy for congestion handling in TCP networks. WTRED provides an adjustable weight parameter to increase the sensitivity of the average queue size in RED gateways to the changes in the actual queue size. This modification, over the original RED proposal, helps gateways minimize the mismatch between average and actual queue sizes in router buffers. WTRED is compared with RED and FRED strategies using the NS-2 simulator. The results suggest that WTRED outperforms RED and FRED. Network performance has been measured using throughput, link utilization, packet loss and delay

    Processing asymmetry of transitions between order and disorder in human auditory cortex

    Get PDF
    Purpose: To develop an algorithm to resolve intrinsic problems with dose calculations using pencil beams when particles involved in each beam are overreaching a lateral density interface or when they are detouring in a laterally heterogeneous medium. Method and Materials: A finding on a Gaussian distribution, such that it can be approximately decomposed into multiple narrower, shifted, and scaled ones, was applied to dynamic splitting of pencil beams implemented in a dose calculation algorithm for proton and ion beams. The method was tested in an experiment with a range-compensated carbon-ion beam. Its effectiveness and efficiency were evaluated for carbon-ion and proton beams in a heterogeneous phantom model. Results: The splitting dose calculation reproduced the detour effect observed in the experiment, which amounted to about 10% at a maximum or as large as the lateral particle-disequilibrium effect. The proton-beam dose generally showed large scattering effects including the overreach and detour effects. The overall computational times were 9 s and 45 s for non-splitting and splitting carbon-ion beams and 15 s and 66 s for non-splitting and splitting proton beams. Conclusions: The beam-splitting method was developed and verified to resolve the intrinsic size limitation of the Gaussian pencil-beam model in dose calculation algorithms. The computational speed slowed down by factor of 5, which would be tolerable for dose accuracy improvement at a maximum of 10%, in our test case.AAPM Annual Meeting 200
    corecore