4,903 research outputs found
Electroactive micro and nanowells for optofluidic storage
This paper reports an optofluidic architecture which enables reversible trapping, detection and long term storage of spectrally multiplexed semiconductor quantum dot cocktails in electrokinetically active wells ranging in size from 200nm to 5ÎŒm. Here we describe the microfluidic delivery of these cocktails, fabrication method and principal of operation for the wells, and characterize the readout capabilities, storage and erasure speeds, internal spatial signal uniformity and potential storage density of the devices. We report storage and erase speeds of less than 153ms and 30ms respectively and the ability to provide 6-bit storage in a single 200nm well through spectral and intensity multiplexing. Furthermore, we present a novel method for enabling passive long term storage of the quantum dots in the wells by transporting them through an agarose gel matrix. We envision that this technique could find eventual application in fluidic memory or display devices
Effects of variations of load distribution on network performance
This paper is concerned with the characterization of the relationship between
topology and traffic dynamics. We use a model of network generation that allows
the transition from random to scale free networks. Specifically, we consider
three different topological types of network: random, scale-free with \gamma =
3, scale-free with \gamma = 2. By using a novel LRD traffic generator, we
observe best performance, in terms of transmission rates and delivered packets,
in the case of random networks. We show that, even if scale-free networks are
characterized by shorter characteristic-path- length (the lower the exponent,
the lower the path-length), they show worst performances in terms of
communication. We conjecture this could be explained in terms of changes in the
load distribution, defined here as the number of shortest paths going through a
given vertex. In fact, that distribu- tion is characterized by (i) a decreasing
mean (ii) an increas- ing standard deviation, as the networks becomes
scale-free (especially scale-free networks with low exponents). The use of a
degree-independent server also discriminates against a scale-free structure. As
a result, since the model is un- controlled, most packets will go through the
same vertices, favoring the onset of congestion.Comment: 4 pages, 4 figures, included in conference proceedings ISCAS 2005,
Kobe Japa
Non-gaussianity of the critical 3d Ising model
We discuss the 4pt function of the critical 3d Ising model, extracted from
recent conformal bootstrap results. We focus on the non-gaussianity Q - the
ratio of the 4pt function to its gaussian part given by three Wick
contractions. This ratio reveals significant non-gaussianity of the critical
fluctuations. The bootstrap results are consistent with a rigorous inequality
due to Lebowitz and Aizenman, which limits Q to lie between 1/3 and 1.Comment: 10 pages, 6 figures; v2: refs added; v3: refs updated, published
version; v4: acknowledgement adde
Communication models with distributed transmission rates and buffer sizes
The paper is concerned with the interplay between network structure and
traffic dynamics in a communications network, from the viewpoint of end-to-end
performance of packet transfer. We use a model of network generation that
allows the transition from random to scale-free networks. Specifically, we are
able to consider three different topologycal types of networks: (a) random; (b)
scale-free with \gamma=3; (c) scale free with \gamma=2. We also use an LRD
traffic generator in order to reproduce the fractal behavior that is observed
in real world data communication. The issue is addressed of how the traffic
behavior on the network is influenced by the variable factors of the
transmission rates and queue length restrictions at the network vertices. We
show that these factors can induce drastic changes in the throughput and
delivery time of network performance and are able to counter-balance some
undesirable effects due to the topology.Comment: 4 pages, 5 figures, IEEE Symposium on Circuits and Systems, Island of
Kos, Greece, 200
Modelling and Forecasting Dynamic VaR Thresholds for Risk Management and Regulation
The paper presents methods of estimating Value-at-Risk (VaR) thresholds utilising two calibrated models and three conditional volatility or GARCH models. These are used to estimate and forecast the VaR thresholds of an equally-weighted portfolio, comprising: the S & P500, CAC40, FTSE100 a Swiss market index (SMI). On the basis of the number of (non-)violations of the Basel Accord thresholds, the best performing model is PS-GARCH, followed by VARMA-AGARCH, then Portfolio-GARCH and the RiskmetricsTM -EWMA models, both of which would attract a penalty of 0.5. The worst forecasts are obtained from the standard normal method based on historical variances.Value at Risk (VaR) modelling, forecasting risk thresholds, Portfolio Spillover-Garch, risk management and regulation Acknowledgements: The authors wish to thank Felix Chan, Suhejla Hoti, Alex Zsimayer and seminar participants at the Institute of Economics, Academia Sinica, Taiwan, Ling Tung Institute of Technology, Griffith University, Queensland University of Technology, and University of Queensland for helpful comments and suggestions. The first and second authors wish to thank the Australian Research Council for financial support. The third author wishes to acknowledge a University Postgraduate Award and an International Postgraduate Research Scholarship at the University of Western Australia.
- âŠ