41,179 research outputs found
Recommended from our members
Exploration of the functional consequences of fixational eye movements in the absence of a fovea.
A recent theory posits that ocular drifts of fixational eye movements serve to reformat the visual input of natural images, so that the power of the input image is equalized across a range of spatial frequencies. This "spectral whitening" effect is postulated to improve the processing of high-spatial-frequency information and requires normal fixational eye movements. Given that people with macular disease exhibit abnormal fixational eye movements, do they also exhibit spectral whitening? To answer this question, we computed the power spectral density of movies of natural images translated in space and time according to the fixational eye movements (thus simulating the retinal input) of a group of observers with long-standing bilateral macular disease. Just as for people with normal vision, the power of the retinal input at low spatial frequencies was lower than that based on the 1/f2 relationship, demonstrating spectral whitening. However, the amount of whitening was much less for observers with macular disease when compared with age-matched controls with normal vision. A mediation analysis showed that the eccentricity of the preferred retinal locus adopted by these observers and the characteristics of ocular drifts are important factors limiting the amount of whitening. Finally, we did not find a normal aging effect on spectral whitening. Although these findings alone cannot form a causal link between macular disease and spectral properties of eye movements, they suggest novel potential means of modifying the characteristics of fixational eye movements, which may in turn improve functional vision for people with macular disease
The anatomy of the Gunn laser
A monopolar GaAs Fabry–Pérot cavity laser based on the Gunn effect is studied both experimentally and theoretically. The light emission occurs via the band-to-band recombination of impact-ionized excess carriers in the propagating space-charge (Gunn) domains. Electroluminescence spectrum from the cleaved end-facet emission of devices with Ga1−xAlxAs (x = 0.32) waveguides shows clearly a preferential mode at a wavelength around 840 nm at T = 95 K. The threshold laser gain is assessed by using an impact ionization coefficient resulting from excess carriers inside the high-field domain
Roughness effects in turbulent forced convection
We conducted direct numerical simulations (DNSs) of turbulent flow over
three-dimensional sinusoidal roughness in a channel. A passive scalar is
present in the flow with Prandtl number , to study heat transfer by
forced convection over this rough surface. The minimal channel is used to
circumvent the high cost of simulating high Reynolds number flows, which
enables a range of rough surfaces to be efficiently simulated. The near-wall
temperature profile in the minimal channel agrees well with that of the
conventional full-span channel, indicating it can be readily used for
heat-transfer studies at a much reduced cost compared to conventional DNS. As
the roughness Reynolds number, , is increased, the Hama roughness
function, , increases in the transitionally rough regime before
tending towards the fully rough asymptote of , where
is a constant that depends on the particular roughness geometry and
is the von K\'arm\'an constant. In this fully rough
regime, the skin-friction coefficient is constant with bulk Reynolds number,
. Meanwhile, the temperature difference between smooth- and rough-wall
flows, , appears to tend towards a constant value,
. This corresponds to the Stanton number (the temperature
analogue of the skin-friction coefficient) monotonically decreasing with
in the fully rough regime. Using shifted logarithmic velocity and temperature
profiles, the heat transfer law as described by the Stanton number in the fully
rough regime can be derived once both the equivalent sand-grain roughness
and the temperature difference are known. In
meteorology, this corresponds to the ratio of momentum and heat transfer
roughness lengths, , being linearly proportional to ,
the momentum roughness length [continued]...Comment: Accepted (In press) in the Journal of Fluid Mechanic
An Elliptical Galaxy Luminosity Function and Velocity Dispersion Sample of Relevance for Gravitational Lensing Statistics
We have selected 42 elliptical galaxies from the literature and estimated
their velocity dispersions at the effective radius (\sigma_{\re}) and at 0.54
effective radii (\vff). We find by a dynamical analysis that the normalized
velocity dispersion of the dark halo of an elliptical galaxy \vdm is roughly
\sigma_{\re} multiplied by a constant, which is almost independent of the
core radius or the anisotropy parameter of each galaxy. Our sample analysis
suggests that \vdm^{*} lies in the range 178-198 km s. The power law
relation we find between the luminosity and the dark matter velocity dispersion
measured in this way is (L/L^{*}) = (\vdm/\vdm^{*})^\gamma, where is
between 2-3. These results are of interest for strong gravitational lensing
statistics studies.
In order to determine the value of \vdm^{*}, we calculate \mstar in the
same \bt band in which \vdm^{*} has been estimated. We select 131
elliptical galaxies as a complete sample set with apparent magnitudes \bt
between 9.26 and 12.19. We find that the luminosity function is well fitted to
the Schechter form, with parameters \mstar = -19.66 + 5, , and the normalization constant Mpc, with the Hubble constant
\hnot = 100 km s Mpc. This normalization implies that
morphology type E galaxies make up (10.8 1.2) per cent of all galaxies.Comment: 18 pages latex, with ps figs included. accepted by New Astronomy
(revised to incorporate referees comments
Impact of edge-removal on the centrality betweenness of the best spreaders
The control of epidemic spreading is essential to avoid potential fatal
consequences and also, to lessen unforeseen socio-economic impact. The need for
effective control is exemplified during the severe acute respiratory syndrome
(SARS) in 2003, which has inflicted near to a thousand deaths as well as
bankruptcies of airlines and related businesses. In this article, we examine
the efficacy of control strategies on the propagation of infectious diseases
based on removing connections within real world airline network with the
associated economic and social costs taken into account through defining
appropriate quantitative measures. We uncover the surprising results that
removing less busy connections can be far more effective in hindering the
spread of the disease than removing the more popular connections. Since
disconnecting the less popular routes tend to incur less socio-economic cost,
our finding suggests the possibility of trading minimal reduction in
connectivity of an important hub with efficiencies in epidemic control. In
particular, we demonstrate the performance of various local epidemic control
strategies, and show how our approach can predict their cost effectiveness
through the spreading control characteristics.Comment: 11 pages, 4 figure
Weak Parity
We study the query complexity of Weak Parity: the problem of computing the
parity of an n-bit input string, where one only has to succeed on a 1/2+eps
fraction of input strings, but must do so with high probability on those inputs
where one does succeed. It is well-known that n randomized queries and n/2
quantum queries are needed to compute parity on all inputs. But surprisingly,
we give a randomized algorithm for Weak Parity that makes only
O(n/log^0.246(1/eps)) queries, as well as a quantum algorithm that makes only
O(n/sqrt(log(1/eps))) queries. We also prove a lower bound of
Omega(n/log(1/eps)) in both cases; and using extremal combinatorics, prove
lower bounds of Omega(log n) in the randomized case and Omega(sqrt(log n)) in
the quantum case for any eps>0. We show that improving our lower bounds is
intimately related to two longstanding open problems about Boolean functions:
the Sensitivity Conjecture, and the relationships between query complexity and
polynomial degree.Comment: 18 page
Reliability assessment of microgrid with renewable generation and prioritized loads
With the increase in awareness about the climate change, there has been a
tremendous shift towards utilizing renewable energy sources (RES). In this
regard, smart grid technologies have been presented to facilitate higher
penetration of RES. Microgrids are the key components of the smart grids.
Microgrids allow integration of various distributed energy resources (DER) such
as the distributed generation (DGs) and energy storage systems (ESSs) into the
distribution system and hence remove or delay the need for distribution
expansion. One of the crucial requirements for utilities is to ensure that the
system reliability is maintained with the inclusion of microgrid topology.
Therefore, this paper evaluates the reliability of a microgrid containing
prioritized loads and distributed RES through a hybrid analytical-simulation
method. The stochasticity of RES introduces complexity to the reliability
evaluation. The method takes into account the variability of RES through Monte-
Carlo state sampling simulation. The results indicate the reliability
enhancement of the overall system in the presence of the microgrid topology. In
particular, the highest priority load has the largest improvement in the
reliability indices. Furthermore, sensitivity analysis is performed to
understand the effects of the failure of microgrid islanding in the case of a
fault in the upstream network
- …