11,557 research outputs found
Nanostructuring Graphene by Dense Electronic Excitation
The ability to manufacture tailored graphene nanostructures is a key factor
to fully exploit its enormous technological potential. We have investigated
nanostructures created in graphene by swift heavy ion induced folding. For our
experiments, single layers of graphene exfoliated on various substrates and
freestanding graphene have been irradiated and analyzed by atomic force and
high resolution transmission electron microscopy as well as Raman spectroscopy.
We show that the dense electronic excitation in the wake of the traversing ion
yields characteristic nanostructures each of which may be fabricated by
choosing the proper irradiation conditions. These nanostructures include unique
morphologies such as closed bilayer edges with a given chirality or nanopores
within supported as well as freestanding graphene. The length and orientation
of the nanopore, and thus of the associated closed bilayer edge, may be simply
controlled by the direction of the incoming ion beam. In freestanding graphene,
swift heavy ion irradiation induces extremely small openings, offering the
possibility to perforate graphene membranes in a controlled way.Comment: 16 pages, 5 figures, submitted to Nanotechnolog
Conditional generation of sub-Poissonian light from two-mode squeezed vacuum via balanced homodyne detection on idler mode
A simple scheme for conditional generation of nonclassical light with
sub-Poissonian photon-number statistics is proposed. The method utilizes
entanglement of signal and idler modes in two-mode squeezed vacuum state
generated in optical parametric amplifier. A quadrature component of the idler
mode is measured in balanced homodyne detector and only those experimental runs
where the absolute value of the measured quadrature is higher than certain
threshold are accepted. If the threshold is large enough then the conditional
output state of signal mode exhibits reduction of photon-number fluctuations
below the coherent-state level.Comment: 7 pages, 6 figures, REVTe
Machine Learning and Portfolio Optimization
The portfolio optimization model has limited impact in practice due to estimation issues when applied with real data. To address this, we adapt two machine learning methods, regularization and cross-validation, for portfolio optimization. First, we introduce performance-based regularization (PBR), where the idea is to constrain the sample variances of the estimated portfolio risk and return, which steers the solution towards one associated with less estimation error in the performance. We consider PBR for both mean-variance and mean-CVaR problems. For the mean-variance problem, PBR introduces a quartic polynomial constraint, for which we make two convex approximations: one based on rank-1 approximation and another based on a convex quadratic approximation. The rank-1 approximation PBR adds a bias to the optimal allocation, and the convex quadratic approximation PBR shrinks the sample covariance matrix. For the mean-CVaR problem, the PBR model is a combinatorial optimization problem, but we prove its convex relaxation, a QCQP, is essentially tight. We show that the PBR models can be cast as robust optimization problems with novel uncertainty sets and establish asymptotic optimality of both Sample Average Approximation (SAA) and PBR solutions and the corresponding efficient frontiers. To calibrate the right hand sides of the PBR constraints, we develop new, performance-based k-fold cross-validation algorithms. Using these algorithms, we carry out an extensive empirical investigation of PBR against SAA, as well as L1 and L2 regularizations and the equally-weighted portfolio. We find that PBR dominates all other benchmarks for two out of three of Fama-French data sets
SPORT: A new sub-nanosecond time-resolved instrument to study swift heavy ion-beam induced luminescence - Application to luminescence degradation of a fast plastic scintillator
We developed a new sub-nanosecond time-resolved instrument to study the
dynamics of UV-visible luminescence under high stopping power heavy ion
irradiation. We applied our instrument, called SPORT, on a fast plastic
scintillator (BC-400) irradiated with 27-MeV Ar ions having high mean
electronic stopping power of 2.6 MeV/\mu m. As a consequence of increasing
permanent radiation damages with increasing ion fluence, our investigations
reveal a degradation of scintillation intensity together with, thanks to the
time-resolved measurement, a decrease in the decay constant of the
scintillator. This combination indicates that luminescence degradation
processes by both dynamic and static quenching, the latter mechanism being
predominant. Under such high density excitation, the scintillation
deterioration of BC-400 is significantly enhanced compared to that observed in
previous investigations, mainly performed using light ions. The observed
non-linear behaviour implies that the dose at which luminescence starts
deteriorating is not independent on particles' stopping power, thus
illustrating that the radiation hardness of plastic scintillators can be
strongly weakened under high excitation density in heavy ion environments.Comment: 5 figures, accepted in Nucl. Instrum. Methods
Rate perception adapts across the senses: evidence for a unified timing mechanism
The brain constructs a representation of temporal properties of events, such as duration and frequency, but the underlying neural mechanisms are under debate. One open question is whether these mechanisms are unisensory or multisensory. Duration perception studies provide some evidence for a dissociation between auditory and visual timing mechanisms; however, we found active crossmodal interaction between audition and vision for rate perception, even when vision and audition were never stimulated together. After exposure to 5 Hz adaptors, people perceived subsequent test stimuli centered around 4 Hz to be slower, and the reverse after exposure to 3 Hz adaptors. This aftereffect occurred even when the adaptor and test were different modalities that were never presented together. When the discrepancy in rate between adaptor and test increased, the aftereffect was attenuated, indicating that the brain uses narrowly-tuned channels to process rate information. Our results indicate that human timing mechanisms for rate perception are not entirely segregated between modalities and have substantial implications for models of how the brain encodes temporal features. We propose a model of multisensory channels for rate perception, and consider the broader implications of such a model for how the brain encodes timing
Data taking strategy for the phase study in
The study of the relative phase between strong and electromagnetic amplitudes
is of great importance for understanding the dynamics of charmonium decays. The
information of the phase can be obtained model-independently by fitting the
scan data of some special decay channels, one of which is . To find out the optimal data taking strategy for a scan experiment
in the measurement of the phase in , the
minimization process is analyzed from a theoretical point of view. The result
indicates that for one parameter fit, only one data taking point in the
vicinity of a resonance peak is sufficient to acquire the optimal precision.
Numerical results are obtained by fitting simulated scan data. Besides the
results related to the relative phase between strong and electromagnetic
amplitudes, the method is extended to analyze the fits of other resonant
parameters, such as the mass and the total decay width of .Comment: 13 pages, 7 figure
- …