516 research outputs found
Quantum trajectories for the realistic measurement of a solid-state charge qubit
We present a new model for the continuous measurement of a coupled quantum
dot charge qubit. We model the effects of a realistic measurement, namely
adding noise to, and filtering, the current through the detector. This is
achieved by embedding the detector in an equivalent circuit for measurement.
Our aim is to describe the evolution of the qubit state conditioned on the
macroscopic output of the external circuit. We achieve this by generalizing a
recently developed quantum trajectory theory for realistic photodetectors [P.
Warszawski, H. M. Wiseman and H. Mabuchi, Phys. Rev. A_65_ 023802 (2002)] to
treat solid-state detectors. This yields stochastic equations whose (numerical)
solutions are the ``realistic quantum trajectories'' of the conditioned qubit
state. We derive our general theory in the context of a low transparency
quantum point contact. Areas of application for our theory and its relation to
previous work are discussed.Comment: 7 pages, 2 figures. Shorter, significantly modified, updated versio
Comparing the forecastability of alternative quantitative models: a trading simulation approach in financial engineering
AbstractIn this article, we build Box-Jenkins ARMA model and ARMA-GARCH model to forecast the returns of shanghai stock exchange composite index in financial engineering. Out-of-sample forecasting performances are evaluated to compare the forecastability of the two models. Traditional engineering type of models aim to minimize statistical errors, however, the model with minimum engineering type of statistical errors does not necessarily guarantee maximized trading profits, which is often deemed as the ultimate objective of financial application. The best way to evaluate alternative financial model is therefore to evaluate their trading performance by means of trading simulation.We find that both quantitative models are able to forecast the future movements of the market accurately, which yields significant risk adjusted returns compared to the overall market during the out-of-sample period. In addition, although the ARMA-GARCH model is better than the ARMA model theoretically and statistically, the latter outperforms the former with significantly higher trading performances
Beyond Volume: The Impact of Complex Healthcare Data on the Machine Learning Pipeline
From medical charts to national census, healthcare has traditionally operated
under a paper-based paradigm. However, the past decade has marked a long and
arduous transformation bringing healthcare into the digital age. Ranging from
electronic health records, to digitized imaging and laboratory reports, to
public health datasets, today, healthcare now generates an incredible amount of
digital information. Such a wealth of data presents an exciting opportunity for
integrated machine learning solutions to address problems across multiple
facets of healthcare practice and administration. Unfortunately, the ability to
derive accurate and informative insights requires more than the ability to
execute machine learning models. Rather, a deeper understanding of the data on
which the models are run is imperative for their success. While a significant
effort has been undertaken to develop models able to process the volume of data
obtained during the analysis of millions of digitalized patient records, it is
important to remember that volume represents only one aspect of the data. In
fact, drawing on data from an increasingly diverse set of sources, healthcare
data presents an incredibly complex set of attributes that must be accounted
for throughout the machine learning pipeline. This chapter focuses on
highlighting such challenges, and is broken down into three distinct
components, each representing a phase of the pipeline. We begin with attributes
of the data accounted for during preprocessing, then move to considerations
during model building, and end with challenges to the interpretation of model
output. For each component, we present a discussion around data as it relates
to the healthcare domain and offer insight into the challenges each may impose
on the efficiency of machine learning techniques.Comment: Healthcare Informatics, Machine Learning, Knowledge Discovery: 20
Pages, 1 Figur
Recommended from our members
Modeling the ongoing dynamics of short and long-range temporal correlations in broadband EEG during movement
Electroencephalogram (EEG) undergoes complex temporal and spectral changes during voluntary movement intention. Characterization of such changes has focused mostly on narrowband spectral processes such as Event-Related Desynchronization (ERD) in the sensorimotor rhythms because EEG is mostly considered as emerging from oscillations of the neuronal populations. However, the changes in the temporal dynamics, especially in the broadband arrhythmic EEG have not been investigated for movement intention detection. The Long-Range Temporal Correlations (LRTC) are ubiquitously present in several neuronal processes, typically requiring longer timescales to detect. In this paper, we study the ongoing changes in the dynamics of long- as well as short-range temporal dependencies in the single trial broadband EEG during movement intention. We obtained LRTC in 2 s windows of broadband EEG and modeled it using the Autoregressive Fractionally Integrated Moving Average (ARFIMA) model which allowed simultaneous modeling of short- and long-range temporal correlations. There were significant (p < 0.05) changes in both broadband long- and short-range temporal correlations during movement intention and execution. We discovered that the broadband LRTC and narrowband ERD are complementary processes providing distinct information about movement because eliminating LRTC from the signal did not affect the ERD and conversely, eliminating ERD from the signal did not affect LRTC. Exploring the possibility of applications in Brain Computer Interfaces (BCI), we used hybrid features with combinations of LRTC, ARFIMA, and ERD to detect movement intention. A significantly higher (p < 0.05) classification accuracy of 88.3 ± 4.2% was obtained using the combination of ARFIMA and ERD features together, which also predicted the earliest movement at 1 s before its onset. The ongoing changes in the long- and short-range temporal correlations in broadband EEG contribute to effectively capturing the motor command generation and can be used to detect movement successfully. These temporal dependencies provide different and additional information about the movement
Neutralino versus axion/axino cold dark matter in the 19 parameter SUGRA model
We calculate the relic abundance of thermally produced neutralino cold dark
matter in the general 19 parameter supergravity (SUGRA-19) model. A scan over
GUT scale parameters reveals that models with a bino-like neutralino typically
give rise to a dark matter density \Omega_{\tz_1}h^2\sim 1-1000, i.e. between 1
and 4 orders of magnitude higher than the measured value. Models with higgsino
or wino cold dark matter can yield the correct relic density, but mainly for
neutralino masses around 700-1300 GeV. Models with mixed bino-wino or
bino-higgsino CDM, or models with dominant co-annihilation or A-resonance
annihilation can yield the correct abundance, but such cases are extremely hard
to generate using a general scan over GUT scale parameters; this is indicative
of high fine-tuning of the relic abundance in these cases. Requiring that
m_{\tz_1}\alt 500 GeV (as a rough naturalness requirement) gives rise to a
minimal probably dip in parameter space at the measured CDM abundance. For
comparison, we also scan over mSUGRA space with four free parameters. Finally,
we investigate the Peccei-Quinn augmented MSSM with mixed axion/axino cold dark
matter. In this case, the relic abundance agrees more naturally with the
measured value. In light of our cumulative results, we conclude that future
axion searches should probe much more broadly in axion mass, and deeper into
the axion coupling.Comment: 23 pages including 17 .eps figure
Advantages of sous-vide cooked red cabbage: structural, nutritional and sensory aspects
The comparison between equivalent cooking treatments should be applied in a systematic way. This study proposes a methodical way to provide cooked samples with similar firmness using two cooking treatments. In addition, the structural, nutritional and sensory properties of red cabbage cooked with sous-vide treatment in comparison with traditional cooking (boiling water) was evaluated. Changes in
texture, color and anthocyanin content were measured in samples cooked with traditional cooking (for different times) and sous-vide (modifying time and temperature according to a Response Surface Methodology). Consumers described sensory properties and preferences between samples. Cryoscanning electron microscopy was used to study the samples microstructure.
The firmness of samples, traditionally cooked for 11 min and preferred by consumers, was achieved in
samples cooked with sous-vide treatment by optimizing of the cooking conditions (87 C/50 min or
91 C/30 min). Sous-vide treatment was preferred to traditional cooking by consumers. Sous-vide
samples were more purple, more aromatic and tastier than traditionally cooked ones. The loss of anthocyanins
in traditional cooking was twice that in sous-vide samples. Micrographs from different
treatments showed different degrees of cell wall damage. Sous-vide treatment could be recommended as
a treatment for the catering industry providing better quality products.Author Iborra-Bernad was supported by the Generalitat Valenciana under FPI (Researcher Formation Program) grant. Author Tarrega was financially supported by the Juan de la Cierva programme.Iborra Bernad, MDC.; Tárrega, A.; García Segovia, P.; Martínez Monzó, J. (2014). Advantages of sous-vide cooked red cabbage: structural, nutritional and sensory aspects. Food Science and Technology. 56(2):451-460. doi:10.1016/j.lwt.2013.12.027S45146056
Native gel electrophoresis of human telomerase distinguishes active complexes with or without dyskerin
Telomeres, the ends of linear chromosomes, safeguard against genome instability. The enzyme responsible for extension of the telomere 3′ terminus is the ribonucleoprotein telomerase. Whereas telomerase activity can be reconstituted in vitro with only the telomerase RNA (hTR) and telomerase reverse transcriptase (TERT), additional components are required in vivo for enzyme assembly, stability and telomere extension activity. One such associated protein, dyskerin, promotes hTR stability in vivo and is the only component to co-purify with active, endogenous human telomerase. We used oligonucleotide-based affinity purification of hTR followed by native gel electrophoresis and in-gel telomerase activity detection to query the composition of telomerase at different purification stringencies. At low salt concentrations (0.1 M NaCl), affinity-purified telomerase was ‘supershifted’ with an anti-dyskerin antibody, however the association with dyskerin was lost after purification at 0.6 M NaCl, despite the retention of telomerase activity and a comparable yield of hTR. The interaction of purified hTR and dyskerin in vitro displayed a similar salt-sensitive interaction. These results demonstrate that endogenous human telomerase, once assembled and active, does not require dyskerin for catalytic activity. Native gel electrophoresis may prove useful in the characterization of telomerase complexes under various physiological conditions
- …