3,311 research outputs found
Hidden Markov Models and their Application for Predicting Failure Events
We show how Markov mixed membership models (MMMM) can be used to predict the
degradation of assets. We model the degradation path of individual assets, to
predict overall failure rates. Instead of a separate distribution for each
hidden state, we use hierarchical mixtures of distributions in the exponential
family. In our approach the observation distribution of the states is a finite
mixture distribution of a small set of (simpler) distributions shared across
all states. Using tied-mixture observation distributions offers several
advantages. The mixtures act as a regularization for typically very sparse
problems, and they reduce the computational effort for the learning algorithm
since there are fewer distributions to be found. Using shared mixtures enables
sharing of statistical strength between the Markov states and thus transfer
learning. We determine for individual assets the trade-off between the risk of
failure and extended operating hours by combining a MMMM with a partially
observable Markov decision process (POMDP) to dynamically optimize the policy
for when and how to maintain the asset.Comment: Will be published in the proceedings of ICCS 2020;
@Booklet{EasyChair:3183, author = {Paul Hofmann and Zaid Tashman}, title =
{Hidden Markov Models and their Application for Predicting Failure Events},
howpublished = {EasyChair Preprint no. 3183}, year = {EasyChair, 2020}
Optimal client recommendation for market makers in illiquid financial products
The process of liquidity provision in financial markets can result in
prolonged exposure to illiquid instruments for market makers. In this case,
where a proprietary position is not desired, pro-actively targeting the right
client who is likely to be interested can be an effective means to offset this
position, rather than relying on commensurate interest arising through natural
demand. In this paper, we consider the inference of a client profile for the
purpose of corporate bond recommendation, based on typical recorded information
available to the market maker. Given a historical record of corporate bond
transactions and bond meta-data, we use a topic-modelling analogy to develop a
probabilistic technique for compiling a curated list of client recommendations
for a particular bond that needs to be traded, ranked by probability of
interest. We show that a model based on Latent Dirichlet Allocation offers
promising performance to deliver relevant recommendations for sales traders.Comment: 12 pages, 3 figures, 1 tabl
Asynchronous Stochastic Variational Inference
Stochastic variational inference (SVI) employs stochastic optimization to scale up Bayesian computation to massive data. Since SVI is at its core a stochastic gradient-based algorithm, horizontal parallelism can be harnessed to allow larger scale inference. We propose a lock-free parallel implementation for SVI which allows distributed computations over multiple slaves in an asynchronous style. We show that our implementation leads to linear speed-up while guaranteeing an asymptotic ergodic convergence rate O(1/√T) given that the number of slaves is bounded by √T (T is the total number of iterations). The implementation is done in a high-performance computing (HPC) environment using message passing interface (MPI) for python (MPI4py). The extensive empirical evaluation shows that our parallel SVI is lossless, performing comparably well to its counterpart serial SVI with linear speed-up
Tracking Pediatric Asthma:The Massachusetts Experience Using School Health Records
The Massachusetts Department of Public Health, in collaboration with the U.S. Centers for Disease Control and Prevention Environmental Public Health Tracking Program, initiated a 3-year statewide project for the routine surveillance of asthma in children using school health records as the primary data source. School district nurse leaders received electronic data reporting forms requesting the number of children with asthma by grade and gender for schools serving grades kindergarten (K) through 8. Verification efforts from an earlier community-level study comparing a select number of school health records with primary care provider records demonstrated a high level of agreement (i.e., > 95%). First-year surveillance targeted approximately one-half (n = 958 schools) of all Massachusetts’s K–8 schools. About 78% of targeted school districts participated, and 70% of the targeted schools submitted complete asthma data. School nurse–reported asthma prevalence was as high as 30.8% for schools, with a mean of 9.2%. School-based asthma surveillance has been demonstrated to be a reliable and cost-effective method of tracking disease through use of an existing and enhanced reporting structure
HTA – algorithm or process? Comment on ‘Expanded HTA: enhancing fairness and legitimacy’
Daniels, Porteny and Urrutia et al make a good case for the idea that that public decisions ought to be made not
only “in the light of ” evidence but also “on the basis of ” budget impact, financial protection and equity. Health
technology assessment (HTA) should, they say, be accordingly expanded to consider matters additional to safety
and cost-effectiveness. They also complain that most HTA reports fail to develop ethical arguments and generally
do not even mention ethical issues. This comment argues that some of these defects are more apparent than real and
are not inherent in HTA – as distinct from being common characteristics found in poorly conducted HTAs. More
generally, HTA does not need “extension” since (1) ethical issues are already embedded in HTA processes, not least in
their scoping phases, and (2) HTA processes are already sufficiently flexible to accommodate evidence about a wide
range of factors, and will not need fundamental change in order to accommodate the new forms of decision-relevant
evidence about distributional impact and financial protection that are now starting to emerge. HTA and related
techniques are there to support decision-makers who have authority to make decisions. Analysts like us are there to
support and advise them (and not to assume the responsibilities for which they, and not we, are accountable). The
required quality in HTA then becomes its effectiveness as a means of addressing the issues of concern to decisionmakers. What is also required is adherence by competent analysts to a standard template of good analytical practice.
The competencies include not merely those of the usual disciplines (particularly biostatistics, cognitive psychology,
health economics, epidemiology, and ethics) but also the imaginative and interpersonal skills for exploring the “real”
question behind the decision-maker’s brief (actual or postulated) and eliciting the social values that necessarily
pervade the entire analysis. The product of such exploration defines the authoritative scope of an HTA
Dissolvable Template Nanoimprint Lithography: A Facile and Versatile Nanoscale Replication Technique
Nanoimprinting lithography (NIL) is a next-generation nanofabrication method, capable of replicating nanostructures from original master surfaces. Here, we develop highly scalable, simple, and nondestructive NIL using a dissolvable template. Termed dissolvable template nanoimprinting lithography (DT-NIL), our method utilizes an economic thermoplastic resin to fabricate nanoimprinting templates, which can be easily dissolved in simple organic solvents. We used the DT-NIL method to replicate cicada wings which have surface nanofeatures of ∼100 nm in height. The master, template, and replica surfaces showed a >∼94% similarity based on the measured diameter and height of the nanofeatures. The versatility of DT-NIL was also demonstrated with the replication of re-entrant, multiscale, and hierarchical features on fly wings, as well as hard silicon wafer-based artificial nanostructures. The DT-NIL method can be performed under ambient conditions with inexpensive materials and equipment. Our work opens the door to opportunities for economical and high-throughput nanofabrication processes
Uniform approach to double shuffle and duality relations of various q-Analogs of multiple zeta values via Rota-Baxter algebras
The multiple zeta values (MZVs) have been studied extensively in recent years. Currently there exist a few different types of -analogs of the MZVs (-MZVs) defined and studied by mathematicians and physicists. In this paper, we give a uniform treatment of these -MZVs by considering their double shuffle relations (DBSFs) and duality relations. The main idea is a modification and generalization of the one used by Castillo Medina et al. who have considered the DBSFs of a special type of -MZVs. We generalize their method to a few other types of -MZVs including the one defined by the author in 2003. With different approach, Takeyama has already studied this type by "regularization" and observed that there exist -linear relations which are not consequences of the DBSFs. He also discovered a new family of relations which we call the duality relations in this paper. This deficiency of DBSFs occurs among other types, too, so we generalize the duality relations to all of these values and find that there are still some missing relations. This leads to the most general type of -MZVs together with a new kind of relations called - relations which are used to lower the deficiencies further. As an application, we will confirm a conjecture of Okounkov on the dimensions of certain -MZV spaces, either theoretically or numerically, for the weight up to 12. Some relevant numerical data are provided at the end
Tunable magnetic exchange interactions in manganese-doped inverted core/shell ZnSe/CdSe nanocrystals
Magnetic doping of semiconductor nanostructures is actively pursued for
applications in magnetic memory and spin-based electronics. Central to these
efforts is a drive to control the interaction strength between carriers
(electrons and holes) and the embedded magnetic atoms. In this respect,
colloidal nanocrystal heterostructures provide great flexibility via
growth-controlled `engineering' of electron and hole wavefunctions within
individual nanocrystals. Here we demonstrate a widely tunable magnetic sp-d
exchange interaction between electron-hole excitations (excitons) and
paramagnetic manganese ions using `inverted' core-shell nanocrystals composed
of Mn-doped ZnSe cores overcoated with undoped shells of narrower-gap CdSe.
Magnetic circular dichroism studies reveal giant Zeeman spin splittings of the
band-edge exciton that, surprisingly, are tunable in both magnitude and sign.
Effective exciton g-factors are controllably tuned from -200 to +30 solely by
increasing the CdSe shell thickness, demonstrating that strong quantum
confinement and wavefunction engineering in heterostructured nanocrystal
materials can be utilized to manipulate carrier-Mn wavefunction overlap and the
sp-d exchange parameters themselves.Comment: To appear in Nature Materials; 18 pages, 4 figures + Supp. Inf
Wegner on Hallucinations, Inconsistency, and the Illusion of Free Will. Some Critical Remarks
Wegner's argument on the illusory nature of conscious will, as developed in The Illusion of Conscious Will (2002) and other publications, has had major impact. Based on empirical data, he develops a theory of apparent mental causation in order to explain the occurrence of the illusion of conscious will. Part of the evidence for his argument is derived from a specific interpretation of the phenomenon of auditory verbal hallucinations as they may occur in schizophrenia. The aim of this paper is to evaluate the validity of the evidence on auditory verbal hallucinations as employed by Wegner. I conclude that auditory hallucinations do not provide solid evidence for Wegner's theory. Moreover, the phenomena in schizophrenia provide, in fact, an argument against part of Wegner's theory of apparent mental causation. © 2010 The Author(s)
- …