3,448 research outputs found
Approximating multivariate posterior distribution functions from Monte Carlo samples for sequential Bayesian inference
An important feature of Bayesian statistics is the opportunity to do
sequential inference: the posterior distribution obtained after seeing a
dataset can be used as prior for a second inference. However, when Monte Carlo
sampling methods are used for inference, we only have a set of samples from the
posterior distribution. To do sequential inference, we then either have to
evaluate the second posterior at only these locations and reweight the samples
accordingly, or we can estimate a functional description of the posterior
probability distribution from the samples and use that as prior for the second
inference. Here, we investigated to what extent we can obtain an accurate joint
posterior from two datasets if the inference is done sequentially rather than
jointly, under the condition that each inference step is done using Monte Carlo
sampling. To test this, we evaluated the accuracy of kernel density estimates,
Gaussian mixtures, vine copulas and Gaussian processes in approximating
posterior distributions, and then tested whether these approximations can be
used in sequential inference. In low dimensionality, Gaussian processes are
more accurate, whereas in higher dimensionality Gaussian mixtures or vine
copulas perform better. In our test cases, posterior approximations are
preferable over direct sample reweighting, although joint inference is still
preferable over sequential inference. Since the performance is case-specific,
we provide an R package mvdens with a unified interface for the density
approximation methods
A 400-to-900 MHz Receiver with Dual-domain Harmonic Rejection Exploiting Adaptive Interference Cancellation
Wideband direct-conversion harmonic-rejection (HR) receivers for software-defined radio aim to remove or relax the pre-mixer RF filters, which are inflexible, bulky and costly [1,2]. HR schemes derived from [3] are often used, but amplitude and phase mismatches limit HR to between 30 and 40dB [1,2]. A quick calculation shows that much more rejection is wanted: in order to bring harmonic responses down to the noise floor (e.g. â100dBm in 10MHz for 4dB NF), and cope with interferers between â40 and 0dBm, an HR of 60 to 100dB is needed. Also in terrestrial TV receivers and in applications like DVB-H with co-existence requirements with GSM/WLAN transmitters in a small telephone, high HR is needed
Digitally-Enhanced Software-Defined Radio Receiver Robust to Out-of-Band Interference
A software-defined radio (SDR) receiver with improved robustness to out-of-band interference (OBI) is presented. Two main challenges are identified for an OBI-robust SDR receiver: out-of-band nonlinearity and harmonic mixing. Voltage gain at RF is avoided, and instead realized at baseband in combination with low-pass filtering to mitigate blockers and improve out-of-band IIP3. Two alternative âiterativeâ harmonic-rejection (HR) techniques are presented to achieve high HR robust to mismatch: a) an analog two-stage polyphase HR concept, which enhances the HR to more than 60 dB; b) a digital adaptive interference cancelling (AIC) technique, which can suppress one dominating harmonic by at least 80 dB. An accurate multiphase clock generator is presented for a mismatch-robust HR. A proof-of-concept receiver is implemented in 65 nm CMOS. Measurements show 34 dB gain, 4 dB NF, and 3.5 dBm in-band IIP3 while the out-of-band IIP3 is + 16 dBm without fine tuning. The measured RF bandwidth is up to 6 GHz and the 8-phase LO works up to 0.9 GHz (master clock up to 7.2 GHz). At 0.8 GHz LO, the analog two-stage polyphase HR achieves a second to sixth order HR > dB over 40 chips, while the digital AIC technique achieves HR > 80 dB for the dominating harmonic. The total power consumption is 50 mA from a 1.2 V supply
A general weak nonlinearity model for LNAs
This paper presents a general weak nonlinearity model that can be used to model, analyze and describe the distortion behavior of various low noise amplifier topologies in both narrowband and wideband applications. Represented by compact closed-form expressions our model can be easily utilized by both circuit designers and LNA design automation algorithms.\ud
Simulations for three LNA topologies at different operating conditions show that the model describes IM components with an error lower than 0.1% and a one order of magnitude faster response time. The model also indicates that for narrowband IM2@w1-w2 all the nonlinear capacitances can be neglected while for narrowband IM3 the nonlinear capacitances at the drainterminal can be neglected
How much control is enough? Optimizing fun with unreliable input
Brain-computer interfaces (BCI) provide a valuable new input modality within human- computer interaction systems, but like other body-based inputs, the system recognition of input commands is far from perfect. This raises important questions, such as: What level of control should such an interface be able to provide? What is the relationship between actual and perceived control? And in the case of applications for entertainment in which fun is an important part of user experience, should we even aim for perfect control, or is the optimum elsewhere? In this experiment the user plays a simple game in which a hamster has to be guided to the exit of a maze, in which the amount of control the user has over the hamster is varied. The variation of control through confusion matrices makes it possible to simulate the experience of using a BCI, while using the traditional keyboard for input. After each session the user ïżœlled out a short questionnaire on fun and perceived control. Analysis of the data showed that the perceived control of the user could largely be explained by the amount of control in the respective session. As expected, user frustration decreases with increasing control. Moreover, the results indicate that the relation between fun and control is not linear. Although in the beginning fun does increase with improved control, the level of fun drops again just before perfect control is reached. This poses new insights for developers of games wanting to incorporate some form of BCI in their game: for creating a fun game, unreliable input can be used to create a challenge for the user
Collagen cross-linking mediated by lysyl hydroxylase 2:an enzymatic battlefield to combat fibrosis
The hallmark of fibrosis is an excessive accumulation of collagen, ultimately leading to organ failure. It has become evident that the deposited collagen also exhibits qualitative modifications. A marked modification is the increased cross-linking, leading to a stabilization of the collagen network and limiting fibrosis reversibility. Not only the level of cross-linking is increased, but also the composition of cross-linking is altered: an increase is seen in hydroxyallysine-derived cross-links at the expense of allysine cross-links. This results in irreversible fibrosis, as collagen cross-linked by hydroxyallysine is more difficult to degrade. Hydroxyallysine is derived from a hydroxylysine in the telopeptides of collagen. The expression of lysyl hydroxylase (LH) 2 (LH2), the enzyme responsible for the formation of telopeptidyl hydroxylysine, is universally up-regulated in fibrosis. It is expected that inhibition of this enzyme will lead to reversible fibrosis without interfering with the normal repair process. In this review, we discuss the molecular basis of collagen modifications and cross-linking, with an emphasis on LH2-mediated hydroxyallysine cross-links, and their implications for the pathogenesis and treatment of fibrosis.</p
Recommended from our members
The evolution of early-life effects on social behaviour-why should social adversity carry over to the future?
Numerous studies have shown that social adversity in early life can have long-lasting consequences for social behaviour in adulthood, consequences that may in turn be propagated to future generations. Given these intergenerational effects, it is puzzling why natural selection might favour such sensitivity to an individual's early social environment. To address this question, we model the evolution of social sensitivity in the development of helping behaviours, showing that natural selection indeed favours individuals whose tendency to help others is dependent on early-life social experience. In organisms with non-overlapping generations, we find that natural selection can favour positive social feedbacks, in which individuals who received more help in early life are also more likely to help others in adulthood, while individuals who received no early-life help develop low tendencies to help others later in life. This positive social sensitivity is favoured because of an intergenerational relatedness feedback: patches with many helpers tend to be more productive, leading to higher relatedness within the local group, which in turn favours higher levels of help in the next generation. In organisms with overlapping generations, this positive feedback is less likely to occur, and those who received more help may instead be less likely to help others (negative social feedback). We conclude that early-life social influences can lead to strong between-individual differences in helping behaviour, which can take different forms dependent on the life history in question. This article is part of the theme issue 'Developing differences: early-life effects and evolutionary medicine'
Automatic calcium scoring in low-dose chest CT using deep neural networks with dilated convolutions
Heavy smokers undergoing screening with low-dose chest CT are affected by
cardiovascular disease as much as by lung cancer. Low-dose chest CT scans
acquired in screening enable quantification of atherosclerotic calcifications
and thus enable identification of subjects at increased cardiovascular risk.
This paper presents a method for automatic detection of coronary artery,
thoracic aorta and cardiac valve calcifications in low-dose chest CT using two
consecutive convolutional neural networks. The first network identifies and
labels potential calcifications according to their anatomical location and the
second network identifies true calcifications among the detected candidates.
This method was trained and evaluated on a set of 1744 CT scans from the
National Lung Screening Trial. To determine whether any reconstruction or only
images reconstructed with soft tissue filters can be used for calcification
detection, we evaluated the method on soft and medium/sharp filter
reconstructions separately. On soft filter reconstructions, the method achieved
F1 scores of 0.89, 0.89, 0.67, and 0.55 for coronary artery, thoracic aorta,
aortic valve and mitral valve calcifications, respectively. On sharp filter
reconstructions, the F1 scores were 0.84, 0.81, 0.64, and 0.66, respectively.
Linearly weighted kappa coefficients for risk category assignment based on per
subject coronary artery calcium were 0.91 and 0.90 for soft and sharp filter
reconstructions, respectively. These results demonstrate that the presented
method enables reliable automatic cardiovascular risk assessment in all
low-dose chest CT scans acquired for lung cancer screening
Control and tuning of a suspended Fabry-Perot cavity using digitally-enhanced heterodyne interferometry
We present the first demonstration of real-time closed-loop control and
deterministic tuning of an independently suspended Fabry-Perot optical cavity
using digitally-enhanced heterodyne interferometry, realising a peak
sensitivity of 10 pm over the 10-1000 Hz frequency
band. The methods presented are readily extensible to multiple coupled
cavities. As such, we anticipate that refinements of this technique may find
application in future interferometric gravitational-wave detectors
- âŠ