204 research outputs found
HIV with contact-tracing: a case study in Approximate Bayesian Computation
Missing data is a recurrent issue in epidemiology where the infection process
may be partially observed. Approximate Bayesian Computation, an alternative to
data imputation methods such as Markov Chain Monte Carlo integration, is
proposed for making inference in epidemiological models. It is a
likelihood-free method that relies exclusively on numerical simulations. ABC
consists in computing a distance between simulated and observed summary
statistics and weighting the simulations according to this distance. We propose
an original extension of ABC to path-valued summary statistics, corresponding
to the cumulated number of detections as a function of time. For a standard
compartmental model with Suceptible, Infectious and Recovered individuals
(SIR), we show that the posterior distributions obtained with ABC and MCMC are
similar. In a refined SIR model well-suited to the HIV contact-tracing data in
Cuba, we perform a comparison between ABC with full and binned detection times.
For the Cuban data, we evaluate the efficiency of the detection system and
predict the evolution of the HIV-AIDS disease. In particular, the percentage of
undetected infectious individuals is found to be of the order of 40%
Didactic Physician Assistant Students’ Perceptions of Evidence-Based Medicine Resources
This is a poster for a presentation at the virtual Midwest/MCMLA Joint Annual Meeting on October 15, 2021
Predictors of mortality in primary antiphospholipid syndrome. A single-centre cohort study.
The vascular mortality of antiphospholipid syndrome (APS) ranges from 1.4 % to 5.5 %, but its predictors are poorly known. It was the study objective to evaluate the impact of baseline lupus anticoagulant assays, IgG anticardiolipin (aCL), plasma fibrinogen (FNG) and von Willebrand factor (VWF), platelets (PLT) and of genetic polymorphisms of methylenetetrahydrofolate reductase C677T, of prothrombin G20210A and of paraoxonase-1 Q192R on mortality in primary APS (PAPS). Cohort study on 77 thrombotic PAPS and 33 asymptomatic carriers of aPL (PCaPL) seen from 1989 to 2015 and persistently positive for aPL as per annual review. At baseline all participants were tested twice for the ratios of kaolin clotting time (KCTr), activated partial thromboplastin time (aPTTr), dilute Russell viper venom time (DRVVTr), IgG aCL, FNG, VWF and once for PLT. All thrombotic PAPS were on warfarin with regular INR monitoring. During follow-up 11 PAPS deceased (D-PAPS) of recurrent thrombosis despite adequate anticoagulation yielding an overall vascular mortality of 10 %. D-PAPS had the strongest baseline aPTTr and DRVVTr and the highest mean baseline IgG aCL, FNG, VWF and PLT. Cox proportional hazards model identified baseline DRVVTr and FNG as main predictors of mortality with adjusted hazard ratios of 5.75 (95 % confidence interval [CI]: 1.5, 22.4) and of 1.03 (95 %CI: 1.01, 1.04), respectively. In conclusion, plasma DRVVTr and FNG are strong predictors of vascular mortality in PAPS; while FNG lowering agents exist further research should be directed at therapeutic strategies able to dampen aPL production
Dark field Z-scan microscopic configuration for nonlinear optical measurements: Numerical study
This study deals with numerical simulations to optimize the parameters of the Dark Filed Z-scan (DFZ-scan) in a microscopic configuration for third-order nonlinear (NL) refraction measurements into thin films. The method allows dynamic, transparent, nonlinear phase shifts to be clearly visible. The simulations of such images are obtained for very low-induced refractive indices. Darkfield illumination requires blocking out of the central light which ordinarily passes through and around (surrounding) the NL specimen. A table to approximate circular aperture stop size versus magnification will be given depending on the focusing lens into the tested material
The standard error of measurement is a more appropriate measure of quality for postgraduate medical assessments than is reliability: an analysis of MRCP(UK) examinations
Background: Cronbach's alpha is widely used as the preferred index of reliability for medical postgraduate examinations. A value of 0.8-0.9 is seen by providers and regulators alike as an adequate demonstration of acceptable reliability for any assessment. Of the other statistical parameters, Standard Error of Measurement (SEM) is mainly seen as useful only in determining the accuracy of a pass mark. However the alpha coefficient depends both on SEM and on the ability range (standard deviation, SD) of candidates taking an exam. This study investigated the extent to which the necessarily narrower ability range in candidates taking the second of the three part MRCP(UK) diploma examinations, biases assessment of reliability and SEM.Methods: a) The interrelationships of standard deviation (SD), SEM and reliability were investigated in a Monte Carlo simulation of 10,000 candidates taking a postgraduate examination. b) Reliability and SEM were studied in the MRCP(UK) Part 1 and Part 2 Written Examinations from 2002 to 2008. c) Reliability and SEM were studied in eight Specialty Certificate Examinations introduced in 2008-9.Results: The Monte Carlo simulation showed, as expected, that restricting the range of an assessment only to those who had already passed it, dramatically reduced the reliability but did not affect the SEM of a simulated assessment. The analysis of the MRCP(UK) Part 1 and Part 2 written examinations showed that the MRCP(UK) Part 2 written examination had a lower reliability than the Part 1 examination, but, despite that lower reliability, the Part 2 examination also had a smaller SEM (indicating a more accurate assessment). The Specialty Certificate Examinations had small Ns, and as a result, wide variability in their reliabilities, but SEMs were comparable with MRCP(UK) Part 2.Conclusions: An emphasis upon assessing the quality of assessments primarily in terms of reliability alone can produce a paradoxical and distorted picture, particularly in the situation where a narrower range of candidate ability is an inevitable consequence of being able to take a second part examination only after passing the first part examination. Reliability also shows problems when numbers of candidates in examinations are low and sampling error affects the range of candidate ability. SEM is not subject to such problems; it is therefore a better measure of the quality of an assessment and is recommended for routine use
Recommended from our members
High on-clopidogrel platelet reactivity in ischaemic stroke or transient ischaemic attack: Systematic review and meta-analysis
Objectives
To assess the prevalence of high on-clopidogrel platelet reactivity (HCPR) in patients with ischaemic stroke or transient ischaemic attack (IS/TIA), their outcome and genetic basis of on-treatment response variability in IS/TIA patients.
Methods
We conducted a comprehensive search of PubMed and EMBASE from their inceptions to March 9, 2019. Studies that reported absolute numbers/percentages of HCRP at any time point after IS/TIA onset evaluated with any type of platelet function tests, clinical outcomes and genotyping data were included.
Results
Among 21 studies of 4312 IS/TIA patients treated with clopidogrel, the pooled prevalence of HCPR was 28% (95%CI: 24–32%; high heterogeneity: I2 = 88.2%, p < 0.001). Heterogeneity degree diminished across groups defined by the HCPR testing method. Clopidogrel non-responder IS/TIA patients had poorer outcome compared to responders (RR = 2.09, 95%CI: 1.61–2.70; p = 0.036; low heterogeneity across studies: I2 = 27.4%, p = 0.210). IS/TIA carriers of CYP2C19*2 or CYP2C19*3 loss of function alleles had a higher risk of HCPR compared to wild type (RR = 1.69, 95%CI: 1.47–1.95; p < 0.001; I2 = 0.01%, p = 0.475).
Conclusions
This systematic review shows a high prevalence of clopidogrel resistance in IS/TIA and poor outcome in these patients. CYP2C19 polymorphisms may potentially influence clopidogrel resistance
Period changes in six semi-detached Algol-type binaries
Six semi-detached Algol-type binaries lacking a period analysis were chosen
to test for a presence of a third body. The O-C diagrams of these binaries were
analyzed with the least-squares method by using all available times of minima.
Also fourteen new minima, obtained from our observations, were included in the
present research. The light-time effect was adopted as a main factor for the
detailed description of the long-term period changes. Third bodies were found
with orbital periods from 46 up to 84 years, and eccentricities from 0.0 to
0.78 for the selected binaries. The mass functions and the minimal masses of
such bodies were also calculated.Comment: 14 pages, 8 figure
Crafting organization
The recent shift in attention away from organization studies as science has allowed for consideration of new ways of thinking about both organization and organizing and has led to several recent attempts to \u27bring down\u27 organizational theorizing. In this paper, we extend calls for organization to be represented as a creative process by considering organization as craft. Organizational craft, we argue, is attractive, accessible, malleable, reproducible, and marketable. It is also a tangible way of considering organization studies with irreverence. We draw on the hierarchy of distinctions among fine art, decorative art, and craft to suggest that understanding the organization of craft assists in complicating our understanding of marginality. We illustrate our argument by drawing on the case of a contemporary Australian craftworks and marketplace known initially as the Meat Market Craft Centre (\u27MMCC\u27) and then, until its recent closure, as Metro! ‡ Stella Minahan was a board member and then the Chief Executive Officer of the Metro! Craft Centre.<br /
- …