1,827 research outputs found
How accurate is your sclerostin measurement?:Comparison between three commercially available sclerostin ELISA kits
Sclerostin, bone formation antagonist is in the spotlight as a potential biomarker for diseases presenting with associated bone disorders such as chronic kidney disease (CDK-MBD). Accurate measurement of sclerostin is therefore important. Several immunoassays are available to measure sclerostin in serum and plasma. We compared the performance of three commercial ELISA kits. We measured sclerostin concentrations in serum and EDTA plasma obtained from healthy young (18-26 years) human subjects using kits from Biomedica, TECOmedical and from R&D Systems. The circulating sclerostin concentrations were systematically higher when measured with the Biomedica assay (serum: 35.5 ± 1.1 pmol/L; EDTA: 39.4 ± 2.0 pmol/L; mean ± SD) as compared with TECOmedical (serum: 21.8 ± 0.7 pmol/L; EDTA: 27.2 ± 1.3 pmol/L) and R&D Systems (serum: 7.6 ± 0.3 pmol/L; EDTA: 30.9 ± 1.5 pmol/L). We found a good correlation between the assay for EDTA plasma (r > 0.6; p < 0.001) while in serum, only measurements obtained using TECOmedical and R&D Systems assays correlated significantly (r = 0.78; p < 0.001). There was no correlation between matrices results when using the Biomedica kit (r = 0.20). The variability in values generated from Biomedica, R&D Systems and TECOmedical assays raises questions regarding the accuracy and specificity of the assays. Direct comparison of studies using different kits is not possible and great care should be given to measurement of sclerostin, with traceability of reagents. Standardization with appropriate material is required before different sclerostin assays can be introduced in clinical practice
Recommended from our members
Assessment of Vessel Density on Non-Contrast Computed Tomography to Detect Basilar Artery Occlusion
Introduction: Basilar artery occlusion (BAO) may be clinically occult due to variable and non-specific symptomatology. We evaluated the qualitative and quantitative determination of a hyperdense basilar artery (HDBA) on non-contrast computed tomography (NCCT) brain for the diagnosis of BAO.Methods: We conducted a case control study of patients with confirmed acute BAO vs a control group of suspected acute stroke patients without BAO. Two EM attending physicians, one third-year EM resident, and one medical student performed qualitative and quantitative assessments for the presence of a HDBA on axial NCCT images. Our primary outcome measures were sensitivity and specificity for BAO. Our secondary outcomes were inter-rater and intra-rater reliability of the qualitative and quantitative assessments.Results: We included 60 BAO and 65 control patients in our analysis. Qualitative assessment of the hyperdense basilar artery sign was poorly sensitive (54%–72%) and specific (55%–89%). Quantitative measurement improved the specificity of hyperdense basilar artery assessment for diagnosing BAO, with a threshold of 61.0–63.8 Hounsfield units demonstrating relatively high specificity of 85%–94%. There was moderate inter-rater agreement for the qualitative assessment of HDBA (Fleiss’ kappa statistic 0.508, 95% confidence interval: 0.435–0.581). Agreement improved for quantitative assessments, but still fell in the moderate range (Shrout-Fleiss intraclass correlation coefficient: 0.635). Intra-rater reliability for the quantitative assessments of the two attending physician reviewers demonstrated substantial consistency.Conclusion: Our results highlight the importance of carefully examining basilar artery density when interpreting the NCCT of patients with altered consciousness or other signs and symptoms concerning for an acute basilar artery occlusion. If the Hounsfield unit density of the basilar artery exceeds 61 Hounsfield units, BAO should be highly suspected
Evidence for changes in historic and future groundwater levels in the UK
We examine the evidence for climate-change impacts on groundwater levels provided by studies of the historical
observational record, and future climate-change impact modelling. To date no evidence has been
found for systematic changes in groundwater drought frequency or intensity in the UK, but some evidence
of multi-annual to decadal coherence of groundwater levels and large-scale climate indices has been found,
which should be considered when trying to identify any trends. We analyse trends in long groundwater level
time-series monitored in seven observation boreholes in the Chalk aquifer, and identify statistically significant
declines at four of these sites, but do not attempt to attribute these to a change in a stimulus. The evidence for
the impacts of future climate change on UK groundwater recharge and levels is limited. The number of studies
that have been undertaken is small and different approaches have been adopted to quantify impacts.
Furthermore, these studies have generally focused on relatively small regions and reported local findings.
Consequently, it has been difficult to compare them between locations. We undertake some additional analysis
of the probabilistic outputs of the one recent impact study that has produced coherent multi-site projections
of changes in groundwater levels. These results suggest reductions in annual and average summer levels,
and increases in average winter levels, by the 2050s under a high greenhouse gas emissions scenario, at most
of the sites modelled, when expressed by the median of the ensemble of simulations. It is concluded, however,
that local hydrogeological conditions can be an important control on the simulated response to a future
climate projection
Probing ultracold Fermi gases with light-induced gauge potentials
We theoretically investigate the response of a two component Fermi gas to
vector potentials which couple separately to the two spin components. Such
vector potentials may be implemented in ultracold atomic gases using optically
dressed states. Our study indicates that light-induced gauge potentials may be
used to probe the properies of the interacting ultracold Fermi gas, providing.
amongst other things, ways to measure the superfluid density and the strength
of pairing.Comment: 8 pages, 3 figure
Path-based Design Model for Constructing and Exploring Alternative Visualisations
We present a path-based design model and system for designing and creating visualisations. Our model represents a systematic approach to constructing visual representations of data or concepts following a predefined sequence of steps. The initial step involves outlining the overall appearance of the visualisation by creating a skeleton structure, referred to as a flowpath. Subsequently, we specify objects, visual marks, properties, and appearance, storing them in a gene. Lastly, we map data onto the flowpath, ensuring suitable morphisms. Alternative designs are created by exchanging values in the gene. For example, designs that share similar traits, are created by making small incremental changes to the gene. Our design methodology fosters the generation of diverse creative concepts, space-filling visualisations, and traditional formats like bar charts, circular plots and pie charts. Through our implementation we showcase the model in action. As an example application, we integrate the output visualisations onto a smartwatch and visualisation dashboards. In this article we (1) introduce, define and explain the path model and discuss possibilities for its use, (2) present our implementation, results, and evaluation, and (3) demonstrate and evaluate an application of its use on a mobile watch
STI testing and subsequent clinic attendance amongst test negative asymptomatic users of an internet STI testing service:one-year retrospective study
AIM: To explore the characteristics of online STI test users, and assess the frequency and factors associated with subsequent service use following a negative online STI test screen in individuals without symptoms.METHODS: One-year retrospective study of online and clinic STI testing within a large integrated sexual health service (Umbrella in Birmingham and Solihull, England) between January and December 2017. A multivariable analysis of sociodemographic and behavioural characteristics of patients was conducted. Sexual health clinic appointments occurring within 90 days of a negative STI test, in asymptomatic individuals who tested either online or in clinic were determined. Factors associated with online STI testing and subsequent clinic use were determined using generalized estimating equations and reported as odds ratios (OR) with corresponding 95% confidence intervals (CI).RESULTS: 31 847 online STI test requests and 40 059 clinic attendances incorporating STI testing were included. 79% (25020/31846) of online STI test users and 49% (19672/40059) of clinic STI test takers were asymptomatic. Online STI testing was less utilised (p<0.05) by men who have sex with men (MSM), non-Caucasians and those living in neighborhoods of greater deprivation. Subsequent clinic appointments within 90 days of an asymptomatic negative STI test occurred in 6.2% (484/7769) of the online testing group and 33% (4960/15238) for the clinic tested group. Re-attendance following online testing was associated with being MSM (aOR 2.55[1.58 to 4.09]-MSM vs Female) and a recent prior history of STI testing (aOR 5.65[4.30 to 7.43] 'clinic tested' vs 'No' recent testing history).CONCLUSIONS: Subsequent clinic attendance amongst online STI test service users with negative test results was infrequent, suggesting that their needs were being met without placing an additional burden on clinic based services. However, unequal use of online services by different patient groups suggests that optimised messaging and the development of online services in partnership with users are required to improve uptake.</p
The Explanatory Visualization Framework: an active learning framework for teaching creative computing using explanatory visualizations
Visualizations are nowadays appearing in popular media and are used everyday in the workplace. This democratisation of visualization challenges educators to develop effective learning strategies, in order to train the next generation of creative visualization specialists. There is high demand for skilled individuals who can analyse a problem, consider alternative designs, develop new visualizations, and be creative and innovative. Our three-stage framework, leads the learner through a series of tasks, each designed to develop different skills necessary for coming up with creative, innovative, effective, and purposeful visualizations. For that, we get the learners to create an explanatory visualization of an algorithm of their choice. By making an algorithm choice, and by following an active-learning and project-based strategy, the learners take ownership of a particular visualization challenge. They become enthusiastic to develop good results and learn different creative skills on their learning journey
Creating explanatory visualizations of algorithms for active learning
Visualizations have been used to explain algorithms to learners, in order to help them understand complex processes. These ‘explanatory visualizations’ can help learners understand computer algorithms and data-structures. But most are created by an educator and merely watched by the learner. In this paper, we explain how we get learners to plan and develop their own explanatory visualizations of algorithms. By actively developing their own visualizations learners
gain a deeper insight of the algorithms that they are explaining. These depictions can also help other learners understand the algorithm
- …