2,645 research outputs found

    Exploring differential item functioning in the SF-36 by demographic, clinical, psychological and social factors in an osteoarthritis population

    Get PDF
    The SF-36 is a very commonly used generic measure of health outcome in osteoarthritis (OA). An important, but frequently overlooked, aspect of validating health outcome measures is to establish if items work in the same way across subgroup of a population. That is, if respondents have the same 'true' level of outcome, does the item give the same score in different subgroups or is it biased towards one subgroup or another. Differential item functioning (DIF) can identify items that may be biased for one group or another and has been applied to measuring patient reported outcomes. Items may show DIF for different conditions and between cultures, however the SF-36 has not been specifically examined in an osteoarthritis population nor in a UK population. Hence, the aim of the study was to apply the DIF method to the SF-36 for a UK OA population. The sample comprised a community sample of 763 people with OA who participated in the Somerset and Avon Survey of Health. The SF-36 was explored for DIF with respect to demographic, social, clinical and psychological factors. Well developed ordinal regression models were used to identify DIF items. Results: DIF items were found by age (6 items), employment status (6 items), social class (2 items), mood (2 items), hip v knee (2 items), social deprivation (1 item) and body mass index (1 item). Although the impact of the DIF items rarely had a significant effect on the conclusions of group comparisons, in most cases there was a significant change in effect size. Overall, the SF-36 performed well with only a small number of DIF items identified, a reassuring finding in view of the frequent use of the SF-36 in OA. Nevertheless, where DIF items were identified it would be advisable to analyse data taking account of DIF items, especially when age effects are the focus of interest

    Quantum resource estimates for computing elliptic curve discrete logarithms

    Get PDF
    We give precise quantum resource estimates for Shor's algorithm to compute discrete logarithms on elliptic curves over prime fields. The estimates are derived from a simulation of a Toffoli gate network for controlled elliptic curve point addition, implemented within the framework of the quantum computing software tool suite LIQUiUi|\rangle. We determine circuit implementations for reversible modular arithmetic, including modular addition, multiplication and inversion, as well as reversible elliptic curve point addition. We conclude that elliptic curve discrete logarithms on an elliptic curve defined over an nn-bit prime field can be computed on a quantum computer with at most 9n+2log2(n)+109n + 2\lceil\log_2(n)\rceil+10 qubits using a quantum circuit of at most 448n3log2(n)+4090n3448 n^3 \log_2(n) + 4090 n^3 Toffoli gates. We are able to classically simulate the Toffoli networks corresponding to the controlled elliptic curve point addition as the core piece of Shor's algorithm for the NIST standard curves P-192, P-224, P-256, P-384 and P-521. Our approach allows gate-level comparisons to recent resource estimates for Shor's factoring algorithm. The results also support estimates given earlier by Proos and Zalka and indicate that, for current parameters at comparable classical security levels, the number of qubits required to tackle elliptic curves is less than for attacking RSA, suggesting that indeed ECC is an easier target than RSA.Comment: 24 pages, 2 tables, 11 figures. v2: typos fixed and reference added. ASIACRYPT 201

    An Optimal Distributed Discrete Log Protocol with Applications to Homomorphic Secret Sharing

    Get PDF
    The distributed discrete logarithm (DDL) problem was introduced by Boyle et al. at CRYPTO 2016. A protocol solving this problem was the main tool used in the share conversion procedure of their homomorphic secret sharing (HSS) scheme which allows non-interactive evaluation of branching programs among two parties over shares of secret inputs. Let gg be a generator of a multiplicative group G\mathbb{G}. Given a random group element gxg^{x} and an unknown integer b[M,M]b \in [-M,M] for a small MM, two parties AA and BB (that cannot communicate) successfully solve DDL if A(gx)B(gx+b)=bA(g^{x}) - B(g^{x+b}) = b. Otherwise, the parties err. In the DDL protocol of Boyle et al., AA and BB run in time TT and have error probability that is roughly linear in M/TM/T. Since it has a significant impact on the HSS scheme\u27s performance, a major open problem raised by Boyle et al. was to reduce the error probability as a function of TT. In this paper we devise a new DDL protocol that substantially reduces the error probability to O(MT2)O(M \cdot T^{-2}). Our new protocol improves the asymptotic evaluation time complexity of the HSS scheme by Boyle et al. on branching programs of size SS from O(S2)O(S^2) to O(S3/2)O(S^{3/2}). We further show that our protocol is optimal up to a constant factor for all relevant cryptographic group families, unless one can solve the discrete logarithm problem in a \emph{short} interval of length RR in time o(R)o(\sqrt{R}). Our DDL protocol is based on a new type of random walk that is composed of several iterations in which the expected step length gradually increases. We believe that this random walk is of independent interest and will find additional applications

    Quadratic optimal functional quantization of stochastic processes and numerical applications

    Get PDF
    In this paper, we present an overview of the recent developments of functional quantization of stochastic processes, with an emphasis on the quadratic case. Functional quantization is a way to approximate a process, viewed as a Hilbert-valued random variable, using a nearest neighbour projection on a finite codebook. A special emphasis is made on the computational aspects and the numerical applications, in particular the pricing of some path-dependent European options.Comment: 41 page

    Immunological and Inflammatory Biomarkers of Susceptibility and Severity in Adult Respiratory Syncytial Virus Infections.

    Get PDF
    BACKGROUND: . Respiratory syncytial virus (RSV) is the most common cause of bronchiolitis in young infants. However, it is also a significant pathogen in older adults. Validated biomarkers of RSV disease severity would benefit diagnostics, treatment decisions, and prophylactic interventions. This review summarizes knowledge of biomarkers for RSV disease in adults. METHODS: A literature review was performed using Ovid Medline, Embase, Global health, Scopus, and Web of Science for articles published 1946-October 2016. Nine articles were identified plus 9 from other sources. RESULTS: From observational studies of natural infection and challenge studies in volunteers, biomarkers of RSV susceptibility or disease severity in adults were: (1) lower anti-RSV neutralizing antibodies, where neutralizing antibody (and local IgA) may be a correlate of susceptibility/severity; (2) RSV-specific CD8+ T cells in bronchoalveolar lavage fluid preinfection (subjects with higher levels had less severe illness); and (3) elevated interleukin-6 (IL-6), IL-8, and myeloperoxidase levels in the airway are indicative of severe infection. CONCLUSIONS: Factors determining susceptibility to and severity of RSV disease in adults have not been well defined. Respiratory mucosal antibodies and CD8+ T cells appear to contribute to preventing infection and modulation of disease severity. Studies of RSV pathogenesis in at-risk populations are needed

    Estimation of the solubility parameters of model plant surfaces and agrochemicals: a valuable tool for understanding plant surface interactions

    Get PDF
    Background Most aerial plant parts are covered with a hydrophobic lipid-rich cuticle, which is the interface between the plant organs and the surrounding environment. Plant surfaces may have a high degree of hydrophobicity because of the combined effects of surface chemistry and roughness. The physical and chemical complexity of the plant cuticle limits the development of models that explain its internal structure and interactions with surface-applied agrochemicals. In this article we introduce a thermodynamic method for estimating the solubilities of model plant surface constituents and relating them to the effects of agrochemicals. Results Following the van Krevelen and Hoftyzer method, we calculated the solubility parameters of three model plant species and eight compounds that differ in hydrophobicity and polarity. In addition, intact tissues were examined by scanning electron microscopy and the surface free energy, polarity, solubility parameter and work of adhesion of each were calculated from contact angle measurements of three liquids with different polarities. By comparing the affinities between plant surface constituents and agrochemicals derived from (a) theoretical calculations and (b) contact angle measurements we were able to distinguish the physical effect of surface roughness from the effect of the chemical nature of the epicuticular waxes. A solubility parameter model for plant surfaces is proposed on the basis of an increasing gradient from the cuticular surface towards the underlying cell wall. Conclusions The procedure enabled us to predict the interactions among agrochemicals, plant surfaces, and cuticular and cell wall components, and promises to be a useful tool for improving our understanding of biological surface interactions

    Measuring the ICF components of impairment, activity limitation and participation restriction: an item analysis using classical test theory and item response theory

    Get PDF
    The International Classification of Functioning, Disability and Health (ICF) proposes three main health outcomes, Impairment (I), Activity Limitation (A) and Participation Restriction (P), but good measures of these constructs are needed The aim of this study was to use both Classical Test Theory (CTT) and Item Response Theory (IRT) methods to carry out an item analysis to improve measurement of these three components in patients having joint replacement surgery mainly for osteoarthritis (OA). A geographical cohort of patients about to undergo lower limb joint replacement was invited to participate. Five hundred and twenty four patients completed ICF items that had been previously identified as measuring only a single ICF construct in patients with osteoarthritis. There were 13 I, 26 A and 20 P items. The SF-36 was used to explore the construct validity of the resultant I, A and P measures. The CTT and IRT analyses were run separately to identify items for inclusion or exclusion in the measurement of each construct. The results from both analyses were compared and contrasted. Overall, the item analysis resulted in the removal of 4 I items, 9 A items and 11 P items. CTT and IRT identified the same 14 items for removal, with CTT additionally excluding 3 items, and IRT a further 7 items. In a preliminary exploration of reliability and validity, the new measures appeared acceptable. New measures were developed that reflect the ICF components of Impairment, Activity Limitation and Participation Restriction for patients with advanced arthritis. The resulting Aberdeen IAP measures (Ab-IAP) comprising I (Ab-I, 9 items), A (Ab-A, 17 items), and P (Ab-P, 9 items) met the criteria of conventional psychometric (CTT) analyses and the additional criteria (information and discrimination) of IRT. The use of both methods was more informative than the use of only one of these methods. Thus combining CTT and IRT appears to be a valuable tool in the development of measures

    Bridging the Health Data Divide

    No full text
    Fundamental quality, safety, and cost problems have not been resolved by the increasing digitization of health care. This digitization has progressed alongside the presence of a persistent divide between clinicians, the domain experts, and the technical experts, such as data scientists. The disconnect between clinicians and data scientists translates into a waste of research and health care resources, slow uptake of innovations, and poorer outcomes than are desirable and achievable. The divide can be narrowed by creating a culture of collaboration between these two disciplines, exemplified by events such as datathons. However, in order to more fully and meaningfully bridge the divide, the infrastructure of medical education, publication, and funding processes must evolve to support and enhance a learning health care system
    corecore