4,124 research outputs found

    Clinical biological and genetic heterogeneity of the inborn errors of pulmonary surfactant metabolism

    Get PDF
    Pulmonary surfactant is a multimolecular complex located at the air-water interface within the alveolus to which a range of physical (surface-active properties) and immune functions has been assigned. This complex consists of a surface-active lipid layer (consisting mainly of phospholipids), and of an aqueous subphase. From discrete surfactant sub-fractions one can isolate strongly hydrophobic surf acta nt proteins B (SP-B) and C (SP-C) as well as collectins SP-A and SP-D, which were shown to have specific structural, metabolic, or immune properties. Inborn or acquired abnormalities of the surfactant, qualitative or quantitative in nature, account for a number of human diseases. Beside hyaline membrane disease of the preterm neonate, a cluster of hereditary or acquired lung diseases has been characterized by periodic acid-Schiff-positive material filling the alveoli. From this heterogeneous nosologic group, at least two discrete entities presently emerge. The first is the SP-B deficiency, in which an essentially proteinaceous material is stored within the alveoli, and which represents an autosomal recessive Mendelian entity linked to the SFTPB gene (MIM 1786640). The disease usually generally entails neonatal respiratory distress with rapid fatal outcome, although partial or transient deficiencies have also been observed. The second is alveolar proteinosis, characterized by the storage of a mixed protein and lipid material, which constitutes a relatively heterogeneous clinical and biological syndrome, especially with regard to age at onset (from the neonate through to adulthood) as well as the severity of associated signs. Murine models, with a targeted mutation of the gene encoding granulocyte macrophage colony-stimulating factor (GM-CSF) (Csfgm) or the beta subunit of its receptor (II3rb1) support the hypothesis of an abnormality of surfactant turnover in which the alveolar macrophage is a key player. Apart from SP-B deficiency, in which a near-consensus diagnostic chart can be designed, the ascertainment of other abnormalities of surfactant metabolism is not straightforward. The disentanglement of this disease cluster is however essential to propose specific therapeutic procedures: repeated broncho-alveolar ravages, GM-CSF replacement, bone marrow grafting or lung transplantation

    Non-Parametric Approximations for Anisotropy Estimation in Two-dimensional Differentiable Gaussian Random Fields

    Full text link
    Spatially referenced data often have autocovariance functions with elliptical isolevel contours, a property known as geometric anisotropy. The anisotropy parameters include the tilt of the ellipse (orientation angle) with respect to a reference axis and the aspect ratio of the principal correlation lengths. Since these parameters are unknown a priori, sample estimates are needed to define suitable spatial models for the interpolation of incomplete data. The distribution of the anisotropy statistics is determined by a non-Gaussian sampling joint probability density. By means of analytical calculations, we derive an explicit expression for the joint probability density function of the anisotropy statistics for Gaussian, stationary and differentiable random fields. Based on this expression, we obtain an approximate joint density which we use to formulate a statistical test for isotropy. The approximate joint density is independent of the autocovariance function and provides conservative probability and confidence regions for the anisotropy parameters. We validate the theoretical analysis by means of simulations using synthetic data, and we illustrate the detection of anisotropy changes with a case study involving background radiation exposure data. The approximate joint density provides (i) a stand-alone approximate estimate of the anisotropy statistics distribution (ii) informed initial values for maximum likelihood estimation, and (iii) a useful prior for Bayesian anisotropy inference.Comment: 39 pages; 8 figure

    Improving adherence to surveillance and screening recommendations for people with colorectal cancer and their first degree relatives: a randomized controlled trial

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Colorectal cancer (CRC) is among the leading causes of cancer-related morbidity and mortality worldwide. Despite clinical practice guidelines to guide surveillance care for those who have completed treatment for this disease as well as screening for first degree relatives of people with CRC, the level of uptake of these recommendations remains uncertain. If outcomes for both patients and their families are to be improved, it is important to establish systematic and cost-effective interventions to improve adherence to guideline recommendations for CRC surveillance and screening.</p> <p>Methods/Design</p> <p>A randomized controlled trial will be used to test the effectiveness of a print-based intervention to improve adherence to colonoscopy surveillance among people with CRC and adherence to CRC screening recommendations among their first degree relatives (FDRs). People diagnosed with CRC in the past 10 months will be recruited through a population-based cancer registry. Consenting participants will be asked if their first degree relatives might also be willing to participate in the trial. Information on family history of CRC will be obtained from patients at baseline. Patients and their families will be randomized to either minimal ethical care or the print-based intervention. The print-based intervention for FDRs will be tailored to the participant's level of risk of CRC as determined by the self-reported family history assessment. Follow up data on surveillance and screening participation will be collected from patients and their FDRs respectively at 12, 24 and 36 months' post recruitment. The primary analyses will relate to comparing levels of guideline adherence in usual care group versus print-based group in the patient sample and the FDR sample respectively.</p> <p>Discussion</p> <p>Results of this study will provide contribute to the evidence base about effective strategies to a) improve adherence to surveillance recommendation for people with CRC; and b) improve adherence to screening recommendation for FDRs of people with CRC. The use of a population-based cancer registry to access the target population may have significant advantages in increasing the reach of the intervention.</p> <p>Trial registration</p> <p>This trial is registered with the Australian and New Zealand Clinical Trials Registry Registration Number (ACTRN): <a href="http://www.anzctr.org.au/ACTRN12609000628246">ACTRN12609000628246</a>.</p

    How victim age affects the context and timing of child sexual abuse: applying the routine activities approach to the first sexual abuse incident

    Get PDF
    The aim of this study was to examine from the routine activities approach how victim age might help to explain the timing, context and nature of offenders’ first known contact sexual abuse incident. One-hundred adult male child sexual abusers (M = 45.8 years, SD = 12.2; range = 20–84) were surveyed about the first time they had sexual contact with a child. Afternoon and early evening (between 3 pm and 9 pm) was the most common time in which sexual contact first occurred. Most incidents occurred in a home. Two-thirds of incidents occurred when another person was in close proximity, usually elsewhere in the home. Older victims were more likely to be sexually abused by someone outside their families and in the later hours of the day compared to younger victims. Proximity of another person (adult and/or child) appeared to have little effect on offenders’ decisions to abuse, although it had some impact on the level of intrusion and duration of these incidents. Overall, the findings lend support to the application of the routine activities approach for considering how contextual risk factors (i.e., the timing and relationship context) change as children age, and raise questions about how to best conceptualize guardianship in the context of child sexual abuse. These factors should be key considerations when devising and implementing sexual abuse prevention strategies and for informing theory development

    Mechanisms of Cognitive Impairment in Cerebral Small Vessel Disease: Multimodal MRI Results from the St George's Cognition and Neuroimaging in Stroke (SCANS) Study.

    Get PDF
    Cerebral small vessel disease (SVD) is a common cause of vascular cognitive impairment. A number of disease features can be assessed on MRI including lacunar infarcts, T2 lesion volume, brain atrophy, and cerebral microbleeds. In addition, diffusion tensor imaging (DTI) is sensitive to disruption of white matter ultrastructure, and recently it has been suggested that additional information on the pattern of damage may be obtained from axial diffusivity, a proposed marker of axonal damage, and radial diffusivity, an indicator of demyelination. We determined the contribution of these whole brain MRI markers to cognitive impairment in SVD. Consecutive patients with lacunar stroke and confluent leukoaraiosis were recruited into the ongoing SCANS study of cognitive impairment in SVD (n = 115), and underwent neuropsychological assessment and multimodal MRI. SVD subjects displayed poor performance on tests of executive function and processing speed. In the SVD group brain volume was lower, white matter hyperintensity volume higher and all diffusion characteristics differed significantly from control subjects (n = 50). On multi-predictor analysis independent predictors of executive function in SVD were lacunar infarct count and diffusivity of normal appearing white matter on DTI. Independent predictors of processing speed were lacunar infarct count and brain atrophy. Radial diffusivity was a stronger DTI predictor than axial diffusivity, suggesting ischaemic demyelination, seen neuropathologically in SVD, may be an important predictor of cognitive impairment in SVD. Our study provides information on the mechanism of cognitive impairment in SVD

    A process pattern model for tackling and improving big data quality

    Get PDF
    Data seldom create value by themselves. They need to be linked and combined from multiple sources, which can often come with variable data quality. The task of improving data quality is a recurring challenge. In this paper, we use a case study of a large telecom company to develop a generic process pattern model for improving data quality. The process pattern model is defined as a proven series of activities, aimed at improving the data quality given a certain context, a particular objective, and a specific set of initial conditions. Four different patterns are derived to deal with the variations in data quality of datasets. Instead of having to find the way to improve the quality of big data for each situation, the process model provides data users with generic patterns, which can be used as a reference model to improve big data quality

    The health disparities cancer collaborative: a case study of practice registry measurement in a quality improvement collaborative

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Practice registry measurement provides a foundation for quality improvement, but experiences in practice are not widely reported. One setting where practice registry measurement has been implemented is the Health Resources and Services Administration's Health Disparities Cancer Collaborative (HDCC).</p> <p>Methods</p> <p>Using practice registry data from 16 community health centers participating in the HDCC, we determined the completeness of data for screening, follow-up, and treatment measures. We determined the size of the change in cancer care processes that an aggregation of practices has adequate power to detect. We modeled different ways of presenting before/after changes in cancer screening, including count and proportion data at both the individual health center and aggregate collaborative level.</p> <p>Results</p> <p>All participating health centers reported data for cancer screening, but less than a third reported data regarding timely follow-up. For individual cancers, the aggregate HDCC had adequate power to detect a 2 to 3% change in cancer screening, but only had the power to detect a change of 40% or more in the initiation of treatment. Almost every health center (98%) improved cancer screening based upon count data, while fewer (77%) improved cancer screening based upon proportion data. The aggregate collaborative appeared to increase breast, cervical, and colorectal cancer screening rates by 12%, 15%, and 4%, respectively (p < 0.001 for all before/after comparisons). In subgroup analyses, significant changes were detectable among individual health centers less than one-half of the time because of small numbers of events.</p> <p>Conclusions</p> <p>The aggregate HDCC registries had both adequate reporting rates and power to detect significant changes in cancer screening, but not follow-up care. Different measures provided different answers about improvements in cancer screening; more definitive evaluation would require validation of the registries. Limits to the implementation and interpretation of practice registry measurement in the HDCC highlight challenges and opportunities for local and aggregate quality improvement activities.</p

    Microservice Transition and its Granularity Problem: A Systematic Mapping Study

    Get PDF
    Microservices have gained wide recognition and acceptance in software industries as an emerging architectural style for autonomic, scalable, and more reliable computing. The transition to microservices has been highly motivated by the need for better alignment of technical design decisions with improving value potentials of architectures. Despite microservices' popularity, research still lacks disciplined understanding of transition and consensus on the principles and activities underlying "micro-ing" architectures. In this paper, we report on a systematic mapping study that consolidates various views, approaches and activities that commonly assist in the transition to microservices. The study aims to provide a better understanding of the transition; it also contributes a working definition of the transition and technical activities underlying it. We term the transition and technical activities leading to microservice architectures as microservitization. We then shed light on a fundamental problem of microservitization: microservice granularity and reasoning about its adaptation as first-class entities. This study reviews state-of-the-art and -practice related to reasoning about microservice granularity; it reviews modelling approaches, aspects considered, guidelines and processes used to reason about microservice granularity. This study identifies opportunities for future research and development related to reasoning about microservice granularity.Comment: 36 pages including references, 6 figures, and 3 table
    corecore