22,537 research outputs found

    Modeling alcohol use disorder severity: An integrative structural equation modeling approach

    Get PDF
    Background: Alcohol dependence is a complex psychological disorder whose phenomenology changes as the disorder progresses. Neuroscience has provided a variety of theories and evidence for the development, maintenance, and severity of addiction; however, clinically, it has been difficult to evaluate alcohol use disorder (AUD) severity.Objective: This study seeks to evaluate and validate a data-driven approach to capturing alcohol severity in a community sample.Method: Participants were non-treatment seeking problem drinkers (n = 283). A structural equation modeling approach was used to (a) verify the latent factor structure of the indices of AUD severity; and (b) test the relationship between the AUD severity factor and measures of alcohol use, affective symptoms, and motivation to change drinking.Results: The model was found to fit well, with all chosen indices of AUD severity loading significantly and positively onto the severity factor. In addition, the paths from the alcohol use, motivation, and affective factors accounted for 68% of the variance in AUD severity. Greater AUD severity was associated with greater alcohol use, increased affective symptoms, and higher motivation to change.Conclusion: Unlike the categorical diagnostic criteria, the AUD severity factor is comprised of multiple quantitative dimensions of impairment observed across the progression of the disorder. The AUD severity factor was validated by testing it in relation to other outcomes such as alcohol use, affective symptoms, and motivation for change. Clinically, this approach to AUD severity can be used to inform treatment planning and ultimately to improve outcomes. © 2013 Moallem, Courtney, Bacio and Ray

    Rate optimal multiple testing procedure in high-dimensional regression

    Full text link
    Multiple testing and variable selection have gained much attention in statistical theory and methodology research. They are dealing with the same problem of identifying the important variables among many (Jin, 2012). However, there is little overlap in the literature. Research on variable selection has been focusing on selection consistency, i.e., both type I and type II errors converging to zero. This is only possible when the signals are sufficiently strong, contrary to many modern applications. For the regime where the signals are both rare and weak, it is inevitable that a certain amount of false discoveries will be allowed, as long as some error rate can be controlled. In this paper, motivated by the research by Ji and Jin (2012) and Jin (2012) in the rare/weak regime, we extend their UPS procedure for variable selection to multiple testing. Under certain conditions, the new UPT procedure achieves the fastest convergence rate of marginal false non-discovery rates, while controlling the marginal false discovery rate at any designated level α\alpha asymptotically. Numerical results are provided to demonstrate the advantage of the proposed method.Comment: 27 page

    Temperature and heat flux measurements: Challenges for high temperature aerospace application

    Get PDF
    The measurement of high temperatures and the influence of heat transfer data is not strictly a problem of either the high temperatures involved or the level of the heating rates to be measured at those high temperatures. It is a problem of duration during which measurements are made and the nature of the materials in which the measurements are made. Thermal measurement techniques for each application must respect and work with the unique features of that application. Six challenges in the development of measurement technology are discussed: (1) to capture the character and localized peak values within highly nonuniform heating regions; (2) to manage large volumes of thermal instrumentation in order to efficiently derive critical information; (3) to accommodate thermal sensors into practical flight structures; (4) to broaden the capabilities of thermal survey techniques to replace discrete gages in flight and on the ground; (5) to provide supporting instrumentation conduits which connect the measurement points to the thermally controlled data acquisition system; and (6) to develop a class of 'vehicle tending' thermal sensors to assure the integrity of flight vehicles in an efficient manner

    An automated ETL for online datasets

    Get PDF
    While using online datasets for machine learning is commonplace today, the quality of these datasets impacts on the performance of prediction algorithms. One method for improving the semantics of new data sources is to map these sources to a common data model or ontology. While semantic and structural heterogeneities must still be resolved, this provides a well established approach to providing clean datasets, suitable for machine learning and analysis. However, when there is a requirement for a close to real time usage of online data, a method for dynamic Extract-Transform-Load of new sources data must be developed. In this work, we present a framework for integrating online and enterprise data sources, in close to real time, to provide datasets for machine learning and predictive algorithms. An exhaustive evaluation compares a human built data transformation process with our system’s machine generated ETL process, with very favourable results, illustrating the value and impact of an automated approach

    An automated ETL for online datasets

    Get PDF
    While using online datasets for machine learning is commonplace today, the quality of these datasets impacts on the performance of prediction algorithms. One method for improving the semantics of new data sources is to map these sources to a common data model or ontology. While semantic and structural heterogeneities must still be resolved, this provides a well established approach to providing clean datasets, suitable for machine learning and analysis. However, when there is a requirement for a close to real time usage of online data, a method for dynamic Extract-Transform-Load of new sources data must be developed. In this work, we present a framework for integrating online and enterprise data sources, in close to real time, to provide datasets for machine learning and predictive algorithms. An exhaustive evaluation compares a human built data transformation process with our system’s machine generated ETL process, with very favourable results, illustrating the value and impact of an automated approach
    corecore