132 research outputs found

    Prediction of tumour pathological subtype from genomic profile using sparse logistic regression with random effects

    Get PDF
    The purpose of this study is to highlight the application of sparse logistic regression models in dealing with prediction of tumour pathological subtypes based on lung cancer patients' genomic information. We consider sparse logistic regression models to deal with the high dimensionality and correlation between genomic regions. In a hierarchical likelihood (HL) method, it is assumed that the random effects follow a normal distribution and its variance is assumed to follow a gamma distribution. This formulation considers ridge and lasso penalties as special cases. We extend the HL penalty to include a ridge penalty (called ‘HLnet’) in a similar principle of the elastic net penalty, which is constructed from lasso penalty. The results indicate that the HL penalty creates more sparse estimates than lasso penalty with comparable prediction performance, while HLnet and elastic net penalties have the best prediction performance in real data. We illustrate the methods in a lung cancer study

    Quantifying the Storm Time Thermospheric Neutral Density Variations Using Model and Observations

    Full text link
    Accurate determination of thermospheric neutral density holds crucial importance for satellite drag calculations. The problem is twofold and involves the correct estimation of the quiet time climatology and storm time variations. In this work, neutral density estimations from two empirical and three physicsĂą based models of the ionosphereĂą thermosphere are compared with the neutral densities along the Challenging MicroĂą Satellite Payload satellite track for six geomagnetic storms. Storm time variations are extracted from neutral density by (1) subtracting the mean difference between model and observation (bias), (2) setting climatological variations to zero, and (3) multiplying model data with the quiet time ratio between the model and observation. Several metrics are employed to evaluate the model performances. We find that the removal of bias or climatology reveals actual performance of the model in simulating the storm time variations. When bias is removed, depending on event and model, storm time errors in neutral density can decrease by an amount of 113% or can increase by an amount of 12% with respect to error in models with quiet time bias. It is shown that using only average and maximum values of neutral density to determine the model performances can be misleading since a model can estimate the averages fairly well but may not capture the maximum value or vice versa. Since each of the metrics used for determining model performances provides different aspects of the error, among these, we suggest employing mean absolute error, prediction efficiency, and normalized root mean square error together as a standard set of metrics for the neutral density.Plain Language SummaryThermospheric neutral density is the largest source of uncertainty in atmospheric drag calculations. Consequently, mission and maneuver planning, satellite lifetime predictions, collision avoidance, and orbit determination depend on the accurate estimation of the thermospheric neutral density. Thermospheric neutral density varies in different timescales. In short timescales, the largest variations occur due to the geomagnetic storms. Several empirical and physicsĂą based models of the ionosphereĂą thermosphere system are used for estimating the variations in the neutral density. However, the storm time responses from the models are clouded by the climatology (background variations), upon which the effect of geomagnetic storms is superimposed. In this work, we show that it is critical to use reference levels for the neutral density to extract the true performance of the models for the evaluation of the storm time performances. We demonstrate that mean absolute error, prediction efficiency, and normalized root mean square error should be considered together for the performance evaluations, since each of them provides different aspects of the error.Key PointsUsing the average and maximum values of neutral densities to determine the model performances can be misleadingRemoving the quiet time trend from the neutral density reveals the actual performance of the model in simulating the storm time variationsMean absolute error, prediction efficiency, and normalized root mean square error should be considered together for the evaluationsPeer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/148396/1/swe20816_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/148396/2/swe20816-sup-0001-2018SW002033-SI.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/148396/3/swe20816.pd

    MRI Indices of Cortical Development in Young People With Psychotic Experiences: Influence of Genetic Risk and Persistence of Symptoms

    Get PDF
    Background Psychotic experiences (PEs) are considered part of an extended psychosis phenotype and are associated with an elevated risk of developing a psychotic disorder. Risk of transition increases with persistence of PEs, and this is thought to be modulated by genetic and environmental factors. However, it is unclear if persistence is associated with progressive schizophrenia-like changes in neuroanatomy. Methods We examined cortical morphometry using MRI in 247 young adults, from a population-based cohort, assessed for the presence of PEs at ages 18 and 20. We then incorporated a polygenic risk score for schizophrenia (PRS) to elucidate the effects of high genetic risk. Finally, we used atlas-based tractography data to examine the underlying white matter. Results Individuals with persisting PEs showed reductions in gyrification (local gyrification index: lGI) in the left temporal gyrus as well as atypical associations with brain volume (TBV) in the left occipital and right prefrontal gyri. No main effect was found for the PRS, but interaction effects with PEs were identified in the orbitofrontal, parietal, and temporal regions. Examination of underlying white matter did not provide strong evidence of further disturbances. Conclusions Disturbances in lGI were similar to schizophrenia but findings were mostly limited to those with persistent PEs. These could reflect subtle changes that worsen with impending psychosis or reflect an early vulnerability associated with the persistence of PEs. The lack of clear differences in underlying white matter suggests our findings reflect early disturbances in cortical expansion rather than progressive changes in brain structure

    The Earth: Plasma Sources, Losses, and Transport Processes

    Get PDF
    This paper reviews the state of knowledge concerning the source of magnetospheric plasma at Earth. Source of plasma, its acceleration and transport throughout the system, its consequences on system dynamics, and its loss are all discussed. Both observational and modeling advances since the last time this subject was covered in detail (Hultqvist et al., Magnetospheric Plasma Sources and Losses, 1999) are addressed

    Cooperation between Russian research organizations and industrial companies: factors and problems

    Get PDF
    The study is focused on the cooperation of Russian companies with research organizations in implementing R&D projects during technological innovation. Taking into account behavioral changes, authors carry out a micro-level analysis based on empirical data of executive survey of over 600 Russian industrial firms (2011—2012) and about 350 research organizations and universities (2012). The authors emphasize the key factors of firms’ demand for outsourcing R&D reveal the main barriers to the development of university-industry cooperation and their particularities for different cooperation actors. The analysis shows that there is a positive relation between the size of a company and R&D outsourcing. As for the factor of age, the highest cooperation activity of Russian firms is observed among enterprises founded over 20 years ago. As far as concernes cooperation activity of research organizations, large ones are significantly more likely to cooperate with business. A common prerequisite for research organizations' R&D cooperation with business is sufficient academic ranking. Business and science evaluate differently various obstacles to effective cooperation. For firms, the main problems are the inflated costs of national R&Ds, insufficient research organizations’ orientation at company needs, and low quality of developments. As for representatives of research organizations, they mention as barriers primarily the lack of companies' receptivity to innovation and inadequate information about promising developments. Businesses are more optimistic about cooperation with science if they already have a relevant experience of interaction. In the case of research organizations we observe a different pattern: most problems seem more significant to organizations conducting R&D in business interests

    Localization of Network Nodes for Extra-Terrestrial Deployment

    No full text
    oes not. The APIT algorithm works by forming triangles of anchor nodes. A node to be localized then determines which triangles it is inside of and the algorithm localizes the node to the center of gravity of the intersection of these triangles. The APIT algorithm states that if no neighbor of a node is further from or closer to all three anchors simultaneously then the node is inside the triangle. Otherwise it is outside the triangle. However since APIT relies on the positioning of neighboring nodes it is susceptible to error. If a node is outside of the triangle we will develop cone shaped error zones radiating in either direction from the node as shown in Fig 1. If the neighboring nodes are within these error cones then the algorithm will incorrectly determine that the node is within the triangle. The closer to the anchor triangle the larger the danger regions and thus the likelihood of encountering the problem. This is the basis of the OutToIn problem noted in [7]. To avoid the pr
    • 

    corecore