55 research outputs found

    A Nomogram to Predict Patients with Obstructive Coronary Artery Disease: Development and Validation

    Get PDF
    Objective: To develop and validate clinical prediction models for the development of a nomogram to estimate the probability of patients having coronary artery disease (CAD). Methods and Results: A total of 1,025 patients referred for coronary angiography were included in a retrospective, single-center study. Randomly, 720 patients (70%) were selected as the development group and the other patients were selected as the validation group. Multivariate logistic regression analysis showed that the seven risk factors age, sex, systolic blood pressure, lipoprotein-associated phospholipase A 2, type of angina, hypertension, and diabetes were significant for diagnosis of CAD, from which we established model A. We established model B with the risk factors age, sex, height, systolic blood pressure, low-density lipoprotein cholesterol, lipoprotein-associated phospholipase A 2, type of angina, hypertension, and diabetes via the Akaike information criterion. The risk factors from the original Framingham Risk Score were used for model C. From comparison of the areas under the receiver operating characteristic curve, net reclassification improvement, and integrated discrimination improvement of models A, B, and C, we chose model B to develop the nomogram because of its fitness in discrimination, calibration, and clinical efficiency. The nomogram for diagnosis of CAD could be used easily and conveniently. Conclusion: An individualized clinical prediction model for patients with CAD allowed an accurate estimation in Chinese populations. The Akaike information criterion is a better method in screening risk factors. The net reclassification improvement and integrated discrimination improvement are better than the area under the receiver operating characteristic curve in discrimination. Decision curve analysis can be used to evaluate the efficiency of clinical prediction models. </p

    Simulation of tumor ablation in hyperthermia cancer treatment: A parametric study

    Full text link
    A holistic simulation framework is established on magnetic hyperthermia modeling to solve the treatment process of tumor, which is surrounded by a healthy tissue block. The interstitial tissue fluid, MNP distribution, temperature profile, and nanofluids are involved in the simulation. Study evaluates the cancer treatment efficacy by cumulative-equivalent-minutes-at-43 centigrade (CEM43), a widely accepted thermal dose coming from the cell death curve. Results are separated into the conditions of with or without gravity effect in the computational domain, where two baseline case are investigated and compared. An optimal treatment time 46.55 min happens in the baseline case without gravity, but the situation deteriorates with gravity effect where the time for totally killing tumor cells prolongs 36.11% and meanwhile causing 21.32% ablation in healthy tissue. For the cases without gravity, parameter study of Lewis number and Heat source number are conducted and the variation of optimal treatment time are both fitting to the inverse functions. For the case considering the gravity, parameters Buoyancy ratio and Darcy ratio are investigated and their influence on totally killing tumor cells and the injury on healthy tissue are matching with the parabolic functions. The results are beneficial to the prediction of various conditions, and provides useful guide to the magnetic hyperthermia treatment

    Multifunctional, durable and highly conductive graphene/sponge nanocomposites

    Get PDF
    Porous functional materials play important roles in a wide variety of growing research and industrial fields. We herein report a simple, effective method to prepare porous functional graphene composites for multi-field applications. Graphene sheets were non-chemically modified by Triton®X-100, not only to maintain high structural integrity but to improve the dispersion of graphene on the pore surface of a sponge. It was found that a graphene/sponge nanocomposite at 0.79 wt.% demonstrated ideal electrical conductivity. The composite materials have high strain sensitivity, stable fatigue performance for 20,000 cycles, short response time of 0.401s and fast response to temperature and pressure. In addition, the composites are effective in monitoring materials deformation and acoustic attenuation with a maximum absorption rate 67.78% and it can be used as electrodes for a supercapacitor with capacitance of 18.1 F/g. Moreover, no expensive materials or complex equipment are required for the composite manufacturing process. This new methodology for the fabrication of multifunctional, durable and highly conductive graphene/sponge nanocomposites hold promise for many other applications

    Generalized Hartmann-Shack array of dielectric metalens sub-arrays for polarimetric beam profiling

    Get PDF
    To define and characterize optical systems, obtaining information on the amplitude, phase, and polarization profile of optical beams is of utmost importance. Polarimetry using bulk optics is well established to characterize the polarization state. Recently, metasurfaces and metalenses have successfully been introduced as compact optical components. Here, we take the metasurface concept to the system level by realizing arrays of metalens 2*3 sub-arrays, allowing to determine the polarization profile of an optical beam. We use silicon-based metalenses with a numerical aperture of 0.32 and a mean measured diffraction efficiency in transmission mode of 28% at 1550 nm wavelength. Together with a standard camera recording the array foci, our system is extremely compact and allows for real-time beam diagnostics by inspecting the foci amplitudes. By further analyzing the foci displacements in the spirit of a Hartmann-Shack wavefront sensor, we can simultaneously detect phase-gradient profiles. As application examples, we diagnose the polarization profiles of a radially polarized beam, an azimuthally polarized beam, and of a vortex beam.Comment: 20 pages, 6 figures

    Evidence based on Mendelian randomization and colocalization analysis strengthens causal relationships between structural changes in specific brain regions and risk of amyotrophic lateral sclerosis

    Get PDF
    BackgroundAmyotrophic lateral sclerosis (ALS) is a neurodegenerative disease characterized by the degeneration of motor neurons in the brain and spinal cord with a poor prognosis. Previous studies have observed cognitive decline and changes in brain morphometry in ALS patients. However, it remains unclear whether the brain structural alterations contribute to the risk of ALS. In this study, we conducted a bidirectional two-sample Mendelian randomization (MR) and colocalization analysis to investigate this causal relationship.MethodsSummary data of genome-wide association study were obtained for ALS and the brain structures, including surface area (SA), thickness and volume of subcortical structures. Inverse-variance weighted (IVW) method was used as the main estimate approach. Sensitivity analysis was conducted detect heterogeneity and pleiotropy. Colocalization analysis was performed to calculate the posterior probability of causal variation and identify the common genes.ResultsIn the forward MR analysis, we found positive associations between the SA in four cortical regions (lingual, parahippocampal, pericalcarine, and middle temporal) and the risk of ALS. Additionally, decreased thickness in nine cortical regions (caudal anterior cingulate, frontal pole, fusiform, inferior temporal, lateral occipital, lateral orbitofrontal, pars orbitalis, pars triangularis, and pericalcarine) was significantly associated with a higher risk of ALS. In the reverse MR analysis, genetically predicted ALS was associated with reduced thickness in the bankssts and increased thickness in the caudal middle frontal, inferior parietal, medial orbitofrontal, and superior temporal regions. Colocalization analysis revealed the presence of shared causal variants between the two traits.ConclusionOur results suggest that altered brain morphometry in individuals with high ALS risk may be genetically mediated. The causal associations of widespread multifocal extra-motor atrophy in frontal and temporal lobes with ALS risk support the notion of a continuum between ALS and frontotemporal dementia. These findings enhance our understanding of the cortical structural patterns in ALS and shed light on potentially viable therapeutic targets

    Training Heterogeneous Features in Sequence to Sequence Tasks: Latent Enhanced Multi-filter Seq2Seq Model

    Full text link
    In language processing, training data with extremely large variance may lead to difficulty in the language model's convergence. It is difficult for the network parameters to adapt sentences with largely varied semantics or grammatical structures. To resolve this problem, we introduce a model that concentrates the each of the heterogeneous features in the input sentences. Building upon the encoder-decoder architecture, we design a latent-enhanced multi-filter seq2seq model (LEMS) that analyzes the input representations by introducing a latent space transformation and clustering. The representations are extracted from the final hidden state of the encoder and lie in the latent space. A latent space transformation is applied for enhancing the quality of the representations. Thus the clustering algorithm can easily separate samples based on the features of these representations. Multiple filters are trained by the features from their corresponding clusters, and the heterogeneity of the training data can be resolved accordingly. We conduct two sets of comparative experiments on semantic parsing and machine translation, using the Geo-query dataset and Multi30k English-French to demonstrate the enhancement our model has made respectively.Comment: Accepted to Intelligent Systems Conference 202
    • …
    corecore