239 research outputs found

    Theranostic nanoparticles for treatment of inflammatory disease and cancer

    Get PDF
    The traditional one treatment fits all paradigm disregards the heterogeneity between patients, and within a particular disease, thus limit the success of common treatments. Moreover, current treatment lacks specificity and therefore most of the drugs induce some adverse effects. Personalized medicine aims to individualize therapeutic interventions, based on the growing knowledge of the human multiple \u27-oms\u27 (e.g. genome, epigenome, transcriptome, proteome and metabolome), which has led to the discovery of various biomarkers that can be used to detect for example, early stage cancers and predict tumor progression, drug response, and clinical outcome. Nanomedicine, the application of nanotechnology to healthcare, holds great promise for revolutionizing disease management such as drug delivery, molecular imaging, reduced adverse effects and the ability to contain both therapeutic and diagnostic modalities simultaneously termed theranostics. Personalized nanomedicine has the power of combining nanomedicine with clinical and molecular biomarkers ( OMICS data) achieving improve prognosis and disease management as well as individualized drug selection and dosage profiling to ensure maximal efficacy and safety. In this presentation I will discuss the immense potential of combining the best of these two worlds, nanomedicine and high throughput OMICS technologies to pave the way towards personalized medicine. Examples will be given from the fields of Oncology (Brain tumors, Ovarian Cancer and Blood cancers) and Inflammation (inflammtory bowel diseases

    Preparing to Share Social Science Data: An Open Source, DDI-based Curation System

    Get PDF
    Objective: This poster will describe the development of a curatorial system to support a repository for research data from randomized controlled trials in the social sciences. Description: The Institution for Social and Policy Studies (ISPS) at Yale University and Innovations for Poverty Action (IPA) are partnering with Colectica to develop a software platform that structures the curation workflow, including checking data for confidentiality and completeness, creating preservation formats, and reviewing and verifying code. The software leverages DDI Lifecycle – the standard for data documentation – and will enable a seamless framework for collecting, processing, archiving, and publishing data. This data curation software system combines several off-the-shelf components with a new, open source, Web application that integrates the existing components to create a flexible data pipeline. The software will help automate parts of the data pipeline and will unify the workflow for staff, and potentially for researchers. Default components include Fedora Commons, Colectica Repository, and Drupal, but the software is developed so each of these can be swapped for alternatives. Results: The software is designed to integrate into any repository workflow, and can also be incorporated earlier in the research workflow, ensuring eventual data and code deposits are of the highest quality. Conclusions: This poster will describe the requirements for the new curatorial workflow tool, the components of the system, how tasks are launched and tracked, and the benefits of building an integrated curatorial system for data, documentation, and code

    Long-term Follow up of Van Nes Rotationplasty for Congenital Proximal Focal Femoral Deficiency

    Get PDF
    Van Nes rotationplasty may be used for patients with congenital proximal focal femoral deficiency (PFFD). The lower limb is rotated to use the ankle and foot as a functional knee joint within a prosthesis. A small series of cases was investigated to determine the long-term outcome. At a mean of 21.5 years (11 to 45) after their rotationplasty, a total of 12 prosthetic patients completed the Short-Form (SF)-36, Faces Pain Scale-Revised, Harris hip score, Oswestry back pain score and Prosthetic Evaluation Questionnaires, as did 12 age- and gender-matched normal control participants. A physical examination and gait analysis, computerised dynamic posturography (CDP), and timed ‘Up & Go’ testing was also completed. Wilcoxon Signed rank test was used to compare each PFFD patient with a matched control participant with false discovery rate of 5%. There were no differences between the groups in overall health and well-being on the SF-36. Significant differences were seen in gait parameters in the PFFD group. Using CDP, the PFFD group had reduced symmetry in stance, and reduced end point and maximum excursions. Patients who had undergone Van Nes rotationplasty had a high level of function and quality of life at long-term follow-up, but presented with significant differences in gait and posture compared with the control group

    Exploratory Nuclear Reactor Safety Analysis and Visualization via Integrated Topological and Geometric Techniques

    Full text link
    A recent trend in the nuclear power engineering field is the implementation of heavily computational and time consuming algorithms and codes for both design and safety analysis. In particular, the new generation of system analysis codes aim to embrace several phenomena such as thermo-hydraulic, structural behavior, and system dynamics, as well as uncertainty quantification and sensitivity analyses. The use of dynamic probabilistic risk assessment (PRA) methodologies allows a systematic approach to uncertainty quantification. Dynamic methodologies in PRA account for possible coupling between triggered or stochastic events through explicit consideration of the time element in system evolution, often through the use of dynamic system models (simulators). They are usually needed when the system has more than one failure mode, control loops, and/or hardware/process/software/human interaction. Dynamic methodologies are also capable of modeling the consequences of epistemic and aleatory uncertainties. The Monte-Carlo (MC) and the Dynamic Event Tree (DET) approaches belong to this new class of dynamic PRA methodologies. The major challenges in using MC and DET methodologies (as well as other dynamic methodologies) are the heavier computational and memory requirements compared to the classical ET analysis. This is due to the fact that each branch generated can contain time evolutions of a large number of variables (about 50,000 data channels are typically present in RELAP) and a large number of scenarios can be generated from a single initiating event (possibly on the order of hundreds or even thousands). Such large amounts of information are usually very difficult to organize in order to identify the main trends in scenario evolutions and the main risk contributors for each initiating event. This report aims to improve Dynamic PRA methodologies by tackling the two challenges mentioned above using: 1) adaptive sampling techniques to reduce computational cost of the analysis and 2) topology-based methodologies to interactively visualize multidimensional data and extract risk-informed insights. Regarding item 1) we employ learning algorithms that aim to infer/predict simulation outcome and decide the coordinate in the input space of the next sample that maximize the amount of information that can be gained from it. Such methodologies can be used to both explore and exploit the input space. The later one is especially used for safety analysis scopes to focus samples along the limit surface, i.e. the boundaries in the input space between system failure and system success. Regarding item 2) we present a software tool that is designed to analyze multi-dimensional data. We model a large-scale nuclear simulation dataset as a high-dimensional scalar function defined over a discrete sample of the domain. First, we provide structural analysis of such a function at multiple scales and provide insight into the relationship between the input parameters and the output. Second, we enable exploratory analysis for users, where we help the users to differentiate features from noise through multi-scale analysis on an interactive platform, based on domain knowledge and data characterization. Our analysis is performed by exploiting the topological and geometric properties of the domain, building statistical models based on its topological segmentations and providing interactive visual interfaces to facilitate such explorations

    An Open Source, DDI-Based Data Curation System For Social Science Data

    Get PDF
    The Institution for Social and Policy Studies (ISPS) at Yale University and Innovations for Poverty Action (IPA) are partnering to develop a repository for research data from randomized controlled trials in the social sciences. The repository will be an expansion – and major upgrade – of the existing ISPS Data Archive. Together with Colectica, the partners are developing a software platform that leverages DDI Lifecycle, the standard for data documentation. The software structures the curation workflow, which also includes checking data for confidentiality and completeness, creating preservation formats, and reviewing and verifying code. The software will enable a seamless framework for collecting, processing, archiving, and publishing data. This data curation software system combines several off-the-shelf components with a new, open source, Web application that integrates the existing components to create a flexible data pipeline. The software helps automate parts of the data pipeline and unifies the workflow for staff. Default components include Fedora Commons, Colectica Repository, and Drupal, but the software is developed so each of these can be swapped for alternatives. This session will include a live demonstration of the data curation software.  &nbsp

    Compulsive sexual behavior disorder in obsessive–compulsive disorder: Prevalence and associated comorbidity

    Get PDF
    Background and aims: Compulsive sexual behavior disorder (CSBD) will be included in ICD-11 as an impulse-control disorder. CSBD also shares clinical features with obsessive–compulsive spectrum disorders (OCSDs) and behavioral addictions. There has been relatively little systematic investigation of CSBD in obsessive–compulsive disorder (OCD), the paradigmatic compulsive disorder. We aimed to determine prevalence of CSBD in OCD, and its associated sociodemographic and clinical features, including associated comorbidity, to learn more about the nature of CSBD. Methods: Adult outpatients with current OCD (N = 539) participated in this study. The Structured Clinical Interview for OCSDs was used to diagnose OCSDs (Tourette’s syndrome, compulsive shopping, pathological gambling, kleptomania, pyromania, intermittent explosive disorder, self-injurious behavior, and CSBD). Prevalence rates of OCSDs in male versus female patients as well as comorbid disorders in OCD patients with and without CSBD were compared. Results: Lifetime prevalence of CSBD was 5.6% in patients with current OCD and significantly higher in men than women. OCD patients with and without CSBD were similar in terms of age, age of onset of OCD, present OCD illness severity, as well as educational background. Lifetime prevalence rates of several mood, obsessive-compulsive, and impulse-control disorders were considerably elevated in patients with lifetime CSBD. Discussion and conclusions: A substantive number of OCD patients suffered from CSBD. CSBD in OCD was more likely comorbid with other mood, obsessive–compulsive, and impulse-control disorders, but not with disorders due to substance use or addictive behaviors. This finding supports conceptualization of CSBD as a compulsive-impulsive disorder

    The generalisability of randomised clinical trials : an interim external validity analysis of the ongoing SENOMAC trial in sentinel lymph node-positive breast cancer

    Get PDF
    Purpose None of the key randomised trials on the omission of axillary lymph node dissection (ALND) in sentinel lymph-positive breast cancer have reported external validity, even though results indicate selection bias. Our aim was to assess the external validity of the ongoing randomised SENOMAC trial by comparing characteristics of Swedish SENOMAC trial participants with non-included eligible patients registered in the Swedish National Breast Cancer Register (NKBC). Methods In the ongoing non-inferiority European SENOMAC trial, clinically node-negative cT1-T3 breast cancer patients with up to two sentinel lymph node macrometastases are randomised to undergo completion ALND or not. Both breast-conserving surgery and mastectomy are eligible interventions. Data from NKBC were extracted for the years 2016 and 2017, and patient and tumour characteristics compared with Swedish trial participants from the same years. Results Overall, 306 NKBC cases from non-participating and 847 NKBC cases from participating sites (excluding SENOMAC participants) were compared with 463 SENOMAC trial participants. Patients belonging to the middle age groups (p = 0.015), with smaller tumours (p = 0.013) treated by breast-conserving therapy (50.3 versus 47.1 versus 65.2%, p < 0.001) and less nodal tumour burden (only 1 macrometastasis in 78.8 versus 79.9 versus 87.3%, p = 0.001) were over-represented in the trial population. Time trends indicated, however, that differences may be mitigated over time. Conclusions This interim external validity analysis specifically addresses selection mechanisms during an ongoing trial, potentially increasing generalisability by the time full accrual is reached. Similar validity checks should be an integral part of prospective clinical trials. Trial registration: NCT 02240472, retrospective registration date September 14, 2015 after trial initiation on January 31, 2015Peer reviewe

    Exploration of High-dimensional Scalar Function for Nuclear Reactor Safety Analysis and Visualization: A User's Guide to TopoXG*

    Get PDF
    Large-scale simulation datasets can be modeled as high-dimensional scalar functions defined over a discrete sample of the domain. The goals of our proposed research are two-fold. First, we would like to provide structural analysis of a function at multiple scales and provide insight into the relationship between the input parameters and the output. Second, we enable exploratory analysis for users, where we help the users to differentiate features from noise through multi-scale analysis on an interactive platform, based on domain knowledge and data characterization. TopoXG is a software package that is designed to address these goals. The unique contribution of TopoXG lies in exploiting the topological and geometric properties of the domain, building statistical models based on its topological segmentations and providing interactive visual interfaces to facilitate such explorations. We provide a user’s guide to TopoXG, by highlighting its analysis and visualization capabilities, and giving several use cases involving datasets from nuclear reactor safety simulations

    Bottom and Suspended Sediment Backscatter Measurements in a Flume—Towards Quantitative Bed and Water Column Properties

    Get PDF
    For health and impact studies of water systems, monitoring underwater environments is essential, for which multi-frequency single- and multibeam echosounders are commonly used state-of-the-art technologies. However, the current scarcity of sediment reference datasets of both bottom backscatter angular response and water column scattering hampers empirical data interpretation. Comprehensive reference data derived from measurements in a controlled environment should optimize the use of empirical backscatter data. To prepare for such innovative experiments, we conducted a feasibility experiment in the Delta Flume (Deltares, The Netherlands). Several configurations of sonar data were recorded of the flume floor and suspended sediment plumes. The results revealed that flume reverberation was sufficiently low and that the differential settling of fine-sand plumes in the water column was clearly detected. Following this successful feasibility test, future comprehensive experiments will feature multi-frequency multi-angle measurements on a variety of sediment types, additional scatterers and sediment plumes, resulting in reference datasets for an improved interpretation of underwater backscatter measurements for scientific observation and sustainable management

    Nanoparticles for imaging, sensing, and therapeutic intervention

    Get PDF
    Nanoparticles have the potential to contribute to new modalities in molecular imaging and sensing as well as in therapeutic interventions. In this Nano Focus article, we identify some of the current challenges and knowledge gaps that need to be confronted to accelerate the developments of various applications. Using specific examples, we journey from the characterization of these complex hybrid nanomaterials; continue with surface design and (bio)physicochemical properties, their fate in biological media and cells, and their potential for cancer treatment; and finally reflect on the role of animal models to predict their behavior in humans
    • …
    corecore