10,254 research outputs found

    LONGITUDINAL CLONAL LINEAGE DYNAMICS AND FUNCTIONAL CHARACTERIZATION OF PANCREATIC CANCER CHEMO-RESISTANCE AND METASTASIZATION

    Get PDF
    In recent years, technological advancements, such as next-generation sequencing and single-cell interrogation techniques, have enriched our understanding in tumor heterogeneity. By dissecting tumors and characterizing clonal lineages, we are better understanding the intricacies of tumor evolution. Tumors are represented by the presence of and dynamic interactions amongst clonal lineages. Each lineage and each cell contributes to tumor dynamics through intrinsic and extrinsic mechanisms, and the variable responses of clones to perturbations in the environment, especially therapeutics, underlie disease progression and relapse. Thus, there exists a pressing need to understand the molecular mechanisms that determine the functional heterogeneity of tumor sub-clones to improve clinical outcomes. Clonal replica tumors (CRTs) is an in vivo platform created specifically to enable robust tracing and functional study of clones within a tumor. The establishment of CRTs is built upon our current concept of tumor heterogeneity, intrinsic cancer cell hierarchy and clonal self-renewal properties. The model allows researchers to create large cohorts of tumors in different animals that are identical in their clonal lineage composition (clonal correlation amongst tumors \u3e0.99). CRTs allow simultaneously tracking of tens of thousands of clonal lineages in different animals to provide a high level of resolution and biological reproducibility. CRTs are comprised of barcoded cells that can be identified and quantified. A critical feature is that we have developed a systematic method to isolate and expand essentially any of the clonal lineages present within a CRT in their naïve state; that is, we can characterize each sub-clonal lineage at the molecular and functional levels and correlate these findings with the behavior of the same lineage in vivo and in response to drugs. Here, based on the CRT model and its concept, we studied differential chemo-resistance among clones, where we identified pre-existing upregulation in DNA repair as a mechanism for chemo-resistance. Furthermore, through stringent statistical testing, we demonstrated orthotopic CRTs to be a powerful and robust model to quantitatively track clonal evolution. Specifically, we longitudinally tracked clones in models of pancreatic ductal adenocarcinoma (PDAC) from primary tumor expansion through metastasization, where we captured unexpected clonal dynamics and “alternating clonal dominance” naturally occurring in unperturbed tumors. Moreover, by characterizing pro- and none-metastasizing clones, we were able to identified key clonal intrinsic factors that determined the nature of tumor metastases. Finally, I will discuss distinct clonal evolution patterns that emerged under different environmental pressures, leading to the hypothesis of “tumor clonal fingerprint”, where the characteristic of a tumor could be defined by actively maintained ratio of different tumor lineages, which could provide measurable insights to how we approach treatments

    Application of advanced technology to space automation

    Get PDF
    Automated operations in space provide the key to optimized mission design and data acquisition at minimum cost for the future. The results of this study strongly accentuate this statement and should provide further incentive for immediate development of specific automtion technology as defined herein. Essential automation technology requirements were identified for future programs. The study was undertaken to address the future role of automation in the space program, the potential benefits to be derived, and the technology efforts that should be directed toward obtaining these benefits

    State-of-the-art in aerodynamic shape optimisation methods

    Get PDF
    Aerodynamic optimisation has become an indispensable component for any aerodynamic design over the past 60 years, with applications to aircraft, cars, trains, bridges, wind turbines, internal pipe flows, and cavities, among others, and is thus relevant in many facets of technology. With advancements in computational power, automated design optimisation procedures have become more competent, however, there is an ambiguity and bias throughout the literature with regards to relative performance of optimisation architectures and employed algorithms. This paper provides a well-balanced critical review of the dominant optimisation approaches that have been integrated with aerodynamic theory for the purpose of shape optimisation. A total of 229 papers, published in more than 120 journals and conference proceedings, have been classified into 6 different optimisation algorithm approaches. The material cited includes some of the most well-established authors and publications in the field of aerodynamic optimisation. This paper aims to eliminate bias toward certain algorithms by analysing the limitations, drawbacks, and the benefits of the most utilised optimisation approaches. This review provides comprehensive but straightforward insight for non-specialists and reference detailing the current state for specialist practitioners

    A Parallel Algorithm and Implementation to Compute Spatial Autocorrelation (Hotspot) Using MATLAB

    Get PDF
    Being a spatial autocorrelation visualization tool in recent years, hotspot is often used in various fields, such as disease analysis, crime analysis, and weather conditions analysis and prediction in a certain area. Most of the research in hot spot analysis is in applying the concept to a variety of fields and to gain insights on the statistical significance prevalent in the clustering of data. Only a few of them discussed the efficiency and optimization of the algorithm. Commonly, these kinds of analyses would be based upon a huge dataset about space and time, and the conventional algorithm would take too much time to get the results. This paper mainly discusses whether the algorithm can be processed in parallel with MATLAB and how to further optimize the algorithm to shorten the calculation time and obtain accurate outcomes faster. I will use the toolbox ‘parpool’ in MATLAB on a multi-core node to parallelize the conventional algorithm, and then take advantage of the basic idea of the \u27R-tree\u27 to further optimize the parallel algorithm. In the end, the results are satisfactory, because the conventional serial algorithm can be parallelized in MATLAB, and the time consumption was saved about five times compared to the original algorithm. When the algorithm was further optimized, its time consumption is saved about ten times. This paper will be helpful in saving time when doing similar computations and analyses in the future

    The GalMer database: Galaxy Mergers in the Virtual Observatory

    Full text link
    We present the GalMer database, a library of galaxy merger simulations, made available to users through tools compatible with the Virtual Observatory (VO) standards adapted specially for this theoretical database. To investigate the physics of galaxy formation through hierarchical merging, it is necessary to simulate galaxy interactions varying a large number of parameters: morphological types, mass ratios, orbital configurations, etc. On one side, these simulations have to be run in a cosmological context, able to provide a large number of galaxy pairs, with boundary conditions given by the large-scale simulations, on the other side the resolution has to be high enough at galaxy scales, to provide realistic physics. The GalMer database is a library of thousands simulations of galaxy mergers at moderate spatial resolution and it is a compromise between the diversity of initial conditions and the details of underlying physics. We provide all coordinates and data of simulated particles in FITS binary tables. The main advantages of the database are VO access interfaces and value-added services which allow users to compare the results of the simulations directly to observations: stellar population modelling, dust extinction, spectra, images, visualisation using dedicated VO tools. The GalMer value-added services can be used as virtual telescope producing broadband images, 1D spectra, 3D spectral datacubes, thus making our database oriented towards the usage by observers. We present several examples of the GalMer database scientific usage obtained from the analysis of simulations and modelling their stellar population properties, including: (1) studies of the star formation efficiency in interactions; (2) creation of old counter-rotating components; (3) reshaping metallicity profiles in elliptical galaxies; (4) orbital to internal angular momentum transfer; (5) reproducing observed colour bimodality of galaxies.Comment: 15 pages, 11 figures, 10 tables accepted to A&A. Visualisation of GalMer simulations, access to snapshot files and value-added tools described in the paper are available at http://galmer.obspm.fr

    Data trend mining for predictive systems design

    Get PDF
    The goal of this research is to propose a data mining based design framework that can be used to solve complex systems design problems in a timely and efficient manner, with the main focus being product family design problems. Traditional data acquisition techniques that have been employed in the product design community have relied primarily on customer survey data or focus group feedback as a means of integrating customer preference information into the product design process. The reliance of direct customer interaction can be costly and time consuming and may therefore limit the overall size and complexity of the customer preference data. Furthermore, since survey data typically represents stated customer preferences (customer responses for hypothetical product designs, rather than actual product purchasing decisions made), design engineers may not know the true customer preferences for specific product attributes, a challenge that could ultimately result in misguided product designs. By analyzing large scale time series consumer data, new products can be designed that anticipate emerging product preference trends in the market space. The proposed data trend mining algorithm will enable design engineers to determine how to characterize attributes based on their relevance to the overall product design. A cell phone case study is used to demonstrate product design problems involving new product concept generation and an aerodynamic particle separator case study is presented for product design problems requiring attribute relevance characterization and product family clustering. Finally, it is shown that the proposed trend mining methodology can be expanded beyond product design problems to include systems of systems design problems such as military systems simulations

    Mining climate data for shire level wheat yield predictions in Western Australia

    Get PDF
    Climate change and the reduction of available agricultural land are two of the most important factors that affect global food production especially in terms of wheat stores. An ever increasing world population places a huge demand on these resources. Consequently, there is a dire need to optimise food production. Estimations of crop yield for the South West agricultural region of Western Australia have usually been based on statistical analyses by the Department of Agriculture and Food in Western Australia. Their estimations involve a system of crop planting recommendations and yield prediction tools based on crop variety trials. However, many crop failures arise from adherence to these crop recommendations by farmers that were contrary to the reported estimations. Consequently, the Department has sought to investigate new avenues for analyses that improve their estimations and recommendations. This thesis explores a new approach in the way analyses are carried out. This is done through the introduction of new methods of analyses such as data mining and online analytical processing in the strategy. Additionally, this research attempts to provide a better understanding of the effects of both gradual variation parameters such as soil type, and continuous variation parameters such as rainfall and temperature, on the wheat yields. The ultimate aim of the research is to enhance the prediction efficiency of wheat yields. The task was formidable due to the complex and dichotomous mixture of gradual and continuous variability data that required successive information transformations. It necessitated the progressive moulding of the data into useful information, practical knowledge and effective industry practices. Ultimately, this new direction is to improve the crop predictions and to thereby reduce crop failures. The research journey involved data exploration, grappling with the complexity of Geographic Information System (GIS), discovering and learning data compatible software tools, and forging an effective processing method through an iterative cycle of action research experimentation. A series of trials was conducted to determine the combined effects of rainfall and temperature variations on wheat crop yields. These experiments specifically related to the South Western Agricultural region of Western Australia. The study focused on wheat producing shires within the study area. The investigations involved a combination of macro and micro analyses techniques for visual data mining and data mining classification techniques, respectively. The research activities revealed that wheat yield was most dependent upon rainfall and temperature. In addition, it showed that rainfall cyclically affected the temperature and soil type due to the moisture retention of crop growing locations. Results from the regression analyses, showed that the statistical prediction of wheat yields from historical data, may be enhanced by data mining techniques including classification. The main contribution to knowledge as a consequence of this research was the provision of an alternate and supplementary method of wheat crop prediction within the study area. Another contribution was the division of the study area into a GIS surface grid of 100 hectare cells upon which the interpolated data was projected. Furthermore, the proposed framework within this thesis offers other researchers, with similarly structured complex data, the benefits of a general processing pathway to enable them to navigate their own investigations through variegated analytical exploration spaces. In addition, it offers insights and suggestions for future directions in other contextual research explorations

    Two-photon imaging and analysis of neural network dynamics

    Full text link
    The glow of a starry night sky, the smell of a freshly brewed cup of coffee or the sound of ocean waves breaking on the beach are representations of the physical world that have been created by the dynamic interactions of thousands of neurons in our brains. How the brain mediates perceptions, creates thoughts, stores memories and initiates actions remains one of the most profound puzzles in biology, if not all of science. A key to a mechanistic understanding of how the nervous system works is the ability to analyze the dynamics of neuronal networks in the living organism in the context of sensory stimulation and behaviour. Dynamic brain properties have been fairly well characterized on the microscopic level of individual neurons and on the macroscopic level of whole brain areas largely with the help of various electrophysiological techniques. However, our understanding of the mesoscopic level comprising local populations of hundreds to thousands of neurons (so called 'microcircuits') remains comparably poor. In large parts, this has been due to the technical difficulties involved in recording from large networks of neurons with single-cell spatial resolution and near- millisecond temporal resolution in the brain of living animals. In recent years, two-photon microscopy has emerged as a technique which meets many of these requirements and thus has become the method of choice for the interrogation of local neural circuits. Here, we review the state-of-research in the field of two-photon imaging of neuronal populations, covering the topics of microscope technology, suitable fluorescent indicator dyes, staining techniques, and in particular analysis techniques for extracting relevant information from the fluorescence data. We expect that functional analysis of neural networks using two-photon imaging will help to decipher fundamental operational principles of neural microcircuits.Comment: 36 pages, 4 figures, accepted for publication in Reports on Progress in Physic

    Belle II Technical Design Report

    Full text link
    The Belle detector at the KEKB electron-positron collider has collected almost 1 billion Y(4S) events in its decade of operation. Super-KEKB, an upgrade of KEKB is under construction, to increase the luminosity by two orders of magnitude during a three-year shutdown, with an ultimate goal of 8E35 /cm^2 /s luminosity. To exploit the increased luminosity, an upgrade of the Belle detector has been proposed. A new international collaboration Belle-II, is being formed. The Technical Design Report presents physics motivation, basic methods of the accelerator upgrade, as well as key improvements of the detector.Comment: Edited by: Z. Dole\v{z}al and S. Un

    Design and Performance analysis of a relational replicated database systems

    Get PDF
    The hardware organization and software structure of a new database system are presented. This system, the relational replicated database system (RRDS), is based on a set of replicated processors operating on a partitioned database. Performance improvements and capacity growth can be obtained by adding more processors to the configuration. Based on designing goals a set of hardware and software design questions were developed. The system then evolved according to a five-phase process, based on simulation and analysis, which addressed and resolved the design questions. Strategies and algorithms were developed for data access, data placement, and directory management for the hardware organization. A predictive performance analysis was conducted to determine the extent to which original design goals were satisfied. The predictive performance results, along with an analytical comparison with three other relational multi-backend systems, provided information about the strengths and weaknesses of our design as well as a basis for future research
    corecore