2,895 research outputs found

    A rewiring model of intratumoral interaction networks.

    Get PDF
    Intratumoral heterogeneity (ITH) has been regarded as a key cause of the failure and resistance of cancer therapy, but how it behaves and functions remains unclear. Advances in single-cell analysis have facilitated the collection of a massive amount of data about genetic and molecular states of individual cancer cells, providing a fuel to dissect the mechanistic organization of ITH at the molecular, metabolic and positional level. Taking advantage of these data, we propose a computational model to rewire up a topological network of cell-cell interdependences and interactions that operate within a tumor mass. The model is grounded on the premise of game theory that each interactive cell (player) strives to maximize its fitness by pursuing a rational self-interest strategy, war or peace, in a way that senses and alters other cells to respond properly. By integrating this idea with genome-wide association studies for intratumoral cells, the model is equipped with a capacity to visualize, annotate and quantify how somatic mutations mediate ITH and the network of intratumoral interactions. Taken together, the model provides a topological flow by which cancer cells within a tumor cooperate or compete with each other to downstream pathogenesis. This topological flow can be potentially used as a blueprint for genetically intervening the pattern and strength of cell-cell interactions towards cancer control

    Incremental Learning Using a Grow-and-Prune Paradigm with Efficient Neural Networks

    Full text link
    Deep neural networks (DNNs) have become a widely deployed model for numerous machine learning applications. However, their fixed architecture, substantial training cost, and significant model redundancy make it difficult to efficiently update them to accommodate previously unseen data. To solve these problems, we propose an incremental learning framework based on a grow-and-prune neural network synthesis paradigm. When new data arrive, the neural network first grows new connections based on the gradients to increase the network capacity to accommodate new data. Then, the framework iteratively prunes away connections based on the magnitude of weights to enhance network compactness, and hence recover efficiency. Finally, the model rests at a lightweight DNN that is both ready for inference and suitable for future grow-and-prune updates. The proposed framework improves accuracy, shrinks network size, and significantly reduces the additional training cost for incoming data compared to conventional approaches, such as training from scratch and network fine-tuning. For the LeNet-300-100 and LeNet-5 neural network architectures derived for the MNIST dataset, the framework reduces training cost by up to 64% (63%) and 67% (63%) compared to training from scratch (network fine-tuning), respectively. For the ResNet-18 architecture derived for the ImageNet dataset and DeepSpeech2 for the AN4 dataset, the corresponding training cost reductions against training from scratch (network fine-tunning) are 64% (60%) and 67% (62%), respectively. Our derived models contain fewer network parameters but achieve higher accuracy relative to conventional baselines

    Automating Logic Transformations With Approximate SPFDs

    Full text link

    Fast Post-placement Rewiring Using Easily Detectable Functional Symmetries

    Get PDF
    Timing convergence problem arises when the estimations made during logic synthesis can not be met during physical design. In this paper, an efficient rewiring engine is proposed to explore maximal freedom after placement. The most important feature of this approach is that the existing placement solution is left intact throughout the optimization. A linear time algorithm is proposed to detect functional symmetries in the Boolean network and is used as the basis for rewiring. Integration with an existing gate sizing algorithm further proves the effectiveness of our technique. Experimental results are very promising

    Jefferson Digital Commons quarterly report: October-December 2019

    Get PDF
    This quarterly report includes: Articles Dean\u27s Research Development Lunch Conference Dissertations Educational Materials From the Archives Grand Rounds and Lectures Journals and Newsletters Population Health Presentation Materials Posters Reports Symposiums What People are Saying About the Jefferson Digital Common

    Designer cell signal processing circuits for biotechnology

    Get PDF
    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field

    Enhancing Nervous System Recovery through Neurobiologics, Neural Interface Training, and Neurorehabilitation.

    Get PDF
    After an initial period of recovery, human neurological injury has long been thought to be static. In order to improve quality of life for those suffering from stroke, spinal cord injury, or traumatic brain injury, researchers have been working to restore the nervous system and reduce neurological deficits through a number of mechanisms. For example, neurobiologists have been identifying and manipulating components of the intra- and extracellular milieu to alter the regenerative potential of neurons, neuro-engineers have been producing brain-machine and neural interfaces that circumvent lesions to restore functionality, and neurorehabilitation experts have been developing new ways to revitalize the nervous system even in chronic disease. While each of these areas holds promise, their individual paths to clinical relevance remain difficult. Nonetheless, these methods are now able to synergistically enhance recovery of native motor function to levels which were previously believed to be impossible. Furthermore, such recovery can even persist after training, and for the first time there is evidence of functional axonal regrowth and rewiring in the central nervous system of animal models. To attain this type of regeneration, rehabilitation paradigms that pair cortically-based intent with activation of affected circuits and positive neurofeedback appear to be required-a phenomenon which raises new and far reaching questions about the underlying relationship between conscious action and neural repair. For this reason, we argue that multi-modal therapy will be necessary to facilitate a truly robust recovery, and that the success of investigational microscopic techniques may depend on their integration into macroscopic frameworks that include task-based neurorehabilitation. We further identify critical components of future neural repair strategies and explore the most updated knowledge, progress, and challenges in the fields of cellular neuronal repair, neural interfacing, and neurorehabilitation, all with the goal of better understanding neurological injury and how to improve recovery
    corecore