819 research outputs found

    Parallel Mapper

    Full text link
    The construction of Mapper has emerged in the last decade as a powerful and effective topological data analysis tool that approximates and generalizes other topological summaries, such as the Reeb graph, the contour tree, split, and joint trees. In this paper, we study the parallel analysis of the construction of Mapper. We give a provably correct parallel algorithm to execute Mapper on multiple processors and discuss the performance results that compare our approach to a reference sequential Mapper implementation. We report the performance experiments that demonstrate the efficiency of our method

    Details of a scientific approach to information systems, Courant Symp. in Data Base Systems

    Get PDF
    are the optimal strategy for the usage considered. If it is assumed now that all q s = 0 (i.e. there is no querying to the IPS) then it is clear that no indexing path is profitable. Rule 4 If only queries and no maintenance are performed then all the candidate indexing paths are included in the optimal strategy whereas if maintenance only is done, no indexing path appears in the IPS. Conclusions A file designer who cannot determine the effects of each alternative decision is bound to make subjective or intuitive design judgements instead of objective ones. The properties and rules stated (a) provide the means to improve the performance of IPS by expanding the current spectrum of alternative indexing paths examined prior to making any implementation decision, and (b) provide for increased confidence in the decision made. In Book review A Programming Metholodology in Compiler Construction Part I: Concepts by J. Lewi, K. De Vlaminck, J. Huens and M. Huybrechts, 1979; 308 pages. (North-Holland, $41.50) In the late 1950s the task of compiler construction was considered a major undertaking. The first FORTRAN compiler, for example, took 18 man-years to implement (Backus et al, 1957). Now, in the late 1970s, such a task is considered a reasonable computer science student project. The factors that have led to this over the last twenty years are (a) the comprehension of the organisation and modular design of the compilation process, (b) the development of systematic techniques for handling the majority of the important tasks that occur during compilation and (c) the construction of software tools that assist in the implementation of compilers and compiler components. Implicit in all these three developments is the closing of the gap between theory and practice. This book is the first part of a twopart description of an environment utilising a completely closed gap. Part I introduces the basic theoretical models whilst part 2 will consider the more practical aspects of the engineering of the environment (namely the language implementation laboratory [LILA] [transducer] programs from the associated syntax. As such, each section is the logical progression of the previous and the methodology used in each section is a reflection of the methodology of the previous section. Hence the book is structurally pleasing and easy to read. In conclusion, the book is ideally suited to the software engineer who is actively involved in the application of language theory to compiler construction (or the construction of any systems softwar

    Rare coding SNP in DZIP1 gene associated with late-onset sporadic Parkinson's disease

    Get PDF
    We present the first application of the hypothesis-rich mathematical theory to genome-wide association data. The Hamza et al. late-onset sporadic Parkinson's disease genome-wide association study dataset was analyzed. We found a rare, coding, non-synonymous SNP variant in the gene DZIP1 that confers increased susceptibility to Parkinson's disease. The association of DZIP1 with Parkinson's disease is consistent with a Parkinson's disease stem-cell ageing theory.Comment: 14 page

    Targeting and killing of glioblastoma with activated T cells armed with bispecific antibodies

    Get PDF
    Abstract Background Since most glioblastomas express both wild-type EGFR and EGFRvIII as well as HER2/neu, they are excellent targets for activated T cells (ATC) armed with bispecific antibodies (BiAbs) that target EGFR and HER2. Methods ATC were generated from PBMC activated for 14 days with anti-CD3 monoclonal antibody in the presence of interleukin-2 and armed with chemically heteroconjugated anti-CD3×anti-HER2/neu (HER2Bi) and/or anti-CD3×anti-EGFR (EGFRBi). HER2Bi- and/or EGFRBi-armed ATC were examined for in vitro cytotoxicity using MTT and 51Cr-release assays against malignant glioma lines (U87MG, U118MG, and U251MG) and primary glioblastoma lines. Results EGFRBi-armed ATC killed up to 85% of U87, U118, and U251 targets at effector:target ratios (E:T) ranging from 1:1 to 25:1. Engagement of tumor by EGFRBi-armed ATC induced Th1 and Th2 cytokine secretion by armed ATC. HER2Bi-armed ATC exhibited comparable cytotoxicity against U118 and U251, but did not kill HER2-negative U87 cells. HER2Bi- or EGFRBi-armed ATC exhibited 50—80% cytotoxicity against four primary glioblastoma lines as well as a temozolomide (TMZ)-resistant variant of U251. Both CD133– and CD133+ subpopulations were killed by armed ATC. Targeting both HER2Bi and EGFRBi simultaneously showed enhanced efficacy than arming with a single BiAb. Armed ATC maintained effectiveness after irradiation and in the presence of TMZ at a therapeutic concentration and were capable of killing multiple targets. Conclusion High-grade gliomas are suitable for specific targeting by armed ATC. These data, together with additional animal studies, may provide the preclinical support for the use of armed ATC as a valuable addition to current treatment regimens

    Variational Methods for Biomolecular Modeling

    Full text link
    Structure, function and dynamics of many biomolecular systems can be characterized by the energetic variational principle and the corresponding systems of partial differential equations (PDEs). This principle allows us to focus on the identification of essential energetic components, the optimal parametrization of energies, and the efficient computational implementation of energy variation or minimization. Given the fact that complex biomolecular systems are structurally non-uniform and their interactions occur through contact interfaces, their free energies are associated with various interfaces as well, such as solute-solvent interface, molecular binding interface, lipid domain interface, and membrane surfaces. This fact motivates the inclusion of interface geometry, particular its curvatures, to the parametrization of free energies. Applications of such interface geometry based energetic variational principles are illustrated through three concrete topics: the multiscale modeling of biomolecular electrostatics and solvation that includes the curvature energy of the molecular surface, the formation of microdomains on lipid membrane due to the geometric and molecular mechanics at the lipid interface, and the mean curvature driven protein localization on membrane surfaces. By further implicitly representing the interface using a phase field function over the entire domain, one can simulate the dynamics of the interface and the corresponding energy variation by evolving the phase field function, achieving significant reduction of the number of degrees of freedom and computational complexity. Strategies for improving the efficiency of computational implementations and for extending applications to coarse-graining or multiscale molecular simulations are outlined.Comment: 36 page

    Novel combination of feed enzymes to improve the degradation of Chlorella vulgaris recalcitrant cell wall

    Get PDF
    Research Areas: Science & TechnologyABSTRACT - In this study, a rational combination of 200 pre-selected Carbohydrate-Active enzymes (CAZymes) and sulfatases were tested, individually or combined, according to their ability to degrade Chlorella vulgaris cell wall to access its valuable nutritional compounds. The disruption of microalgae cell walls by a four enzyme mixture (Mix) in comparison with the control, enabled to release up to 1.21g/L of reducing sugars (p<0.001), led to an eight-fold increase in oligosaccharides release (p<0.001), and reduced the fuorescence intensity by 47% after staining with Calcofuor White (p<0.001). The Mix treatment was successful in releasing proteins (p<0.001), some MUFA (p<0.05), and the benefcial 18:3n-3 fatty acid (p0.05), total carotenoids were increased in the supernatant (p<0.05) from the Mix treatment, relative to the control. Taken together, these results indicate that this four-enzyme Mix displays an efective capacity to degrade C. vulgaris cell wall. Thus, these enzymes may constitute a good approach to improve the bioavailability of C. vulgaris nutrients for monogastric diets, in particular, and to facilitate the cost-efective use of microalgae by the feed industry, in general.info:eu-repo/semantics/publishedVersio

    Results from the centers for disease control and prevention's predict the 2013-2014 Influenza Season Challenge

    Get PDF
    Background: Early insights into the timing of the start, peak, and intensity of the influenza season could be useful in planning influenza prevention and control activities. To encourage development and innovation in influenza forecasting, the Centers for Disease Control and Prevention (CDC) organized a challenge to predict the 2013-14 Unites States influenza season. Methods: Challenge contestants were asked to forecast the start, peak, and intensity of the 2013-2014 influenza season at the national level and at any or all Health and Human Services (HHS) region level(s). The challenge ran from December 1, 2013-March 27, 2014; contestants were required to submit 9 biweekly forecasts at the national level to be eligible. The selection of the winner was based on expert evaluation of the methodology used to make the prediction and the accuracy of the prediction as judged against the U.S. Outpatient Influenza-like Illness Surveillance Network (ILINet). Results: Nine teams submitted 13 forecasts for all required milestones. The first forecast was due on December 2, 2013; 3/13 forecasts received correctly predicted the start of the influenza season within one week, 1/13 predicted the peak within 1 week, 3/13 predicted the peak ILINet percentage within 1 %, and 4/13 predicted the season duration within 1 week. For the prediction due on December 19, 2013, the number of forecasts that correctly forecasted the peak week increased to 2/13, the peak percentage to 6/13, and the duration of the season to 6/13. As the season progressed, the forecasts became more stable and were closer to the season milestones. Conclusion: Forecasting has become technically feasible, but further efforts are needed to improve forecast accuracy so that policy makers can reliably use these predictions. CDC and challenge contestants plan to build upon the methods developed during this contest to improve the accuracy of influenza forecasts. © 2016 The Author(s)
    • …
    corecore