391 research outputs found

    Pooling spaces associated with finite geometry

    Get PDF
    AbstractMotivated by the works of Ngo and Du [H. Ngo, D. Du, A survey on combinatorial group testing algorithms with applications to DNA library screening, DIMACS Series in Discrete Mathematics and Theoretical Computer Science 55 (2000) 171–182], the notion of pooling spaces was introduced [T. Huang, C. Weng, Pooling spaces and non-adaptive pooling designs, Discrete Mathematics 282 (2004) 163–169] for a systematic way of constructing pooling designs; note that geometric lattices are among pooling spaces. This paper attempts to draw possible connections from finite geometry and distance regular graphs to pooling spaces: including the projective spaces, the affine spaces, the attenuated spaces, and a few families of geometric lattices associated with the orbits of subspaces under finite classical groups, and associated with d-bounded distance-regular graphs

    2015 Abstract Book

    Get PDF

    Evidence Based Medicine

    Get PDF
    Evidence-based medicine (EBM) was introduced to the best benefit of the patient. It has transformed the pathophysiological approach to the outcome approach of today's treatments. Disease-oriented to patient-oriented medicine. And, for some, daily medical practice from patient oriented to case oriented medicine. Evidence has changed the paternalistic way of medical practice. And gave room to patients, who show a tendency towards partnership. Although EBM has introduced a different way of thinking in the day to day medical practice, there is plenty of space for implementation and improvement. This book is meant to provoke the thinker towards the unlimited borders of caring for the patient

    Artificial Intelligence for Science in Quantum, Atomistic, and Continuum Systems

    Full text link
    Advances in artificial intelligence (AI) are fueling a new paradigm of discoveries in natural sciences. Today, AI has started to advance natural sciences by improving, accelerating, and enabling our understanding of natural phenomena at a wide range of spatial and temporal scales, giving rise to a new area of research known as AI for science (AI4Science). Being an emerging research paradigm, AI4Science is unique in that it is an enormous and highly interdisciplinary area. Thus, a unified and technical treatment of this field is needed yet challenging. This work aims to provide a technically thorough account of a subarea of AI4Science; namely, AI for quantum, atomistic, and continuum systems. These areas aim at understanding the physical world from the subatomic (wavefunctions and electron density), atomic (molecules, proteins, materials, and interactions), to macro (fluids, climate, and subsurface) scales and form an important subarea of AI4Science. A unique advantage of focusing on these areas is that they largely share a common set of challenges, thereby allowing a unified and foundational treatment. A key common challenge is how to capture physics first principles, especially symmetries, in natural systems by deep learning methods. We provide an in-depth yet intuitive account of techniques to achieve equivariance to symmetry transformations. We also discuss other common technical challenges, including explainability, out-of-distribution generalization, knowledge transfer with foundation and large language models, and uncertainty quantification. To facilitate learning and education, we provide categorized lists of resources that we found to be useful. We strive to be thorough and unified and hope this initial effort may trigger more community interests and efforts to further advance AI4Science

    Bioinformatics Applications Based On Machine Learning

    Get PDF
    The great advances in information technology (IT) have implications for many sectors, such as bioinformatics, and has considerably increased their possibilities. This book presents a collection of 11 original research papers, all of them related to the application of IT-related techniques within the bioinformatics sector: from new applications created from the adaptation and application of existing techniques to the creation of new methodologies to solve existing problems

    Microwave Breast Cancer Imaging: Simulation, Experimental Data, Reconstruction and Classification

    Get PDF
    This work concerns the microwave imaging (MWI) for breast cancer. The full process to develop an experimental phantom is detailed. The models used in the simulation stage are presented in an increasing complexity. Starting from a simplified homogeneous breast where only the tumor is placed in a background medium, moving to an intermediate complexity model where a rugged fibroglandular structure other than tumor has been placed and reaching a realistic breast model derived from the nuclear magnetic resonance phantoms. The reconstruction is performed in 2D using the linear TR-MUSIC algorithm tested in the monostatic and multistatic approaches. The description of the developed phantom and the instruments involved are detailed along with the already planned improvements. The simulated and experimental results are compared. Finally a classification stage based on the leading technique known as “deep learning”, an improved branch of the machine learning, is adopted using mammographic images

    Scale-up of continuous monoclonal antibody precipitation

    Get PDF
    The scale-up of protein precipitation processes proves to be a challenging task due to the complexity of the reactions and transport processes involved. A good understanding of the molecular processes underpinning precipitate formation and the reaction kinetics are therefore required in order to devise a scale-up strategy. The doctoral project was first set out to establish micro-mixing as an engineering tool for the scale-up of antibody precipitation from cell culture, and secondly to design a downstream process with the goal of purifying a therapeutic mAb to clinical grade levels. Studies were first conducted in batch and transferred to a continuous process, with the scale-up approach focusing on the latter. Interactions between precipitation conditions and centrifugal recovery were then examined by employing an ultra scale-down (USD) methodology to mimic large-scale centrifugation. The downstream process design was on the basis of integrating precipitation with non-affinity chromatography steps to avoid the cost of affinity chromatography. Precipitate formation in batch and continuous settings was governed by the mixing at the molecular scale, which determined the final particle properties. Based on this, the mean energy dissipation rate for a continuous precipitation process proved an effective scale-up criterion, enabling high process throughputs relative to batch operation. The strength of protein precipitates, as evaluated by exposing particles to turbulent shear in a rotating disc device, was shown to correlate with particle fractal dimensions. Despite excellent precipitate solids removal from the USD methodology, these could not be predicted by disc-stack centrifugation. Differences in hindered settling between the systems were proposed to explain this observation which suggests routes to resolve this scale-up challenge. To provide an integrated DSP solution for therapeutic mAbs processes anion exchange and mixed-mode chromatography steps subsequent to precipitation were designed. Parameter ranges were studied to identify the optimal conditions in maximising antibody yield and HCP removal. Using optimal conditions, precipitation and anion exchange demonstrated an 18-fold removal in HCPs, whilst precipitation and mixed-mode provided a 40-fold removal. For a three step process comprising the sequence precipitation, anion exchange and mixed mode, an overall HCP removal of 260-fold was seen; however such levels remain at least 38-fold higher than the typical specification of a clinical grade product. This therefore necessitates further optimisation in one or more steps
    • …
    corecore