1,768 research outputs found

    Characterising side-chain motions in proteins by Nuclear Magnetic Resonance and Molecular Dynamics to report on function and regulation

    Get PDF
    Analysing the motions proteins undergo is vital for understanding a wide variety of biological processes. In particular side chains provide a wide range of chemical groups allowing proteins to carry out diverse functions such as catalysis and regulating gene expression. A key theme in this thesis is understanding the roles side-chains play in protein dynamics. To do this we use molecular dynamics, density functional theory and nuclear magnetic resonance. The first part of this work describes the relationship between the isoleucine side- chain conformation and chemical shift. We show there is a clear dependence between the χ angles and the observed side-chain’s 13C chemical shifts. This relationship is then used to determine rotamer distributions in the L24A FF domain’s excited state and the 42 kDa membrane complex DsbA-DsbB. In addition we use our methodology to show that the isoleucine random coil distribution in two model peptides is substantially different to the statistical distribution derived from the PDB. The second part of this thesis focuses on characterising the dynamic processes reg- ulating histone deacetylase 8. Here two approaches are used. The first concentrates on molecular dynamics to show the allosteric connection between the active site, the bind- ing rail and I19, a naturally occurring mutation site in patients. In conjunction with this we aimed to carry out a backbone independent methyl assignment. To aid joining intra- residue methyls we developed the HMBC-HMQC that utilises scalar coupling based transfers. This has many advantages over NOE based approaches as it directly reports on the bonding network, greatly simplifying the interpretation of crowded regions of the spectra. In addition to this we also made substantial progress towards assigning the ILV methyls by determining the residue types, joining intra-residue methyls and building an NOE network between the observed resonances

    Roadmap on signal processing for next generation measurement systems

    Get PDF
    Signal processing is a fundamental component of almost any sensor-enabled system, with a wide range of applications across different scientific disciplines. Time series data, images, and video sequences comprise representative forms of signals that can be enhanced and analysed for information extraction and quantification. The recent advances in artificial intelligence and machine learning are shifting the research attention towards intelligent, data-driven, signal processing. This roadmap presents a critical overview of the state-of-the-art methods and applications aiming to highlight future challenges and research opportunities towards next generation measurement systems. It covers a broad spectrum of topics ranging from basic to industrial research, organized in concise thematic sections that reflect the trends and the impacts of current and future developments per research field. Furthermore, it offers guidance to researchers and funding agencies in identifying new prospects.AerodynamicsMicrowave Sensing, Signals & System

    Probing Cellular Uptake of Nanoparticles, One at a Time

    Get PDF
    Advanced fluorescence microscopy is the method of choice to study cellular uptake of nanoparticles with molecular specificity and nanoscale resolution; yet, direct visualization of nanoparticles entry into cells poses severe technical challenges. Here, we have combined super-resolution photoactivation localization microscopy (PALM) with single particle tracking (SPT) to visualize clathrin-mediated endocytosis (CME) of polystyrene nanoparticles at very high spatial and temporal resolution

    Monte Carlo Method with Heuristic Adjustment for Irregularly Shaped Food Product Volume Measurement

    Get PDF
    Volume measurement plays an important role in the production and processing of food products. Various methods have been proposed to measure the volume of food products with irregular shapes based on 3D reconstruction. However, 3D reconstruction comes with a high-priced computational cost. Furthermore, some of the volume measurement methods based on 3D reconstruction have a low accuracy. Another method for measuring volume of objects uses Monte Carlo method. Monte Carlo method performs volume measurements using random points. Monte Carlo method only requires information regarding whether random points fall inside or outside an object and does not require a 3D reconstruction. This paper proposes volume measurement using a computer vision system for irregularly shaped food products without 3D reconstruction based on Monte Carlo method with heuristic adjustment. Five images of food product were captured using five cameras and processed to produce binary images. Monte Carlo integration with heuristic adjustment was performed to measure the volume based on the information extracted from binary images. The experimental results show that the proposed method provided high accuracy and precision compared to the water displacement method. In addition, the proposed method is more accurate and faster than the space carving method

    Non-acyclicity of coset lattices and generation of finite groups

    Get PDF

    Simulation Intelligence: Towards a New Generation of Scientific Methods

    Full text link
    The original "Seven Motifs" set forth a roadmap of essential methods for the field of scientific computing, where a motif is an algorithmic method that captures a pattern of computation and data movement. We present the "Nine Motifs of Simulation Intelligence", a roadmap for the development and integration of the essential algorithms necessary for a merger of scientific computing, scientific simulation, and artificial intelligence. We call this merger simulation intelligence (SI), for short. We argue the motifs of simulation intelligence are interconnected and interdependent, much like the components within the layers of an operating system. Using this metaphor, we explore the nature of each layer of the simulation intelligence operating system stack (SI-stack) and the motifs therein: (1) Multi-physics and multi-scale modeling; (2) Surrogate modeling and emulation; (3) Simulation-based inference; (4) Causal modeling and inference; (5) Agent-based modeling; (6) Probabilistic programming; (7) Differentiable programming; (8) Open-ended optimization; (9) Machine programming. We believe coordinated efforts between motifs offers immense opportunity to accelerate scientific discovery, from solving inverse problems in synthetic biology and climate science, to directing nuclear energy experiments and predicting emergent behavior in socioeconomic settings. We elaborate on each layer of the SI-stack, detailing the state-of-art methods, presenting examples to highlight challenges and opportunities, and advocating for specific ways to advance the motifs and the synergies from their combinations. Advancing and integrating these technologies can enable a robust and efficient hypothesis-simulation-analysis type of scientific method, which we introduce with several use-cases for human-machine teaming and automated science

    Complexity, Emergent Systems and Complex Biological Systems:\ud Complex Systems Theory and Biodynamics. [Edited book by I.C. Baianu, with listed contributors (2011)]

    Get PDF
    An overview is presented of System dynamics, the study of the behaviour of complex systems, Dynamical system in mathematics Dynamic programming in computer science and control theory, Complex systems biology, Neurodynamics and Psychodynamics.\u

    Use of prior information and probabilistic image reconstruction for optical tomographic imaging

    Get PDF
    Preclinical bioluminescence tomographic reconstruction is underdetermined. This work addresses the use of prior information in bioluminescence tomography to improve image acquisition, reconstruction, and analysis. A structured light surface metrology method was developed to measure surface geometry and enable robust and automatic integration of mirrors into the measurement process. A mouse phantom was imaged and accuracy was measured at 0.2mm with excellent surface coverage. A sparsity-regularised reconstruction algorithm was developed to use instrument noise statistics to automatically determine the stopping point of reconstruction. It was applied to in silico and in simulacra data and successfully reconstructed and resolved two separate luminescent sources within a plastic mouse phantom. A Bayesian framework was constructed that incorporated bioluminescence properties and instrument properties. Distribution expectations and standard deviations were estimated, providing reconstructions and measures of reconstruction uncertainty. The reconstructions showed superior performance when applied to in simulacra data compared to the sparsity-based algorithm. The information content of measurements using different sets of wavelengths was quantified using the Bayesian framework via mutual information and applied to an in silico problem. Significant differences in information content were observed and comparison against a condition number-based approach indicated subtly different results
    corecore