224 research outputs found

    Aging in One-Dimensional Coagulation-Diffusion Processes and the Fredrickson-Andersen Model

    Full text link
    We analyse the aging dynamics of the one-dimensional Fredrickson-Andersen (FA) model in the nonequilibrium regime following a low temperature quench. Relaxation then effectively proceeds via diffusion limited pair coagulation (DLPC) of mobility excitations. By employing a familiar stochastic similarity transformation, we map exact results from the free fermion case of diffusion limited pair annihilation to DLPC. Crucially, we are able to adapt the mapping technique to averages involving multiple time quantities. This relies on knowledge of the explicit form of the evolution operators involved. Exact results are obtained for two-time correlation and response functions in the free fermion DLPC process. The corresponding long-time scaling forms apply to a wider class of DLPC processes, including the FA model. We are thus able to exactly characterise the violations of the fluctuation-dissipation theorem (FDT) in the aging regime of the FA model. We find nontrivial scaling forms for the fluctuation-dissipation ratio (FDR) X = X(tw/t), but with a negative asymptotic value X = -3*pi/(6*pi - 16) = -3.307. While this prevents a thermodynamic interpretation in terms of an effective temperature, it is a direct consequence of probing FDT with observables that couple to activated dynamics. The existence of negative FDRs should therefore be a widespread feature in non mean-field systems.Comment: 39 pages, 4 figure

    Automatic Middle-Out Optimisation of Coarse-Grained Lipid Force Fields

    Get PDF
    Automatic data-driven approaches are increasingly used to develop accurate molecular models. But the parameters of such automatically-optimised models are typically untransferable. Using a multi-reference approach in combination with an automatic optimisation engine (SwarmCGM), here we show that it is possible to optimise coarse-grained (CG) lipid models that are also transferable, generating optimised lipid force fields. The parameters of the CG lipid models are iteratively and simultaneously optimised against higher-resolution simulations (bottom-up) and experimental data (top-down references). Including different types of lipid bilayers in the training set guarantees the transferability of the optimised force field parameters. Tested against state-of-the-art CG lipid force fields, we demonstrate that SwarmCGM can systematically improve their parameters, enhancing the agreement with the experiments even for lipid types not included in the training set. The approach is general and can be used to improve existing CG lipid force fields, as well as to develop new custom ones.Comment: Paper (Pages 1-16) + Supporting Information (Pages 17-40

    Adiabatic Quantum State Generation and Statistical Zero Knowledge

    Get PDF
    The design of new quantum algorithms has proven to be an extremely difficult task. This paper considers a different approach to the problem, by studying the problem of 'quantum state generation'. This approach provides intriguing links between many different areas: quantum computation, adiabatic evolution, analysis of spectral gaps and groundstates of Hamiltonians, rapidly mixing Markov chains, the complexity class statistical zero knowledge, quantum random walks, and more. We first show that many natural candidates for quantum algorithms can be cast as a state generation problem. We define a paradigm for state generation, called 'adiabatic state generation' and develop tools for adiabatic state generation which include methods for implementing very general Hamiltonians and ways to guarantee non negligible spectral gaps. We use our tools to prove that adiabatic state generation is equivalent to state generation in the standard quantum computing model, and finally we show how to apply our techniques to generate interesting superpositions related to Markov chains.Comment: 35 pages, two figure

    Sequential Voxel-Based Leaflet Segmentation of Complex Lipid Morphologies

    Get PDF
    [Image: see text] As molecular dynamics simulations increase in complexity, new analysis tools are necessary to facilitate interpreting the results. Lipids, for instance, are known to form many complicated morphologies, because of their amphipathic nature, becoming more intricate as the particle count increases. A few lipids might form a micelle, where aggregation of tens of thousands could lead to vesicle formation. Millions of lipids comprise a cell and its organelle membranes, and are involved in processes such as neurotransmission and transfection. To study such phenomena, it is useful to have analysis tools that understand what is meant by emerging entities such as micelles and vesicles. Studying such systems at the particle level only becomes extremely tedious, counterintuitive, and computationally expensive. To address this issue, we developed a method to track all the individual lipid leaflets, allowing for easy and quick detection of topological changes at the mesoscale. By using a voxel-based approach and focusing on locality, we forego costly geometrical operations without losing important details and chronologically identify the lipid segments using the Jaccard index. Thus, we achieve a consistent sequential segmentation on a wide variety of (lipid) systems, including monolayers, bilayers, vesicles, inverted hexagonal phases, up to the membranes of a full mitochondrion. It also discriminates between adhesion and fusion of leaflets. We show that our method produces consistent results without the need for prefitting parameters, and segmentation of millions of particles can be achieved on a desktop machine

    Computational Studies of Protein Structure, Dynamics, and Function in Native-like Environments

    Get PDF
    Proteins are among the four unique organic constituents of cells. They are responsible for a variety of important cell functions ranging from providing structural support to catalyzing biological reactions. They vary in shape, dynamic behavior, and localization. All of these together determine the specificity in their functions, but the question is how. The ultimate goal of the research conducted in this thesis is to answer this question. Two types of proteins are of particular interest. They include transmembrane proteins and protein assemblies. Using computer simulations with available experimental data to validate the simulation results, the research described here aims to reveal the structure and dynamics of proteins in their native-like environment and the indication on the mechanism of their functions. The first part of the thesis focuses on studying the structure and functions of transmembrane proteins. These proteins are consisted of transmembrane α-helices or β-strands, and each of the secondary structure elements adopts a unique orientation in the membrane following its local interactions. The structure of the entire protein is a collection of the orientations of these elements and their relative positions with respect to one another. These two basic aspects of membrane protein structure are studied in Chapter II and III. In Chapter II, efforts are given to determine the favorable orientation of a β-hairpin peptide, protegrin-1, in different lipid bilayers. The orientational preference results from the interplay between the protein and the surrounding lipid molecules. Chapter III is centered on revealing the structure and dynamics of caveolin-1 in DMPC bilayers. Caveolin-1 forms a re-entrant helix-turn-helix structure with two α-helices embedded in the membrane bilayer. The study shows that caveolin-1 monomer is rather dynamic and maintains its inserted conformation via both specific and non-specific protein-lipid interactions. To investigate the structural and dynamic impact on the function of a membrane protein, molecular dynamics simulations of the voltage-dependent anion channel are performed and the results are presented in Chapter IV. It is found in this chapter that the electrostatic interactions between charged residues on the channel wall facing the lumen are responsible for retarding the cation current, therefore giving the channel its anion selectivity. The second category of protein that is of interest in this thesis is the assembled protein complex, especially the ones that are highly symmetric. Actually, many membrane proteins belong to this category as well, but the study presented here in Chapter V involves simulations performed on a soluble protein complex, bacterioferritin B from Pseudomonas Aeruginosa. It is revealed by the simulations that the dynamic behavior of the protein is magnified by the symmetry and is tightly associated to its function

    Electrospray Fundamentals and Non-Covalent Peptide-Lipid Interactions as Studied by Fourier Transform Ion Cyclotron Reonance Mass Spectrometry

    Get PDF
    A novel electrochemical probe has been designed, built, and used to characterize the distribution in solution potential within the metal capillary and Taylor cone of the electrospray (ES) device. Results show that the measured potential difference increases as the internal probe travels toward the ES capillary exit, with values rising sharply as the base of the Taylor cone is penetrated. Higher conductivity solutions exhibit potentials of higher magnitude at longer distances away from the counter electrode, but these same solutions show lower potentials near the ES capillary exit. Removal of easily oxidizable species from the solution causes the measured potential difference to have nonzero values at distances further within the capillary, and the values measured at all points are raised. The influence of the diameter of the spray tip employed for nano-electrospray mass spectrometry (nano-ES-MS) upon mass spectral charge state distributions was investigated. A detailed comparison of charge state distributions obtained for nanospray capillaries of varying diameters was undertaken while systematically varying experimental parameters such as sample flow rate, analyte concentration, solvent composition, and electrospray current. The general tendency to obtain higher charge states from narrow diameter capillaries was conserved throughout, but tips with smaller orifices were more sensitive to sample flow rate, while tips with larger orifices were more sensitive to analyte concentration and pH of the solution. Electrospray mass spectrometry (ES-MS) has been employed to study noncovalent associations between lipids and fusion peptides. Detailed binding specificities between selected phospholipids and model fusion peptides were investigated. Strong evidence has been compiled to demonstrate the importance of the initial hydrophobic interaction to the observation of lipid-peptide binding by ES-MS. Initial hydrophobic interactions in solution contributed heavily to the formation of these peptide-lipid complexes, particularly for [peptide+PC] complexes, whereas electrostatic interactions played a larger role for [peptide+PG] complexes. The influence of solution pH and degree of unsaturation of lipids upon the binding strength of [peptide+PC] complexes were also investigated. These experiments help to establish ES-MS as a viable new biotechnology tool capable of providing valuable information regarding the strength of hydrophobically driven, noncovalent interactions

    Statistical imaging of transport in complex fluids: a journey from entangled polymers to living cells

    Get PDF
    Combining advanced fluorescence imaging, single particle tracking, and quantitative analysis in the framework of statistical mechanics, we studied several transport phenomena in complex fluids with nanometer and millisecond resolution. On the list are diffusion of nanoparticles and vesicles in crowded environments, reptational motion of polymers in entangled semidilute solutions, and active endosome transport along microtubules in living cells. We started from individual trajectories, and then converged statistically to aggregate properties of interests, with special emphasis on the fluctuations buried under the classic mean-field descriptions. The unified scientific theme behind these diversified subjects is to examine, with experiments designed as direct as possible, the commonly believed fundamental assumptions in those fields, such as Gaussian displacements in Fickian diffusion, harmonic confining potential of virtual tubes in polymer entanglements, and bidirectional motion of active intra-cellular transport. This series of efforts led us to discoveries of new phenomena, mechanisms, and concepts. This route, we termed as ???statistical imaging???, is expected to be widely useful at studying dynamic processes, especially in those emerging fields at the overlap of physics and biology

    Design, synthesis and biological evaluation of novel lipid-based nanoparticle delivery system for metabolic re-engineering

    Get PDF
    Dietary supplementation with fibre has been shown to ameliorate features of the metabolic syndrome and inhibit malignant growth in certain types of cancer. These effects have been linked to short-chain fatty acids (SCFA), mostly acetate. However, the ubiquitous role of SCFAs in metabolism, combined with a short tissue half-life and the non-targeted nature of oral and peripheral administrations make achieving phenotypically relevant levels of SCFA by standard delivery techniques challenging and limit their therapeutic potential. Liposomal encapsulation of a therapeutic agent overcomes these issues by protecting against degradation, increasing circulation time and passively targeting both the liver and tumour tissue. In this research project, I have designed a bifunctional liposome formulation to transport SCFA, monitored their distribution and uptake utilising visualisation by MRI, PET/CT and fluorescence microscopy. These bifunctional liposomes were useful for effectively encapsulating small molecules within their aqueous core, which in this case was acetate, and capable of acetate delivery into cells while also being amenable to cellular imaging. I have shown that preferential delivery of liposome encapsulated acetate (LITA) nanoparticles to key sites of metabolic control provide beneficial therapeutic effects in animal models of both obesity and cancer. Chronic administration of LITA nanoparticles in an obeseogenic model led to a significant reduction in adiposity, intrahepatocellular lipid, inflammatory tone and genetic indication of a decrease fatty acid synthesis in the liver. Application of LITA in a murine xenograft model caused an inhibition of tumour growth in three colorectal cancer cell lines: HT-29, HCT116 p53+/+ and HCT116 p53-/-. The mechanisms for these two outcomes are not fully defined; however cellular energy homeostasis of both scenarios was restored. These results indicate that LITA nanoparticles can be used to improve multiple metabolic pathways, in vivo.Open Acces

    Terminological Methods in Lexicography: Conceptualising, Organising, and Encoding Terms in General Language Dictionaries

    Get PDF
    Os dicionários de língua geral apresentam inconsistências de uniformização e cientificidade no tratamento do conteúdo lexicográfico especializado. Analisando a presença e o tratamento de termos em dicionários de língua geral, propomos um tratamento mais uniforme e cientificamente rigoroso desse conteúdo, considerando também a necessidade de compilar e alinhar futuros recursos lexicais em consonância com padrões interoperáveis. Partimos da premissa de que o tratamento dos itens lexicais, sejam unidades lexicais (palavras em geral) ou unidades terminológicas (termos ou palavras pertencentes a determinados domínios), deve ser diferenciado, e recorremos a métodos terminológicos para tratar os termos dicionarizados. A nossa abordagem assume que a terminologia – na sua dupla dimensão linguística e conceptual – e a lexicografia, como domínios interdisciplinares, podem ser complementares. Assim, apresentamos objetivos teóricos (aperfeiçoamento da metalinguagem e descrição lexicográfica a partir de pressupostos terminológicos) e práticos (representação consistente de dados lexicográficos), que visam facilitar a organização, descrição e modelização consistente de componentes lexicográficos, nomeadamente a hierarquização das etiquetas de domínio, que são marcadores de identificação de léxico especializados. Queremos ainda facilitar a redação de definições, as quais podem ser otimizadas e elaboradas com maior precisão científica ao seguir uma abordagem terminológica no tratamento dos termos. Analisámos os dicionários desenvolvidos por três instituições académicas distintas: a Academia das Ciências de Lisboa, a Real Academia Española e a Académie Française, que representam um valioso legado da tradição lexicográfica académica europeia. A análise inicial inclui um levantamento exaustivo e a comparação das etiquetas de domínio usadas, bem como um debate sobre as opções escolhidas e um estudo comparativo do tratamento dos termos. Elaborámos, depois, uma proposta metodológica para o tratamento de termos em dicionários de língua geral, tomando como exemplo dois domínios, GEOLOGIA e FUTEBOL, extraídos da edição de 2001 do dicionário da Academia das Ciências de Lisboa. Revimos os termos selecionados de acordo com os princípios terminológicos defendidos, dando origem a sentidos especializados revistos/novos para a primeira edição digital deste dicionário. Representamos e anotamos os dados usando as especificações da TEI Lex-0, uma extensão da TEI (Text Encoding Initiative), dedicada à codificação de dados lexicográficos. Destacamos também a importância de ter etiquetas de domínio hierárquicas em vez de uma lista simples de domínios, vantajosas para a organização dos dados, correspondência e possíveis futuros alinhamentos entre diferentes recursos lexicográficos. A investigação revelou que a) os modelos estruturais dos recursos lexicais são complexos e contêm informação de natureza diversa; b) as etiquetas de domínio nos dicionários gerais da língua são planas, desequilibradas, inconsistentes e, muitas vezes, estão desatualizadas, havendo necessidade de as hierarquizar para organizar o conhecimento especializado; c) os critérios adotados para a marcação dos termos e as fórmulas utilizadas na definição são díspares; d) o tratamento dos termos é heterogéneo e formulado de diferentes formas, pelo que o recurso a métodos terminológicos podem ajudar os lexicógrafos a redigir definições; e) a aplicação de métodos terminológicos e lexicográficos interdisciplinares, e também de padrões, é vantajosa porque permite a construção de bases de dados lexicais estruturadas, concetualmente organizadas, apuradas do ponto de vista linguístico e interoperáveis. Em suma, procuramos contribuir para a questão urgente de resolver problemas que afetam a partilha, o alinhamento e vinculação de dados lexicográficos.General language dictionaries show inconsistencies in terms of uniformity and scientificity in the treatment of specialised lexicographic content. By analysing the presence and treatment of terms in general language dictionaries, we propose a more uniform and scientifically rigorous treatment of this content, considering the necessity of compiling and aligning future lexical resources according to interoperable standards. We begin from the premise that the treatment of lexical items, whether lexical units (words in general) or terminological units (terms or words belonging to particular subject fields), must be differentiated, and resort to terminological methods to treat dictionary terms. Our approach assumes that terminology – in its dual dimension, both linguistic and conceptual – and lexicography, as interdisciplinary domains, can be complementary. Thus, we present theoretical (improvement of metalanguage and lexicographic description based on terminological assumptions) and practical (consistent representation of lexicographic data) objectives that aim to facilitate the organisation, description and consistent modelling of lexicographic components, namely the hierarchy of domain labels, as they are specialised lexicon identification markers. We also want to facilitate the drafting of definitions, which can be optimised and elaborated with greater scientific precision by following a terminological approach for the treatment of terms. We analysed the dictionaries developed by three different academic institutions: the Academia das Ciências de Lisboa, the Real Academia Española and the Académie Française, which represent a valuable legacy of the European academic lexicographic tradition. The initial analysis includes an exhaustive survey and comparison of the domain labels used, as well as a debate on the chosen options and a comparative study of the treatment of the terms. We then developed a methodological proposal for the treatment of terms in general language dictionaries, exemplified using terms from two domains, GEOLOGY and FOOTBALL, taken from the 2001 edition of the dictionary of the Academia das Ciências de Lisboa. We revised the selected terms according to the defended terminological principles, giving rise to revised/new specialised meanings for the first digital edition of this dictionary. We represent and annotate the data using the TEI Lex-0 specifications, a TEI (Text Encoding Initiative) subset for encoding lexicographic data. We also highlight the importance of having hierarchical domain labels instead of a simple list of domains, which are beneficial to the data organisation itself, correspondence and possible future alignments between different lexicographic resources. Our investigation revealed the following: a) structural models of lexical resources are complex and contain information of a different nature; b) domain labels in general language dictionaries are flat, unbalanced, inconsistent and often outdated, requiring the need to hierarchise them for organising specialised knowledge; c) the criteria adopted for marking terms and the formulae used in the definition are disparate; d) the treatment of terms is heterogeneous and formulated differently, whereby terminological methods can help lexicographers to draft definitions; e) the application of interdisciplinary terminological and lexicographic methods, and of standards, is advantageous because it allows the construction of structured, conceptually organised, linguistically accurate and interoperable lexical databases. In short, we seek to contribute to the urgent issue of solving problems that affect the sharing, alignment and linking of lexicographic data
    • …
    corecore