36 research outputs found
Algorithmic and infrastructural software development for cryo electron tomography
Many Cryo Electron Microscopy (cryoEM) software packages have accumulated significant technical debts over the years, resulting in overcomplicated codebases that are costly to maintain and that slow down development. In this thesis, we advocate for the development of open-source cryoEM core libraries as a solution to this debt and with the ultimate goal of improving the developer and user experience.
First, a brief summary of cryoEM is presented, with an emphasis on projection algorithms and tomography. Second, the requirements of modern and future cryoEM image processing are discussed. Third, a new experimental cryoEM core library written in modern C++ is introduced. This library prioritises performance and code reusability, and is designed around a few core functions which offers an efficient model to manipulate multidimensional arrays at an index-wise and element-wise level. C++ template metaprogramming allowed us to develop modular and transparent compute backends, that provide great CPU and GPU performance, unified in an easy to use interface. Fourth, new projection algorithms will be described, notably a grid-driven approach to accurately insert and sample central slices in 3-dimensional (3d) Fourier space. A Fourier-based fused backward-forward projection, further improving the computational efficiency and accuracy of reprojections, will also be presented. Fifth, and as part of our efforts to test and showcase the library, we have started to implement a tilt series alignment package that gathers existing and new techniques into an automated pipeline. The current program first estimates the per-tilt translations and specimen stage rotation using a coarse alignment based on cosine stretching. It then fits the Thon rings of each tilt image as part of a global optimization to estimate the specimen inclination. Finally, we are using our Fourier-based fused reprojection to efficiently refine the per-tilt translations, and are starting to explore ways that would allow us to refine the per-tilt stage rotations
Structure of the native chemotaxis core signaling unit from phage E-protein lysed E. coli cells
Motile bacteria employ conserved chemotaxis networks to detect chemical gradients in their surroundings and effectively regulate their locomotion, enabling the location of essential nutrients and other important biological niches. The sensory apparatus of the chemotaxis pathway is an array of core-signaling units (CSUs) composed of transmembrane chemoreceptors, the histidine kinase CheA and an adaptor protein, CheW. Although chemotaxis pathways represent the best understood signaling systems, a detailed mechanistic understanding of signal transduction has been hindered by the lack of a complete structural picture of the CSU and extended array. In this study, we present the structure of the complete CSU from phage φX174 E protein lysed Escherichia coli cells, determined using cryo-electron tomography and sub-tomogram averaging to 12-Å resolution. Using AlphaFold2, we further predict the atomic structures of the CSU’s constituent proteins as well as key protein-protein interfaces, enabling the assembly an all-atom CSU model, which we conformationally refine using our cryo-electron tomography map. Molecular dynamics simulations of the resulting model provide new insight into the periplasmic organization of the complex, including novel interactions between neighboring receptor ligand-binding domains. Our results further elucidate previously unresolved interactions between individual CheA domains, including an anti-parallel P1 dimer and non-productive binding mode between P1 and P4, enhancing our understanding of the structural mechanisms underlying CheA signaling and regulation. IMPORTANCE Bacterial chemotaxis is a ubiquitous behavior that enables cell movement toward or away from specific chemicals. It serves as an important model for understanding cell sensory signal transduction and motility. Characterization of the molecular mechanisms underlying chemotaxis is of fundamental interest and requires a high-resolution structural picture of the sensing machinery, the chemosensory array. In this study, we combine cryo-electron tomography and molecular simulation to present the complete structure of the core signaling unit, the basic building block of chemosensory arrays, from Escherichia coli . Our results provide new insight into previously poorly-resolved regions of the complex and offer a structural basis for designing new experiments to test mechanistic hypotheses
Intellectual Property, Open Science and Research Biobanks
In biomedical research and translational medicine, the ancient war between exclusivity (private control over information) and access to information is proposing again on a new battlefield: research biobanks. The latter are becoming increasingly important (one of the ten ideas changing the world, according to Time magazine) since they allow to collect, store and distribute in a secure and professional way a critical mass of human biological samples for research purposes. Tissues and related data are fundamental for the development of the biomedical research and the emerging field of translational medicine: they represent the “raw material” for every kind of biomedical study. For this reason, it is crucial to understand the boundaries of Intellectual Property (IP) in this prickly context. In fact, both data sharing and collaborative research have become an imperative in contemporary open science, whose development depends inextricably on: the opportunities to access and use data, the possibility of sharing practices between communities, the cross-checking of information and results and, chiefly, interactions with experts in different fields of knowledge. Data sharing allows both to spread the costs of analytical results that researchers cannot achieve working individually and, if properly managed, to avoid the duplication of research. These advantages are crucial: access to a common pool of pre-competitive data and the possibility to endorse follow-on research projects are fundamental for the progress of biomedicine. This is why the "open movement" is also spreading in the biobank's field. After an overview of the complex interactions among the different stakeholders involved in the process of information and data production, as well as of the main obstacles to the promotion of data sharing (i.e., the appropriability of biological samples and information, the privacy of participants, the lack of interoperability), we will firstly clarify some blurring in language, in particular concerning concepts often mixed up, such as “open source” and “open access”. The aim is to understand whether and to what extent we can apply these concepts to the biomedical field. Afterwards, adopting a comparative perspective, we will analyze the main features of the open models – in particular, the Open Research Data model – which have been proposed in literature for the promotion of data sharing in the field of research biobanks.
After such an analysis, we will suggest some recommendations in order to rebalance the clash between exclusivity - the paradigm characterizing the evolution of intellectual property over the last three centuries - and the actual needs for access to knowledge. We argue that the key factor in this balance may come from the right interaction between IP, social norms and contracts. In particular, we need to combine the incentives and the reward mechanisms characterizing scientific communities with data sharing imperative
Some German Interpretations of Neolithic Origins During the Period of Nazi Influence
My intention in this study is to break down the Neolithic Period into component parts, and state what the various German authorities propose in the case of each. Not only the works appearing during the prescribed decade will be consulted, but also those of the pre-1933 years. In each case these opinions compared to those of standard British and American and, at times, Scandinavian authorities. Deviations between these need not necessarily indicate that a falsification has taken place. Nevertheless, a unanimity of agreement in part of perhaps four different countries other than Germany would perhaps be important sign of it. In the final analysis, it will be necessary to follow the steps in reasoning used by the Germans in attaining their results
Manufacturing Data Uncertainties Propagation Method in Burn-Up Problems
A nuclear data-based uncertainty propagation methodology is extended to enable propagation of manufacturing/technological data (TD) uncertainties in a burn-up calculation problem, taking into account correlation terms between Boltzmann and Bateman terms. The methodology is applied to reactivity and power distributions in a Material Testing Reactor benchmark. Due to the inherent statistical behavior of manufacturing tolerances, Monte Carlo sampling method is used for determining output perturbations on integral quantities. A global sensitivity analysis (GSA) is performed for each manufacturing parameter and allows identifying and ranking the influential parameters whose tolerances need to be better controlled. We show that the overall impact of some TD uncertainties, such as uranium enrichment, or fuel plate thickness, on the reactivity is negligible because the different core areas induce compensating effects on the global quantity. However, local quantities, such as power distributions, are strongly impacted by TD uncertainty propagations. For isotopic concentrations, no clear trends appear on the results
Nuclear data uncertainties propagation methods in Boltzmann/Bateman coupled problems: Application to reactivity in MTR
International audienceA novel method has been developed to calculate sensitivity coefficients in coupled Boltzmann/Bateman problem for nuclear data (ND) uncertainties propagation on the reactivity. Different uncertainty propagation methodologies, such as One-At-a-Time (OAT) and hybrid Monte-Carlo/deterministic methods have been tested and are discussed on an actual example of ND uncertainty problem on a Material Testing Reactor (MTR) benchmark. Those methods, unlike total Monte Carlo (MC) sampling for uncertainty propagation and quantification (UQ), allow obtaining sensitivity coefficients, as well as Bravais–Pearson correlations values between Boltzmann and Bateman, during the depletion calculation for global neutronics parameters such as the effective multiplication coefficient. The methodologies are compared to a pure MC sampling method, usually considered as the “reference” method. It is shown that methodologies can seriously underestimate propagated variances, when Bravais–Pearson correlations on ND are not taken into account in the UQ process
Impact of correlations between core configurations for the evaluation of nuclear data uncertainty propagation for reactivity
The precise estimation of Pearsons correlations, also called “representativity” coefficients, between core configurations is a fundamental quantity for properly assessing the nuclear data (ND) uncertainties propagation on integral parameters such as k-eff, power distributions, or reactivity coefficients. In this paper, a traditional adjoint method is used to propagate ND uncertainty on reactivity and reactivity coefficients and estimate correlations between different states of the core. We show that neglecting those correlations induces a loss of information in the final uncertainty. We also show that using approximate values of Pearson does not lead to an important error of the model. This calculation is made for reactivity at the beginning of life and can be extended to other parameters during depletion calculations
Calculation and benchmark of fluence-to-local skin equivalent dose coefficients for neutrons with FLUKA, MCNP, and GEANT4 Monte-Carlo codes
International audienceDose equivalent limits for single organs are recommended by the ICRP (International Commission for the Radiological Protection publication 103). These limits do not lend themselves to be measured. They are assessed by convoluting conversion factors with particle fluences. The Fluence-to-Dose conversion factors are tabulated in the ICRP literature. They allow assessing the organ dose of interest using numerical simulations. In particular, the literature lacks the knowledge of local skin equivalent dose (LSD) coefficients for neutrons. In this article, we compute such values for neutron energies ranging from 1 meV to 15 MeV. We use FLUKA, MCNP and GEANT4 Radiation transport Monte-Carlo simulation codes to perform the calculations. A comparison between these three codes is performed. These calculated values are important for radiation protection studies and radiotherapy applications