741 research outputs found

    Computational Approaches to Drug Profiling and Drug-Protein Interactions

    Get PDF
    Despite substantial increases in R&D spending within the pharmaceutical industry, denovo drug design has become a time-consuming endeavour. High attrition rates led to a long period of stagnation in drug approvals. Due to the extreme costs associated with introducing a drug to the market, locating and understanding the reasons for clinical failure is key to future productivity. As part of this PhD, three main contributions were made in this respect. First, the web platform, LigNFam enables users to interactively explore similarity relationships between ‘drug like’ molecules and the proteins they bind. Secondly, two deep-learning-based binding site comparison tools were developed, competing with the state-of-the-art over benchmark datasets. The models have the ability to predict offtarget interactions and potential candidates for target-based drug repurposing. Finally, the open-source ScaffoldGraph software was presented for the analysis of hierarchical scaffold relationships and has already been used in multiple projects, including integration into a virtual screening pipeline to increase the tractability of ultra-large screening experiments. Together, and with existing tools, the contributions made will aid in the understanding of drug-protein relationships, particularly in the fields of off-target prediction and drug repurposing, helping to design better drugs faster

    A probabilistic approach for the estimation of earthquake source parameters from spectral inversion

    Get PDF
    The characterization of the mechanisms of earthquake generation and propagation is a major challenge in understanding the Earth engine. Although the seismic rupture non-linearly combines several space and time scales, some macroscopic parameters can provide insights in its evolution, such as the earthquake size and the stress drop released during a seismic event. However, the estimation of these parameters is very uncertain (Cotton et al., 2013), owing to uncertainties in data and models and to the strong coupling between source effects and wave propagation up to the observation sites. The objective of this thesis is the characterization of the seismic source parameters using the amplitude spectrum of the displacement records and assuming that the earthquake behaves as a circular crack (Keilis-Borok, 1959). Several methods for the characterization of the source using a spectral analysis have been proposed in literature. Systematic comparison between different methodologies highlighted the dependence of the results on the fitting model, due to the high correlation between the parameters, especially comparing EGF and TGF based techniques (Ide et al. 2003; Oye et al., 2005). A probabilistic approach can allow to investigate such a correlation, defining a probability density function (PDF) in the parameter space and allowing for a consistent estimate of the uncertainties. Using the probabilistic framework developed by Tarantola (2005), and specifically the notion of conjunction of states of information, I developed a probabilistic approach to retrieve the source parameters seismic moment (through the low-frequency spectral level), the corner frequency (that is a proxy of the rupture length) and the high-frequency decay parameter. Information on the source of an earthquake requires the modeling of the wave propagation too; I choose to use in this work a theoretical Green’s function, adding one parameter to invert related to the propagation (a frequency-independent Q-factor) beyond the three source parameter that I want to retrieve. I model the observations with an operator, defined on these four parameters, which is non-linear; thus, a global exploration of the model space is required in order to find the best solution to describe the data. Additionally, the joint a-posteriori probability density function (PDF) is computed around the best model, to extract the correlation matrix of the parameters. This allows to obtain estimates and uncertainties from the PDF, that are taking into account the correlations. The global exploration relies on the building of a Markov chain in the parameter space and on the combination of a deterministic minimization with a random exploration of the space (Basin-Hopping method, Wales and Doye, 1997; Wales, 2003). The main advantages of this new methodology are the following : • A fully probabilistic approach associated with a global exploration method can provide a robust information about the “best-fit” model, with correct estimation of uncertainties and parameter correlation. • The shape of the estimated PDF can assess the quality of the solutions, allowing to rule out noisy data and thus enabling the use of the method for automatic processing of large datasets. I performed three applications of the method. In Chapter 4, I analyzed the Central Italy 2016-2017 sequence, characterizing the source of all the earthquakes with Ml > 4 (56 events); in Chapter 5 I characterized the source of more than 10000 LFEs occurred in the Nankai region (Japan) during the period 2012-2016; in Chapter 6 I analyzed the micro-seismicity (Ml between 0 and 4.5, 1061 events) occurred from 2016 to 2017 in the Northern Ibaraki region (Japan)

    X-Ray microcalorimeter detectors - Technology developments for high energy astrophysics space missions

    Get PDF
    Improvements in the design, fabrication, and performance of astronomical detectors has ushered in the so-called era of multi messenger astrophysics, in which several different signals (electromagnetic waves, gravitational waves, neutrinos, cosmic rays) are processed to obtain detailed descriptions of their sources. Soft x-ray instrumentation has been developed in the last decades and used on board numerous space missions. This has allowed a deep understanding of several physical phenomena taking place in astrophysical sources of different scales from normal stars to galaxy clusters and huge black holes. On the other hand, imaging and spectral capabilities in the the hard x-rays are still lagging behind with high potentials of discovery area. Modern cryogenic microcalorimeters have two orders of magnitude or more better energy resolution with respect to CCD detectors at the same energy in the whole X-ray band. This significant improvement will permit important progress in high energy astrophysics thanks to the data that will be provided by future missions adopting this detector technology such as the ESA L2 mission Athena, the JAXA/NASA mission XRISM, both under development, or the NASA LYNX mission presently under investigation. The JAXA/NASA mission Hitomi, launched in 2016 and failed before starting normal operation, has already given a hint of the high potential of such detectors. Due to their very high sensitivity, X-ray cryogenic microcalorimeters need to be shielded from out of band radiation by the use of efficient thin filters. These microcalorimeters work by measuring the temperature increase caused by a photon that hits an X-ray absorber. In neutron transmutation doped germanium (NTD Ge) devices the temperature increase in the absorber is measured by a semiconductor thermometer made of germanium doped by the neutron transmutation doping technique. They are characterized by relatively low specific heat and low sensitivity to external magnetic fields. These characteristics make them promising detectors for hard X-ray detectors for space and laboratory applications. Research groups of the the X-ray Astronomy Calibration and Testing (XACT) Laboratory of the Osservatorio Astronomico di Palermo – Istituto Nazionale di Astrofisica (INAF-OAPA), and of the Dipartimento di Fisica e Chimica “Emilio Segrè” (DiFC) of the Università di Palermo have already developed experience related to the design, fabrication and testing of NTD Ge microcalorimeters. Furthermore, the research group has participated for many years in the design and development of filters for x-ray detectors in different space missions. This thesis concerns the development of materials and technologies for high energy microcalorimeters. In particular its aim is to design and fabricate thick bismuth absorbers for NTD germanium microcalorimeter arrays to extend their detection band toward hard X-ray energies. Filters for shielding microcalorimeters from different background radiation arriving on the detectors were also studied. The design and fabrication of thick bismuth absorbers for hard x-rays detection (20 keV ≤ E ≤ 100 keV) is part of an ongoing effort to develop arrays of NTD Ge microcalorimeters by planar technologies for astrophysical applications. One potential application of such detectors is in the high spectral resolution (∆E ~ 50 eV) investigation of the hard X-ray emission from the solar corona, which is the goal of a stratospheric balloon borne experiment concept named MIcrocalorimeters STratospheric ExpeRiment for solar hard X rays (MISTERX) presently under study at INAF-OAPA. The characterization activity of filters for microcalorimeters in also related to the implementation of the European Space Agency high energy mission named Athena (Advanced Telescopes for High Energy Astrophysics). This thesis describes the design, fabrication, and characterization of the bismuth absorbers, as well as the characterization of filters for Athena. Chapter one summarizes the working principles of NTD Ge microcalorimeters and their applications. Chapter 2 describes the design of the bismuth absorber array on suitable substrates. Chapter 3 focuses on the electroplating process for the bismuth layer deposition, with details about the design and fabrication of the microlithographic mask for the array patterning, and about the development of the microlithographic process for the array fabrication on the chosen substrates. The fabrication of 4 x 4 absorber arrays is also described. Chapter 4 reports on the characterization activity of deposited bismuth layers by different techniques; their morphology was investigated by scanning electron microscopy. The electrochemical impedance spectroscopy technique was used to increase grown layer quality. Fabricated arrays were also characterized. Chapter 5 describes the characterization activity for different filter prototype samples developed for Athena. Mechanical robustness, radio frequency attenuation and radiation damage caused by protons were evaluated. Radiation damage effects at different doses were in particular investigated on silicon nitride filters by scanning electron microscopy (SEM), atomic force microscopy (AFM), UV-Vis-IR spectroscopy and x-ray attenuation measurements. Details on both technical detector requirements and different sensor types are given in the Appendix

    Capabilities of Gossamer-1 derived small spacecraft solar sails carrying MASCOT-derived nanolanders for in-situ surveying of NEAs

    Get PDF
    Any effort which intends to physically interact with specific asteroids requires understanding at least of the composition and multi-scale structure of the surface layers, sometimes also of the interior. Therefore, it is necessary first to characterize each target object sufficiently by a precursor mission to design the mission which then interacts with the object. In small solar system body (SSSB) science missions, this trend towards landing and sample-return missions is most apparent. It also has led to much interest in MASCOT-like landing modules and instrument carriers. They integrate at the instrument level to their mothership and by their size are compatible even with small interplanetary missions. The DLR-ESTEC Gossamer Roadmap NEA Science Working Groups‘ studies identified Multiple NEA Rendezvous (MNR) as one of the space science missions only feasible with solar sail propulsion. Parallel studies of Solar Polar Orbiter (SPO) and Displaced L1 (DL1) space weather early warning missions studies outlined very lightweight sailcraft and the use of separable payload modules for operations close to Earth as well as the ability to access any inclination and a wide range of heliocentric distances. These and many other studies outline the unique capability of solar sails to provide access to all SSSB, at least within the orbit of Jupiter. Since the original MNR study, significant progress has been made to explore the performance envelope of near-term solar sails for multiple NEA rendezvous. However, although it is comparatively easy for solar sails to reach and rendezvous with objects in any inclination and in the complete range of semi-major axis and eccentricity relevant to NEOs and PHOs, it remains notoriously difficult for sailcraft to interact physically with a SSSB target object as e.g. the Hayabusa missions do. The German Aerospace Center, DLR, recently brought the Gossamer solar sail deployment technology to qualification status in the Gossamer-1 project. Development of closely related technologies is continued for very large deployable membrane-based photovoltaic arrays in the GoSolAr project. We expand the philosophy of the Gossamer solar sail concept of efficient multiple sub-spacecraft integration to also include landers for one-way in-situ investigations and sample-return missions. These are equally useful for planetary defence scenarios, SSSB science and NEO utilization. We outline the technological concept used to complete such missions and the synergetic integration and operation of sail and lander. We similarly extend the philosophy of MASCOT and use its characteristic features as well as the concept of Constraints-Driven Engineering for a wider range of operations

    Bifunctionality: New Insights into the Class of (6-4)Photolyases and animal-like Cryptochromes

    Get PDF
    The cryptochrome/photolyase family (CPF) is a huge protein family of blue-light photoreceptors, which occur in all kingdoms of life. All members of this family utilize a flavin chromophore as catalytic cofactor and show high sequence and structural similarity, although they have different functions inside the living organism. Photolyases use the energy of light to repair UV-light induced DNA lesions like the cyclobutane pyrimidine dimer (CPD) or the pyrimidine-(6-4)-pyrimidone photoproduct ((6-4)PP). Cryptochromes, on the other hand, are involved in many different blue-light regulated mechanisms, like the photoperiodic flowering in plants and the entrainment of the circadian rhythm in animals. This study focuses on the characterization of the photoreduction and (6-4) repair mechanism of the subclass of animal cryptochromes and (6-4) photolyases on the basis of the animal-like cryptochrome from the green algae Chlamydomonas reinhardtii (CraCRY). Through multiple sequence alignment with other CPFs and mutational studies, the specific residues involved in the photoreduction mechanism were successfully identified, leading to the discovery of a tyrosine as distal electron donor at the end of the conserved tryptophan triad. The photoreduction, with regard to the formation and decay of a tyrosyl radical, was extensively studied with several spectroscopic methods. All analyses resulted in the observation of an unusually long-lived tyrosyl radical upon photoreduction. As it turned out, CraCRY is not only a cryptochrome, but also has (6-4) photolyase function, which makes it a bifunctional member of this group. To study structure-function relationships, the 3D structure of CraCRY in complex with its chromophores as well as a (6-4)PP was solved by X-ray crystallography. The structure reveals a new binding mode of the DNA lesion and provides insight into the active site. One of the essential residues for DNA repair (His1) exhibits a different conformation as in the common model, which may indicate an alternative mechanism for (6-4)PP repair. The main knowledge about cryptochrome structures derived from the comparison with photolyases, but cryptochromes contain a highly variable C-terminal extension (CTE), which is missing in photolyases. In CraCRY this CTE is about 100 amino acids long and not shown in the solved crystal structure. For analysis of the CTE, hydrogen-deuterium-exchange coupled with mass spectrometry was used including a comparison of different reduction. Although, the coverage of the CTE was incomplete, there were significant changes between the oxidized and fully reduced state FADH− detectable. It was concluded, that the photoreduction process and the formation of the tyrosyl radical is triggering a structural change in the region between the loop carrying the tyrosyl radical and the C-terminal α22-helix. For further investigation of the intramolecular changes upon photoreduction and DNA repair, time-resolved crystal measurements of a class II CPD photolyase (MmCPDII) and CraCRY were performed within a joint project at the free electron laser SACLA. So far, the different conformations of the flavin cofactor of MmCPDII in its different oxidation states have been successfully derived. In future, it is expected to show the whole repair mechanism for the CPD lesion as well as for the (6 4)PP by time-resolved SFX

    Digital watermark technology in security applications

    Get PDF
    With the rising emphasis on security and the number of fraud related crimes around the world, authorities are looking for new technologies to tighten security of identity. Among many modern electronic technologies, digital watermarking has unique advantages to enhance the document authenticity. At the current status of the development, digital watermarking technologies are not as matured as other competing technologies to support identity authentication systems. This work presents improvements in performance of two classes of digital watermarking techniques and investigates the issue of watermark synchronisation. Optimal performance can be obtained if the spreading sequences are designed to be orthogonal to the cover vector. In this thesis, two classes of orthogonalisation methods that generate binary sequences quasi-orthogonal to the cover vector are presented. One method, namely "Sorting and Cancelling" generates sequences that have a high level of orthogonality to the cover vector. The Hadamard Matrix based orthogonalisation method, namely "Hadamard Matrix Search" is able to realise overlapped embedding, thus the watermarking capacity and image fidelity can be improved compared to using short watermark sequences. The results are compared with traditional pseudo-randomly generated binary sequences. The advantages of both classes of orthogonalisation inethods are significant. Another watermarking method that is introduced in the thesis is based on writing-on-dirty-paper theory. The method is presented with biorthogonal codes that have the best robustness. The advantage and trade-offs of using biorthogonal codes with this watermark coding methods are analysed comprehensively. The comparisons between orthogonal and non-orthogonal codes that are used in this watermarking method are also made. It is found that fidelity and robustness are contradictory and it is not possible to optimise them simultaneously. Comparisons are also made between all proposed methods. The comparisons are focused on three major performance criteria, fidelity, capacity and robustness. aom two different viewpoints, conclusions are not the same. For fidelity-centric viewpoint, the dirty-paper coding methods using biorthogonal codes has very strong advantage to preserve image fidelity and the advantage of capacity performance is also significant. However, from the power ratio point of view, the orthogonalisation methods demonstrate significant advantage on capacity and robustness. The conclusions are contradictory but together, they summarise the performance generated by different design considerations. The synchronisation of watermark is firstly provided by high contrast frames around the watermarked image. The edge detection filters are used to detect the high contrast borders of the captured image. By scanning the pixels from the border to the centre, the locations of detected edges are stored. The optimal linear regression algorithm is used to estimate the watermarked image frames. Estimation of the regression function provides rotation angle as the slope of the rotated frames. The scaling is corrected by re-sampling the upright image to the original size. A theoretically studied method that is able to synchronise captured image to sub-pixel level accuracy is also presented. By using invariant transforms and the "symmetric phase only matched filter" the captured image can be corrected accurately to original geometric size. The method uses repeating watermarks to form an array in the spatial domain of the watermarked image and the the array that the locations of its elements can reveal information of rotation, translation and scaling with two filtering processes
    corecore