121 research outputs found

    OREGAMI: Software Tools for Mapping Parallel Computations to Parallel Architectures

    Get PDF
    22 pagesThe mapping problem in message-passing parallel processors involves the assignment of tasks in a parallel computation to processors and the routing of inter-task messages along the links of the interconnection network. We have developed a unified set of software tools called OREGAMI for automatic and guided mapping of parallel computations to parallel architectures in order to achieve portability and maximal performance from parallel systems. Our tools include a description language which enables the programmer of parallel algorithms to specify information about the static and dynamic communication behavior of the computation to be mapped. This information is used by the mapping algorithms to assign tasks to processors and to route communication in the network topology. Two key features of our system are (a) the ability to take advantage of the regularity present in both the computation structure and the interconnection network and (b) the desire to balance the user's knowledge and intuition with the computational power of efficient combinatorial algorithms

    Computer architecture for efficient algorithmic executions in real-time systems: New technology for avionics systems and advanced space vehicles

    Get PDF
    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed

    Theoretical study of the interaction of agonists with the 5-HT2A receptor

    Get PDF
    The 5-HT2A receptor (5-HT2AR) is a biogenic amine receptor that belongs to the class A of G protein coupled receptors. It is characterized by a low affinity for serotonin (5-HT) and for other primary amines. Introduction of an ortho-methoxybenzyl substituent at the amine nitrogen increases the partial agonistic activity by a factor of 40 to 1400 compared with 5-HT. The present study was to analyse the QSAR of a series of 51 5-HT2AR partial agonistic arylethylamines, tested in vascular in-vitro assays on rats, at a structure-based level and to suggest ligand binding sites. The compounds belong to three different structural classes, (1) indoles, (2) methoxybenzenes and (3) quinazolinediones. Following a hierarchical strategy, different methods have been applied which all contribute to the investigation of ligand-receptor interactions: fragment regression analysis (FRA), receptor modeling, docking studies and 3D QSAR approaches (comparative molecular field analysis, CoMFA, and comparative molecular similarity index analysis, CoMSIA). An initial FRA indicated that methoxy substituents at indole and phenyl derivatives increase the activity and may be involved in polar interactions with the 5-HT2AR. The large contribution of lipophilic substituents in p position of phenethylamines suggests fit to a specific hydrophobic pocket. Secondary benzylamines are more than one order of magnitude more active than their NH2 analogs. An ortho-OH or -OMe substituent at the benzyl moiety further increases activity. Homology models of the human and rat 5-HT2AR were generated using the crystal structure of bovine rhodopsin and of the beta2-adrenoceptor as templates. The derivation of the putative binding sites for the arylethylamines was based on the results from FRA and on mutagenesis data. Both templates led to 5-HT2AR models with similar topology of the binding pocket within the transmembrane domains TM3, TM5, TM6 and TM7. Docking studies with representative members of the three structural classes suggested that the aryl moieties and particularly para-substituents in phenyl derivatives fit into a hydrophobic pocket formed by Phe2435.47, Phe2445.48 and Phe3406.52. The 5-methoxy substituents in indole and phenyl compounds form H bonds with Ser2395.43. In each case, an additional H bond with Ser1593.36 may be assumed. The cationic amine interacts with the conserved Asp1553.32. The benzyl group of secondary arylethylamines is inserted into another hydrophobic pocket formed by Phe3396.51, Trp3677.40 and Tyr3707.43. In this region, the docking poses depend on the template used for model generation, leading to different interactions especially of ortho- substituents. The docking studies with the beta2-adrenoceptor based rat 5-HT2AR model provided templates for a structure-based alignment of the whole series which was used in 3D QSAR analyses of the partial agonistic activity. Both approaches, CoMFA and CoMSIA, led to highly predictive models with low complexity (cross-validated q2 of 0.72 and 0.81 at 4 and 3 components, respectively). The results were largely compatible with the binding site and confirm the docking studies and the suggested ligand-receptor interactions. Steric and hydrophobic field effects on the potency indicate a hydrophobic pocket around the aryl moiety and near the para position of phenyl derivatives and account for the increased activity of secondary benzylamines. The effects of electrostatic and H-bond acceptor fields suggest a favourable influence of negative charges around the aryl moiety, corresponding to the increase in potency caused by methoxy substituents in 2-, 4-, 5- and 6-position of phenethylamines and by the quinazolinedione oxygens. This is in accord with the role of Ser1593.36 and Ser2395.43 as H bond donors. At the benzyl moiety, the negative charge and the acceptor potential of 2-hydroxy and -methoxy substituents is of advantage. Agonists stabilize or induce active receptor states not reflected by the existing crystal structures. Based on models of different rhodopsin states, a homology modeling and ligand docking study on corresponding 5-HT2AR states suggested to be specific to agonist and partial agonist binding, respectively, was performed. The models indicate collective conformational changes of TM domains during activation. The different 5-HT2AR states are similar with respect to the amino acids interacting with the arylethylamines, but show individual topologies of the binding sites. The interconversion of states by TM movements may be accompanied by co-translations and rotations of the ligands. In the case of the secondary amines considered, the tight fit of the benzyl substituent into a hydrophobic pocket containing key residues in TM6 probably impedes the complete receptor activation due to inhibiting the rotation of this helix. High affinity of a partial agonist is therefore often at the expense of its ability to fully activate a receptor

    Information Theoretic Methods For Biometrics, Clustering, And Stemmatology

    Get PDF
    This thesis consists of four parts, three of which study issues related to theories and applications of biometric systems, and one which focuses on clustering. We establish an information theoretic framework and the fundamental trade-off between utility of biometric systems and security of biometric systems. The utility includes person identification and secret binding, while template protection, privacy, and secrecy leakage are security issues addressed. A general model of biometric systems is proposed, in which secret binding and the use of passwords are incorporated. The system model captures major biometric system designs including biometric cryptosystems, cancelable biometrics, secret binding and secret generating systems, and salt biometric systems. In addition to attacks at the database, information leakage from communication links between sensor modules and databases is considered. A general information theoretic rate outer bound is derived for characterizing and comparing the fundamental capacity, and security risks and benefits of different system designs. We establish connections between linear codes to biometric systems, so that one can directly use a vast literature of coding theories of various noise and source random processes to achieve good performance in biometric systems. We develop two biometrics based on laser Doppler vibrometry: LDV) signals and electrocardiogram: ECG) signals. For both cases, changes in statistics of biometric traits of the same individual is the major challenge which obstructs many methods from producing satisfactory results. We propose a ii robust feature selection method that specifically accounts for changes in statistics. The method yields the best results both in LDV and ECG biometrics in terms of equal error rates in authentication scenarios. Finally, we address a different kind of learning problem from data called clustering. Instead of having a set of training data with true labels known as in identification problems, we study the problem of grouping data points without labels given, and its application to computational stemmatology. Since the problem itself has no true answer, the problem is in general ill-posed unless some regularization or norm is set to define the quality of a partition. We propose the use of minimum description length: MDL) principle for graphical based clustering. In the MDL framework, each data partitioning is viewed as a description of the data points, and the description that minimizes the total amount of bits to describe the data points and the model itself is considered the best model. We show that in synthesized data the MDL clustering works well and fits natural intuition of how data should be clustered. Furthermore, we developed a computational stemmatology method based on MDL, which achieves the best performance level in a large dataset

    The MANGO clockless network-on-chip: Concepts and implementation

    Get PDF

    The BrightEyes-TTM: an open-source time-tagging module for fluorescence lifetime imaging microscopy applications

    Get PDF
    The aim of this Ph.D. work is to reason and show how an open-source multi-channel and standalone time-tagging device was developed, validated and used in combination with a new generation of single-photon array detectors to pursue super-resolved time-resolved fluorescence lifetime imaging measurements. Within the compound of time-resolved fluorescence laser scanning microscopy (LSM) techniques, fluorescence lifetime imaging microscopy (FLIM) plays a relevant role in the life-sciences field, thanks to its ability of detecting functional changes within the cellular micro-environment. The recent advancements in photon detection technologies, such as the introduction of asynchronous read-out single-photon avalanche diode (SPAD) array detectors, allow to image a fluorescent sample with spatial resolution below the diffraction limit, at the same time, yield the possibility of accessing the single-photon information content allowing for time-resolved FLIM measurements. Thus, super-resolved FLIM experiments can be accomplished using SPAD array detectors in combination with pulsed laser sources and special data acquisition systems (DAQs), capable of handling a multiplicity of inputs and dealing with the single-photons readouts generated by SPAD array detectors. Nowadays, the commercial market lacks a true standalone, multi-channel, single-board, time-tagging and affordable DAQ device specifically designed for super-resolved FLIM experiments. Moreover, in the scientific community, no-efforts have been placed yet in building a device that can compensate such absence. That is why, within this Ph.D. project, an open-source and low-cost device, the so-called BrightEyes-TTM (time tagging module), was developed and validated both for fluorescence lifetime and time-resolved measurements in general. The BrightEyes-TTM belongs to a niche of DAQ devices called time-to-digital converters (TDCs). The field-gate programmable array (FPGA) technology was chosen for implementing the BrightEyes-TTM thanks to its reprogrammability and low cost features. The literature reports several different FPGA-based TDC architectures. Particularly, the differential delay-line TDC architecture turned out to be the most suitable for this Ph.D. project as it offers an optimal trade-off between temporal precision, temporal range, temporal resolution, dead-time, linearity, and FPGA resources, which are all crucial characteristics for a TDC device. The goal of the project of pursuing a cost-effective and further-upgradable open-source time-tagging device was achieved as the BrigthEyes-TTM was developed and assembled using low-cost commercially available electronic development kits, thus allowing for the architecture to be easily reproduced. BrightEyes-TTM was deployed on a FPGA development board which was equipped with a USB 3.0 chip for communicating with a host-processing unit and a multi-input/output custom-built interface card for interconnecting the TTM with the outside world. Licence-free softwares were used for acquiring, reconstructing and analyzing the BrightEyes-TTM time-resolved data. In order to characterize the BrightEyes-TTM performances and, at the same time, validate the developed multi-channel TDC architecture, the TTM was firstly tested on a bench and then integrated into a fluorescent LSM system. Yielding a 30 ps single-shot precision and linearity performances that allows to be employed for actual FLIM measurements, the BrightEyes-TTM, which also proved to acquire data from many channels in parallel, was ultimately used with a SPAD array detector to perform fluorescence imaging and spectroscopy on biological systems. As output of the Ph.D. work, the BrightEyes-TTM was released on GitHub as a fully open-source project with two aims. The principal aim is to give to any microscopy and life science laboratory the possibility to implement and further develop single-photon-based time-resolved microscopy techniques. The second aim is to trigger the interest of the microscopy community, and establish the BrigthEyes-TTM as a new standard for single-photon FLSM and FLIM experiments
    • …
    corecore