2,196 research outputs found

    Bayesian multi-modal model comparison: a case study on the generators of the spike and the wave in generalized spike–wave complexes

    Get PDF
    We present a novel approach to assess the networks involved in the generation of spontaneous pathological brain activity based on multi-modal imaging data. We propose to use probabilistic fMRI-constrained EEG source reconstruction as a complement to EEG-correlated fMRI analysis to disambiguate between networks that co-occur at the fMRI time resolution. The method is based on Bayesian model comparison, where the different models correspond to different combinations of fMRI-activated (or deactivated) cortical clusters. By computing the model evidence (or marginal likelihood) of each and every candidate source space partition, we can infer the most probable set of fMRI regions that has generated a given EEG scalp data window. We illustrate the method using EEG-correlated fMRI data acquired in a patient with ictal generalized spike–wave (GSW) discharges, to examine whether different networks are involved in the generation of the spike and the wave components, respectively. To this effect, we compared a family of 128 EEG source models, based on the combinations of seven regions haemodynamically involved (deactivated) during a prolonged ictal GSW discharge, namely: bilateral precuneus, bilateral medial frontal gyrus, bilateral middle temporal gyrus, and right cuneus. Bayesian model comparison has revealed the most likely model associated with the spike component to consist of a prefrontal region and bilateral temporal–parietal regions and the most likely model associated with the wave component to comprise the same temporal–parietal regions only. The result supports the hypothesis of different neurophysiological mechanisms underlying the generation of the spike versus wave components of GSW discharges

    On Self-Dual Quantum Codes, Graphs, and Boolean Functions

    Get PDF
    A short introduction to quantum error correction is given, and it is shown that zero-dimensional quantum codes can be represented as self-dual additive codes over GF(4) and also as graphs. We show that graphs representing several such codes with high minimum distance can be described as nested regular graphs having minimum regular vertex degree and containing long cycles. Two graphs correspond to equivalent quantum codes if they are related by a sequence of local complementations. We use this operation to generate orbits of graphs, and thus classify all inequivalent self-dual additive codes over GF(4) of length up to 12, where previously only all codes of length up to 9 were known. We show that these codes can be interpreted as quadratic Boolean functions, and we define non-quadratic quantum codes, corresponding to Boolean functions of higher degree. We look at various cryptographic properties of Boolean functions, in particular the propagation criteria. The new aperiodic propagation criterion (APC) and the APC distance are then defined. We show that the distance of a zero-dimensional quantum code is equal to the APC distance of the corresponding Boolean function. Orbits of Boolean functions with respect to the {I,H,N}^n transform set are generated. We also study the peak-to-average power ratio with respect to the {I,H,N}^n transform set (PAR_IHN), and prove that PAR_IHN of a quadratic Boolean function is related to the size of the maximum independent set over the corresponding orbit of graphs. A construction technique for non-quadratic Boolean functions with low PAR_IHN is proposed. It is finally shown that both PAR_IHN and APC distance can be interpreted as partial entanglement measures.Comment: Master's thesis. 105 pages, 33 figure

    A generalised formulation of G-continuous Bezier elements applied to non-linear MHD simulations

    Get PDF
    The international tokamak ITER is progressing towards assembly completion and first-plasma operation, which will be a physics and engineering challenge for the fusion community. In the preparation for ITER experimental scenarios, non-linear MHD simulations are playing an essential role to actively understand and predict the behaviour and stability of tokamak plasmas in future fusion power plant. The development of MHD codes like JOREK is a key aspect of this research effort, and provides invaluable insight into the plasma stability and the control of global and localised plasma events, like Edge-Localised-Mode and disruptions. In this paper, we present an operational implementation of a new, generalised formulation of Bezier finite-elements applied to the JOREK code, a significant advancement from the previously G1-continuous bi-cubic Bezier elements. This new mathematical method enables any polynomial order of Bezier elements, with a guarantee of G-continuity at the level of (n−1)/2, for any odd n, where n is the order of the Bezier polynomials. The generalised method is defined, and a rigorous mathematical proof is provided for the G-continuity requirement. Key details on the code implementation are mentioned, together with a suite of tests to demonstrate the mathematical reliability of the finite-element method, as well as the practical usability for typical non-linear tokamak MHD simulations. A demonstration for a state-of-the-art simulation of an Edge-Localised-Mode instability in the future ITER tokamak, with realistic grid geometry, finalises the study.</p

    A treatment of stereochemistry in computer aided organic synthesis

    Get PDF
    This thesis describes the author’s contributions to a new stereochemical processing module constructed for the ARChem retrosynthesis program. The purpose of the module is to add the ability to perform enantioselective and diastereoselective retrosynthetic disconnections and generate appropriate precursor molecules. The module uses evidence based rules generated from a large database of literature reactions. Chapter 1 provides an introduction and critical review of the published body of work for computer aided synthesis design. The role of computer perception of key structural features (rings, functions groups etc.) and the construction and use of reaction transforms for generating precursors is discussed. Emphasis is also given to the application of strategies in retrosynthetic analysis. The availability of large reaction databases has enabled a new generation of retrosynthesis design programs to be developed that use automatically generated transforms assembled from published reactions. A brief description of the transform generation method employed by ARChem is given. Chapter 2 describes the algorithms devised by the author for handling the computer recognition and representation of the stereochemical features found in molecule and reaction scheme diagrams. The approach is generalised and uses flexible recognition patterns to transform information found in chemical diagrams into concise stereo descriptors for computer processing. An algorithm for efficiently comparing and classifying pairs of stereo descriptors is described. This algorithm is central for solving the stereochemical constraints in a variety of substructure matching problems addressed in chapter 3. The concise representation of reactions and transform rules as hyperstructure graphs is described. Chapter 3 is concerned with the efficient and reliable detection of stereochemical symmetry in both molecules, reactions and rules. A novel symmetry perception algorithm, based on a constraints satisfaction problem (CSP) solver, is described. The use of a CSP solver to implement an isomorph‐free matching algorithm for stereochemical substructure matching is detailed. The prime function of this algorithm is to seek out unique retron locations in target molecules and then to generate precursor molecules without duplications due to symmetry. Novel algorithms for classifying asymmetric, pseudo‐asymmetric and symmetric stereocentres; meso, centro, and C2 symmetric molecules; and the stereotopicity of trigonal (sp2) centres are described. Chapter 4 introduces and formalises the annotated structural language used to create both retrosynthetic rules and the patterns used for functional group recognition. A novel functional group recognition package is described along with its use to detect important electronic features such as electron‐withdrawing or donating groups and leaving groups. The functional groups and electronic features are used as constraints in retron rules to improve transform relevance. Chapter 5 details the approach taken to design detailed stereoselective and substrate controlled transforms from organised hierarchies of rules. The rules employ a rich set of constraints annotations that concisely describe the keying retrons. The application of the transforms for collating evidence based scoring parameters from published reaction examples is described. A survey of available reaction databases and the techniques for mining stereoselective reactions is demonstrated. A data mining tool was developed for finding the best reputable stereoselective reaction types for coding as transforms. For various reasons it was not possible during the research period to fully integrate this work with the ARChem program. Instead, Chapter 6 introduces a novel one‐step retrosynthesis module to test the developed transforms. The retrosynthesis algorithms use the organisation of the transform rule hierarchy to efficiently locate the best retron matches using all applicable stereoselective transforms. This module was tested using a small set of selected target molecules and the generated routes were ranked using a series of measured parameters including: stereocentre clearance and bond cleavage; example reputation; estimated stereoselectivity with reliability; and evidence of tolerated functional groups. In addition a method for detecting regioselectivity issues is presented. This work presents a number of algorithms using common set and graph theory operations and notations. Appendix A lists the set theory symbols and meanings. Appendix B summarises and defines the common graph theory terminology used throughout this thesis
    • 

    corecore