22,740 research outputs found

    Seizure characterisation using frequency-dependent multivariate dynamics

    Get PDF
    The characterisation of epileptic seizures assists in the design of targeted pharmaceutical seizure prevention techniques and pre-surgical evaluations. In this paper, we expand on recent use of multivariate techniques to study the crosscorrelation dynamics between electroencephalographic (EEG) channels. The Maximum Overlap Discrete Wavelet Transform (MODWT) is applied in order to separate the EEG channels into their underlying frequencies. The dynamics of the cross-correlation matrix between channels, at each frequency, are then analysed in terms of the eigenspectrum. By examination of the eigenspectrum, we show that it is possible to identify frequency dependent changes in the correlation structure between channels which may be indicative of seizure activity. The technique is applied to EEG epileptiform data and the results indicate that the correlation dynamics vary over time and frequency, with larger correlations between channels at high frequencies. Additionally, a redistribution of wavelet energy is found, with increased fractional energy demonstrating the relative importance of high frequencies during seizures. Dynamical changes also occur in both correlation and energy at lower frequencies during seizures, suggesting that monitoring frequency dependent correlation structure can characterise changes in EEG signals during these. Future work will involve the study of other large eigenvalues and inter-frequency correlations to determine additional seizure characteristics

    Contexts, refinement and determinism

    Get PDF
    In this paper we have been influenced by those who take an “engineering view” of the problem of designing systems, i.e. a view that is motivated by what someone designing a real system will be concerned with, and what questions will arise as they work on their design. Specifically, we have borrowed from the testing work of Hennessy, de Nicola and van Glabbeek, e.g. [13, 5, 21, 40, 39]. Here we concentrate on one fundamental part of the engineering view and where consideration of it leads. The aspects we are concerned with are computational entities in contexts, observed by users. This leads to formalising design steps that are often left informal, and that in turn gives insights into non-determinism and ultimately leads to being able to use refinement in situations where existing techniques fail

    Characterisation of Strongly Normalising lambda-mu-Terms

    Full text link
    We provide a characterisation of strongly normalising terms of the lambda-mu-calculus by means of a type system that uses intersection and product types. The presence of the latter and a restricted use of the type omega enable us to represent the particular notion of continuation used in the literature for the definition of semantics for the lambda-mu-calculus. This makes it possible to lift the well-known characterisation property for strongly-normalising lambda-terms - that uses intersection types - to the lambda-mu-calculus. From this result an alternative proof of strong normalisation for terms typeable in Parigot's propositional logical system follows, by means of an interpretation of that system into ours.Comment: In Proceedings ITRS 2012, arXiv:1307.784

    Normalisation Control in Deep Inference via Atomic Flows

    Get PDF
    We introduce `atomic flows': they are graphs obtained from derivations by tracing atom occurrences and forgetting the logical structure. We study simple manipulations of atomic flows that correspond to complex reductions on derivations. This allows us to prove, for propositional logic, a new and very general normalisation theorem, which contains cut elimination as a special case. We operate in deep inference, which is more general than other syntactic paradigms, and where normalisation is more difficult to control. We argue that atomic flows are a significant technical advance for normalisation theory, because 1) the technique they support is largely independent of syntax; 2) indeed, it is largely independent of logical inference rules; 3) they constitute a powerful geometric formalism, which is more intuitive than syntax

    Nominal C-Unification

    Full text link
    Nominal unification is an extension of first-order unification that takes into account the \alpha-equivalence relation generated by binding operators, following the nominal approach. We propose a sound and complete procedure for nominal unification with commutative operators, or nominal C-unification for short, which has been formalised in Coq. The procedure transforms nominal C-unification problems into simpler (finite families) of fixpoint problems, whose solutions can be generated by algebraic techniques on combinatorics of permutations.Comment: Pre-proceedings paper presented at the 27th International Symposium on Logic-Based Program Synthesis and Transformation (LOPSTR 2017), Namur, Belgium, 10-12 October 2017 (arXiv:1708.07854

    Developments in plant breeding for improved nutritional quality of soya beans I. Protein and amino acid content

    Get PDF
    Soya beans, like other legumes, contain low concentrations of the nutritionally essential sulphur amino acid, methionine. Cysteine, although not an essential amino acid because it can be synthesized from methionine, also influences the nutritional quality of soya bean products when it is only present in low levels. A low cysteine content will also aggravate a methionine deficiency. Soya bean lines deficient in 7S protein subunits have been identified. The 7S proteins contain substantially less methionine and cysteine than the 11S proteins. With the myriad of genetic null alleles for these subunits it may be possible to tailor the 7S/11S storage protein ratio and their total composition in seeds to include only those subunits with the richest sulphur amino acid composition. Cotyledon feeding experiments, using isolated soya bean cotyledons, demonstrated that addition of methionine to the culture media caused increased synthesis of both proteins and free amino acids but the mechanism by which this takes place is not clear

    Application of sludge-based carbonaceous materials in a hybrid water treatment process based on adsorption and catalytic wet air oxidation

    Get PDF
    This paper describes a preliminary evaluation of the performance of carbonaceous materials prepared from sewage sludges (SBCMs) in a hybrid water treatment process based on adsorption and catalytic wet air oxidation; phenol was used as the model pollutant. Three different sewage sludges were treated by either carbonisation or steam activation, and the physico-chemical properties of the resultant carbonaceous materials (e.g. hardness, BET surface area, ash and elemental content, surface chemistry) were evaluated and compared with a commercial reference activated carbon (PICA F22). The adsorption capacity for phenol of the SBCMs was greater than suggested by their BET surface area, but less than F22; a steam activated, dewatered raw sludge (SA_DRAW) had the greatest adsorption capacity of the SBCMs in the investigated range of concentrations (<0.05 mol L−1). In batch oxidation tests, the SBCMs demonstrated catalytic behaviour arising from their substrate adsorptivity and metal content. Recycling of SA_DRAW in successive oxidations led to significant structural attrition and a hardened SA_DRAW was evaluated, but found to be unsatisfactory during the oxidation step. In a combined adsorption–oxidation sequence, both the PICA carbon and a selected SBCM showed deterioration in phenol adsorption after oxidative regeneration, but a steady state performance was reached after 2 or 3 cycles

    Review of recent research towards power cable life cycle management

    Get PDF
    Power cables are integral to modern urban power transmission and distribution systems. For power cable asset managers worldwide, a major challenge is how to manage effectively the expensive and vast network of cables, many of which are approaching, or have past, their design life. This study provides an in-depth review of recent research and development in cable failure analysis, condition monitoring and diagnosis, life assessment methods, fault location, and optimisation of maintenance and replacement strategies. These topics are essential to cable life cycle management (LCM), which aims to maximise the operational value of cable assets and is now being implemented in many power utility companies. The review expands on material presented at the 2015 JiCable conference and incorporates other recent publications. The review concludes that the full potential of cable condition monitoring, condition and life assessment has not fully realised. It is proposed that a combination of physics-based life modelling and statistical approaches, giving consideration to practical condition monitoring results and insulation response to in-service stress factors and short term stresses, such as water ingress, mechanical damage and imperfections left from manufacturing and installation processes, will be key to success in improved LCM of the vast amount of cable assets around the world
    corecore