469 research outputs found

    Kalman-filter-based EEG source localization

    Get PDF
    This thesis uses the Kalman filter (KF) to solve the electroencephalographic (EEG) inverse problem to image its neuronal sources. Chapter 1 introduces EEG source localization and the KF and discusses how it can solve the inverse problem. Chapter 2 introduces an EEG inverse solution using a spatially whitened KF (SWKF) to reduce the computational burden. Likelihood maximization is used to fit spatially uniform neural model parameters to simulated and clinical EEGs. The SWKF accurately reconstructs source dynamics. Filter performance is analyzed by computing the innovations’ statistical properties and identifying spatial variations in performance that could be improved by use of spatially varying parameters. Chapter 3 investigates the SWKF via one-dimensional (1D) simulations. Motivated by Chapter 2, two model parameters are given Gaussian spatial profiles to better reflect brain dynamics. Constrained optimization ensures estimated parameters have clear biophysical interpretations. Inverse solutions are also computed using the optimal linear KF. Both filters produce accurate state estimates. Spatially varying parameters are correctly identified from datasets with transient dynamics, but estimates for driven datasets are degraded by the unmodeled drive term. Chapter 4 treats the whole-brain EEG inverse problem and applies features of the 1D simulations to the SWKF of Chapter 2. Spatially varying parameters are used to model spatial variation of the alpha rhythm. The simulated EEG here exhibits wave-like patterns and spatially varying dynamics. As in Chapter 3, optimization constrains model parameters to appropriate ranges. State estimation is again reliable for simulated and clinical EEG, although spatially varying parameters do not improve accuracy and parameter estimation is unreliable, with wave velocity underestimated. Contributing factors are identified and approaches to overcome them are discussed. Chapter 5 summarizes the main findings and outlines future work

    Cattle welfare assessment at the slaughterhouse level: Integrated risk profiles based on the animal''s origin, pre-slaughter logistics, and iceberg indicators

    Get PDF
    Detection of on farm and transport animal welfare problems at slaughterhouse level is a key issue for the meat industry; however, usually, the assessments do not include basic aspects of animal health. For that reason, it is necessary to develop an assessment method that has an integrative scope and identifies the risk profiles in an-imals. Therefore, the aim of the present study was to detect cattle welfare indicators that can be implemented at the slaughterhouse level and to develop integrated risk profiles based on the animal''s origin, pre-slaughter lo-gistics, and animal-based indicators. We recorded the origin, commercial category, transportation details, and horn size of 1040 cattle upon arrival at the slaughterhouse. Cattle welfare was measured based on individual scores for vocalizations, stunning shots, carcass bruises, meat pH, severe hoof injuries, and organ condemnations. To characterize operational and logistic practices from the farm to the slaughterhouse, a two-step cluster analysis was applied to the aforementioned variables (production system, cattle type, horn size, journey distance, vehicle type), which identified four clusters: small feedlot and free-range profile (C1, n = 216, 20.8 %), feedlot profile (C2, n = 193, 18.6 %), culled dairy cows profile (C3, n = 262, 25.2 %), and free-range profile (C4, n = 369, 35.5 %). The animal''s diet and environmental conditions might have influenced the development of hoof disorders in C1 animals (P = 0.023), the proportion of animals that were re-shot was highest in C2 animals (P = 0.033), and C3 and C4 animals were most likely to suffer injuries such as severe bruising (P = 0.001). In addition, the number of stunning shots, meat pH, carcass bruises, severe hoof injuries, and liver condemnations, explained a significant variation in the incidence of various health and welfare consequences based on an animal''s origin, which confirmed their importance as ''welfare iceberg'' indicators. The study provided detailed data that can be included into assessment methods for the welfare of slaughter cattle, which can be tailored to specific production systems

    Scheduling periodic tasks in a hard real-time environment

    Get PDF
    We consider a real-time scheduling problem that occurs in the design of software-based aircraft control. The goal is to distribute tasks aui=(ci,pi) au_i=(c_i,p_i) on a minimum number of identical machines and to compute offsets aia_i for the tasks such that no collision occurs. A task aui au_i releases a job of running time cic_i at each time ai+kcdotpi,kinmathbbN0a_i + kcdot p_i,k in mathbb{N}_0 and a collision occurs if two jobs are simultaneously active on the same machine. We shed some light on the complexity and approximability landscape of this problem. Although the problem cannot be approximated within a factor of n1−varepsilonn^{1-varepsilon} for any varepsilon>0varepsilon>0, an interesting restriction is much more tractable: If the periods are dividing (for each i,ji,j one has pi∣pjp_i | p_j or pj∣pip_j | p_i), the problem allows for a better structured representation of solutions, which leads to a 2-approximation. This result is tight, even asymptotically

    An approach to construct wave packets with complete classical-quantum correspondence in non-relativistic quantum mechanics

    Full text link
    We introduce a method to construct wave packets with complete classical and quantum correspondence in one-dimensional non-relativistic quantum mechanics. First, we consider two similar oscillators with equal total energy. In classical domain, we can easily solve this model and obtain the trajectories in the space of variables. This picture in the quantum level is equivalent with a hyperbolic partial differential equation which gives us a freedom for choosing the initial wave function and its initial slope. By taking advantage of this freedom, we propose a method to choose an appropriate initial condition which is independent from the form of the oscillators. We then construct the wave packets for some cases and show that these wave packets closely follow the whole classical trajectories and peak on them. Moreover, we use de-Broglie Bohm interpretation of quantum mechanics to quantify this correspondence and show that the resulting Bohmian trajectories are also in a complete agreement with their classical counterparts.Comment: 15 pages, 13 figures, to appear in International Journal of Theoretical Physic

    Search for composite and exotic fermions at LEP 2

    Get PDF
    A search for unstable heavy fermions with the DELPHI detector at LEP is reported. Sequential and non-canonical leptons, as well as excited leptons and quarks, are considered. The data analysed correspond to an integrated luminosity of about 48 pb^{-1} at an e^+e^- centre-of-mass energy of 183 GeV and about 20 pb^{-1} equally shared between the centre-of-mass energies of 172 GeV and 161 GeV. The search for pair-produced new leptons establishes 95% confidence level mass limits in the region between 70 GeV/c^2 and 90 GeV/c^2, depending on the channel. The search for singly produced excited leptons and quarks establishes upper limits on the ratio of the coupling of the excited fermio

    Search for charginos in e+e- interactions at sqrt(s) = 189 GeV

    Full text link
    An update of the searches for charginos and gravitinos is presented, based on a data sample corresponding to the 158 pb^{-1} recorded by the DELPHI detector in 1998, at a centre-of-mass energy of 189 GeV. No evidence for a signal was found. The lower mass limits are 4-5 GeV/c^2 higher than those obtained at a centre-of-mass energy of 183 GeV. The (\mu,M_2) MSSM domain excluded by combining the chargino searches with neutralino searches at the Z resonance implies a limit on the mass of the lightest neutralino which, for a heavy sneutrino, is constrained to be above 31.0 GeV/c^2 for tan(beta) \geq 1.Comment: 22 pages, 8 figure

    Evidence for a mixed mass composition at the `ankle' in the cosmic-ray spectrum

    Get PDF
    We report a first measurement for ultra-high energy cosmic rays of the correlation between the depth of shower maximum and the signal in the water Cherenkov stations of air-showers registered simultaneously by the fluorescence and the surface detectors of the Pierre Auger Observatory. Such a correlation measurement is a unique feature of a hybrid air-shower observatory with sensitivity to both the electromagnetic and muonic components. It allows an accurate determination of the spread of primary masses in the cosmic-ray flux. Up till now, constraints on the spread of primary masses have been dominated by systematic uncertainties. The present correlation measurement is not affected by systematics in the measurement of the depth of shower maximum or the signal in the water Cherenkov stations. The analysis relies on general characteristics of air showers and is thus robust also with respect to uncertainties in hadronic event generators. The observed correlation in the energy range around the `ankle' at lg⁡(E/eV)=18.5−19.0\lg(E/{\rm eV})=18.5-19.0 differs significantly from expectations for pure primary cosmic-ray compositions. A light composition made up of proton and helium only is equally inconsistent with observations. The data are explained well by a mixed composition including nuclei with mass A>4A > 4. Scenarios such as the proton dip model, with almost pure compositions, are thus disfavoured as the sole explanation of the ultrahigh-energy cosmic-ray flux at Earth.Comment: Published version. Added journal reference and DOI. Added Report Numbe

    The influence of rice husk ash addition on the properties of metakaolin-based geopolymers

    Get PDF
    This paper investigates the replacement of metakaolin (MK) with rice husk ash (RHA) in the production of alkali-activated binders or geopolymers. The influence of the RHA addition on compressive and flexural strength, as well as water absorption and apparent porosity were determined, in terms of the percentage of RHA in the mixture and molar ratios of the mixes. Fourier Transform Infrared (FTIR) spectroscopy and Energy Dispersive spectroscopy (EDS) were carried out to assess the changes in the microstructure of the geopolymer matrices with the RHA addition. Results have shown that RHA may be a supplementary precursor for geopolymers. The composition of the geopolymer matrices containing 0-40% RHA is very similar, which indicates that the additional Si provided by RHA is not incorporated to the geopolymer matrix. In addition, geopolymers with RHA content higher than 40% present a plastic behavior, characterized by extremely low strength and high deformation, which can be attributed to the formation of silica gel in formulations containing variable Si/Al ratio

    Search for lightest neutralino and stau pair production in light gravitino scenarios with stau NLSP

    Get PDF
    Promptly decaying lightest neutralinos and long-lived staus are searched for in the context of light gravitino scenarios. It is assumed that the stau is the next to lightest supersymmetric particle (NLSP) and that the lightest neutralino is the next to NLSP (NNLSP). Data collected with the Delphi detector at centre-of-mass energies from 161 to 183 \GeV are analysed. No evidence of the production of these particles is found. Hence, lower mass limits for both kinds of particles are set at 95% C.L.. The mass of gaugino-like neutralinos is found to be greater than 71.5 GeV/c^2. In the search for long-lived stau, masses less than 70.0 to 77.5 \GeVcc are excluded for gravitino masses from 10 to 150 \eVcc . Combining this search with the searches for stable heavy leptons and Minimal Supersymmetric Standard Model staus a lower limit of 68.5 \GeVcc may be set for the stau mas
    • 

    corecore