332 research outputs found

    Aliasing phenomenon due to CCD sensors

    Get PDF
    This article deals with the aliasing phenomenon due to CCD sensors . Those detectors inherently carry out a sampling and a low pass filtering process . We propose a superresolution technique reducing aliasing, and allowing a deconvolution over an extended spectral support to diminish the low pass filtering of CCD cells . This method improves the two restricting factors (low pass filtering and perturbation of low frequencies by folded high frequencies) inherent to this type of detector .Nous montrons dans cet article l'importance du repliement spectral lié à l'utilisation des détecteurs constitués de cellules CCD . Pour cela nous rappelons l'effet sur son spectre de l'échantillonnage et du lissage d'un signal ; ce sont les deux particularités de ces capteurs . Nous proposons une méthode de superrésolution qui réduit l'effet du repliement spectral, et qui permet de déconvoluer le signal sur un support spectral plus étendu afin de diminuer l'effet de lissage des cellules CCD . La méthode proposée permet donc d'améliorer les deux facteurs limitatifs (filtrage passe bas et perturbation de ces basses fréquences par les hautes fréquences) liés à la nature même de ce type de détecteur

    Ground state correlations and mean-field in 16^{16}O

    Get PDF
    We use the coupled cluster expansion (exp(S)\exp(S) method) to generate the complete ground state correlations due to the NN interaction. Part of this procedure is the calculation of the two-body G matrix inside the nucleus in which it is being used. This formalism is being applied to 16O^{16}O in a configuration space of 50 ω\hbar\omega. The resulting ground state wave function is used to calculate the binding energy and one- and two-body densities for the ground state of 16O^{16}O.Comment: 9 pages, 9 figures, LaTe

    Kernel Methods for Document Filtering

    No full text
    This paper describes the algorithms implemented by the KerMIT consortium for its participation in the Trec 2002 Filtering track. The consortium submitted runs for the routing task using a linear SVM, for the batch task using the same SVM in combination with an innovation threshold-selection mechanism, and for the adaptive task using both a second-order perceptron and a combination of SVM and perceptron with uneven margin. Results seem to indicate that these algorithm performed relatively well on the extensive TREC benchmark

    Faster Person Re-Identification

    Full text link
    Fast person re-identification (ReID) aims to search person images quickly and accurately. The main idea of recent fast ReID methods is the hashing algorithm, which learns compact binary codes and performs fast Hamming distance and counting sort. However, a very long code is needed for high accuracy (e.g. 2048), which compromises search speed. In this work, we introduce a new solution for fast ReID by formulating a novel Coarse-to-Fine (CtF) hashing code search strategy, which complementarily uses short and long codes, achieving both faster speed and better accuracy. It uses shorter codes to coarsely rank broad matching similarities and longer codes to refine only a few top candidates for more accurate instance ReID. Specifically, we design an All-in-One (AiO) framework together with a Distance Threshold Optimization (DTO) algorithm. In AiO, we simultaneously learn and enhance multiple codes of different lengths in a single model. It learns multiple codes in a pyramid structure, and encourage shorter codes to mimic longer codes by self-distillation. DTO solves a complex threshold search problem by a simple optimization process, and the balance between accuracy and speed is easily controlled by a single parameter. It formulates the optimization target as a FβF_{\beta} score that can be optimised by Gaussian cumulative distribution functions. Experimental results on 2 datasets show that our proposed method (CtF) is not only 8% more accurate but also 5x faster than contemporary hashing ReID methods. Compared with non-hashing ReID methods, CtF is 50×50\times faster with comparable accuracy. Code is available at https://github.com/wangguanan/light-reid.Comment: accepted by ECCV2020, https://github.com/wangguanan/light-rei

    Parity Violating Measurements of Neutron Densities

    Get PDF
    Parity violating electron nucleus scattering is a clean and powerful tool for measuring the spatial distributions of neutrons in nuclei with unprecedented accuracy. Parity violation arises from the interference of electromagnetic and weak neutral amplitudes, and the Z0Z^0 of the Standard Model couples primarily to neutrons at low Q2Q^2. The data can be interpreted with as much confidence as electromagnetic scattering. After briefly reviewing the present theoretical and experimental knowledge of neutron densities, we discuss possible parity violation measurements, their theoretical interpretation, and applications. The experiments are feasible at existing facilities. We show that theoretical corrections are either small or well understood, which makes the interpretation clean. The quantitative relationship to atomic parity nonconservation observables is examined, and we show that the electron scattering asymmetries can be directly applied to atomic PNC because the observables have approximately the same dependence on nuclear shape.Comment: 38 pages, 7 ps figures, very minor changes, submitted to Phys. Rev.

    Exploratory fMRI analysis without spatial normalization

    Get PDF
    Author Manuscript received 2010 March 11. 21st International Conference, IPMI 2009, Williamsburg, VA, USA, July 5-10, 2009. ProceedingsWe present an exploratory method for simultaneous parcellation of multisubject fMRI data into functionally coherent areas. The method is based on a solely functional representation of the fMRI data and a hierarchical probabilistic model that accounts for both inter-subject and intra-subject forms of variability in fMRI response. We employ a Variational Bayes approximation to fit the model to the data. The resulting algorithm finds a functional parcellation of the individual brains along with a set of population-level clusters, establishing correspondence between these two levels. The model eliminates the need for spatial normalization while still enabling us to fuse data from several subjects. We demonstrate the application of our method on a visual fMRI study.McGovern Institute for Brain Research at MIT. Neurotechnology ProgramNational Science Foundation (U.S.) (CAREER Grant 0642971)National Institutes of Health (U.S.) (NIBIB NAMIC U54-EB005149)National Institutes of Health (U.S.) (NCRR NAC P41-RR13218

    Hemodynamic-informed parcellation of fMRI data in a Joint Detection Estimation framework

    Get PDF
    International audienceIdentifying brain hemodynamics in event-related functional MRI (fMRI) data is a crucial issue to disentangle the vascular response from the neuronal activity in the BOLD signal. This question is usually addressed by estimating the so-called Hemodynamic Response Function (HRF). Voxelwise or region-/parcelwise inference schemes have been proposed to achieve this goal but so far all known contributions commit to pre-specified spatial supports for the hemodynamic territories by defining these supports either as individual voxels or a priori fixed brain parcels. In this paper, we introduce a Joint Parcellation-Detection-Estimation (JPDE) procedure that incorporates an adaptive parcel identification step based upon local hemodynamic properties. Efficient inference of both evoked activity, HRF shapes and supports is then achieved using variational approximations. Validation on synthetic and real fMRI data demonstrate the JPDE performance over standard detection estimation schemes and suggest it as a new brain exploration tool
    corecore