2,507 research outputs found

    Electrostatic Field Classifier for Deficient Data

    Get PDF
    This paper investigates the suitability of recently developed models based on the physical field phenomena for classification problems with incomplete datasets. An original approach to exploiting incomplete training data with missing features and labels, involving extensive use of electrostatic charge analogy, has been proposed. Classification of incomplete patterns has been investigated using a local dimensionality reduction technique, which aims at exploiting all available information rather than trying to estimate the missing values. The performance of all proposed methods has been tested on a number of benchmark datasets for a wide range of missing data scenarios and compared to the performance of some standard techniques. Several modifications of the original electrostatic field classifier aiming at improving speed and robustness in higher dimensional spaces are also discussed

    Issues in modern bone histomorphometry

    Get PDF
    This review reports on proceedings of a bone histomorphometry session conducted at the Fortieth International IBMS Sun Valley Skeletal Tissue Biology Workshop held on August 1, 2010. The session was prompted by recent technical problems encountered in conducting histomorphometry on bone biopsies from humans and animals treated with anti-remodeling agents such as bisphosphonates and RANKL antibodies. These agents reduce remodeling substantially, and thus cause problems in calculating bone remodeling dynamics using in vivo fluorochrome labeling. The tissue specimens often contain few or no fluorochrome labels, and thus create statistical and other problems in analyzing variables such as mineral apposition rates, mineralizing surface and bone formation rates. The conference attendees discussed these problems and their resolutions, and the proceedings reported here summarize their discussions and recommendations

    Pareto versus lognormal: a maximum entropy test

    Get PDF
    It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units

    Strain control of superlattice implies weak charge-lattice coupling in La0.5_{0.5}Ca0.5_{0.5}MnO3_3

    Full text link
    We have recently argued that manganites do not possess stripes of charge order, implying that the electron-lattice coupling is weak [Phys Rev Lett \textbf{94} (2005) 097202]. Here we independently argue the same conclusion based on transmission electron microscopy measurements of a nanopatterned epitaxial film of La0.5_{0.5}Ca0.5_{0.5}MnO3_3. In strain relaxed regions, the superlattice period is modified by 2-3% with respect to the parent lattice, suggesting that the two are not strongly tied.Comment: 4 pages, 4 figures It is now explained why the work provides evidence to support weak-coupling, and rule out charge orde

    Tactile Discrimination Using Template Classifiers: Towards a Model of Feature Extraction in Mammalian Vibrissal Systems

    Get PDF
    Rats and other whiskered mammals are capable of making sophisticated sensory discriminations using tactile signals from their facial whiskers (vibrissae). As part of a programme of work to develop biomimetic technologies for vibrissal sensing, including whiskered robots, we are devising algorithms for the fast extraction of object parameters from whisker deflection data. Previous work has demonstrated that radial distance to contact can be estimated from forces measured at the base of the whisker shaft. We show that in the case of a moving object contacting a whisker, the measured force can be ambiguous in distinguishing a nearby object moving slowly from a more distant object moving rapidly. This ambiguity can be resolved by simultaneously extracting object position and speed from the whisker deflection time series – that is by attending to the dynamics of the whisker’s interaction with the object. We compare a simple classifier with an adaptive EM (Expectation Maximisation) classifier. Both systems are effective at simultaneously extracting the two parameters, the EM-classifier showing similar performance to a handpicked template classifier. We propose that adaptive classification algorithms can provide insights into the types of computations performed in the rat vibrissal system when the animal is faced with a discrimination task

    Quantum theory of incompatible observations

    Get PDF
    Maximum likelihood principle is shown to be the best measure for relating the experimental data with the predictions of quantum theory.Comment: 3 page

    Quantum polarization tomography of bright squeezed light

    Full text link
    We reconstruct the polarization sector of a bright polarization squeezed beam starting from a complete set of Stokes measurements. Given the symmetry that underlies the polarization structure of quantum fields, we use the unique SU(2) Wigner distribution to represent states. In the limit of localized and bright states, the Wigner function can be approximated by an inverse three-dimensional Radon transform. We compare this direct reconstruction with the results of a maximum likelihood estimation, finding an excellent agreement.Comment: 15 pages, 5 figures. Contribution to New Journal of Physics, Focus Issue on Quantum Tomography. Comments welcom

    Inferential models: A framework for prior-free posterior probabilistic inference

    Full text link
    Posterior probabilistic statistical inference without priors is an important but so far elusive goal. Fisher's fiducial inference, Dempster-Shafer theory of belief functions, and Bayesian inference with default priors are attempts to achieve this goal but, to date, none has given a completely satisfactory picture. This paper presents a new framework for probabilistic inference, based on inferential models (IMs), which not only provides data-dependent probabilistic measures of uncertainty about the unknown parameter, but does so with an automatic long-run frequency calibration property. The key to this new approach is the identification of an unobservable auxiliary variable associated with observable data and unknown parameter, and the prediction of this auxiliary variable with a random set before conditioning on data. Here we present a three-step IM construction, and prove a frequency-calibration property of the IM's belief function under mild conditions. A corresponding optimality theory is developed, which helps to resolve the non-uniqueness issue. Several examples are presented to illustrate this new approach.Comment: 29 pages with 3 figures. Main text is the same as the published version. Appendix B is an addition, not in the published version, that contains some corrections and extensions of two of the main theorem

    The challenge of patients’ unmet palliative care needs in the final stages of chronic illness.

    Get PDF
    Background: There is consensus in the literature that the end of life care for patients with chronic illness is suboptimal, but research on the specific needs of this population is limited. Aim: This study aimed to use a mixed methodology and case study approach to explore the palliative care needs of patients with a non-cancer diagnosis from the perspectives of the patient, their significant other and the clinical team responsible for their care. Patients (n 18) had a diagnosis of either end-stage heart failure, renal failure or respiratory disease. Methods: The Short Form 36 and Hospital and Anxiety and Depression Questionnaire were completed by all patients. Unstructured interviews were (n 35) were conducted separately with each patient and then their significant other. These were followed by a focus group discussion (n 18) with the multiprofessional clinical team. Quantitative data were analysed using simple descriptive statistics and simple descriptive statistics. All qualitative data were taped, transcribed and analysed using Colaizzi’s approach to qualitative analysis. Findings: Deteriorating health status was the central theme derived from this analysis. It led to decreased independence, social isolation and family burden. These problems were mitigated by the limited resources at the individual’s disposal and the availability of support from hospital and community services. Generally resources and support were perceived as lacking. All participants in this study expressed concerns regarding the patients’ future and some patients described feelings of depression or acceptance of the inevitability of imminent death. Conclusion: Patients dying from chronic illness in this study had many concerns and unmet clinical needs. Care teams were frustrated by the lack of resources available to them and admitted they were ill-equipped to provide for the individual’s holistic needs. Some clinicians described difficulty in talking openly with the patient and family regarding the palliative nature of their treatment. An earlier and more effective implementation of the palliative care approach is necessary if the needs of patients in the final stages of chronic illness are to be adequately addressed. P

    Cryptotomography: reconstructing 3D Fourier intensities from randomly oriented single-shot diffraction patterns

    Full text link
    We reconstructed the 3D Fourier intensity distribution of mono-disperse prolate nano-particles using single-shot 2D coherent diffraction patterns collected at DESY's FLASH facility when a bright, coherent, ultrafast X-ray pulse intercepted individual particles of random, unmeasured orientations. This first experimental demonstration of cryptotomography extended the Expansion-Maximization-Compression (EMC) framework to accommodate unmeasured fluctuations in photon fluence and loss of data due to saturation or background scatter. This work is an important step towards realizing single-shot diffraction imaging of single biomolecules.Comment: 4 pages, 4 figure
    corecore