484 research outputs found

    Computing the Similarity Between Moving Curves

    Get PDF
    In this paper we study similarity measures for moving curves which can, for example, model changing coastlines or retreating glacier termini. Points on a moving curve have two parameters, namely the position along the curve as well as time. We therefore focus on similarity measures for surfaces, specifically the Fr\'echet distance between surfaces. While the Fr\'echet distance between surfaces is not even known to be computable, we show for variants arising in the context of moving curves that they are polynomial-time solvable or NP-complete depending on the restrictions imposed on how the moving curves are matched. We achieve the polynomial-time solutions by a novel approach for computing a surface in the so-called free-space diagram based on max-flow min-cut duality

    Evaluation of Quantitative EEG by Classification and Regression Trees to Characterize Responders to Antidepressant and Placebo Treatment

    Get PDF
    The study objective was to evaluate the usefulness of Classification and Regression Trees (CART), to classify clinical responders to antidepressant and placebo treatment, utilizing symptom severity and quantitative EEG (QEEG) data. Patients included 51 adults with unipolar depression who completed treatment trials using either fluoxetine, venlafaxine or placebo. Hamilton Depression Rating Scale (HAM-D) and single electrodes data were recorded at baseline, 2, 7, 14, 28 and 56 days. Patients were classified as medication and placebo responders or non-responders. CART analysis of HAM-D scores showed that patients with HAM-D scores lower than 13 by day 7 were more likely to be treatment responders to fluoxetine or venlafaxine compared to non-responders (p=0.001). Youden’s index γ revealed that CART models using QEEG measures were more accurate than HAM-D-based models. For patients given fluoxetine, patients with a decrease at day 2 in θ cordance at AF2 were classified by CART as treatment responders (p=0.02). For those receiving venlafaxine, CART identified a decrease in δ absolute power at day 7 at the PO2 region as characterizing treatment responders (p=0.01). Using all patients receiving medication, CART identified a decrease in δ absolute power at day 2 in the FP1 region as characteristic of nonresponse to medication (p=0.003). Optimal trees from the QEEG CART analysis primarily utilized cordance values, but also incorporated some δ absolute power values. The results of our study suggest that CART may be a useful method for identifying potential outcome predictors in the treatment of major depression

    Slowly Rotating General Relativistic Superfluid Neutron Stars with Relativistic Entrainment

    Full text link
    Neutron stars that are cold enough should have two or more superfluids/supercondutors in their inner crusts and cores. The implication of superfluidity/superconductivity for equilibrium and dynamical neutron star states is that each individual particle species that forms a condensate must have its own, independent number density current and equation of motion that determines that current. An important consequence of the quasiparticle nature of each condensate is the so-called entrainment effect, i.e. the momentum of a condensate is a linear combination of its own current and those of the other condensates. We present here the first fully relativistic modelling of slowly rotating superfluid neutron stars with entrainment that is accurate to the second-order in the rotation rates. The stars consist of superfluid neutrons, superconducting protons, and a highly degenerate, relativistic gas of electrons. We use a relativistic σ\sigma - ω\omega mean field model for the equation of state of the matter and the entrainment. We determine the effect of a relative rotation between the neutrons and protons on a star's total mass, shape, and Kepler, mass-shedding limit.Comment: 30 pages, 10 figures, uses ReVTeX

    Integrating the data envelopment analysis and the balanced scorecard approaches for enhanced performance assessment

    Get PDF
    This article presents the development of a conceptual framework which aims to assess Decision Making Units (DMUs)from multiple perspectives. The proposed conceptual framework combines the Balanced Scorecard(BSC)method with the non-parametric technique known as Data Envelopment Analysis (DEA) by using various interconnected models which try to encapsulate four perspectives of performance (financial, customers, internal processes,learning and growth). The practical relevance of the conceptual model has been tested by using it to assess the performance of DMUs in a multinational company which operates in two business areas.Various models were developed with the collaboration of the directors of the company in order to conceive an appropriate and consensual framework, which may provide useful information for the company.The application of the conceptual framework provides structured information regarding the performance of each DMU(from multiple perspectives)and ways to improve it.By integrating the BSC and the DEA approaches this research helps to identify where there is room for improving organisational performance and points out opportunities for reciprocal learning between DMUs.In doing so,this article provides a set of recommendations relating to the successful application of DEA and its integration with the BSC,in order to promote a continuous learning process and to bring about improvements in performance

    Using data envelopment analysis to support the design of process improvement interventions in electricity distribution

    Get PDF
    A significant number of studies have documented the use of Data Envelopment Analysis (DEA) for efficiency measurement in the context of electricity distribution, particularly at the level of the distribution utilities. However, their aim has been predominantly descriptive and classificatory, without any attempt to ‘open’ the black box of the transformation process. In contrast, our aim is to explore the potential of DEA to contribute to the design of effective process improvement interventions within a distribution utility. In particular, in this paper, we study an important question within the context of DEA analysis: that is, to investigate whether differences in efficiency can be attributed to a particular managerial programme or design feature. We use two different methodologies to undertake this type of analysis. Firstly, we apply Mann–Whitney rank statistics to the scores obtained from DEA in order to evaluate the statistical significance of the differences observed between an intervention programme and its control group programme. Secondly, we undertake dynamic analysis with the Malmquist Productivity Index in order to study the impact of the introduction of a new technology on a group of units. Our case study focuses on the performance evaluation of medium-voltage power lines belonging to one of the service areas in the Public Electricity Distribution System in Portugal. The results from our case study show that the application of DEA for process improvement interventions has great potential and should be explored in other contexts

    Massive stars as thermonuclear reactors and their explosions following core collapse

    Full text link
    Nuclear reactions transform atomic nuclei inside stars. This is the process of stellar nucleosynthesis. The basic concepts of determining nuclear reaction rates inside stars are reviewed. How stars manage to burn their fuel so slowly most of the time are also considered. Stellar thermonuclear reactions involving protons in hydrostatic burning are discussed first. Then I discuss triple alpha reactions in the helium burning stage. Carbon and oxygen survive in red giant stars because of the nuclear structure of oxygen and neon. Further nuclear burning of carbon, neon, oxygen and silicon in quiescent conditions are discussed next. In the subsequent core-collapse phase, neutronization due to electron capture from the top of the Fermi sea in a degenerate core takes place. The expected signal of neutrinos from a nearby supernova is calculated. The supernova often explodes inside a dense circumstellar medium, which is established due to the progenitor star losing its outermost envelope in a stellar wind or mass transfer in a binary system. The nature of the circumstellar medium and the ejecta of the supernova and their dynamics are revealed by observations in the optical, IR, radio, and X-ray bands, and I discuss some of these observations and their interpretations.Comment: To be published in " Principles and Perspectives in Cosmochemistry" Lecture Notes on Kodai School on Synthesis of Elements in Stars; ed. by Aruna Goswami & Eswar Reddy, Springer Verlag, 2009. Contains 21 figure

    Modeling a teacher in a tutorial-like system using Learning Automata

    Get PDF
    The goal of this paper is to present a novel approach to model the behavior of a Teacher in a Tutorial- like system. In this model, the Teacher is capable of presenting teaching material from a Socratic-type Domain model via multiple-choice questions. Since this knowledge is stored in the Domain model in chapters with different levels of complexity, the Teacher is able to present learning material of varying degrees of difficulty to the Students. In our model, we propose that the Teacher will be able to assist the Students to learn the more difficult material. In order to achieve this, he provides them with hints that are relative to the difficulty of the learning material presented. This enables the Students to cope with the process of handling more complex knowledge, and to be able to learn it appropriately. To our knowledge, the findings of this study are novel to the field of intelligent adaptation using Learning Automata (LA). The novelty lies in the fact that the learning system has a strategy by which it can deal with increasingly more complex/difficult Environments (or domains from which the learning as to be achieved). In our approach, the convergence of the Student models (represented by LA) is driven not only by the response of the Environment (Teacher), but also by the hints that are provided by the latter. Our proposed Teacher model has been tested against different benchmark Environments, and the results of these simulations have demonstrated the salient aspects of our model. The main conclusion is that Normal and Below-Normal learners benefited significantly from the hints provided by the Teacher, while the benefits to (brilliant) Fast learners were marginal. This seems to be in-line with our subjective understanding of the behavior of real-life Students

    A Model for the Development of the Rhizobial and Arbuscular Mycorrhizal Symbioses in Legumes and Its Use to Understand the Roles of Ethylene in the Establishment of these two Symbioses

    Get PDF
    We propose a model depicting the development of nodulation and arbuscular mycorrhizae. Both processes are dissected into many steps, using Pisum sativum L. nodulation mutants as a guideline. For nodulation, we distinguish two main developmental programs, one epidermal and one cortical. Whereas Nod factors alone affect the cortical program, bacteria are required to trigger the epidermal events. We propose that the two programs of the rhizobial symbiosis evolved separately and that, over time, they came to function together. The distinction between these two programs does not exist for arbuscular mycorrhizae development despite events occurring in both root tissues. Mutations that affect both symbioses are restricted to the epidermal program. We propose here sites of action and potential roles for ethylene during the formation of the two symbioses with a specific hypothesis for nodule organogenesis. Assuming the epidermis does not make ethylene, the microsymbionts probably first encounter a regulatory level of ethylene at the epidermis–outermost cortical cell layer interface. Depending on the hormone concentrations there, infection will either progress or be blocked. In the former case, ethylene affects the cortex cytoskeleton, allowing reorganization that facilitates infection; in the latter case, ethylene acts on several enzymes that interfere with infection thread growth, causing it to abort. Throughout this review, the difficulty of generalizing the roles of ethylene is emphasized and numerous examples are given to demonstrate the diversity that exists in plants

    The exposure of the hybrid detector of the Pierre Auger Observatory

    Get PDF
    The Pierre Auger Observatory is a detector for ultra-high energy cosmic rays. It consists of a surface array to measure secondary particles at ground level and a fluorescence detector to measure the development of air showers in the atmosphere above the array. The "hybrid" detection mode combines the information from the two subsystems. We describe the determination of the hybrid exposure for events observed by the fluorescence telescopes in coincidence with at least one water-Cherenkov detector of the surface array. A detailed knowledge of the time dependence of the detection operations is crucial for an accurate evaluation of the exposure. We discuss the relevance of monitoring data collected during operations, such as the status of the fluorescence detector, background light and atmospheric conditions, that are used in both simulation and reconstruction.Comment: Paper accepted by Astroparticle Physic
    corecore