7,545 research outputs found

    TechMiner: Extracting Technologies from Academic Publications

    Get PDF
    In recent years we have seen the emergence of a variety of scholarly datasets. Typically these capture ‘standard’ scholarly entities and their connections, such as authors, affiliations, venues, publications, citations, and others. However, as the repositories grow and the technology improves, researchers are adding new entities to these repositories to develop a richer model of the scholarly domain. In this paper, we introduce TechMiner, a new approach, which combines NLP, machine learning and semantic technologies, for mining technologies from research publications and generating an OWL ontology describing their relationships with other research entities. The resulting knowledge base can support a number of tasks, such as: richer semantic search, which can exploit the technology dimension to support better retrieval of publications; richer expert search; monitoring the emergence and impact of new technologies, both within and across scientific fields; studying the scholarly dynamics associated with the emergence of new technologies; and others. TechMiner was evaluated on a manually annotated gold standard and the results indicate that it significantly outperforms alternative NLP approaches and that its semantic features improve performance significantly with respect to both recall and precision

    Quantum Metropolis Sampling

    Get PDF
    The original motivation to build a quantum computer came from Feynman who envisaged a machine capable of simulating generic quantum mechanical systems, a task that is believed to be intractable for classical computers. Such a machine would have a wide range of applications in the simulation of many-body quantum physics, including condensed matter physics, chemistry, and high energy physics. Part of Feynman's challenge was met by Lloyd who showed how to approximately decompose the time-evolution operator of interacting quantum particles into a short sequence of elementary gates, suitable for operation on a quantum computer. However, this left open the problem of how to simulate the equilibrium and static properties of quantum systems. This requires the preparation of ground and Gibbs states on a quantum computer. For classical systems, this problem is solved by the ubiquitous Metropolis algorithm, a method that basically acquired a monopoly for the simulation of interacting particles. Here, we demonstrate how to implement a quantum version of the Metropolis algorithm on a quantum computer. This algorithm permits to sample directly from the eigenstates of the Hamiltonian and thus evades the sign problem present in classical simulations. A small scale implementation of this algorithm can already be achieved with today's technologyComment: revised versio

    The Measurement of Wood Cell Parameters Using the Distance Transform

    Get PDF
    In this paper we present a new approach to the quantitative assessment of cell wood boundaries, using image processing techniques based upon the distance transform. It is demonstrated that the method produces the parameters of wall thickness and boundary perimeter using objective measures to estimate these parameters. Further, it is possible, using this technique, to segment the image of a sample of Eucalyptus regnans (mountain ash) into rays, lumens, and cell walls with minimum human intervention

    A Gaussian process framework for modelling instrumental systematics: application to transmission spectroscopy

    Full text link
    Transmission spectroscopy, which consists of measuring the wavelength-dependent absorption of starlight by a planet's atmosphere during a transit, is a powerful probe of atmospheric composition. However, the expected signal is typically orders of magnitude smaller than instrumental systematics, and the results are crucially dependent on the treatment of the latter. In this paper, we propose a new method to infer transit parameters in the presence of systematic noise using Gaussian processes, a technique widely used in the machine learning community for Bayesian regression and classification problems. Our method makes use of auxiliary information about the state of the instrument, but does so in a non-parametric manner, without imposing a specific dependence of the systematics on the instrumental parameters, and naturally allows for the correlated nature of the noise. We give an example application of the method to archival NICMOS transmission spectroscopy of the hot Jupiter HD 189733, which goes some way towards reconciling the controversy surrounding this dataset in the literature. Finally, we provide an appendix giving a general introduction to Gaussian processes for regression, in order to encourage their application to a wider range of problems.Comment: 6 figures, 1 table, accepted for publication in MNRA

    Improving Editorial Workflow and Metadata Quality at Springer Nature

    Get PDF
    Identifying the research topics that best describe the scope of a scientific publication is a crucial task for editors, in particular because the quality of these annotations determine how effectively users are able to discover the right content in online libraries. For this reason, Springer Nature, the world's largest academic book publisher, has traditionally entrusted this task to their most expert editors. These editors manually analyse all new books, possibly including hundreds of chapters, and produce a list of the most relevant topics. Hence, this process has traditionally been very expensive, time-consuming, and confined to a few senior editors. For these reasons, back in 2016 we developed Smart Topic Miner (STM), an ontology-driven application that assists the Springer Nature editorial team in annotating the volumes of all books covering conference proceedings in Computer Science. Since then STM has been regularly used by editors in Germany, China, Brazil, India, and Japan, for a total of about 800 volumes per year. Over the past three years the initial prototype has iteratively evolved in response to feedback from the users and evolving requirements. In this paper we present the most recent version of the tool and describe the evolution of the system over the years, the key lessons learnt, and the impact on the Springer Nature workflow. In particular, our solution has drastically reduced the time needed to annotate proceedings and significantly improved their discoverability, resulting in 9.3 million additional downloads. We also present a user study involving 9 editors, which yielded excellent results in term of usability, and report an evaluation of the new topic classifier used by STM, which outperforms previous versions in recall and F-measure

    Causal structure of the entanglement renormalization ansatz

    Get PDF
    We show that the multiscale entanglement renormalization ansatz (MERA) can be reformulated in terms of a causality constraint on discrete quantum dynamics. This causal structure is that of de Sitter space with a flat spacelike boundary, where the volume of a spacetime region corresponds to the number of variational parameters it contains. This result clarifies the nature of the ansatz, and suggests a generalization to quantum field theory. It also constitutes an independent justification of the connection between MERA and hyperbolic geometry which was proposed as a concrete implementation of the AdS-CFT correspondence

    Quasi-Adiabatic Continuation in Gapped Spin and Fermion Systems: Goldstone's Theorem and Flux Periodicity

    Full text link
    We apply the technique of quasi-adiabatic continuation to study systems with continuous symmetries. We first derive a general form of Goldstone's theorem applicable to gapped nonrelativistic systems with continuous symmetries. We then show that for a fermionic system with a spin gap, it is possible to insert π\pi-flux into a cylinder with only exponentially small change in the energy of the system, a scenario which covers several physically interesting cases such as an s-wave superconductor or a resonating valence bond state.Comment: 19 pages, 2 figures, final version in press at JSTA

    Space tug propulsion system failure mode, effects and criticality analysis

    Get PDF
    For purposes of the study, the propulsion system was considered as consisting of the following: (1) main engine system, (2) auxiliary propulsion system, (3) pneumatic system, (4) hydrogen feed, fill, drain and vent system, (5) oxygen feed, fill, drain and vent system, and (6) helium reentry purge system. Each component was critically examined to identify possible failure modes and the subsequent effect on mission success. Each space tug mission consists of three phases: launch to separation from shuttle, separation to redocking, and redocking to landing. The analysis considered the results of failure of a component during each phase of the mission. After the failure modes of each component were tabulated, those components whose failure would result in possible or certain loss of mission or inability to return the Tug to ground were identified as critical components and a criticality number determined for each. The criticality number of a component denotes the number of mission failures in one million missions due to the loss of that component. A total of 68 components were identified as critical with criticality numbers ranging from 1 to 2990

    The SSS phase of RS Ophiuchi observed with Chandra and XMM-Newton I.: Data and preliminary Modeling

    Full text link
    The phase of Super-Soft-Source (SSS) emission of the sixth recorded outburst of the recurrent nova RS Oph was observed twice with Chandra and once with XMM-Newton. The observations were taken on days 39.7, 54.0, and 66.9 after outburst. We confirm a 35-sec period on day 54.0 and found that it originates from the SSS emission and not from the shock. We discus the bound-free absorption by neutral elements in the line of sight, resonance absorption lines plus self-absorbed emission line components, collisionally excited emission lines from the shock, He-like intersystem lines, and spectral changes during an episode of high-amplitude variability. We find a decrease of the oxygen K-shell absorption edge that can be explained by photoionization of oxygen. The absorption component has average velocities of -1286+-267 km/s on day 39.7 and of -771+-65 km/s on day 66.9. The wavelengths of the emission line components are consistent with their rest wavelengths as confirmed by measurements of non-self absorbed He-like intersystem lines. We have evidence that these lines originate from the shock rather than the outer layers of the outflow and may be photoexcited in addition to collisional excitations. We found collisionally excited emission lines that are fading at wavelengths shorter than 15A that originate from the radiatively cooling shock. On day 39.5 we find a systematic blue shift of -526+-114 km/s from these lines. We found anomalous He-like f/i ratios which indicates either high densities or significant UV radiation near the plasma where the emission lines are formed. During the phase of strong variability the spectral hardness light curve overlies the total light curve when shifted by 1000sec. This can be explained by photoionization of neutral oxygen in the line of sight if the densities of order 10^{10}-10^{11} cm^{-3}.Comment: 16 pages, 10 figures, 4 tables. Accepted by ApJ; v2: Co-author Woodward adde

    Multiplayer Cost Games with Simple Nash Equilibria

    Full text link
    Multiplayer games with selfish agents naturally occur in the design of distributed and embedded systems. As the goals of selfish agents are usually neither equivalent nor antagonistic to each other, such games are non zero-sum games. We study such games and show that a large class of these games, including games where the individual objectives are mean- or discounted-payoff, or quantitative reachability, and show that they do not only have a solution, but a simple solution. We establish the existence of Nash equilibria that are composed of k memoryless strategies for each agent in a setting with k agents, one main and k-1 minor strategies. The main strategy describes what happens when all agents comply, whereas the minor strategies ensure that all other agents immediately start to co-operate against the agent who first deviates from the plan. This simplicity is important, as rational agents are an idealisation. Realistically, agents have to decide on their moves with very limited resources, and complicated strategies that require exponential--or even non-elementary--implementations cannot realistically be implemented. The existence of simple strategies that we prove in this paper therefore holds a promise of implementability.Comment: 23 page
    corecore