8,579 research outputs found
AUGUR: Forecasting the Emergence of New Research Topics
Being able to rapidly recognise new research trends is strategic for many stakeholders, including universities, institutional funding bodies, academic publishers and companies. The literature presents several approaches to identifying the emergence of new research topics, which rely on the assumption that the topic is already exhibiting a certain degree of popularity and consistently referred to by a community of researchers. However, detecting the emergence of a new research area at an embryonic stage, i.e., before the topic has been consistently labelled by a community of researchers and associated with a number of publications, is still an open challenge. We address this issue by introducing Augur, a novel approach to the early detection of research topics. Augur analyses the diachronic relationships between research areas and is able to detect clusters of topics that exhibit dynamics correlated with the emergence of new research topics. Here we also present the Advanced Clique Percolation Method (ACPM), a new community detection algorithm developed specifically for supporting this task. Augur was evaluated on a gold standard of 1,408 debutant topics in the 2000-2011 interval and outperformed four alternative approaches in terms of both precision and recall
TechMiner: Extracting Technologies from Academic Publications
In recent years we have seen the emergence of a variety of scholarly datasets. Typically these capture âstandardâ scholarly entities and their connections, such as authors, affiliations, venues, publications, citations, and others. However, as the repositories grow and the technology improves, researchers are adding new entities to these repositories to develop a richer model of the scholarly domain. In this paper, we introduce TechMiner, a new approach, which combines NLP, machine learning and semantic technologies, for mining technologies from research publications and generating an OWL ontology describing their relationships with other research entities. The resulting knowledge base can support a number of tasks, such as: richer semantic search, which can exploit the technology dimension to support better retrieval of publications; richer expert search; monitoring the emergence and impact of new technologies, both within and across scientific fields; studying the scholarly dynamics associated with the emergence of new technologies; and others. TechMiner was evaluated on a manually annotated gold standard and the results indicate that it significantly outperforms alternative NLP approaches and that its semantic features improve performance significantly with respect to both recall and precision
The Measurement of Wood Cell Parameters Using the Distance Transform
In this paper we present a new approach to the quantitative assessment of cell wood boundaries, using image processing techniques based upon the distance transform. It is demonstrated that the method produces the parameters of wall thickness and boundary perimeter using objective measures to estimate these parameters. Further, it is possible, using this technique, to segment the image of a sample of Eucalyptus regnans (mountain ash) into rays, lumens, and cell walls with minimum human intervention
Exploring scholarly data with Rexplore.
Despite the large number and variety of tools and services available today for exploring scholarly data, current support is still very limited in the context of sensemaking tasks, which go beyond standard search and ranking of authors and publications, and focus instead on i) understanding the dynamics of research areas, ii) relating authors âsemanticallyâ (e.g., in terms of common interests or shared academic trajectories), or iii) performing fine-grained academic expert search along multiple dimensions. To address this gap we have developed a novel tool, Rexplore, which integrates statistical analysis, semantic technologies, and visual analytics to provide effective support for exploring and making sense of scholarly data. Here, we describe the main innovative elements of the tool and we present the results from a task-centric empirical evaluation, which shows that Rexplore is highly effective at providing support for the aforementioned sensemaking tasks. In addition, these results are robust both with respect to the background of the users (i.e., expert analysts vs. âordinaryâ users) and also with respect to whether the tasks are selected by the evaluators or proposed by the users themselves
Space tug propulsion system failure mode, effects and criticality analysis
For purposes of the study, the propulsion system was considered as consisting of the following: (1) main engine system, (2) auxiliary propulsion system, (3) pneumatic system, (4) hydrogen feed, fill, drain and vent system, (5) oxygen feed, fill, drain and vent system, and (6) helium reentry purge system. Each component was critically examined to identify possible failure modes and the subsequent effect on mission success. Each space tug mission consists of three phases: launch to separation from shuttle, separation to redocking, and redocking to landing. The analysis considered the results of failure of a component during each phase of the mission. After the failure modes of each component were tabulated, those components whose failure would result in possible or certain loss of mission or inability to return the Tug to ground were identified as critical components and a criticality number determined for each. The criticality number of a component denotes the number of mission failures in one million missions due to the loss of that component. A total of 68 components were identified as critical with criticality numbers ranging from 1 to 2990
Causal structure of the entanglement renormalization ansatz
We show that the multiscale entanglement renormalization ansatz (MERA) can be
reformulated in terms of a causality constraint on discrete quantum dynamics.
This causal structure is that of de Sitter space with a flat spacelike
boundary, where the volume of a spacetime region corresponds to the number of
variational parameters it contains. This result clarifies the nature of the
ansatz, and suggests a generalization to quantum field theory. It also
constitutes an independent justification of the connection between MERA and
hyperbolic geometry which was proposed as a concrete implementation of the
AdS-CFT correspondence
The SSS phase of RS Ophiuchi observed with Chandra and XMM-Newton I.: Data and preliminary Modeling
The phase of Super-Soft-Source (SSS) emission of the sixth recorded outburst
of the recurrent nova RS Oph was observed twice with Chandra and once with
XMM-Newton. The observations were taken on days 39.7, 54.0, and 66.9 after
outburst. We confirm a 35-sec period on day 54.0 and found that it originates
from the SSS emission and not from the shock. We discus the bound-free
absorption by neutral elements in the line of sight, resonance absorption lines
plus self-absorbed emission line components, collisionally excited emission
lines from the shock, He-like intersystem lines, and spectral changes during an
episode of high-amplitude variability. We find a decrease of the oxygen K-shell
absorption edge that can be explained by photoionization of oxygen. The
absorption component has average velocities of -1286+-267 km/s on day 39.7 and
of -771+-65 km/s on day 66.9. The wavelengths of the emission line components
are consistent with their rest wavelengths as confirmed by measurements of
non-self absorbed He-like intersystem lines. We have evidence that these lines
originate from the shock rather than the outer layers of the outflow and may be
photoexcited in addition to collisional excitations. We found collisionally
excited emission lines that are fading at wavelengths shorter than 15A that
originate from the radiatively cooling shock. On day 39.5 we find a systematic
blue shift of -526+-114 km/s from these lines. We found anomalous He-like f/i
ratios which indicates either high densities or significant UV radiation near
the plasma where the emission lines are formed. During the phase of strong
variability the spectral hardness light curve overlies the total light curve
when shifted by 1000sec. This can be explained by photoionization of neutral
oxygen in the line of sight if the densities of order 10^{10}-10^{11} cm^{-3}.Comment: 16 pages, 10 figures, 4 tables. Accepted by ApJ; v2: Co-author
Woodward adde
Estimating the environmental impact of dairy cattle breeding programs through emission intensity.
A recently developed methodological approach for determining the greenhouse gas emissions impact of national breeding programs was applied to measure the effects of current and future breeding goals on the emission intensity (EI) of the Canadian dairy industry. Emission intensity is the ratio of greenhouse gas outputted in comparison to the product generated. Traits under investigation affected EI by either decreasing the direct emissions yield (i.e. increasing feed performance), changing herd structure (i.e. prolonging herd life) or through the dilution effect of increased production (i.e. increasing fat yield). The intensity value (IV) of each trait, defined as the change in emissions' intensity per unit change in each trait, was calculated for each of the investigated traits. The IV trend of these traits was compared for the current and prospective selection index, as well as for a system with and without quota (the supply management policy designed to prevent overproduction). The overall EI of the average genetic merit Canadian dairy herd per breeding female was 5.07 kg CO2eq/kg protein equivalent output. The annual reduction in EI due to the improvement of production traits was -0.027, -0.018 and -0.006 for fat, protein and milk other solids, respectively. The functional traits, herd life and mastitis resistance, had more modest effects (-0.008 and -0.001, respectively). These results are consistent with international studies that identified traits related to production, survival, health and fertility as having the largest impact on the environmental footprint of dairy cattle. Overall, the dairy industry is becoming more efficient by reducing its EI through selection of environmentally favorable traits, with a 1% annual reduction of EI in Canada
Entanglement between a qubit and the environment in the spin-boson model
The quantitative description of the quantum entanglement between a qubit and
its environment is considered. Specifically, for the ground state of the
spin-boson model, the entropy of entanglement of the spin is calculated as a
function of , the strength of the ohmic coupling to the environment,
and , the level asymmetry. This is done by a numerical
renormalization group treatment of the related anisotropic Kondo model. For
, the entanglement increases monotonically with , until it
becomes maximal for . For fixed , the entanglement
is a maximum as a function of for a value, .Comment: 4 pages, 3 figures. Shortened version restricted to groundstate
entanglemen
- âŠ