101,061 research outputs found
NLP2Code: Code Snippet Content Assist via Natural Language Tasks
Developers increasingly take to the Internet for code snippets to integrate
into their programs. To save developers the time required to switch from their
development environments to a web browser in the quest for a suitable code
snippet, we introduce NLP2Code, a content assist for code snippets. Unlike
related tools, NLP2Code integrates directly into the source code editor and
provides developers with a content assist feature to close the vocabulary gap
between developers' needs and code snippet meta data. Our preliminary
evaluation of NLP2Code shows that the majority of invocations lead to code
snippets rated as helpful by users and that the tool is able to support a wide
range of tasks.Comment: tool demo video available at
https://www.youtube.com/watch?v=h-gaVYtCznI; to appear as a tool demo paper
at ICSME 2017 (https://icsme2017.github.io/
Republishing OpenStreetMap’s roads as linked routable tiles
Route planning providers manually integrate different geo-spatial datasets before offering a Web service to developers, thus creating a closed world view. In contrast, combining open datasets at runtime can provide more information for user-specific route planning needs. For example, an extra dataset of bike sharing availabilities may provide more relevant information to the occasional cyclist. A strategy for automating the adoption of open geo-spatial datasets is needed to allow an ecosystem of route planners able to answer more specific and complex queries. This raises new challenges such as (i) how open geo-spatial datasets should be published on the Web to raise interoperability, and (ii) how route planners can discover and integrate relevant data for a certain query on the fly. We republished OpenStreetMap's road network as "Routable Tiles" to facilitate its integration into open route planners. To achieve this, we use a Linked Data strategy and follow an approach similar to vector tiles. In a demo, we show how client-side code can automatically discover tiles and perform a shortest path algorithm. We provide four contributions: (i) we launched an open geo-spatial dataset that is available for everyone to reuse at no cost, (ii) we published a Linked Data version of the OpenStreetMap ontology, (iii) we introduced a hypermedia specification for vector tiles that extends the Hydra ontology, and (iv) we released the mapping scripts, demo and routing scripts as open source software
Assessment of fissionable material behaviour in fission chambers
A comprehensive study is performed in order to assess the pertinence of fission chambers coated with different fissile materials for high neutron flux detection. Three neutron scenarios are proposed to study the fast component of a high neutron flux: (i) high neutron flux with a significant thermal contribution such as BR2, (ii) DEMO magnetic fusion reactor, and (iii) IFMIF high flux test module.
In this study, the inventory code ACAB is used to analyze the following questions: (i) impact of different deposits in fission chambers; (ii) effect of the irradiation time/burn-up on the concentration; (iii) impact of activation cross-section uncertainties on the composition of the deposit for all the range of burn-up/irradiation neutron fluences of interest. The complete set of nuclear data (decay, fission yield, activation cross-sections, and uncertainties) provided in the EAF2007 data library are used for this evaluation
Recommended from our members
Can Robots Replace Librarians? Experiences Using a Chat Bot to Respond to IM Questions
This poster will use text and statistical data to describe the conditions that led to development of a chat bot. It will use text and graphics to illustrate the steps involved in building and launching the bot, and it will use text and statistical data to illustrate its functionality in responding to IM reference questions. A laptop will also be available to demo backend (developer code, etc.) as well as frontend (bot responses to questions in real time
Two or three takeaways
- The poster session will demonstrate the basic steps that are needed to integrate a chatbot to library chat software
- The session will show cases where a chatbot can be helpful (and where it is not
Linking de novo assembly results with long DNA reads by dnaasm-link application
Currently, third-generation sequencing techniques, which allow to obtain much
longer DNA reads compared to the next-generation sequencing technologies, are
becoming more and more popular. There are many possibilities to combine data
from next-generation and third-generation sequencing.
Herein, we present a new application called dnaasm-link for linking contigs,
a result of \textit{de novo} assembly of second-generation sequencing data,
with long DNA reads. Our tool includes an integrated module to fill gaps with a
suitable fragment of appropriate long DNA read, which improves the consistency
of the resulting DNA sequences. This feature is very important, in particular
for complex DNA regions, as presented in the paper. Finally, our implementation
outperforms other state-of-the-art tools in terms of speed and memory
requirements, which may enable the usage of the presented application for
organisms with a large genome, which is not possible in~existing applications.
The presented application has many advantages as (i) significant memory
optimization and reduction of computation time (ii) filling the gaps through
the appropriate fragment of a specified long DNA read (iii) reducing number of
spanned and unspanned gaps in the existing genome drafts.
The application is freely available to all users under GNU Library or Lesser
General Public License version 3.0 (LGPLv3). The demo application, docker image
and source code are available at http://dnaasm.sourceforge.net.Comment: 16 pages, 5 figure
Balancing act: multivariate rational reconstruction for IBP
We address the problem of unambiguous reconstruction of rational functions of
many variables. This is particularly relevant for recovery of exact expansion
coefficients in integration-by-parts identites (IBPs) based on modular
arithmetic. These IBPs are indispensable in modern approaches to evaluation of
multiloop Feynman integrals by means of differential equations. Modular
arithmetic is far more superior to algebraic implementations when one deals
with high-multiplicity situations involving a large number of Lorentz
invariants. We introduce a new method based on balanced relations which allows
one to achieve the goal of a robust functional restoration with minimal data
input. The technique is implemented as a Mathematica package Reconstruction.m
in the FIRE6 environment and thus successfully demonstrates a proof of concept.Comment: 15 pages, 10 ancillary files with code, scripts and demo; download
code @ https://bitbucket.org/feynmanIntegrals/fire/src/master/FIRE6/mm
TabGenie: A Toolkit for Table-to-Text Generation
Heterogenity of data-to-text generation datasets limits the research on
data-to-text generation systems. We present TabGenie - a toolkit which enables
researchers to explore, preprocess, and analyze a variety of data-to-text
generation datasets through the unified framework of table-to-text generation.
In TabGenie, all the inputs are represented as tables with associated metadata.
The tables can be explored through the web interface, which also provides an
interactive mode for debugging table-to-text generation, facilitates
side-by-side comparison of generated system outputs, and allows easy exports
for manual analysis. Furthermore, TabGenie is equipped with command line
processing tools and Python bindings for unified dataset loading and
processing. We release TabGenie as a PyPI package and provide its open-source
code and a live demo at https://github.com/kasnerz/tabgenie.Comment: Submitted to ACL 2023 System Demonstration Trac
- …