270,902 research outputs found

    An End-to-end Neural Natural Language Interface for Databases

    Full text link
    The ability to extract insights from new data sets is critical for decision making. Visual interactive tools play an important role in data exploration since they provide non-technical users with an effective way to visually compose queries and comprehend the results. Natural language has recently gained traction as an alternative query interface to databases with the potential to enable non-expert users to formulate complex questions and information needs efficiently and effectively. However, understanding natural language questions and translating them accurately to SQL is a challenging task, and thus Natural Language Interfaces for Databases (NLIDBs) have not yet made their way into practical tools and commercial products. In this paper, we present DBPal, a novel data exploration tool with a natural language interface. DBPal leverages recent advances in deep models to make query understanding more robust in the following ways: First, DBPal uses a deep model to translate natural language statements to SQL, making the translation process more robust to paraphrasing and other linguistic variations. Second, to support the users in phrasing questions without knowing the database schema and the query features, DBPal provides a learned auto-completion model that suggests partial query extensions to users during query formulation and thus helps to write complex queries

    The exploration metaphor

    Get PDF
    NASA's experience in planetary exploration has demonstrated that the desktop workstation is inadequate for many visualization situations. The primary mission displays for the unmanned Surveyor missions to the moon during the mid-1960's, for example, were environmental images assembled on the inside surfaces of spherical shells. Future exploration missions will greatly benefit from advances in digital computer and display technology, but there remain unmet user interface needs. Alternative user interfaces and metaphors are needed for planetary exploration and other interactions with complex spatial environments. These interfaces and metaphors would enable the user to directly explore environments and naturally manipulate objects in those environments. Personal simulators, virtual workstations, and telepresence user interfaces are systems capable of providing this integration of user space and task space. The Exploration Metaphor is a useful concept for guiding the design of user interfaces for virtual environments and telepresence. To apply the Exploration Metaphor is to assert that computing is like exploration, and to support objects, operations, and contexts comparable to those encountered in the exploration of natural environments. The Exploration Metaphor, under development for user interfaces in support of NASA's planetary exploration missions and goals, will also benefit other applications where complex spatial information must be visualized. Visualization methods and systems for planetary exploration are becoming increasingly integrated and interactive as computing technology improves. These advances will benefit from virtual environment and telepresence interface technology. A key development has been the processing of multiple images and other sensor data to create detailed digital models of the planets and moons. Data from images of the Earth, Mars, and Miranda, for example, have been converted into 3D models, and dynamic virtual fly-overs have been computed as demonstrations. Similar processing of lower altitude photography and the use of computer aided design tools promise to produce very detailed models in the future

    A user interface for terrain modelling in virtual reality using a head mounted display

    Get PDF
    The increased commercial availability of virtual reality (VR) devices has resulted in more content being created for virtual environments (VEs). This content creation has mainly taken place using traditional desktop systems but certain applications are now integrating VR into the creation pipeline. Therefore we look at the effectiveness of creating content, specifically designing terrains, for use in immersive environments using VR technology. To do this, we develop a VR interface for terrain creation based on an existing desktop application. The interface incorporates a head-mounted display and 6 degree of freedom controllers. This allows the mapping of user controls to more natural movements compared to the abstract controls in mouse and keyboard based systems. It also means that users can view the terrain in full 3D due to the inherent stereoscopy of the VR display. The interface goes through three iterations of user centred design and testing. This results in paper and low fidelity prototypes being created before the final interface is developed. The performance of this final VR interface is then compared to the desktop interface on which it was based. We carry out user tests to assess the performance of each interface in terms of speed, accuracy and usability. From our results we find that there is no significant difference between the interfaces when it comes to accuracy but that the desktop interface is superior in terms of speed while the VR interface was rated as having higher usability. Some of the possible reasons for these results, such as users preferring the natural interactions offered by the VR interface but not having sufficient training to fully take advantage of it, are discussed. Finally, we conclude that while it was not shown that either interface is clearly superior, there is certainly room for further exploration of this research area. Recommendations for how to incorporate lessons learned during the creation of this dissertation into any further research are also made
    • …
    corecore