125 research outputs found

    trackr: A Framework for Enhancing Discoverability and Reproducibility of Data Visualizations and Other Artifacts in R

    Full text link
    Research is an incremental, iterative process, with new results relying and building upon previous ones. Scientists need to find, retrieve, understand, and verify results in order to confidently extend them, even when the results are their own. We present the trackr framework for organizing, automatically annotating, discovering, and retrieving results. We identify sources of automatically extractable metadata for computational results, and we define an extensible system for organizing, annotating, and searching for results based on these and other metadata. We present an open-source implementation of these concepts for plots, computational artifacts, and woven dynamic reports generated in the R statistical computing language

    Encyclopedia of software components

    Get PDF
    Intelligent browsing through a collection of reusable software components is facilitated with a computer having a video monitor and a user input interface such as a keyboard or a mouse for transmitting user selections, by presenting a picture of encyclopedia volumes with respective visible labels referring to types of software, in accordance with a metaphor in which each volume includes a page having a list of general topics under the software type of the volume and pages having lists of software components for each one of the generic topics, altering the picture to open one of the volumes in response to an initial user selection specifying the one volume to display on the monitor a picture of the page thereof having the list of general topics and altering the picture to display the page thereof having a list of software components under one of the general topics in response to a next user selection specifying the one general topic, and then presenting a picture of a set of different informative plates depicting different types of information about one of the software components in response to a further user selection specifying the one component

    KB4VA: A Knowledge Base of Visualization Designs for Visual Analytics

    Full text link
    Visual analytics (VA) systems have been widely used to facilitate decision-making and analytical reasoning in various application domains. VA involves visual designs, interaction designs, and data mining, which is a systematic and complex paradigm. In this work, we focus on the design of effective visualizations for complex data and analytical tasks, which is a critical step in designing a VA system. This step is challenging because it requires extensive knowledge about domain problems and visualization to design effective encodings. Existing visualization designs published in top venues are valuable resources to inspire designs for problems with similar data structures and tasks. However, those designs are hard to understand, parse, and retrieve due to the lack of specifications. To address this problem, we build KB4VA, a knowledge base of visualization designs in VA systems with comprehensive labels about their analytical tasks and visual encodings. Our labeling scheme is inspired by a workshop study with 12 VA researchers to learn user requirements in understanding and retrieving professional visualization designs in VA systems. The theme extends Vega-Lite specifications for describing advanced and composited visualization designs in a declarative manner, thus facilitating human understanding and automatic indexing. To demonstrate the usefulness of our knowledge base, we present a user study about design inspirations for VA tasks. In summary, our work opens new perspectives for enhancing the accessibility and reusability of professional visualization designs

    Studying and solving visual artifacts occurring when procedural texturing with paradoxical requirements

    Get PDF
    International audienceTextures are images widely used by computer graphics artists to add visual detail to their work. Textures may come from different sources, such as pictures of real-world surfaces, manually created images using graphics editors, or algorithmic processes. “Procedural texturing” refers to the creation of textures using algorithmic processes.Procedural textures offer many advantages, including the ability to manipulate their appearance through parameters. Many applications rely on changing those parameters to evolve the look of those textures over time or space. This often introduces requirements contradictory with the structure of the unaltered texture, often resulting in visible rendering artifacts. As an example, to animate a lava flow the rendered texture should be an effective representation of the simulated flow, but features such as rocks floating over should not be distorted, nor brutally appearor disappear thus disrupting the illusion. Informally, we want our lava texture to “change, but stay the same”. This example is an instance of the consistency problem that arises when changing parameters of a texture, resulting in noticeable artifacts in the rendered result.In this project, we seek to classify these artifacts depending on their causes and their effects on textures, but also how we can objectively detect and explain their presence, and so predict their occurrence. Analytical and statistical analysis of procedural texturing processes will be performed, in order to find the relation with the corresponding artifacts.Les textures sont des images largement utilisées par les artistes infographistes pour ajouter des détails dans leurs rendus. Ces textures peuvent provenir de différentes sources, telles que des images de surfaces réelles, des images créées manuellement dans des logiciels, ou des processus algorithmiques. Ces dernières sont appelées“textures procédurales”.Les textures procédurales offrent beaucoup d’avantages, notamment la possibilité de définir leur apparence à partir de paramètres. Dans de nombreuses situations, ces paramètres évoluent au cours du temps ou dans l’espace. Cela introduit cependant des contraintes contradictoires à l’apparence de la texture originale, introduisant ainsi des artefacts visibles. Par exemple, pour animer un flot de lave, la texture rendue devrait suivre le flot simulé, mais les formes telles que des rochers flottants à la surface ne devraient pas être distordus, ni apparaître ou disparaîtrebrutalement, ce qui casserait l’illusion. Informellement, cette texture de lave doit “changer, mais rester la même”. Cet exemple est une instance du problème de la cohérence temporelle lié à l’évolution de paramètres d’une texture, ce qui introduit des artefacts dans le rendu final.Au cours de ce projet, nous essayons de classer les différents artefacts selon les causes qui les produisent et leurs effets sur les textures. Nous cherchons aussi des méthodes pour décrire objectivement et détecter leur présence, et même prédire leur apparition. Des méthodes analytiques mais aussi statistiques des procédés de texture vont être mis en œuvre, afin de trouver les liens entre artefacts et descripteurs

    Encyclopedia of Software Components

    Get PDF
    Intelligent browsing through a collection of reusable software components is facilitated with a computer having a video monitor and a user input interface such as a keyboard or a mouse for transmitting user selections, by presenting a picture of encyclopedia volumes with respective visible labels referring to types of software, in accordance with a metaphor in which each volume includes a page having a list of general topics under the software type of the volume and pages having lists of software components for each one of the generic topics, altering the picture to open one of the volumes in response to an initial user selection specifying the one volume to display on the monitor a picture of the page thereof having the list of general topics and altering the picture to display the page thereof having a list of software components under one of the general topics in response to a next user selection specifying the one general topic, and then presenting a picture of a set of different informative plates depicting different types of information about one of the software components in response to a further user selection specifying the one component

    Benchmarking CAD search techniques

    Full text link

    Application of a three-dimensional color laser scanner to paleontology: An interactive model of a juvenile Tylosaurus SP. Basisphenoid-basioccipital

    Get PDF
    Three-dimensional (3D) modeling has always been an important part of paleontological research and interpretation though digital reproductions of fossils are a recent phenomena. A highly accurate, interactive, 100 ÎĽm resolution, 3D, digital model of a fossilized basisphenoid-basioccipital from a juvenile Tylosaurus sp. mosasaur was generated using a 3D laser scanner and manipulated using VRML and InnovMetric polygon files. This 3D model supports varying levels of magnification depending on the initial scan resolution and the amount of post-production polygon reduction. The generation of these 3D models is relatively simple because the software and technology for their generation is relatively mature. At present, complex 3D models require powerful computers in order to manipulate their computer graphic substructures. But, as computer technology improves, digital 3D scanning could prove invaluable for creating and sharing virtual copies of fossil material. Primary results of this study indicate that for most paleontological applications a 100ÎĽm scan resolution is acceptable. Copyright: Society for Vertebrate Paleontology, 15 November 2000

    Excavations at 41LK67 a Prehistoric Site in the Choke Canyon Reservior, South Texas

    Get PDF
    In 1977-1978 excavations were conducted at 41 LK 67 in Live Oak County, south Texas, by the Center for Archaeological Research, The University of Texas at San Antonio. The investigation of this prehistoric archaeological site was part of an extensive program of reconnaissance and excavation necessitated by the construction of the Choke Canyon Reservoir on the Frio River by the Bureau of Reclamation. The site is situated in shallow colluvial deposits capping an old terrace remnant of the Frio River. The excavations involved 193 m2 in three separate areas and revealed Late Prehistoric and Late Archaic components. Recognizably older artifacts (including patinated chert flakes) from the surface and from excavations may represent older disturbed components or artifacts collected prehistorically from nearby sites. Radiocarbon dates, with medians ranging from 1590 to 660 B.C. (MASCA correction) are available only from the Late Archaic component. The principal kinds of debris recovered from the excavations are fire-cracked rock, cores and chipping debris, shells of snails and freshwater mussels, plainware potsherds, and chipped stone tools. Mussel shell was surprisingly abundant; more than 9000 specimens, including 3000 specimens identified taxonomically, were recovered. Fish otoliths were the only animal bones preserved, except for a few recent, intrusive elements. Debris frequencies from the two larger excavation blocks (Areas A and B) were factor analyzed. In most cases the analysis showed the strongest covariation occurring among different classes of chipping debris. For Area C factor analysis indicated that the strongest spatial patterning occurred in the upper part of the deposits. Unfortunately, the analysis was not particularly successful in defining activity sets. The small collection of chipped stone tools was examined microscopically. Two tool classes in particular, distally beveled tools (gouges) and quadrilateral bifaces (beveled knives) seem to represent more functionally specific tool forms, but other hafted bifaces (projectile points) show a wide range of use wear mostly unrelated to projectile use
    • …
    corecore