6,458 research outputs found

    The Visualization of Historical Structures and Data in a 3D Virtual City

    Get PDF
    Google Earth is a powerful tool that allows users to navigate through 3D representations of many cities and places all over the world. Google Earth has a huge collection of 3D models and it only continues to grow as users all over the world continue to contribute new models. As new buildings are built new models are also created. But what happens when a new building replaces another? The same thing that happens in reality also happens in Google Earth. Old models are replaced with new models. While Google Earth shows the most current data, many users would also benefit from being able to view historical data. Google Earth has acknowledged this with the ability to view historical images with the manipulation of a time slider. However, this feature does not apply to 3D models of buildings, which remain in the environment even when viewing a time before their existence. I would like to build upon this concept by proposing a system that stores 3D models of historical buildings that have been demolished and replaced by new developments. People may want to view the old cities that they grew up in which have undergone huge developments over the years. Old neighborhoods may be completely transformed with new road and buildings. In addition to being able to view historical buildings, users may want to view statistics of a given area. Users can view such data in their raw format but using 3D visualizations of statistical data allows for a greater understanding and appreciation of historical changes. I propose to enhance the visualization of the 3D world by allowing users to graphically view statistical data such as population, ethnic groups, education, crime, and income. With this feature users will not only be able to see physical changes in the environment, but also statistical changes over time

    DARIAH and the Benelux

    Get PDF

    Web-based visualization for 3D data in archaeology : The ADS 3D viewer

    Get PDF
    The solid geometry of archaeological deposits is fundamental to the interpretation of their chronological sequence. However, such stratigraphic sequences are generally viewed as static two-dimensional diagrammatic representations which are difficult to manipulate or to relate to real layers. The ADS 3D Viewer is a web-based resource for the management and analysis of archaeological data. The viewer was developed to take advantage of recent developments in web technology, namely the adoption of WebGL (Web Graphics Library) by current web browsers. The ADS 3D Viewer combines the potential of the 3D Heritage Online Presenter (3DHOP), a software package for the web-based visualization of 3D geometries, with the infrastructure of the Archaeology Data Service (ADS) repository, in the attempt to create a platform for the visualization and analysis of 3D data archived by the ADS. Two versions of the viewer have been developed to answer the needs of different users. The first version, the Object Level 3D Viewer, was implemented to extend the browsing capability of ADS project archives by enabling the visualization of single 3D models. The second version, the Stratigraphy 3D Viewer, is an extension which allows the exploration of a specific kind of aggregated data: the multiple layers of an archaeological stratigraphic sequence. This allows those unable to participate directly in the fieldwork to access, analyse and re-interpret the archaeological context remotely. This has the potential to transform the discipline, allowing inter-disciplinary, cross-border and ‘at-distance’ collaborative workflows, and enabling easier access to and analysis of archaeological data

    A Community-Driven Validation Service for Standard Medical Imaging Objects

    Get PDF
    Digital medical imaging laboratories contain many distinct types of equipment provided by different manufacturers. Interoperability is a critical issue and the DICOM protocol is a de facto standard in those environments. However, manufacturers' implementation of the standard may have non-conformities at several levels, which will hinder systems' integration. Moreover, medical staff may be responsible for data inconsistencies when entering data. Those situations severely affect the quality of healthcare services since they can disrupt system operations. The existence of software able to confirm data quality and compliance with the DICOM standard is important for programmers, IT staff and healthcare technicians. Although there are a few solutions that try to accomplish this goal, they are unable to deal with certain situations that require user input. Furthermore, these cases usually require the setup of a working environment, which makes the sharing of validation information more difficult. This article proposes and describes the development of a Web DICOM validation service for the community. This solution requires no configuration by the user, promotes validation results share-ability in the community and preserves patient data privacy since files are de-identified on the client side.Comment: Computer Standards & Interfaces, 201

    Mathematical practice, crowdsourcing, and social machines

    Full text link
    The highest level of mathematics has traditionally been seen as a solitary endeavour, to produce a proof for review and acceptance by research peers. Mathematics is now at a remarkable inflexion point, with new technology radically extending the power and limits of individuals. Crowdsourcing pulls together diverse experts to solve problems; symbolic computation tackles huge routine calculations; and computers check proofs too long and complicated for humans to comprehend. Mathematical practice is an emerging interdisciplinary field which draws on philosophy and social science to understand how mathematics is produced. Online mathematical activity provides a novel and rich source of data for empirical investigation of mathematical practice - for example the community question answering system {\it mathoverflow} contains around 40,000 mathematical conversations, and {\it polymath} collaborations provide transcripts of the process of discovering proofs. Our preliminary investigations have demonstrated the importance of "soft" aspects such as analogy and creativity, alongside deduction and proof, in the production of mathematics, and have given us new ways to think about the roles of people and machines in creating new mathematical knowledge. We discuss further investigation of these resources and what it might reveal. Crowdsourced mathematical activity is an example of a "social machine", a new paradigm, identified by Berners-Lee, for viewing a combination of people and computers as a single problem-solving entity, and the subject of major international research endeavours. We outline a future research agenda for mathematics social machines, a combination of people, computers, and mathematical archives to create and apply mathematics, with the potential to change the way people do mathematics, and to transform the reach, pace, and impact of mathematics research.Comment: To appear, Springer LNCS, Proceedings of Conferences on Intelligent Computer Mathematics, CICM 2013, July 2013 Bath, U

    Horizon Report 2009

    Get PDF
    El informe anual Horizon investiga, identifica y clasifica las tecnologías emergentes que los expertos que lo elaboran prevén tendrán un impacto en la enseñanza aprendizaje, la investigación y la producción creativa en el contexto educativo de la enseñanza superior. También estudia las tendencias clave que permiten prever el uso que se hará de las mismas y los retos que ellos suponen para las aulas. Cada edición identifica seis tecnologías o prácticas. Dos cuyo uso se prevé emergerá en un futuro inmediato (un año o menos) dos que emergerán a medio plazo (en dos o tres años) y dos previstas a más largo plazo (5 años)

    Web archives: the future

    Get PDF
    T his report is structured first, to engage in some speculative thought about the possible futures of the web as an exercise in prom pting us to think about what we need to do now in order to make sure that we can reliably and fruitfully use archives of the w eb in the future. Next, we turn to considering the methods and tools being used to research the live web, as a pointer to the types of things that can be developed to help unde rstand the archived web. Then , we turn to a series of topics and questions that researchers want or may want to address using the archived web. In this final section, we i dentify some of the challenges individuals, organizations, and international bodies can target to increase our ability to explore these topi cs and answer these quest ions. We end the report with some conclusions based on what we have learned from this exercise

    Computing and data processing

    Get PDF
    The applications of computers and data processing to astronomy are discussed. Among the topics covered are the emerging national information infrastructure, workstations and supercomputers, supertelescopes, digital astronomy, astrophysics in a numerical laboratory, community software, archiving of ground-based observations, dynamical simulations of complex systems, plasma astrophysics, and the remote control of fourth dimension supercomputers

    Suitability of Unidata Metapps for Incorporation in Platform-Independent User-Customized Aviation Weather Products Generation Software

    Get PDF
    The Air Force Combat Climatology Center (AFCCC) is tasked to provide long-range seasonal forecasts for worldwide locations. Currently, the best long-range temperature forecasts the weather community has are the climatological standard normals. This study creates a stepping-stone into the solution of long-range forecasting by finding a process to predict temperatures better than those using climatological standard normals or simple frequency distributions of occurrences. Northern Hemispheric teleconnection indices and the standardized Southern Oscillation index are statistically compared to three-month summed Heating Degree Days (HDDs) and Cooling Degree Days (CDDs) at 14 U.S. locations. First, linear regression was accomplished. The results showed numerous valid models, however, the percent of variance resolved by the models was rarely over 30%. The HDDs and CDDs were then analyzed with Data-mining classification tree statistics, however, the results proved difficult to extract any predictive quantitative information. Finally a Data-mining regression tree analysis was performed. At each conditional outcome, a range of HDDs/CDDs is produced using the predicted standard deviations about the mean. Verification of independent teleconnection indices was used as predictors in the conditional model; 90% of the resulting HDDs/CDDs fell into the calculated range. An overall average reduction in the forecast range was 35.7% over climatolog
    • …
    corecore