899 research outputs found
Enhancing the Jaquez k Nearest Neighbor Test for Space-Time Interaction
The Jacquez k nearest neighbor test, originally developed to improve upon shortcomings of existing tests for space-time interaction, has been shown to be a robust and powerful method of detecting interaction. Despite its flexibility and power however, the test has three main shortcomings: (1) it discards important information regarding the spatial and temporal scale at which detected interac- tion takes place; (2) the results of the test have not been visualized; (3) recent research demonstrates the test to be susceptible to population shift bias. This study presents enhancements to the Jacquez k nearest neighbors test with the goal of addressing each of these three shortcomings and improving the utility of the test. Data on Burkitt’s lymphoma cases in Uganda between 1961-1975 are employed to illustrate the modifications and enhance the visual output of the test. Output from the enhanced test is compared to that provided by alternative tests of space-time interaction. Results show the enhancements presented in this study transform the Jacquez test into a complete, descriptive, and informative metric that can be used as a stand alone measure of global space-time interaction.space-time interaction, Jacquez k nearest neighbor, visualization, space-time cube, population shift bias
Visualizing the Past: Tools and Techniques for Understanding Historical Processes
The University of Richmond requests a Level I Digital Humanities Start-Up grant to bring together experts for investigations about how to overcome limitations that prevent most humanities scholars from taking advantage of visualization techniques in their research. The grant will fund a two-day workshop where invited scholars will discuss current work on visualizing historical processes, and together consider: (1) How can we harness emerging cyber-infrastructure tools and interoperability standards to explore, visualize, and analyze spatial and temporal components of distributed digital archives to better understand historical events and processes? (2) How can user-friendly tools or web sites be created to allow scholars and researchers to animate spatial and temporal data housed on different systems across the Internet? The grant will also fund initial experiments toward creating new tools for overcoming obstacles to data visualization work. Results will be presented as a white paper
Communicating thematic data quality with web map services
Geospatial information of many kinds, from topographic maps to scientific data, is increasingly being made available through web mapping services. These allow georeferenced map images to be served from data stores and displayed in websites and geographic information systems, where they can be integrated with other geographic information. The Open Geospatial Consortium’s Web Map Service (WMS) standard has been widely adopted in diverse communities for sharing data in this way. However, current services typically provide little or no information about the quality or accuracy of the data they serve. In this paper we will describe the design and implementation of a new “quality-enabled” profile of WMS, which we call “WMS-Q”. This describes how information about data quality can be transmitted to the user through WMS. Such information can exist at many levels, from entire datasets to individual measurements, and includes the many different ways in which data uncertainty can be expressed. We also describe proposed extensions to the Symbology Encoding specification, which include provision for visualizing uncertainty in raster data in a number of different ways, including contours, shading and bivariate colour maps. We shall also describe new open-source implementations of the new specifications, which include both clients and servers
Recommended from our members
WorldMap – A Geospatial Framework for Collaborative Research
WorldMap is a web-based, map-centric data exploration system built on open-source geospatial technology at Harvard University. It is designed to serve collaborative research and teaching, but is also accessible to the general public. This article explains WorldMap's basic functions through several historical research projects, demonstrating its flexible scale (from neighborhood to continent) and diverse research themes (social, political, economic, cultural, infrastructural, etc.). Also shared in this article are our experiences in handling technical and institutional challenges during system development, such as synchronization of software components being developed by multiple organizations; juggling competing priorities for serving individual requests and developing a system that will enable users to support themselves; balancing promotion of the system usage with constraints on infrastructure investment; harnessing volunteered geographic information while managing data quality; as well as protecting copyrights, preserving permanent links and citations, and providing long-term archiving.East Asian Languages and Civilization
Seafloor characterization using airborne hyperspectral co-registration procedures independent from attitude and positioning sensors
The advance of remote-sensing technology and data-storage capabilities has progressed in the last decade to commercial multi-sensor data collection. There is a constant need to characterize, quantify and monitor the coastal areas for habitat research and coastal management. In this paper, we present work on seafloor characterization that uses hyperspectral imagery (HSI). The HSI data allows the operator to extend seafloor characterization from multibeam backscatter towards land and thus creates a seamless ocean-to-land characterization of the littoral zone
Sensor Data Visualization in Virtual Globe
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.With the recent developments related with sensors in matters of standardization and accessibility, valuable data covering different geographical subjects have become widely available. The applications that can leverage sensor data are still under development and there is much to do in this subject in the scientific community. Data visualization tools are one of the immediately relevant needs related with sensor data. Such tools would help to increase the understanding and exploration of the data from which many other fields can get benefits.
Virtual Globes are becoming increasingly popular in the society. The existence of several implementations and millions of users (scientific and no scientific) around the world are a proof of their increasing usability as a tool for representing and sharing geographical content.
In this document we present a generic tool for visualizing sensor data retrieved from SOS servers over the NASA World Wind virtual globe. For this, we started by creating a classification of sensor data that helps in defining possible visualizations for the different types of sensor data. Using this classification as a basis, we have implemented a set of visualization types to ease sensor data exploration. We also included analysis capabilities by integrating the SEXTANTE library in the visualization tool. The results of the analysis can be included in the virtual globe as part of the visualizations
Recommended from our members
The Billion Object Platform (BOP): a system to lower barriers to support big, streaming, spatio-temporal data sources
With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA) has developed a big spatio-temporal data visualization platform called the Billion Object Platform or BOP . The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. Since once archived, streaming data gets big fast, and since most GIS systems don\u27t support interactive visualization of millions of objects, a new platform was needed. The BOP is loaded with the latest billion geo-tweets and is fed a real-time stream of about 1 million tweets per day. The CGA has been harvesting and archiving geo-tweets since 2012. As tweets flow into the BOP, they are enriched with sentiment and census information to support further analysis. Incoming and intermediate data is streamed/stored in Apache Kafka. The core of the BOP is Apache Solr, which supports fast search. Some significant enhancements were done to Solr (and contributed back) -- notably 2D heatmap faceting to support spatial visualization. The BOP fronts Solr with a RESTful web service, which provides a friendly, and secure API that is accessed from a browser-based client. The client developed, dynamically displays temporal and spatial distributions of results for result sets containing hundreds of millions of features. The system is open source and runs on commodity hardware. It is hosted on Massachusetts Open Cloud (MOC), an OpenStack environment. All components are deployed in Docker orchestrated by Kontena
A comparison of GIS packages for geospatial data pre-processing
This paper aims to assess the choice of geographic information systems (GIS) used in pre-process geospatial data sets.The study conducted a comparative review of some of the commonly used GIS packages with the aim of proposing the most reliable in terms of consistency, functionality, user-friendliness and cost-effectiveness, which are the determinants in adopting any GIS packages. By this systematic assessment, both the current and potential users will be able to take full advantage of the most efficient GIS package to perform various analytical pre-processing tasks.The outcome of the assessment could be adopted as a guide for selecting an appropriate and reliable open source GIS platform for a timely and efficient pre-preprocessing geospatial data for environmental analysis
ANALYTiC: Understanding Decision Boundaries and Dimensionality Reduction in Machine Learning
The advent of compact, handheld devices has given us a pool of tracked
movement data that could be used to infer trends and patterns that can be made
to use. With this flooding of various trajectory data of animals, humans,
vehicles, etc., the idea of ANALYTiC originated, using active learning to infer
semantic annotations from the trajectories by learning from sets of labeled
data. This study explores the application of dimensionality reduction and
decision boundaries in combination with the already present active learning,
highlighting patterns and clusters in data. We test these features with three
different trajectory datasets with objective of exploiting the the already
labeled data and enhance their interpretability. Our experimental analysis
exemplifies the potential of these combined methodologies in improving the
efficiency and accuracy of trajectory labeling. This study serves as a
stepping-stone towards the broader integration of machine learning and visual
methods in context of movement data analysis.Comment: Bachelor's thesi
The Modern Methods of Data Analysis in Social Research: Python Programming Language and its Pandas Library as an Example- a Theoretic Study
سنحاول من خلال هذه الورقة البحثية تسليط الضوء على أهمية وأبعاد استخدامات لغة 'بايثون' في العلوم الاجتماعية، كأحد لغات البرمجة الأكثر استخداما، باعتبارها لغة سهلة التعلم، قوية، وسهلة التطبيق، لتحليل البيانات في البحوث الاجتماعية، والتركيز على مكتبة Pandas باعتبارها واحدة من المكتبات الأكثر استخداماً في تحليل البيانات في تلك البحوث، حيث توفر وسائل قوية لتحليل وتحويل البيانات وإجراء العمليات الحسابية والإحصائية على البيانات بشكل سهل وفعال. وقد توصلت الدراسة إلى أن تعلُّم لغات البرمجة وتوظيفها في تحليل البيانات في العلوم الاجتماعية أصبح أمرًا حاسمًا في وقتنا الحاضر، لفهم وتحليل وتفسير المجتمع المعاصر ولتوليد رؤى علمية جديدة وإنتاج معرفة توافق التغيرات المتسارعة في المجال الاجتماعي. This study aims to highlight the significance and various dimensions of utilizing the Python language in the field of social sciences. Python is widely recognized as one of the most frequently employed programming languages, primarily due to its user-friendly nature and suitability for data analysis in social research. In this study, particular emphasis is placed on the Pandas library, which holds a prominent position in data analysis for social research. The library offers robust tools for data analysis, manipulation, and the execution of mathematical and statistical operations in an efficient and accessible manner. The study has concluded that learning programming languages and using them in data analysis in social sciences has become crucial nowadays to understand, analyze and interpret the contemporary society, as well as to generate new scientific perspectives and produce knowledge that corresponds to the rapid changes in the social field
- …