899 research outputs found

    Learning sparse representations of depth

    Full text link
    This paper introduces a new method for learning and inferring sparse representations of depth (disparity) maps. The proposed algorithm relaxes the usual assumption of the stationary noise model in sparse coding. This enables learning from data corrupted with spatially varying noise or uncertainty, typically obtained by laser range scanners or structured light depth cameras. Sparse representations are learned from the Middlebury database disparity maps and then exploited in a two-layer graphical model for inferring depth from stereo, by including a sparsity prior on the learned features. Since they capture higher-order dependencies in the depth structure, these priors can complement smoothness priors commonly used in depth inference based on Markov Random Field (MRF) models. Inference on the proposed graph is achieved using an alternating iterative optimization technique, where the first layer is solved using an existing MRF-based stereo matching algorithm, then held fixed as the second layer is solved using the proposed non-stationary sparse coding algorithm. This leads to a general method for improving solutions of state of the art MRF-based depth estimation algorithms. Our experimental results first show that depth inference using learned representations leads to state of the art denoising of depth maps obtained from laser range scanners and a time of flight camera. Furthermore, we show that adding sparse priors improves the results of two depth estimation methods: the classical graph cut algorithm by Boykov et al. and the more recent algorithm of Woodford et al.Comment: 12 page

    On the Cost of Negation for Dynamic Pruning

    Get PDF
    Negated query terms allow documents containing such terms to be filtered out of a search results list, supporting disambiguation. In this work, the effect of negation on the efficiency of disjunctive, top-k retrieval is examined. First, we show how negation can be integrated efficiently into two popular dynamic pruning algorithms. Then, we explore the efficiency of our approach, and show that while often efficient, negation can negatively impact the dynamic pruning effectiveness for certain queries

    Reverse k Nearest Neighbor Search over Trajectories

    Full text link
    GPS enables mobile devices to continuously provide new opportunities to improve our daily lives. For example, the data collected in applications created by Uber or Public Transport Authorities can be used to plan transportation routes, estimate capacities, and proactively identify low coverage areas. In this paper, we study a new kind of query-Reverse k Nearest Neighbor Search over Trajectories (RkNNT), which can be used for route planning and capacity estimation. Given a set of existing routes DR, a set of passenger transitions DT, and a query route Q, a RkNNT query returns all transitions that take Q as one of its k nearest travel routes. To solve the problem, we first develop an index to handle dynamic trajectory updates, so that the most up-to-date transition data are available for answering a RkNNT query. Then we introduce a filter refinement framework for processing RkNNT queries using the proposed indexes. Next, we show how to use RkNNT to solve the optimal route planning problem MaxRkNNT (MinRkNNT), which is to search for the optimal route from a start location to an end location that could attract the maximum (or minimum) number of passengers based on a pre-defined travel distance threshold. Experiments on real datasets demonstrate the efficiency and scalability of our approaches. To the best of our best knowledge, this is the first work to study the RkNNT problem for route planning.Comment: 12 page

    Fair Use and Films in Academic Forums

    Get PDF
    A library\u27s Course Reserves department often fields questions about Copyright and Fair Use. Most recently, the Georgia Southern University Libraries have been asked several questions concerning Fair Use and movies. This short presentation will outline how the Course Reserves Department at the Henderson Library complies with Fair Use and Copyright. By following the Georgia Southern Universities Course Reserves policy, professors are able to share resources to their students in a legal and ethical manner. We will briefly review our process when professors have Copyright questions including when we bring in legal affairs. This presentation will provide tips for teachers, professors, and students to use when evaluating information

    Understanding the Transition between High School and College Mathematics and Science

    Get PDF
    Mathematics and science education is gaining increasing recognition as key for the well-being of individuals and society. Accordingly, the transition from high school to college is particularly important to ensure that students are prepared for college mathematics and science. The goal of this study was to understand how high school mathematics and science course-taking related to performance in college. Specifically, the study employed a nonparametric regression method to examine the relationship between high school mathematics and science courses, and academic performance in college mathematics and science courses. The results provide some evidence pertaining to the positive benefits from high school course-taking. Namely, students who completed high school trigonometry and lab-based chemistry tended to earn higher grades in college algebra and general chemistry, respectively. However, there was also evidence that high school coursework in biology and physics did not improve course performance in general biology and college physics beyond standardized test scores. Interestingly, students who completed high school calculus earned better grades in general biology. The implications of the findings are discussed for high school curriculum and alignment in standards between high schools and colleges

    Enabling Computer Decisions Based on EEG Input

    Get PDF
    Multilayer neural networks were successfully trained to classify segments of 12-channel electroencephalogram (EEG) data into one of five classes corresponding to five cognitive tasks performed by a subject. Independent component analysis (ICA) was used to segregate obvious artifact EEG components from other sources, and a frequency-band representation was used to represent the sources computed by ICA. Examples of results include an 85% accuracy rate on differentiation between two tasks, using a segment of EEG only 0.05 s long and a 95% accuracy rate using a 0.5-s-long segment

    Statistical comparisons of non-deterministic IR systems using two dimensional variance

    Get PDF
    Retrieval systems with non-deterministic output are widely used in information retrieval. Common examples include sampling, approximation algorithms, or interactive user input. The effectiveness of such systems differs not just for different topics, but also for different instances of the system. The inherent variance presents a dilemma - What is the best way to measure the effectiveness of a non-deterministic IR system? Existing approaches to IR evaluation do not consider this problem, or the potential impact on statistical significance. In this paper, we explore how such variance can affect system comparisons, and propose an evaluation framework and methodologies capable of doing this comparison. Using the context of distributed information retrieval as a case study for our investigation, we show that the approaches provide a consistent and reliable methodology to compare the effectiveness of a non-deterministic system with a deterministic or another non-deterministic system. In addition, we present a statistical best-practice that can be used to safely show how a non-deterministic IR system has equivalent effectiveness to another IR system, and how to avoid the common pitfall of misusing a lack of significance as a proof that two systems have equivalent effectiveness
    • …
    corecore