511,696 research outputs found

    Geographical and temporal weighted regression (GTWR)

    Get PDF
    Both space and time are fundamental in human activities as well as in various physical processes. Spatiotemporal analysis and modeling has long been a major concern of geographical information science (GIScience), environmental science, hydrology, epidemiology, and other research areas. Although the importance of incorporating the temporal dimension into spatial analysis and modeling has been well recognized, challenges still exist given the complexity of spatiotemporal models. Of particular interest in this article is the spatiotemporal modeling of local nonstationary processes. Specifically, an extension of geographically weighted regression (GWR), geographical and temporal weighted regression (GTWR), is developed in order to account for local effects in both space and time. An efficient model calibration approach is proposed for this statistical technique. Using a 19-year set of house price data in London from 1980 to 1998, empirical results from the application of GTWR to hedonic house price modeling demonstrate the effectiveness of the proposed method and its superiority to the traditional GWR approach, highlighting the importance of temporally explicit spatial modeling

    An information geometric perspective on the complexity of macroscopic predictions arising from incomplete information

    Full text link
    Motivated by the presence of deep connections among dynamical equations, experimental data, physical systems, and statistical modeling, we report on a series of findings uncovered by the Authors and collaborators during the last decade within the framework of the so-called Information Geometric Approach to Chaos (IGAC). The IGAC is a theoretical modeling scheme that combines methods of information geometry with inductive inference techniques to furnish probabilistic descriptions of complex systems in presence of limited information. In addition to relying on curvature and Jacobi field computations, a suitable indicator of complexity within the IGAC framework is given by the so-called Information Geometric Entropy (IGE). The IGE is an information geometric measure of complexity of geodesic paths on curved statistical manifolds underlying the entropic dynamics of systems specified in terms of probability distributions. In this manuscript, we discuss several illustrative examples wherein our modeling scheme is employed to infer macroscopic predictions when only partial knowledge of the microscopic nature of a given system is available. Finally, we include comments on the strengths and weaknesses of the current version of our proposed theoretical scheme in our concluding remarks.Comment: 26 pages, invited review articl

    Universal Geometric Camera Calibration with Statistical Model Selection

    Get PDF
    We propose a new universal camera calibration approach that uses statistical information criteria for automatic camera model selection. It requires the camera to observe a planar pattern from different positions, and then closed-form estimates for the intrinsic and extrinsic parameters are computed followed by nonlinear optimization. In lieu of modeling radial distortion, the lens projection of the camera is modeled, and in addition we include decentering distortion. This approach is particularly advantageous for wide angle (fisheye) camera calibration because it often reduces the complexity of the model compared to modeling radial distortion. We then apply statistical information criteria to automatically select the complexity of the camera model for any lens type. The complete algorithm is evaluated on synthetic and real data for several different lens projections, and a comparison between existing methods which use radial distortion is done

    A Finite State Automaton Representation And Simulation Of A Data/Frame Model Of Sensemaking

    Get PDF
    This thesis presents the application of a finite state automaton (FSA) to analytic modeling of Data/Frame Model (DFM) of sensemaking. A FSA is chosen for the DFM simulation because of its inherent characteristics to mimic changes in system behaviors and transitional states akin to the dynamic information changes in dynamic and unstructured emergencies. It also has the ability to capture feedback and loops, transitions, and spatio-temporal events based on iterative processes of an individual or a group of sensemakers. The thesis has exploited the human-driven DFM constructs for analytical modeling using Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) software system. Sensemaking times, problem stage time (PST), and nodeto-node (NTN) transition times serve as the major performance factors. The results obtained show differences in sensemaking times based on problem complexity and information uncertainty. An analysis of variance (ANOVA) statistical analysis, for three developed fictitious scenarios with different complexities and Hurricane Katrina, was conducted to investigate sensemaking performance. The results show that sensemaking performance was significant with an F (3,177) of 16.78 and probability value less than 0.05, indicating an overall effect of sensemaking information flow on sensemaking. Tukey’s Studentized Range Test shows the significant statistical differences between the complexities of Hurricane Katrina (HK) and medium complexity scenario (MC), HK and low complexity scenario (LC), high complexity scenario (HC) and LC, and MC and LC

    Sparse Model Selection using Information Complexity

    Get PDF
    This dissertation studies and uses the application of information complexity to statistical model selection through three different projects. Specifically, we design statistical models that incorporate sparsity features to make the models more explanatory and computationally efficient. In the first project, we propose a Sparse Bridge Regression model for variable selection when the number of variables is much greater than the number of observations if model misspecification occurs. The model is demonstrated to have excellent explanatory power in high-dimensional data analysis through numerical simulations and real-world data analysis. The second project proposes a novel hybrid modeling method that utilizes a mixture of sparse principal component regression (MIX-SPCR) to segment high-dimensional time series data. Using the MIX-SPCR model, we empirically analyze the S\&P 500 index data (from 1999 to 2019) and identify two key change points. The third project investigates the use of nonlinear features in the Sparse Kernel Factor Analysis (SKFA) method to derive the information criterion. Using a variety of wide datasets, we demonstrate the benefits of SKFA in the nonlinear representation and classification of data. The results obtained show the flexibility and the utility of information complexity in such data modeling problems
    • …
    corecore