51,952 research outputs found

    Distributed and parallel sparse convex optimization for radio interferometry with PURIFY

    Full text link
    Next generation radio interferometric telescopes are entering an era of big data with extremely large data sets. While these telescopes can observe the sky in higher sensitivity and resolution than before, computational challenges in image reconstruction need to be overcome to realize the potential of forthcoming telescopes. New methods in sparse image reconstruction and convex optimization techniques (cf. compressive sensing) have shown to produce higher fidelity reconstructions of simulations and real observations than traditional methods. This article presents distributed and parallel algorithms and implementations to perform sparse image reconstruction, with significant practical considerations that are important for implementing these algorithms for Big Data. We benchmark the algorithms presented, showing that they are considerably faster than their serial equivalents. We then pre-sample gridding kernels to scale the distributed algorithms to larger data sizes, showing application times for 1 Gb to 2.4 Tb data sets over 25 to 100 nodes for up to 50 billion visibilities, and find that the run-times for the distributed algorithms range from 100 milliseconds to 3 minutes per iteration. This work presents an important step in working towards computationally scalable and efficient algorithms and implementations that are needed to image observations of both extended and compact sources from next generation radio interferometers such as the SKA. The algorithms are implemented in the latest versions of the SOPT (https://github.com/astro-informatics/sopt) and PURIFY (https://github.com/astro-informatics/purify) software packages {(Versions 3.1.0)}, which have been released alongside of this article.Comment: 25 pages, 5 figure

    A wildland fire model with data assimilation

    Full text link
    A wildfire model is formulated based on balance equations for energy and fuel, where the fuel loss due to combustion corresponds to the fuel reaction rate. The resulting coupled partial differential equations have coefficients that can be approximated from prior measurements of wildfires. An ensemble Kalman filter technique with regularization is then used to assimilate temperatures measured at selected points into running wildfire simulations. The assimilation technique is able to modify the simulations to track the measurements correctly even if the simulations were started with an erroneous ignition location that is quite far away from the correct one.Comment: 35 pages, 12 figures; minor revision January 2008. Original version available from http://www-math.cudenver.edu/ccm/report

    Realizing stock market crashes: stochastic cusp catastrophe model of returns under the time-varying volatility

    Full text link
    This paper develops a two-step estimation methodology, which allows us to apply catastrophe theory to stock market returns with time-varying volatility and model stock market crashes. Utilizing high frequency data, we estimate the daily realized volatility from the returns in the first step and use stochastic cusp catastrophe on data normalized by the estimated volatility in the second step to study possible discontinuities in markets. We support our methodology by simulations where we also discuss the importance of stochastic noise and volatility in deterministic cusp catastrophe model. The methodology is empirically tested on almost 27 years of U.S. stock market evolution covering several important recessions and crisis periods. Due to the very long sample period we also develop a rolling estimation approach and we find that while in the first half of the period stock markets showed marks of bifurcations, in the second half catastrophe theory was not able to confirm this behavior. Results suggest that the proposed methodology provides an important shift in application of catastrophe theory to stock markets

    Improved Distributed Estimation Method for Environmental\ud time-variant Physical variables in Static Sensor Networks

    Get PDF
    In this paper, an improved distributed estimation scheme for static sensor networks is developed. The scheme is developed for environmental time-variant physical variables. The main contribution of this work is that the algorithm in [1]-[3] has been extended, and a filter has been designed with weights, such that the variance of the estimation errors is minimized, thereby improving the filter design considerably\ud and characterizing the performance limit of the filter, and thereby tracking a time-varying signal. Moreover, certain parameter optimization is alleviated with the application of a particular finite impulse response (FIR) filter. Simulation results are showing the effectiveness of the developed estimation algorithm

    Spatial snow water equivalent estimation for mountainous areas using wireless-sensor networks and remote-sensing products

    Get PDF
    We developed an approach to estimate snow water equivalent (SWE) through interpolation of spatially representative point measurements using a k-nearest neighbors (k-NN) algorithm and historical spatial SWE data. It accurately reproduced measured SWE, using different data sources for training and evaluation. In the central-Sierra American River basin, we used a k-NN algorithm to interpolate data from continuous snow-depth measurements in 10 sensor clusters by fusing them with 14 years of daily 500-m resolution SWE-reconstruction maps. Accurate SWE estimation over the melt season shows the potential for providing daily, near real-time distributed snowmelt estimates. Further south, in the Merced-Tuolumne basins, we evaluated the potential of k-NN approach to improve real-time SWE estimates. Lacking dense ground-measurement networks, we simulated k-NN interpolation of sensor data using selected pixels of a bi-weekly Lidar-derived snow water equivalent product. k-NN extrapolations underestimate the Lidar-derived SWE, with a maximum bias of −10 cm at elevations below 3000 m and +15 cm above 3000 m. This bias was reduced by using a Gaussian-process regression model to spatially distribute residuals. Using as few as 10 scenes of Lidar-derived SWE from 2014 as training data in the k-NN to estimate the 2016 spatial SWE, both RMSEs and MAEs were reduced from around 20–25 cm to 10–15 cm comparing to using SWE reconstructions as training data. We found that the spatial accuracy of the historical data is more important for learning the spatial distribution of SWE than the number of historical scenes available. Blending continuous spatially representative ground-based sensors with a historical library of SWE reconstructions over the same basin can provide real-time spatial SWE maps that accurately represents Lidar-measured snow depth; and the estimates can be improved by using historical Lidar scans instead of SWE reconstructions

    A Simplified Crossing Fiber Model in Diffusion Weighted Imaging

    Get PDF
    Diffusion MRI (dMRI) is a vital source of imaging data for identifying anatomical connections in the living human brain that form the substrate for information transfer between brain regions. dMRI can thus play a central role toward our understanding of brain function. The quantitative modeling and analysis of dMRI data deduces the features of neural fibers at the voxel level, such as direction and density. The modeling methods that have been developed range from deterministic to probabilistic approaches. Currently, the Ball-and-Stick model serves as a widely implemented probabilistic approach in the tractography toolbox of the popular FSL software package and FreeSurfer/TRACULA software package. However, estimation of the features of neural fibers is complex under the scenario of two crossing neural fibers, which occurs in a sizeable proportion of voxels within the brain. A Bayesian non-linear regression is adopted, comprised of a mixture of multiple non-linear components. Such models can pose a difficult statistical estimation problem computationally. To make the approach of Ball-and-Stick model more feasible and accurate, we propose a simplified version of Ball-and-Stick model that reduces parameter space dimensionality. This simplified model is vastly more efficient in the terms of computation time required in estimating parameters pertaining to two crossing neural fibers through Bayesian simulation approaches. Moreover, the performance of this new model is comparable or better in terms of bias and estimation variance as compared to existing models

    A Simulation Perspective: Error Analysis in the Distributed Simulation of Continuous System

    Get PDF
    To construct a corresponding distributed system from a continuous system, the most convenient way is to partition the system into parts according to its topology and deploy the parts on separated nodes directly. However, system error will be introduced during this process because the computing pattern is changed from the sequential to the parallel. In this paper, the mathematical expression of the introduced error is studied. A theorem is proposed to prove that a distributed system preserving the stability property of the continuous system can be found if the system error is limited to be small enough. Then, the compositions of the system error are analyzed one by one and the complete expression is deduced, where the advancing step T in distributed environment is one of the key factors associated. At last, the general steps to determine the step T are given. The significance of this study lies in the fact that the maximum T can be calculated without exceeding the expected error threshold, and a larger T can reduce the simulation cost effectively without causing too much performance degradation compared to the original continuous system

    A review of applied methods in Europe for flood-frequency analysis in a changing environment

    Get PDF
    The report presents a review of methods used in Europe for trend analysis, climate change projections and non-stationary analysis of extreme precipitation and flood frequency. In addition, main findings of the analyses are presented, including a comparison of trend analysis results and climate change projections. Existing guidelines in Europe on design flood and design rainfall estimation that incorporate climate change are reviewed. The report concludes with a discussion of research needs on non-stationary frequency analysis for considering the effects of climate change and inclusion in design guidelines. Trend analyses are reported for 21 countries in Europe with results for extreme precipitation, extreme streamflow or both. A large number of national and regional trend studies have been carried out. Most studies are based on statistical methods applied to individual time series of extreme precipitation or extreme streamflow using the non-parametric Mann-Kendall trend test or regression analysis. Some studies have been reported that use field significance or regional consistency tests to analyse trends over larger areas. Some of the studies also include analysis of trend attribution. The studies reviewed indicate that there is some evidence of a general increase in extreme precipitation, whereas there are no clear indications of significant increasing trends at regional or national level of extreme streamflow. For some smaller regions increases in extreme streamflow are reported. Several studies from regions dominated by snowmelt-induced peak flows report decreases in extreme streamflow and earlier spring snowmelt peak flows. Climate change projections have been reported for 14 countries in Europe with results for extreme precipitation, extreme streamflow or both. The review shows various approaches for producing climate projections of extreme precipitation and flood frequency based on alternative climate forcing scenarios, climate projections from available global and regional climate models, methods for statistical downscaling and bias correction, and alternative hydrological models. A large number of the reported studies are based on an ensemble modelling approach that use several climate forcing scenarios and climate model projections in order to address the uncertainty on the projections of extreme precipitation and flood frequency. Some studies also include alternative statistical downscaling and bias correction methods and hydrological modelling approaches. Most studies reviewed indicate an increase in extreme precipitation under a future climate, which is consistent with the observed trend of extreme precipitation. Hydrological projections of peak flows and flood frequency show both positive and negative changes. Large increases in peak flows are reported for some catchments with rainfall-dominated peak flows, whereas a general decrease in flood magnitude and earlier spring floods are reported for catchments with snowmelt-dominated peak flows. The latter is consistent with the observed trends. The review of existing guidelines in Europe on design floods and design rainfalls shows that only few countries explicitly address climate change. These design guidelines are based on climate change adjustment factors to be applied to current design estimates and may depend on design return period and projection horizon. The review indicates a gap between the need for considering climate change impacts in design and actual published guidelines that incorporate climate change in extreme precipitation and flood frequency. Most of the studies reported are based on frequency analysis assuming stationary conditions in a certain time window (typically 30 years) representing current and future climate. There is a need for developing more consistent non-stationary frequency analysis methods that can account for the transient nature of a changing climate

    Do Public Expenditure and Macroeconomic Uncertainty Matter to Private Investment? Evidence from Pakistan

    Get PDF
    This study examines the role of macroeconomic uncertainty and public expenditure in determining private fixed investment in Pakistan. It is found that individual series are nonstationary. There is a long-run relationship between private fixed investment, public consumption expenditure, public development expenditure, and market activities. It is revealed that public development expenditure stimulates private investment, whereas public consumption expenditure is detrimental to private investment. The preferred dynamic private fixed investment function confirms that in the short run, public development expenditure enhances private investment. Moreover, macroeconomic instability and uncertainty depresses private investment in Pakistan.Private Investment, Public Expenditure, Macroeconomic Uncertainty, Co-integration, Pakistan

    Maneuverable Applications: Advancing Distributed Computing

    Get PDF
    Extending the military principle of maneuver into the war-fighting domain of cyberspace, academic and military researchers have produced many theoretical and strategic works, though few have focused on researching the applications and systems that apply this principle. We present a survey of our research in developing new architectures for the enhancement of parallel and distributed applica-tions. Specifically, we discuss our work in applying the military concept of maneuver in the cyberspace domain by creating a set of applications and systems called “ma-neuverable applications.” Our research investigates resource provisioning, application optimization, and cybersecurity enhancement through the modification, relocation, addition or removal of computing resources. We first describe our work to create a system to provision a big data computational re-source within academic environments. Secondly, we present a computing testbed built to allow researchers to study network optimizations of data centers. Thirdly, we discuss our Petri Net model of an adaptable system, which increases its cyber security posture in the face of varying levels of threat from malicious actors. Finally, we present evidence that traditional ideas about extending maneuver into cyberspace focus on security only, but computing can benefit from maneuver in multiple manners beyond security
    corecore