206 research outputs found
The SIMCA algorithm for processing Ground Penetrating Radar data and its use in landmine detection
The main challenge of ground penetrating radar (GPR)
based land mine detection is to have an accurate image
analysis method that is capable of reducing false alarms.
However an accurate image relies on having sufficient spatial
resolution in the received signal. But because the diameter
of an AP mine can be as low as 2cm and many soils
have very high attenuations at frequencies above 3GHz,
the accurate detection of landmines is accomplished using
advanced algorithms. Using image reconstruction and
by carrying out the system level analysis of the issues involved
with recognition of landmines allows the landmine
detection problem to be solved. The SIMCA (âSIMulated
Correlation Algorithmâ) is a novel and accurate landmine
detection tool that carries out correlation between a simulated
GPR trace and a clutter1 removed original GPR
trace. This correlation is performed using the MATLAB
R
processing environment. The authors tried using convolution
and correlation. But in this paper the correlated results
are presented because they produced better results.
Validation of the results from the algorithm was done by
an expert GPR user and 4 other general users who predict
the location of landmines. These predicted results are
compared with the ground truth data
Probabilistic political economy and endogenous money
Since the foundational work of Farjoun and Machover , important contributions to the ïŹeld of
probabilistic economy have been made. In this context one naturally has conservation of money as a postulate. However
it is questionable whether a capitalist economy could ever work with entirely exogenous money, and it
is interesting to see to what extent probabilistic arguments can illuminate the evolution of the type of
endogenous money system that characterizes contemporary capitalism.
We ïŹrst argue, on probabilistic grounds, that a system with a strict conservation law on money was
historically unsustainable. We then make the case that phenomena such as the formation of a rate of
interest, periodic commercial crises, and the formation of a rentier class can be understood using the sort
of reasoning pioneered by Farjoun and Machove
The SCC and the SICSA multi-core challenge
Two phases of the SICSA Multi-core Challenge have
gone past. The first challenge was to produce concordances of
books for sequences of words up to length N; and the second
to simulate the motion of N celestial bodies under gravity. We
took both challenges on the SCC, using C and the Linux Shell.
This paper is an account of the experiences gained. It also gives
a shorter account of the performance of other systems on the
same set of problems, as they provide benchmarks against which
the SCC performance can be compared with
A 2D processing algorithm for detecting landmines using Ground Penetrating Radar data
Ground Penetrating Radar(GPR) is one of a number
of technologies that have been used to improve landmine
detection efficiency. The clutter environment within the first
few cm of the soil where landmines are buried, exhibits strong
reflections with highly non-stationary statistics. An antipersonnel
mine(AP) can have a diameter as low as 2cm whereas many
soils have very high attenuation frequencies above 3GHZ. The
landmine detection problem can be solved by carrying out system
level analysis of the issues involved to synthesise an image
which people can readily understand. The SIMCA (âSIMulated
Correlation Algorithmâ) is a technique that carries out correlation
between the actual GPR trace that is recorded at the field and the
ideal trace which is obtained by carrying out GPR simulation.
The SIMCA algorithm firstly calculates by forward modelling a
synthetic point spread function of the GPR by using the design
parameters of the radar and soil properties to carry out radar
simulation. This allows the derivation of the correlation kernel.
The SIMCA algorithm then filters these unwanted components
or clutter from the signal to enhance landmine detection. The
clutter removed GPR B scan is then correlated with the kernel
using the Pearson correlation coefficient. This results in a image
which emphasises the target features and allows the detection of
the target by looking at the brightest spots. Raising of the image
to an odd power >2 enhances the target/background separation.
To validate the algorithm, the length of the target in some cases
and the diameter of the target in other cases, along with the
burial depth obtained by the SIMCA system are compared with
the actual values used during the experiments for the burial depth
and those of the dimensions of the actual target. Because, due
to the security intelligence involved with landmine detection and
most authors work in collaboration with the national government
military programs, a database of landmine signatures is not
existant and the authors are also not able to publish fully their
algorithms. As a result, in this study we have compared some of
the cleaned images from other studies with the images obtained
by our method, and I am sure the reader would agree that our
algorithm produces a much clearer interpretable image
Mainstream parallel array programming on cell
We present the E] compiler and runtime library for the âFâ subset of
the Fortran 95 programming language. âFâ provides first-class support for arrays,
allowing E] to implicitly evaluate array expressions in parallel using the SPU coprocessors
of the Cell Broadband Engine. We present performance results from
four benchmarks that all demonstrate absolute speedups over equivalent âCâ or
Fortran versions running on the PPU host processor. A significant benefit of this
straightforward approach is that a serial implementation of any code is always
available, providing code longevity, and a familiar development paradigm
Is economic planning hypercomputational? The argument from Cantor diagonalisation
Murphy [26] argues that the diagonal argument of the number theorist Cantor can be used to elucidate issues that arose in the socialist calculation debate of the 1930s. In particular he contends that the diagonal argument buttresses the claims of the Austrian economists regarding the impossibility of rational planning.We challenge Murphyâs argument, both at the number theoretic level and from the standpoint of economic realism
The SIMCA algorithm for processing Ground Penetrating Radar data and its use in locating foundations in demolished buildings
AbstractâThe main challenge of ground penetrating radar
GPR) based foundation detection is to have an accurate image
analysis method. In order to solve the detection problem a
system level analysis of the issues involved with the recognition of
foundations using image reconstruction is required. The SIMCA
(âSIMulated Correlation Algorithmâ) is a technique based on
an area correlation between the trace that would be returned
by an ideal point reflector in the soil conditions at the site
and the actual trace. During an initialization phase, SIMCA
carries out radar simulation using the design parameters of the
radar and soil properties. Then SIMCA takes the raw data as
the radar is scanned over the ground and in real-time uses a
clutter removal technique to remove various clutter such as cross
talk, initial ground reflection and antenna ringing. The trace
which would be returned by a target under these conditions
is then used to form a correlation kernel. The GPR b-scan is
then correlated with the kernel using the Pearson correlation
coefficient, resulting in a correlated image which is brightest at
points most similar to the canonical target. This image is then
raised to an odd power >2 to enhance the target/background
separation. To validate and compare the algorithm, photographs
of the building before it was demolished along with processed data
using the REFLEXW package were used. The results produced
by the SIMCA algorithm were very promising and were able to
locate some features that the REFLEXW package were not able
to identify
Compressed sensing electron tomography using adaptive dictionaries: a simulation study
Electron tomography (ET) is an increasingly important technique for examining the three-dimensional morphologies of nanostructures. ET involves the acquisition of a set of 2D projection images to be reconstructed into a volumetric image by solving an inverse problem. However, due to limitations in the acquisition process this inverse problem is considered ill-posed (i.e., no unique solution exists). Furthermore reconstruction usually suffers from missing wedge artifacts (e.g., star, fan, blurring, and elongation artifacts). Compressed sensing (CS) has recently been applied to ET and showed promising results for reducing missing wedge artifacts caused by limited angle sampling. CS uses a nonlinear reconstruction algorithm that employs image sparsity as a priori knowledge to improve the accuracy of density reconstruction from a relatively small number of projections compared to other reconstruction techniques. However, The performance of CS recovery depends heavily on the degree of sparsity of the reconstructed image in the selected transform domain. Prespecified transformations such as spatial gradients provide sparse image representation, while synthesising the sparsifying transform based on the properties of the particular specimen may give even sparser results and can extend the application of CS to specimens that can not be sparsely represented with other transforms such as Total variation (TV). In this work, we show that CS reconstruction in ET can be significantly improved by tailoring the sparsity representation using a sparse dictionary learning principle
Probabilistic political economy and endogenous money
Since the foundational work of Farjoun and Machover , important contributions to the ïŹeld of
probabilistic economy have been made. In this context one naturally has conservation of money as a postulate. However
it is questionable whether a capitalist economy could ever work with entirely exogenous money, and it
is interesting to see to what extent probabilistic arguments can illuminate the evolution of the type of
endogenous money system that characterizes contemporary capitalism.
We ïŹrst argue, on probabilistic grounds, that a system with a strict conservation law on money was
historically unsustainable. We then make the case that phenomena such as the formation of a rate of
interest, periodic commercial crises, and the formation of a rentier class can be understood using the sort
of reasoning pioneered by Farjoun and Machove
- âŠ