19,168 research outputs found
Recommended from our members
From on-line sketching to 2D and 3D geometry: A fuzzy knowledge based system
The paper describes the development of a fuzzy knowledge based prototype system for conceptual design. This real time system is designed to infer user’s sketching intentions, to segment sketched input and generate corresponding geometric primitives: straight lines, circles, arcs, ellipses, elliptical arcs, and B-spline curves. Topology information (connectivity, unitary constraints and pairwise constraints) is received dynamically from 2D sketched input and primitives. From the 2D topology information, a more accurate 2D geometry can be built up by applying a 2D geometric constraint solver. Subsequently, 3D geometry can be received feature by feature incrementally. Each feature can be recognised by inference knowledge in terms of matching its 2D primitive configurations and connection relationships. The system accepts not only sketched input, working as an automatic design tools, but also accepts user’s interactive input of both 2D primitives and special positional 3D primitives. This makes it easy and friendly to use. The system has been tested with a number of sketched inputs of 2D and 3D geometry
Analytical and experimental FWHM of a gamma camera: theoretical and practical issues
It is well known that resolution on a gamma camera varies as a
function of distance, scatter and the camera\u2019s characteristics (collimator type,
crystal thickness, intrinsic resolution etc). Manufacturers frequently provide only
a few pre-calculated resolution values (using a line source in air, 10\u201315 cm from
the collimator surface and without scattering). However, these are typically not
obtained in situations resembling a clinical setting. From a diagnostic point of view,
it is useful to know the expected resolution of a gamma camera at a given distance
from the collimator surface for a particular setting in order to decide whether it is
worth scanning patients with \u201csmall lesion\u201d or not. When dealing with absolute
quantification it is also mandatory to know precisely the expected resolution and its
uncertainty in order to make appropriate corrections.
Aim. Our aims are: to test a novel mathematical approach, the cubic spline interpolation,
for the extraction of the full width at half maximum (FWHM) from the
acquisition of a line source (experimental resolution) also considering measurement
uncertainty; to compare it with the usually adopted methods such as the gaussian
approach; to compare it with the theoretical resolution (analytical resolution) of a
gamma camera at different distances; to create a web-based educational program
with which to test these theories.
Methods. Three mathematical methods (direct calculation, global interpolation using
gaussian and local interpolation using splines) for calculatingFWHMfroma line
source (planar scintigraphy) were tested and compared. A NEMA Triple Line Source
Phantom was used to obtain static images both in air and with different scattering
levels. An advanced, open-source software (MATLAB/Octave and PHP based) was
created \u201cad hoc\u201d to obtain and compareFWHMvalues and relative uncertainty.
Results and Conclusion. Local interpolation using splines proved faster and more
reliable than the usually-adopted Gaussian interpolation. The proposed freely available
software proved effective in assessing bothFWHMand its uncertainty
Bayesian modelling and quantification of Raman spectroscopy
Raman spectroscopy can be used to identify molecules such as DNA by the
characteristic scattering of light from a laser. It is sensitive at very low
concentrations and can accurately quantify the amount of a given molecule in a
sample. The presence of a large, nonuniform background presents a major
challenge to analysis of these spectra. To overcome this challenge, we
introduce a sequential Monte Carlo (SMC) algorithm to separate each observed
spectrum into a series of peaks plus a smoothly-varying baseline, corrupted by
additive white noise. The peaks are modelled as Lorentzian, Gaussian, or
pseudo-Voigt functions, while the baseline is estimated using a penalised cubic
spline. This latent continuous representation accounts for differences in
resolution between measurements. The posterior distribution can be
incrementally updated as more data becomes available, resulting in a scalable
algorithm that is robust to local maxima. By incorporating this representation
in a Bayesian hierarchical regression model, we can quantify the relationship
between molecular concentration and peak intensity, thereby providing an
improved estimate of the limit of detection, which is of major importance to
analytical chemistry
A tension approach to controlling the shape of cubic spline surfaces on FVS triangulations
We propose a parametric tensioned version of the FVS macro-element to control the shape of the composite surface and remove artificial oscillations, bumps and other undesired behaviour. In particular, this approach is applied to C1 cubic spline surfaces over a four-directional mesh produced by two-stage scattered data fitting methods
Extracting 3D parametric curves from 2D images of Helical objects
Helical objects occur in medicine, biology, cosmetics, nanotechnology, and engineering. Extracting a 3D parametric curve from a 2D image of a helical object has many practical applications, in particular being able to extract metrics such as tortuosity, frequency, and pitch. We present a method that is able to straighten the image object and derive a robust 3D helical curve from peaks in the object boundary. The algorithm has a small number of stable parameters that require little tuning, and the curve is validated against both synthetic and real-world data. The results show that the extracted 3D curve comes within close Hausdorff distance to the ground truth, and has near identical tortuosity for helical objects with a circular profile. Parameter insensitivity and robustness against high levels of image noise are demonstrated thoroughly and quantitatively
B-spline techniques for volatility modeling
This paper is devoted to the application of B-splines to volatility modeling,
specifically the calibration of the leverage function in stochastic local
volatility models and the parameterization of an arbitrage-free implied
volatility surface calibrated to sparse option data. We use an extension of
classical B-splines obtained by including basis functions with infinite
support. We first come back to the application of shape-constrained B-splines
to the estimation of conditional expectations, not merely from a scatter plot
but also from the given marginal distributions. An application is the Monte
Carlo calibration of stochastic local volatility models by Markov projection.
Then we present a new technique for the calibration of an implied volatility
surface to sparse option data. We use a B-spline parameterization of the
Radon-Nikodym derivative of the underlying's risk-neutral probability density
with respect to a roughly calibrated base model. We show that this method
provides smooth arbitrage-free implied volatility surfaces. Finally, we sketch
a Galerkin method with B-spline finite elements to the solution of the partial
differential equation satisfied by the Radon-Nikodym derivative.Comment: 25 page
- …