204,307 research outputs found
Recommended from our members
Reconstruction and measurement of (100) MeV energy electromagnetic activity from π0 arrow γγ decays in the MicroBooNE LArTPC
We present results on the reconstruction of electromagnetic (EM) activity from photons produced in charged current νμ interactions with final state π0s. We employ a fully-automated reconstruction chain capable of identifying EM showers of (100) MeV energy, relying on a combination of traditional reconstruction techniques together with novel machine-learning approaches. These studies demonstrate good energy resolution, and good agreement between data and simulation, relying on the reconstructed invariant π0 mass and other photon distributions for validation. The reconstruction techniques developed are applied to a selection of νμ + Ar → μ + π0 + X candidate events to demonstrate the potential for calorimetric separation of photons from electrons and reconstruction of π0 kinematics
Sensor data validation and reconstruction. Phase 1: System architecture study
The sensor validation and data reconstruction task reviewed relevant literature and selected applicable validation and reconstruction techniques for further study; analyzed the selected techniques and emphasized those which could be used for both validation and reconstruction; analyzed Space Shuttle Main Engine (SSME) hot fire test data to determine statistical and physical relationships between various parameters; developed statistical and empirical correlations between parameters to perform validation and reconstruction tasks, using a computer aided engineering (CAE) package; and conceptually designed an expert system based knowledge fusion tool, which allows the user to relate diverse types of information when validating sensor data. The host hardware for the system is intended to be a Sun SPARCstation, but could be any RISC workstation with a UNIX operating system and a windowing/graphics system such as Motif or Dataviews. The information fusion tool is intended to be developed using the NEXPERT Object expert system shell, and the C programming language
Comparative validation of single-shot optical techniques for laparoscopic 3-D surface reconstruction
Intra-operative imaging techniques for obtaining the shape and morphology of soft-tissue surfaces in vivo are a key enabling technology for advanced surgical systems. Different optical techniques for 3-D surface reconstruction in laparoscopy have been proposed, however, so far no quantitative and comparative validation has been performed. Furthermore, robustness of the methods to clinically important factors like smoke or bleeding has not yet been assessed. To address these issues, we have formed a joint international initiative with the aim of validating different state-of-the-art passive and active reconstruction methods in a comparative manner. In this comprehensive in vitro study, we investigated reconstruction accuracy using different organs with various shape and texture and also tested reconstruction robustness with respect to a number of factors like the pose of the endoscope as well as the amount of blood or smoke present in the scene. The study suggests complementary advantages of the different techniques with respect to accuracy, robustness, point density, hardware complexity and computation time. While reconstruction accuracy under ideal conditions was generally high, robustness is a remaining issue to be addressed. Future work should include sensor fusion and in vivo validation studies in a specific clinical context. To trigger further research in surface reconstruction, stereoscopic data of the study will be made publically available at www.open-CAS.com upon publication of the paper
Reconstructing financial statements
This paper introduces a tool for the reconstruction and validation of categorized totals embedded in untrusted and unformatted
text, such as OCR scans of nancial statements. The tool is a spino
of academic research into the funding of Japanese third-sector organizations, the annual reports of which are frequently published reports in the
form of PDF les containing document images. A number of techniques
at string- line- and document-level are used to resolve ambiguities and
obtain the greatest possible recovery rate for the underlying data, while
excluding the content of untrustworthy documents from the nal sample. In a preliminary trial \in the wild", the tool has returned validated
income totals for 47.9% of the documents in a heterogeous set of 2205
annual reports
The Orbiter Experiments (OEX) Program
The objective of the Orbiter Experiments (OEX) program is to obtain research quality flight data for the augmentation and advancement of space transportation technologies. This includes the validation and advancement of analytical theories and of ground-test methods and techniques. The following topics are discussed: aerothermodynamic design tool development and validation; the freestream environment; trajectory reconstruction; atmospheric reconstruction; the Shuttle Entry Air Data System (SEADS); the Shuttle Upper Atmosphere Mass Spectrometer (SUMS); and aerodynamic forces and moments. The discussion is presented in vugraph form
Hyper-parameter selection in non-quadratic regularization-based radar image formation
We consider the problem of automatic parameter selection in regularization-based radar image formation techniques. It
has previously been shown that non-quadratic regularization produces feature-enhanced radar images; can yield
superresolution; is robust to uncertain or limited data; and can generate enhanced images in non-conventional data
collection scenarios such as sparse aperture imaging. However, this regularized imaging framework involves some
hyper-parameters, whose choice is crucial because that directly affects the characteristics of the reconstruction. Hence
there is interest in developing methods for automatic parameter choice. We investigate Stein’s unbiased risk estimator
(SURE) and generalized cross-validation (GCV) for automatic selection of hyper-parameters in regularized radar
imaging. We present experimental results based on the Air Force Research Laboratory (AFRL) “Backhoe Data Dome,”
to demonstrate and discuss the effectiveness of these methods
Study of a Flexible UAV Proprotor
This paper is concerned with the evaluation of design techniques, both for the propulsive performance and for the structural behavior of a composite flexible proprotor. A numerical model was developed using a combination of aerodynamic model based on Blade Element Momentum Theory (BEMT), and structural model based on anisotropic beam finite element, in order to evaluate the coupled structural and the aerodynamic characteristics of the deformable proprotor blade. The numerical model was then validated by means of static performance measurements and shape reconstruction from Laser Distance Sensor (LDS) outputs. From the validation results of both aerodynamic and structural model, it can be concluded that the numerical approach developed by the authors is valid as a reliable tool for designing and analyzing the UAV-sized proprotor made of composite material. The proposed experiment technique is also capable of providing a predictive and reliable data in blade geometry and performance for rotor modes
İleri sentetik açıklıklı radar görüntüleme algoritmalarında parametre seçimi = Hyper-parameter selection in advanced synthetic aperture radar imaging slgorithms
We consider the problem of hyper-parameter selection
in advanced image reconstruction algorithms used in
synthetic aperture radar (SAR) imaging. To deal with
the parameter selection problem in these algorithms, we
propose the use of unbiased predictive risk estimation
and generalized cross-validation techniques. We
demonstrate the effectiveness of the applied methods
through experiments based on electromagnetically
simulated realistic data
The Friendly Settlement of Human Rights Abuses in the Americas
We present a new method for estimation of seismic coda shape. It falls into the same class of methods as non-parametric shape reconstruction with the use of neural network techniques where data are split into a training and validation data sets. We particularly pursue the well-known problem of image reconstruction formulated in this case as shape isolation in the presence of a broadly defined noise. This combined approach is enabled by the intrinsic feature of seismogram which can be divided objectively into a pre-signal seismic noise with lack of the target shape, and the remainder that contains scattered waveforms compounding the coda shape. In short, we separately apply shape restoration procedure to pre-signal seismic noise and the event record, which provides successful delineation of the coda shape in the form of a smooth almost non-oscillating function of time. The new algorithm uses a recently developed generalization of classical computational-geometry tool of alpha-shape. The generalization essentially yields robust shape estimation by ignoring locally a number of points treated as extreme values, noise or non-relevant data. Our algorithm is conceptually simple and enables the desired or pre-determined level of shape detail, constrainable by an arbitrary data fit criteria. The proposed tool for coda shape delineation provides an alternative to moving averaging and/or other smoothing techniques frequently used for this purpose. The new algorithm is illustrated with an application to the problem of estimating the coda duration after a local event. The obtained relation coefficient between coda duration and epicentral distance is consistent with the earlier findings in the region of interest
- …