266,302 research outputs found
Principal Component Analysis and Radiative Transfer modelling of Spitzer IRS Spectra of Ultra Luminous Infrared Galaxies
The mid-infrared spectra of ultraluminous infrared galaxies (ULIRGs) contain
a variety of spectral features that can be used as diagnostics to characterise
the spectra. However, such diagnostics are biased by our prior prejudices on
the origin of the features. Moreover, by using only part of the spectrum they
do not utilise the full information content of the spectra. Blind statistical
techniques such as principal component analysis (PCA) consider the whole
spectrum, find correlated features and separate them out into distinct
components.
We further investigate the principal components (PCs) of ULIRGs derived in
Wang et al.(2011). We quantitatively show that five PCs is optimal for
describing the IRS spectra. These five components (PC1-PC5) and the mean
spectrum provide a template basis set that reproduces spectra of all z<0.35
ULIRGs within the noise. For comparison, the spectra are also modelled with a
combination of radiative transfer models of both starbursts and the dusty torus
surrounding active galactic nuclei. The five PCs typically provide better fits
than the models. We argue that the radiative transfer models require a colder
dust component and have difficulty in modelling strong PAH features.
Aided by the models we also interpret the physical processes that the
principal components represent. The third principal component is shown to
indicate the nature of the dominant power source, while PC1 is related to the
inclination of the AGN torus.
Finally, we use the 5 PCs to define a new classification scheme using 5D
Gaussian mixtures modelling and trained on widely used optical classifications.
The five PCs, average spectra for the four classifications and the code to
classify objects are made available at: http://www.phys.susx.ac.uk/~pdh21/PCA/Comment: 11 pages, 12 figures, accepted for publication in MNRA
The XMM-LSS cluster sample and its cosmological applications. Prospects for the XMM next decade
The well defined selection function of the XMM-LSS survey enables a
simultaneous modelling of the observed cluster number counts and of the
evolution of the L-T relation. We present results pertaining to the first 5
deg2 for a well controlled sample comprising 30 objects: they are compatible
with the WMAP3 parameter set along with cluster self-similar evolution.
Extending such a survey to 200 deg2 would (1) allow discriminating between the
major scenarios of the cluster L-T evolution and (2) provide a unique
self-sufficient determination of sigma8 and Gamma with an accuracy of ~ 5% and
10% respectively, when adding mass information from weak lensing and S-Z
observations.Comment: Proceedings of the "XMM-Newton: the next decade", to appear in
Astronomische Nachrichte
Extended Object Tracking: Introduction, Overview and Applications
This article provides an elaborate overview of current research in extended
object tracking. We provide a clear definition of the extended object tracking
problem and discuss its delimitation to other types of object tracking. Next,
different aspects of extended object modelling are extensively discussed.
Subsequently, we give a tutorial introduction to two basic and well used
extended object tracking approaches - the random matrix approach and the Kalman
filter-based approach for star-convex shapes. The next part treats the tracking
of multiple extended objects and elaborates how the large number of feasible
association hypotheses can be tackled using both Random Finite Set (RFS) and
Non-RFS multi-object trackers. The article concludes with a summary of current
applications, where four example applications involving camera, X-band radar,
light detection and ranging (lidar), red-green-blue-depth (RGB-D) sensors are
highlighted.Comment: 30 pages, 19 figure
Prototype system for supporting the incremental modelling of vague geometric configurations
In this paper the need for Intelligent Computer Aided Design (Int.CAD) to jointly support design and learning assistance is introduced. The paper focuses on presenting and exploring the possibility of realizing learning assistance in Int.CAD by introducing a new concept called Shared Learning. Shared Learning is proposed to empower CAD tools with more useful learning capabilities than that currently available and thereby provide a stronger interaction of learning between a designer and a computer. Controlled computational learning is proposed as a means whereby the Shared Learning concept can be realized. The viability of this new concept is explored by using a system called PERSPECT. PERSPECT is a preliminary numerical design tool aimed at supporting the effective utilization of numerical experiential knowledge in design. After a detailed discussion of PERSPECT's numerical design support, the paper presents the results of an evaluation that focuses on PERSPECT's implementation of controlled computational learning and ability to support a designer's need to learn. The paper then discusses PERSPECT's potential as a tool for supporting the Shared Learning concept by explaining how a designer and PERSPECT can jointly learn. There is still much work to be done before the full potential of Shared Learning can be realized. However, the authors do believe that the concept of Shared Learning may hold the key to truly empowering learning in Int.CAD
A Comparison of State-Based Modelling Tools for Model Validation
In model-based testing, one of the biggest decisions taken before modelling is the modelling language and the model analysis tool to be used to model the system under investigation. UML, Alloy and Z are examples of popular state-based modelling languages. In the literature, there has been research about the similarities and the differences between modelling languages. However, we believe that, in addition to recognising the expressive power of modelling languages, it is crucial to detect the capabilities and the weaknesses of analysis tools that parse and analyse models written in these languages. In order to explore this area, we have chosen four model analysis tools: USE, Alloy Analyzer, ZLive and ProZ and observed how modelling and validation stages of MBT are handled by these tools for the same system. Through this experiment, we not only concretise the tasks that form the modelling and validation stages of MBT process, but also reveal how efficiently these tasks are carried out in different tools
Three Dimensional Software Modelling
Traditionally, diagrams used in software systems modelling have been two dimensional (2D). This is probably because graphical notations, such as those used in object-oriented and structured systems modelling, draw upon the topological graph metaphor, which, at its basic form, receives little benefit from three dimensional (3D) rendering. This paper presents a series of 3D graphical notations demonstrating effective use of the third dimension in modelling. This is done by e.g., connecting several graphs together, or in using the Z co-ordinate to show special kinds of edges. Each notation combines several familiar 2D diagrams, which can be reproduced from 2D projections of the 3D model. 3D models are useful even in the absence of a powerful graphical workstation: even 2D stereoscopic projections can expose more information than a plain planar diagram
Comparison of Simple Graphical Process Models
Comparing the structure of graphical process models can reveal a number of process variations. Since most contemporary norms for process modelling rely on directed connectivity of objects in the model, connections between objects form sequences which can be translated into performing scenarios. Whereas sequences can be tested for completeness in performing process activities using simulation methods, the similarity or difference in static characteristics of sequences in different model variants are difficult to explore. The goal of the paper is to test the application of a method for comparison of graphical models by analyzing and comparing static characteristics of process models. Consequently, a metamodel for process models is developed followed by a comparison procedure conducted using a graphical model comparison algorithm
An approach for real world data modelling with the 3D terrestrial laser scanner for built environment
Capturing and modelling 3D information of the built environment is a big challenge. A number of techniques and technologies are now in use. These include EDM, GPS, and photogrammetric application, remote sensing and traditional building surveying applications. However, use of these technologies cannot be practical and efficient in regard to time, cost and accuracy. Furthermore, a multi disciplinary knowledge base, created from the studies and research about the regeneration aspects is fundamental: historical, architectural, archeologically, environmental, social, economic, etc. In order to have an adequate diagnosis of regeneration, it is necessary to describe buildings and surroundings by means of documentation and plans. However, at this point in time the foregoing is considerably far removed from the real situation, since more often than not it is extremely difficult to obtain full documentation and cartography, of an acceptable quality, since the material, constructive pathologies and systems are often insufficient or deficient (flat that simply reflects levels, isolated photographs,..). Sometimes the information in reality exists, but this fact is not known, or it is not easily accessible, leading to the unnecessary duplication of efforts and resources.
In this paper, we discussed 3D laser scanning technology, which can acquire high density point data in an accurate, fast way. Besides, the scanner can digitize all the 3D information concerned with a real world object such as buildings, trees and terrain down to millimetre detail Therefore, it can provide benefits for refurbishment process in regeneration in the Built Environment and it can be the potential solution to overcome the challenges above. The paper introduce an approach for scanning buildings, processing the point cloud raw data, and a modelling approach for CAD extraction and building objects classification by a pattern matching approach in IFC (Industry Foundation Classes) format. The approach presented in this paper from an undertaken research can lead to parametric design and Building Information Modelling (BIM) for existing structures. Two case studies are introduced to demonstrate the use of laser scanner technology in the Built Environment. These case studies are the Jactin House Building in East Manchester and the Peel building in the campus of University Salford. Through these case studies, while use of laser scanners are explained, the integration of it with various technologies and systems are also explored for professionals in Built Environmen
Spatial decomposition of on-nucleus spectra of quasar host galaxies
In order to study the host galaxies of type 1 (broad-line) quasars, we
present a semi-analytic modelling method to decompose the on-nucleus spectra of
quasars into nuclear and host galaxy channels. The method uses the spatial
information contained in long-slit or slitlet spectra. A routine determines the
best fitting combination of the spatial distribution of the point like nucleus
and extended host galaxy. Inputs are a simultaneously observed PSF, and
external constraints on galaxy morphology from imaging. We demonstrate the
capabilities of the method to two samples of a total of 18 quasars observed
with EFOSC at the ESO 3.6m telescope and FORS1 at the ESO VLT.
~50% of the host galaxies with sucessful decomposition show distortions in
their rotation curves or peculiar gas velocities above normal maximum
velocities for disks. This is consistent with the fraction from optical
imaging. All host galaxies have quite young stellar populations, typically 1-2
Gyr. For the disk dominated hosts these are consistent with their inactive
counterparts, the luminosity weighted stellar ages are much younger for the
bulge dominated hosts, compared to inactive early type galaxies. While this
presents further evidence for a connection of galaxy interaction and AGN
activity for half of the sample, this is not clear for the other half: These
are often undistorted disk dominated host galaxies, and interaction on a
smaller level might be detected in deeper high-resolution images or deeper
spectroscopic data. The velocity information does not show obvious signs for
large scale outflows triggered by AGN feedback - the data is consistent with
velocity fields created by galaxy interaction.Comment: Accepted for publication in MNRAS; 19 pages, 12 figure
- …