5,026 research outputs found
A conjugate gradient algorithm for the astrometric core solution of Gaia
The ESA space astrometry mission Gaia, planned to be launched in 2013, has
been designed to make angular measurements on a global scale with
micro-arcsecond accuracy. A key component of the data processing for Gaia is
the astrometric core solution, which must implement an efficient and accurate
numerical algorithm to solve the resulting, extremely large least-squares
problem. The Astrometric Global Iterative Solution (AGIS) is a framework that
allows to implement a range of different iterative solution schemes suitable
for a scanning astrometric satellite. In order to find a computationally
efficient and numerically accurate iteration scheme for the astrometric
solution, compatible with the AGIS framework, we study an adaptation of the
classical conjugate gradient (CG) algorithm, and compare it to the so-called
simple iteration (SI) scheme that was previously known to converge for this
problem, although very slowly. The different schemes are implemented within a
software test bed for AGIS known as AGISLab, which allows to define, simulate
and study scaled astrometric core solutions. After successful testing in
AGISLab, the CG scheme has been implemented also in AGIS. The two algorithms CG
and SI eventually converge to identical solutions, to within the numerical
noise (of the order of 0.00001 micro-arcsec). These solutions are independent
of the starting values (initial star catalogue), and we conclude that they are
equivalent to a rigorous least-squares estimation of the astrometric
parameters. The CG scheme converges up to a factor four faster than SI in the
tested cases, and in particular spatially correlated truncation errors are much
more efficiently damped out with the CG scheme.Comment: 24 pages, 16 figures. Accepted for publication in Astronomy &
Astrophysic
A census of Oph candidate members from Gaia DR2
The Ophiuchus cloud complex is one of the best laboratories to study the
earlier stages of the stellar and protoplanetary disc evolution. The wealth of
accurate astrometric measurements contained in the Gaia Data Release 2 can be
used to update the census of Ophiuchus member candidates. We seek to find
potential new members of Ophiuchus and identify those surrounded by a
circumstellar disc. We constructed a control sample composed of 188 bona fide
Ophiuchus members. Using this sample as a reference we applied three different
density-based machine learning clustering algorithms (DBSCAN, OPTICS, and
HDBSCAN) to a sample drawn from the Gaia catalogue centred on the Ophiuchus
cloud. The clustering analysis was applied in the five astrometric dimensions
defined by the three-dimensional Cartesian space and the proper motions in
right ascension and declination. The three clustering algorithms systematically
identify a similar set of candidate members in a main cluster with astrometric
properties consistent with those of the control sample. The increased
flexibility of the OPTICS and HDBSCAN algorithms enable these methods to
identify a secondary cluster. We constructed a common sample containing 391
member candidates including 166 new objects, which have not yet been discussed
in the literature. By combining the Gaia data with 2MASS and WISE photometry,
we built the spectral energy distributions from 0.5 to 22\microm for a subset
of 48 objects and found a total of 41 discs, including 11 Class II and 1 Class
III new discs. Density-based clustering algorithms are a promising tool to
identify candidate members of star forming regions in large astrometric
databases. If confirmed, the candidate members discussed in this work would
represent an increment of roughly 40% of the current census of Ophiuchus.Comment: A&A, Accepted. Abridged abstrac
Absorption reconstruction improves biodistribution assessment of fluorescent nanoprobes using hybrid Fluorescence-mediated tomography
Aim: Fluorescence-mediated tomography (FMT) holds potential for accelerating diagnostic and theranostic drug development. However, for proper quantitative fluorescence reconstruction, knowledge on optical scattering and absorption, which are highly heterogeneous in different (mouse) tissues, is required. We here describe methods to assess these parameters using co-registered micro Computed Tomography (µCT) data and nonlinear whole-animal absorption reconstruction, and evaluate their importance for assessment of the biodistribution and target site accumulation of fluorophore-labeled drug delivery systems.\ud
\ud
Methods: Besides phantoms with varying degrees of absorption, mice bearing A431 tumors were imaged 15 min and 48 h after i.v. injection of a fluorophore-labeled polymeric drug carrier (pHPMA-Dy750) using µCT-FMT. The outer shape of mice and a scattering map were derived using automated segmentation of the µCT data. Furthermore, a 3D absorption map was reconstructed from the trans-illumination data. We determined the absorption of five interactively segmented regions (heart, liver, kidney, muscle, tumor). Since blood is the main near-infrared absorber in vivo, the absorption was also estimated from the relative blood volume (rBV), determined by contrast-enhanced µCT. We compared the reconstructed absorption with the rBV-based values and analyzed the effect of using the absorption map on the fluorescence reconstruction.\ud
\ud
Results: Phantom experiments demonstrated that absorption reconstruction is possible and necessary for quantitative fluorescence reconstruction. In vivo, the reconstructed absorption showed high values in strongly blood-perfused organs such as the heart, liver and kidney. The absorption values correlated strongly with the rBV-based absorption values, confirming the accuracy of the absorption reconstruction. Usage of homogenous absorption instead of the reconstructed absorption map resulted in reduced values in the heart, liver and kidney, by factors of 3.5, 2.1 and 1.4, respectively. For muscle and subcutaneous tumors, which have a much lower rBV and absorption, absorption reconstruction was less important.\ud
\ud
Conclusion: Quantitative whole-animal absorption reconstruction is possible and can be validated in vivo using the rBV. Usage of an absorption map is important when quantitatively assessing the biodistribution of fluorescently labeled drugs and drug delivery systems, to avoid a systematic underestimation of fluorescence in strongly absorbing organs, such as the heart, liver and kidney
Gaia Data Processing Architecture
Gaia is ESA's ambitious space astrometry mission the main objective of which
is to astrometrically and spectro-photometrically map 1000 Million celestial
objects (mostly in our galaxy) with unprecedented accuracy. The announcement of
opportunity for the data processing will be issued by ESA late in 2006. The
Gaia Data Processing and Analysis Consortium (DPAC) has been formed recently
and is preparing an answer. The satellite will downlink close to 100 TB of raw
telemetry data over 5 years. To achieve its required accuracy of a few 10s of
Microarcsecond astrometry, a highly involved processing of this data is
required.
In addition to the main astrometric instrument Gaia will host a Radial
Velocity instrument, two low-resolution dispersers for multi-color photometry
and two Star Mappers. Gaia is a flying Giga Pixel camera. The various
instruments each require relatively complex processing while at the same time
being interdependent. We describe the overall composition of the DPAC and the
envisaged overall architecture of the Gaia data processing system. We shall
delve further into the core processing - one of the nine, so-called,
coordination units comprising the Gaia processing system.Comment: 10 Pages, 2 figures. To appear in ADASS XVI Proceeding
Methane in the Baltic and North Seas and a reassessment of the marine emissions of methane
During three measurement campaigns on the Baltic and North Seas, atmospheric and dissolved methane was determined with an automated gas chromatographic system. Area-weighted mean saturation values in the sea surface waters were 113 ± 5% and 395 ± 82% (Baltic Sea, February and July 1992) and 126 ± 8% (south central North Sea, September 1992). On the bases of our data and a compilation of literature data the global oceanic emissions of methane were reassessed by introducing a concept of regional gas transfer coefficients. Our estimates computed with two different air-sea exchange models lie in the range of 11-18 Tg CH4 yr-1. Despite the fact that shelf areas and estuaries only represent a small part of the world's ocean they contribute about 75% to the global oceanic emissions. We applied a simple, coupled, three-layer model to numerically simulate the time dependent variation of the oceanic flux to the atmosphere. The model calculations indicate that even with increasing tropospheric methane concentration, the ocean will remain a source of atmospheric methane
A Hybrid Finite-Volume/Transported PDF Model for Simulations of Turbulent Flames on Vector Machines
The dependence of dijet production on photon virtuality in ep collisions at HERA
The dependence of dijet production on the virtuality of the exchanged photon,
Q^2, has been studied by measuring dijet cross sections in the range 0 < Q^2 <
2000 GeV^2 with the ZEUS detector at HERA using an integrated luminosity of
38.6 pb^-1.
Dijet cross sections were measured for jets with transverse energy E_T^jet >
7.5 and 6.5 GeV and pseudorapidities in the photon-proton centre-of-mass frame
in the range -3 < eta^jet <0. The variable xg^obs, a measure of the photon
momentum entering the hard process, was used to enhance the sensitivity of the
measurement to the photon structure. The Q^2 dependence of the ratio of low- to
high-xg^obs events was measured.
Next-to-leading-order QCD predictions were found to generally underestimate
the low-xg^obs contribution relative to that at high xg^obs. Monte Carlo models
based on leading-logarithmic parton-showers, using a partonic structure for the
photon which falls smoothly with increasing Q^2, provide a qualitative
description of the data.Comment: 35 pages, 6 eps figures, submitted to Eur.Phys.J.
Горизонтальные классификаторы. Основы теории и расчета: моногр.
Приведены технологические схемы получения строительных песков при
гидромеханизированной добыче, основные конструктивные схемы классификаторов,
используемых при получении строительных песков. Особое внимание уделено изучению
процесса взаимодействия проточной части горизонтального классификатора с
совокупностью твердых частиц, расположенных в горизонтальном ускоренном потоке
несущей среды. Выполнено математическое моделирование ускоренного движения
горизонтального потока и твердых частиц в пределах разнонаклонных поверхностей
горизонтального классификатора. Экспериментально изучено гравитационное осаждение
твердых частиц, рассмотренное в виде вертикальной и горизонтальной составляющих, а
также влияние стесненности движения и перемещения твердых частиц относительно
несущего горизонтального потока.
Приведена методика расчета и выбора параметров классификаторов, информация об
опыте проектирования и внедрения горизонтальных классификаторов в составе добычных
комплексов при освоении обводненных месторождений песков.
Монография может быть полезна студентам, инженерно-техническим работникам,
сотрудникам высших учебных заведений, научно-исследовательских институтов и
проектных организаций горной промышленности
The relative and absolute timing accuracy of the EPIC-pn camera on XMM-Newton, from X-ray pulsations of the Crab and other pulsars
Reliable timing calibration is essential for the accurate comparison of
XMM-Newton light curves with those from other observatories, to ultimately use
them to derive precise physical quantities. The XMM-Newton timing calibration
is based on pulsar analysis. However, as pulsars show both timing noise and
glitches, it is essential to monitor these calibration sources regularly. To
this end, the XMM-Newton observatory performs observations twice a year of the
Crab pulsar to monitor the absolute timing accuracy of the EPIC-pn camera in
the fast Timing and Burst modes. We present the results of this monitoring
campaign, comparing XMM-Newton data from the Crab pulsar (PSR B0531+21) with
radio measurements. In addition, we use five pulsars (PSR J0537-69, PSR
B0540-69, PSR B0833-45, PSR B1509-58 and PSR B1055-52) with periods ranging
from 16 ms to 197 ms to verify the relative timing accuracy. We analysed 38
XMM-Newton observations (0.2-12.0 keV) of the Crab taken over the first ten
years of the mission and 13 observations from the five complementary pulsars.
All the data were processed with the SAS, the XMM-Newton Scientific Analysis
Software, version 9.0. Epoch folding techniques coupled with \chi^{2} tests
were used to derive relative timing accuracies. The absolute timing accuracy
was determined using the Crab data and comparing the time shift between the
main X-ray and radio peaks in the phase folded light curves. The relative
timing accuracy of XMM-Newton is found to be better than 10^{-8}. The strongest
X-ray pulse peak precedes the corresponding radio peak by 306\pm9 \mus, which
is in agreement with other high energy observatories such as Chandra, INTEGRAL
and RXTE. The derived absolute timing accuracy from our analysis is \pm48 \mus.Comment: 16 pages, 9 figures. Accepted for publication on A&
- …
