81 research outputs found
Recommended from our members
Optimal gridding and degridding in radio interferometry imaging
In radio interferometry imaging, the gridding procedure of convolving
visibilities with a chosen gridding function is necessary to transform
visibility values into uniformly sampled grid points. We propose here a
parameterised family of "least-misfit gridding functions" which minimise an
upper bound on the difference between the DFT and FFT dirty images for a given
gridding support width and image cropping ratio. When compared with the widely
used spheroidal function with similar parameters, these provide more than 100
times better alias suppression and RMS misfit reduction over the usable dirty
map. We discuss how appropriate parameter selection and tabulation of these
functions allow for a balance between accuracy, computational cost and storage
size. Although it is possible to reduce the errors introduced in the gridding
or degridding process to the level of machine precision, accuracy comparable to
that achieved by CASA requires only a lookup table with 300 entries and a
support width of 3, allowing for a greatly reduced computation cost for a given
performance
MEM and CLEAN Imaging of VLBA Polarisation Observations of Compact Active Galactic Nuclei
The Maximum Entropy Method (MEM) for the deconvolution of radio
interferometry images is mathematically well based and presents a number of
advantages over the usual CLEAN deconvolution, such as appreciably higher
resolution. The application of MEM for polarisation imaging remains relatively
little studied. CLEAN and MEM intensity and polarisation techniques are
discussed in application to recently obtained 18cm VLBA polarisation data for a
sample of Active Galactic Nuclei.Comment: From the proceedings of Beamed and Unbeamed Gamma-Rays from Galaxies,
April 11-15, 2011, Muonio, Finland. 6 pages, 3 figure
Bayesian astrostatistics: a backward look to the future
This perspective chapter briefly surveys: (1) past growth in the use of
Bayesian methods in astrophysics; (2) current misconceptions about both
frequentist and Bayesian statistical inference that hinder wider adoption of
Bayesian methods by astronomers; and (3) multilevel (hierarchical) Bayesian
modeling as a major future direction for research in Bayesian astrostatistics,
exemplified in part by presentations at the first ISI invited session on
astrostatistics, commemorated in this volume. It closes with an intentionally
provocative recommendation for astronomical survey data reporting, motivated by
the multilevel Bayesian perspective on modeling cosmic populations: that
astronomers cease producing catalogs of estimated fluxes and other source
properties from surveys. Instead, summaries of likelihood functions (or
marginal likelihood functions) for source properties should be reported (not
posterior probability density functions), including nontrivial summaries (not
simply upper limits) for candidate objects that do not pass traditional
detection thresholds.Comment: 27 pp, 4 figures. A lightly revised version of a chapter in
"Astrostatistical Challenges for the New Astronomy" (Joseph M. Hilbe, ed.,
Springer, New York, forthcoming in 2012), the inaugural volume for the
Springer Series in Astrostatistics. Version 2 has minor clarifications and an
additional referenc
A dusty pinwheel nebula around the massive star WR 104
Wolf-Rayet (WR) stars are luminous massive blue stars thought to be immediate
precursors to the supernova terminating their brief lives. The existence of
dust shells around such stars has been enigmatic since their discovery some 30
years ago; the intense radiation field from the star should be inimical to dust
survival. Although dust-creation models, including those involving interacting
stellar winds from a companion star, have been put forward, high-resolution
observations are required to understand this phenomena. Here we present
resolved images of the dust outflow around Wolf-Rayet WR 104, obtained with
novel imaging techniques, revealing detail on scales corresponding to about 40
AU at the star. Our maps show that the dust forms a spatially confined stream
following precisely a linear (or Archimedian) spiral trajectory. Images taken
at two separate epochs show a clear rotation with a period of 220 +/- 30 days.
Taken together, these findings prove that a binary star is responsible for the
creation of the circumstellar dust, while the spiral plume makes WR 104 the
prototype of a new class of circumstellar nebulae unique to interacting wind
systems.Comment: 7 pages, 2 figures, Appearing in Nature (1999 April 08
Neural Network Parameterizations of Electromagnetic Nucleon Form Factors
The electromagnetic nucleon form-factors data are studied with artificial
feed forward neural networks. As a result the unbiased model-independent
form-factor parametrizations are evaluated together with uncertainties. The
Bayesian approach for the neural networks is adapted for chi2 error-like
function and applied to the data analysis. The sequence of the feed forward
neural networks with one hidden layer of units is considered. The given neural
network represents a particular form-factor parametrization. The so-called
evidence (the measure of how much the data favor given statistical model) is
computed with the Bayesian framework and it is used to determine the best form
factor parametrization.Comment: The revised version is divided into 4 sections. The discussion of the
prior assumptions is added. The manuscript contains 4 new figures and 2 new
tables (32 pages, 15 figures, 2 tables
In-cell NMR characterization of the secondary structure populations of a disordered conformation of α-Synuclein within E. coli cells
α-Synuclein is a small protein strongly implicated in the pathogenesis of Parkinson’s disease and related neurodegenerative disorders. We report here the use of in-cell NMR spectroscopy to observe directly the structure and dynamics of this protein within E. coli cells. To improve the accuracy in the measurement of backbone chemical shifts within crowded in-cell NMR spectra, we have developed a deconvolution method to reduce inhomogeneous line broadening within cellular samples. The resulting chemical shift values were then used to evaluate the distribution of secondary structure populations which, in the absence of stable tertiary contacts, are a most effective way to describe the conformational fluctuations of disordered proteins. The results indicate that, at least within the bacterial cytosol, α-synuclein populates a highly dynamic state that, despite the highly crowded environment, has the same characteristics as the disordered monomeric form observed in aqueous solution
A dusty torus around the luminous young star LkHa 101
A star forms when a cloud of dust and gas collapses. It is generally believed
that this collapse first produces a flattened rotating disk, through which
matter is fed onto the embryonic star at the center of the disk. When the
temperature and density at the center of the star pass a critical threshold,
thermonuclear fusion begins. The remaining disk, which can still contain up to
0.3 times the mass of the star, is then sculpted and eventually dissipated by
the radiation and wind from the newborn star. Unfortunately this picture of the
structure and evolution of the disk remains speculative because of the lack of
morphological data of sufficient resolution and uncertainties regarding the
underlying physical processes. Here we present resolved images of a young star,
LkHa 101 in which the structure of the inner accretion disk is resolved. We
find that the disk is almost face-on, with a central gap (or cavity) and a hot
inner edge. The cavity is bigger than previous theoretical predictions, and we
infer that the position of the inner edge is probably determined by sublimation
of dust grains by direct stellar radiation, rather than by disk reprocessing or
the viscous heating processes as usually assumed.Comment: 7 pages, 1 figure. Appears in Nature, 22 Feb, 2001 (Vol 409
Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy
Background
A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets.
Methods
Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis.
Results
A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001).
Conclusion
We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty
A MAP6-Related Protein Is Present in Protozoa and Is Involved in Flagellum Motility
In vertebrates the microtubule-associated proteins MAP6 and MAP6d1 stabilize cold-resistant microtubules. Cilia and flagella have cold-stable microtubules but MAP6 proteins have not been identified in these organelles. Here, we describe TbSAXO as the first MAP6-related protein to be identified in a protozoan, Trypanosoma brucei. Using a heterologous expression system, we show that TbSAXO is a microtubule stabilizing protein. Furthermore we identify the domains of the protein responsible for microtubule binding and stabilizing and show that they share homologies with the microtubule-stabilizing Mn domains of the MAP6 proteins. We demonstrate, in the flagellated parasite, that TbSAXO is an axonemal protein that plays a role in flagellum motility. Lastly we provide evidence that TbSAXO belongs to a group of MAP6-related proteins (SAXO proteins) present only in ciliated or flagellated organisms ranging from protozoa to mammals. We discuss the potential roles of the SAXO proteins in cilia and flagella function
Recommended from our members
Probabilistic downscaling of remote sensing data with applications for multi-scale biogeochemical flux modeling
Upscaling ecological information to larger scales in space and downscaling remote sensing observations or model simulations to finer scales remain grand challenges in Earth system science. Downscaling often involves inferring subgrid information from coarse-scale data, and such ill-posed problems are classically addressed using regularization. Here, we apply two-dimensional Tikhonov Regularization (2DTR) to simulate subgrid surface patterns for ecological applications. Specifically, we test the ability of 2DTR to simulate the spatial statistics of high-resolution (4 m) remote sensing observations of the normalized difference vegetation index (NDVI) in a tundra landscape. We find that the 2DTR approach as applied here can capture the major mode of spatial variability of the high-resolution information, but not multiple modes of spatial variability, and that the Lagrange multiplier (γ) used to impose the condition of smoothness across space is related to the range of the experimental semivariogram. We used observed and 2DTR-simulated maps of NDVI to estimate landscape-level leaf area index (LAI) and gross primary productivity (GPP). NDVI maps simulated using a γ value that approximates the range of observed NDVI result in a landscape-level GPP estimate that differs by ca 2% from those created using observed NDVI. Following findings that GPP per unit LAI is lower near vegetation patch edges, we simulated vegetation patch edges using multiple approaches and found that simulated GPP declined by up to 12% as a result. 2DTR can generate random landscapes rapidly and can be applied to disaggregate ecological information and compare of spatial observations against simulated landscapes
- …