705 research outputs found
A Fluctuation Analysis of the Bolocam 1.1mm Lockman Hole Survey
We perform a fluctuation analysis of the 1.1mm Bolocam Lockman Hole Survey,
which covers 324 square arcmin to a very uniform point source-filtered RMS
noise level of 1.4 mJy/beam. The fluctuation analysis has the significant
advantage of utilizing all of the available data. We constrain the number
counts in the 1-10 mJy range, and derive significantly tighter constraints than
in previous work: the power-law index is 2.7 (+0.18, -0.15), while the
amplitude is equal to 1595 (+85,-238) sources per mJy per square degree, or
N(>1 mJy) = 940 (+50,-140) sources/square degree (95% confidence). Our results
agree extremely well with those derived from the extracted source number counts
by Laurent et al (2005). Our derived normalization is about 2.5 times smaller
than determined by MAMBO at 1.2mm by Greve et al (2004). However, the
uncertainty in the normalization for both data sets is dominated by the
systematic (i.e., absolute flux calibration) rather than statistical errors;
within these uncertainties, our results are in agreement. We estimate that
about 7% of the 1.1mm background has been resolved at 1 mJy.Comment: To appear in the Astrophysical Journal; 22 pages, 9 figure
Planning and Leveraging Event Portfolios: Towards a Holistic Theory
This conceptual paper seeks to advance the discourse on the leveraging and legacies of events by examining the planning, management, and leveraging of event portfolios. This examination shifts the common focus from analyzing single events towards multiple events and purposes that can enable cross-leveraging among different events in pursuit of attainment and magnification of specific ends. The following frameworks are proposed: (1) event portfolio planning and leveraging, and (2) analyzing events networks and inter-organizational linkages. These frameworks are intended to provide, at this infancy stage of event portfolios research, a solid ground for building theory on the management of different types and scales of events within the context of a portfolio aimed to obtain, optimize and sustain tourism, as well as broader community benefits
Recommended from our members
Stops making sense: translational trade-offs and stop codon reassignment
Background
Efficient gene expression involves a trade-off between (i) premature termination of protein synthesis; and (ii) readthrough, where the ribosome fails to dissociate at the terminal stop. Sense codons that are similar in sequence to stop codons are more susceptible to nonsense mutation, and are also likely to be more susceptible to transcriptional or translational errors causing premature termination. We therefore expect this trade-off to be influenced by the number of stop codons in the genetic code. Although genetic codes are highly constrained, stop codon number appears to be their most volatile feature.
Results
In the human genome, codons readily mutable to stops are underrepresented in coding sequences. We construct a simple mathematical model based on the relative likelihoods of premature termination and readthrough. When readthrough occurs, the resultant protein has a tail of amino acid residues incorrectly added to the C-terminus. Our results depend strongly on the number of stop codons in the genetic code. When the code has more stop codons, premature termination is relatively more likely, particularly for longer genes. When the code has fewer stop codons, the length of the tail added by readthrough will, on average, be longer, and thus more deleterious. Comparative analysis of taxa with a range of stop codon numbers suggests that genomes whose code includes more stop codons have shorter coding sequences.
Conclusions
We suggest that the differing trade-offs presented by alternative genetic codes may result in differences in genome structure. More speculatively, multiple stop codons may mitigate readthrough, counteracting the disadvantage of a higher rate of nonsense mutation. This could help explain the puzzling overrepresentation of stop codons in the canonical genetic code and most variants
Detection of involved margins in breast specimens with x-ray phase-contrast computed tomography
Margins of wide local excisions in breast conserving surgery are tested through histology, which can delay results by days and lead to second operations. Detection of margin involvement intraoperatively would allow the removal of additional tissue during the same intervention. X-ray phase contrast imaging (XPCI) provides soft tissue sensitivity superior to conventional X-rays: we propose its use to detect margin involvement intraoperatively. We have developed a system that can perform phase-based computed tomography (CT) scans in minutes, used it to image 101 specimens approximately half of which contained neoplastic lesions, and compared results against those of a commercial system. Histological analysis was carried out on all specimens and used as the gold standard. XPCI-CT showed higher sensitivity (83%, 95% CI 69–92%) than conventional specimen imaging (32%, 95% CI 20–49%) for detection of lesions at margin, and comparable specificity (83%, 95% CI 70–92% vs 86%, 95% CI 73–93%). Within the limits of this study, in particular that specimens obtained from surplus tissue typically contain small lesions which makes detection more difficult for both methods, we believe it likely that the observed increase in sensitivity will lead to a comparable reduction in the number of re-operations
Detection of involved margins in breast specimens with X-ray phase-contrast computed tomography.
Margins of wide local excisions in breast conserving surgery are tested through histology, which can delay results by days and lead to second operations. Detection of margin involvement intraoperatively would allow the removal of additional tissue during the same intervention. X-ray phase contrast imaging (XPCI) provides soft tissue sensitivity superior to conventional X-rays: we propose its use to detect margin involvement intraoperatively. We have developed a system that can perform phase-based computed tomography (CT) scans in minutes, used it to image 101 specimens approximately half of which contained neoplastic lesions, and compared results against those of a commercial system. Histological analysis was carried out on all specimens and used as the gold standard. XPCI-CT showed higher sensitivity (83%, 95% CI 69-92%) than conventional specimen imaging (32%, 95% CI 20-49%) for detection of lesions at margin, and comparable specificity (83%, 95% CI 70-92% vs 86%, 95% CI 73-93%). Within the limits of this study, in particular that specimens obtained from surplus tissue typically contain small lesions which makes detection more difficult for both methods, we believe it likely that the observed increase in sensitivity will lead to a comparable reduction in the number of re-operations
Volumetric high-resolution X-ray phase-contrast virtual histology of breast specimens with a compact laboratory system
The assessment of margin involvement is a fundamental task in breast conserving surgery to prevent recurrences and reoperations. It is usually performed through histology, which makes the process time consuming and can prevent the complete volumetric analysis of large specimens. X-ray phase contrast tomography combines high resolution, sufficient penetration depth and high soft tissue contrast, and can therefore provide a potential solution to this problem. In this work, we used a high-resolution implementation of the edge illumination X-ray phase contrast tomography based on "pixel-skipping" X-ray masks and sample dithering, to provide high definition virtual slices of breast specimens. The scanner was originally designed for intra-operative applications in which short scanning times were prioritised over spatial resolution; however, thanks to the versatility of edge illumination, high-resolution capabilities can be obtained with the same system simply by swapping x-ray masks without this imposing a reduction in the available field of view. This makes possible an improved visibility of fine tissue strands, enabling a direct comparison of selected CT slices with histology, and providing a tool to identify suspect features in large specimens before slicing. Combined with our previous results on fast specimen scanning, this works paves the way for the design of a multi-resolution EI scanner providing intra-operative capabilities as well as serving as a digital pathology system
Reinventing grounded theory: some questions about theory, ground and discovery
Grounded theory’s popularity persists after three decades of broad-ranging critique. In this article three problematic notions are discussed—‘theory,’ ‘ground’ and ‘discovery’—which linger in the continuing use and development of grounded theory procedures. It is argued that far from providing the epistemic security promised by grounded theory, these notions—embodied in continuing reinventions of grounded theory—constrain and distort qualitative inquiry, and that what is contrived is not in fact theory in any meaningful sense, that ‘ground’ is a misnomer when talking about interpretation and that what ultimately materializes following grounded theory procedures is less like discovery and more akin to invention. The procedures admittedly provide signposts for qualitative inquirers, but educational researchers should be wary, for the significance of interpretation, narrative and reflection can be undermined in the procedures of grounded theory
The Bolocam Lockman Hole Millimeter-Wave Galaxy Survey: Galaxy Candidates and Number Counts
We present results of a new deep 1.1 mm survey using Bolocam, a
millimeter-wavelength bolometer array camera designed for mapping large fields
at fast scan rates, without chopping. A map, galaxy candidate list, and derived
number counts are presented. This survey encompasses 324 arcmin^2 to an rms
noise level (filtered for point sources) of 1.4 mJy/beam and includes the
entire regions surveyed by the published 8 mJy 850 micron JCMT SCUBA and 1.2 mm
IRAM MAMBO surveys. We reduced the data using a custom software pipeline to
remove correlated sky and instrument noise via a principal component analysis.
Extensive simulations and jackknife tests were performed to confirm the
robustness of our source candidates and estimate the effects of false
detections, bias, and completeness. In total, 17 source candidates were
detected at a significance > 3.0 sigma, with six expected false detections.
Nine candidates are new detections, while eight candidates have coincident
SCUBA 850 micron and/or MAMBO 1.2 mm detections. From our observed number
counts, we estimate the underlying differential number count distribution of
submillimeter galaxies and find it to be in general agreement with previous
surveys. Modeling the spectral energy distributions of these submillimeter
galaxies after observations of dusty nearby galaxies suggests extreme
luminosities of L = 1.0-1.6 x 10^13 L_solar and, if powered by star formation,
star formation rates of 500-800 M_solar/yr.Comment: In press (to appear in Astrophysical Journal: 1 May 2005, v624, 1
issue); 21 pages, 15 figures, 3 table
A very conscientious brand: A case study of the BBC's current affairs series Panorama
The reputation of British current affairs and documentary series such as the BBC's Panorama, Channel 4’s Dispatches or the now defunct Granada series World
in Action have rested on an image of conscientious ‘public service’. These popular, long running series have, at various points in their history, acted as the ‘conscience
of the nation’, seeking to expose social injustice, investigate misdemeanours by the powerful and take on venal or corrupt vested interest. The BBC’s flagship current
affairs series Panorama is Britain’s longest running television programme and, according to the Panorama website, ‘the world’s longest running investigative TV show’. It has provided a template for other current affairs series both in Britain, Europe and around the world while undergoing several transformations in form and style since its launch in 1953, the latest and arguably most dramatic being in 2007. This article will chart the development of Panorama as a distinctive, ‘flagship' current affairs series over six decades. It will attempt to answer why the Panorama brand has survived so long, while so many other notable current affairs series have not. Using research and material from Bournemouth University’s Panorama Archive, the Video Active website, the BFI and other European archives this article explores the development of an iconic current affairs series that has, at different stages in its history, proved a template for other news and current affairs programmes. Various breaks and continuities are highlighted in Panorama’s history and identity, and an attempt will be made to characterise and specify the Panorama ‘brand’ and pinpoint the series’ successes and failures in reinventing itself in a rapidly changing media context
Microguards and micromessengers of the genome
The regulation of gene expression is of fundamental importance to maintain organismal function and integrity and requires a multifaceted and highly ordered sequence of events. The cyclic nature of gene expression is known as ‘transcription dynamics’. Disruption or perturbation of these dynamics can result in significant fitness costs arising from genome instability, accelerated ageing and disease. We review recent research that supports the idea that an important new role for small RNAs, particularly microRNAs (miRNAs), is in protecting the genome against short-term transcriptional fluctuations, in a process we term ‘microguarding’. An additional emerging role for miRNAs is as ‘micromessengers’—through alteration of gene expression in target cells to which they are trafficked within microvesicles. We describe the scant but emerging evidence that miRNAs can be moved between different cells, individuals and even species, to exert biologically significant responses. With these two new roles, miRNAs have the potential to protect against deleterious gene expression variation from perturbation and to themselves perturb the expression of genes in target cells. These interactions between cells will frequently be subject to conflicts of interest when they occur between unrelated cells that lack a coincidence of fitness interests. Hence, there is the potential for miRNAs to represent both a means to resolve conflicts of interest, as well as instigate them. We conclude by exploring this conflict hypothesis, by describing some of the initial evidence consistent with it and proposing new ideas for future research into this exciting topic
- …