508 research outputs found

    The rotating molecular core and precessing outflow of the young stellar object Barnard 1c

    Get PDF
    We investigate the structure of the core surrounding the recently identified deeply embedded young stellar object Barnard 1c which has an unusual polarization pattern as traced in submillimeter dust emission. Barnard 1c lies within the Perseus molecular cloud at a distance of 250 pc. It is a deeply embedded core of 2.4 solar masses (Kirk et al.) and a luminosity of 4 +/- 2 solar luminosities. Observations of CO, 13CO, C18O, HCO+ and N2H+ were obtained with the BIMA array, together with the continuum at 3.3 mm and 2.7 mm. Single-dish measurements of N2H+ and HCO+ with FCRAO reveal the larger scale emission in these lines, The CO and HCO+ emission traces the outflow, which coincides in detail with the S-shaped jet recently found in Spitzer IRAC imaging. The N2H+ emission, which anticorrelates spatially with the C18O emission, originates from a rotating envelope with effective radius ~ 2400 AU and mass 2.1 - 2.9 solar masses. N2H+ emission is absent from a 600 AU diameter region around the young star. The remaining N2H+ emission may lie in a coherent torus of dense material. With its outflow and rotating envelope, B1c closely resembles the previously studied object L483-mm, and we conclude that it is a protostar in an early stage of evolution. We hypothesize that heating by the outflow and star has desorbed CO from grains which has destroyed N2H+ in the inner region and surmise that the presence of grains without ice mantles in this warm inner region can explain the unusual polarization signature from B1c.Comment: 17 pages, 17 figures (9 colour). Accepted to The Astrophysical Journal. For higher resolution images, see http://astrowww.phys.uvic.ca/~brenda/preprints.htm

    Spatially Explicit Data: Stewardship and Ethical Challenges in Science

    Get PDF
    Scholarly communication is at an unprecedented turning point created in part by the increasing saliency of data stewardship and data sharing. Formal data management plans represent a new emphasis in research, enabling access to data at higher volumes and more quickly, and the potential for replication and augmentation of existing research. Data sharing has recently transformed the practice, scope, content, and applicability of research in several disciplines, in particular in relation to spatially specific data. This lends exciting potentiality, but the most effective ways in which to implement such changes, particularly for disciplines involving human subjects and other sensitive information, demand consideration. Data management plans, stewardship, and sharing, impart distinctive technical, sociological, and ethical challenges that remain to be adequately identified and remedied. Here, we consider these and propose potential solutions for their amelioration

    An Analysis of the Shapes of Interstellar Extinction Curves. V. The IR-Through-UV Curve Morphology

    Full text link
    We study the IR-through-UV interstellar extinction curves towards 328 Galactic B and late-O stars. We use a new technique which employs stellar atmosphere models in lieu of unreddened "standard" stars. This technique is capable of virtually eliminating spectral mismatch errors in the curves. It also allows a quantitative assessment of the errors and enables a rigorous testing of the significance of relationships between various curve parameters, regardless of whether their uncertainties are correlated. Analysis of the curves gives the following results: (1) In accord with our previous findings, the central position of the 2175 A extinction bump is mildly variable, its width is highly variable, and the two variations are unrelated. (2) Strong correlations are found among some extinction properties within the UV region, and within the IR region. (3) With the exception of a few curves with extreme (i.e., large) values of R(V), the UV and IR portions of Galactic extinction curves are not correlated with each other. (4) The large sightline-to-sightline variation seen in our sample implies that any average Galactic extinction curve will always reflect the biases of its parent sample. (5) The use of an average curve to deredden a spectral energy distribution (SED) will result in significant errors, and a realistic error budget for the dereddened SED must include the observed variance of Galactic curves. While the observed large sightline-to-sightline variations, and the lack of correlation among the various features of the curves, make it difficult to meaningfully characterize average extinction properties, they demonstrate that extinction curves respond sensitively to local conditions. Thus, each curve contains potentially unique information about the grains along its sightline.Comment: To appear in the Astrophysical Journal, Part 1, July 1, 2007. Figures and Tables which will appear only in the electronic version of the Journal can be obtained via anonymous ftp from ftp://ftp.astronomy.villanova.edu . After logging in, change directories to "fitz/FMV_EXTINCTION". A README file describes the various files present in the director

    The Emerging Scholarly Brain

    Full text link
    It is now a commonplace observation that human society is becoming a coherent super-organism, and that the information infrastructure forms its emerging brain. Perhaps, as the underlying technologies are likely to become billions of times more powerful than those we have today, we could say that we are now building the lizard brain for the future organism.Comment: to appear in Future Professional Communication in Astronomy-II (FPCA-II) editors A. Heck and A. Accomazz

    Comparison of the predictive performance of the BIG, TRISS, and PS09 score in an adult trauma population derived from multiple international trauma registries

    Get PDF
    The BIG score (Admission base deficit (B), International normalized ratio (I), and Glasgow Coma Scale (G)) has been shown to predict mortality on admission in pediatric trauma patients. The objective of this study was to assess its performance in predicting mortality in an adult trauma population, and to compare it with the existing Trauma and Injury Severity Score (TRISS) and probability of survival (PS09) score. A retrospective analysis using data collected between 2005 and 2010 from seven trauma centers and registries in Europe and the United States of America was performed. We compared the BIG score with TRISS and PS09 scores in a population of blunt and penetrating trauma patients. We then assessed the discrimination ability of all scores via receiver operating characteristic (ROC) curves and compared the expected mortality rate (precision) of all scores with the observed mortality rate. In total, 12,206 datasets were retrieved to validate the BIG score. The mean ISS was 15 ± 11, and the mean 30-day mortality rate was 4.8%. With an AUROC of 0.892 (95% confidence interval (CI): 0.879 to 0.906), the BIG score performed well in an adult population. TRISS had an area under ROC (AUROC) of 0.922 (0.913 to 0.932) and the PS09 score of 0.825 (0.915 to 0.934). On a penetrating-trauma population, the BIG score had an AUROC result of 0.920 (0.898 to 0.942) compared with the PS09 score (AUROC of 0.921; 0.902 to 0.939) and TRISS (0.929; 0.912 to 0.947). The BIG score is a good predictor of mortality in the adult trauma population. It performed well compared with TRISS and the PS09 score, although it has significantly less discriminative ability. In a penetrating-trauma population, the BIG score performed better than in a population with blunt trauma. The BIG score has the advantage of being available shortly after admission and may be used to predict clinical prognosis or as a research tool to risk stratify trauma patients into clinical trial

    A Study of the Luminosity and Mass Functions of the Young IC 348 Cluster using FLAMINGOS Wide-Field Near-Infrared Images

    Get PDF
    We present wide-field near-infrared (JHK) images of the young, 2 Myr IC 348 cluster taken with FLAMINGOS. We use these new data to construct an infrared census of sources, which is sensitive enough to detect a 10 Mjup brown dwarf seen through an extinction of Av=7mag. We examine the cluster's structure and relationship to the molecular cloud and construct the cluster's K band luminosity function. Using our model luminosity function algorithm, we derive the cluster's initial mass function throughout the stellar and substellar regimes and find that the IC 348 IMF is very similar to that found for the Trapezium Cluster with both cluster IMFs having a mode between 0.2 - 0.08 Msun. In particular we find that, similar to our results for the Trapezium, brown dwarfs constitute only 1 in 4 of the sources in the IC 348 cluster. We show that a modest secondary peak forms in the substellar IC 348 KLF, corresponding to the same mass range responsible for a similar KLF peak found in the Trapezium. We interpret this KLF peak as either evidence for a corresponding secondary IMF peak at the deuterium burning limit, or as arising from a feature in the substellar mass-luminosity relation that is not predicted by current theoretical models. Lastly, we find that IC 348 displays radial variations of its sub-solar (0.5 - 0.08 Msun) IMF on a parsec scale. Whatever mechanism that is breaking the universality of the IMF on small spatial scales in IC 348 does not appear to be acting upon the brown dwarf population, whose relative size does not vary with distance from the cluster center.Comment: 53 pages, 20 figures, AASTeX5.0. Color version of Figure 1 made available in jpg format. Figure(s) 2,3,5 are reduced in resolution. Accepted 14 Jan 2003 to the Astronomical Journa

    What Difference Does Quantity Make? On the Epistemology of Big Data Biology

    Get PDF
    publication-status: Acceptedtypes: ArticleIs Big Data science a whole new way of doing research? And what difference does data quantity make to knowledge production strategies and their outputs? I argue that the novelty of Big Data science does not lie in the sheer quantity of data involved, but rather in (1) the prominence and status acquired by data as commodity and recognised output, both within and outside of the scientific community and (2) the methods, infrastructures, technologies, skills and knowledge developed to handle data. These developments generate the impression that data-intensive research is a new mode of doing science, with its own epistemology and norms. To assess this claim, one needs to consider the ways in which data are actually disseminated and used to generate knowledge. Accordingly, this article reviews the development of sophisticated ways to disseminate, integrate and re-use data acquired on model organisms over the last three decades of work in experimental biology. I focus on online databases as prominent infrastructures set up to organise and interpret such data and examine the wealth and diversity of expertise, resources and conceptual scaffolding that such databases draw upon. This illuminates some of the conditions under which Big Data needs to be curated to support processes of discovery across biological subfields, which in turn highlights the difficulties caused by the lack of adequate curation for the vast majority of data in the life sciences. In closing, I reflect on the difference that data quantity is making to contemporary biology, the methodological and epistemic challenges of identifying and analysing data given these developments, and the opportunities and worries associated with Big Data discourse and methods.Economic and Social Research CouncilES/F028180/1Leverhulme TrustRPG-2013-153European Union’s Seventh Framework Programme (FP7/2007-2013ERC grant agreement number 335925

    Theoretical studies of the historical development of the accounting discipline: a review and evidence

    Get PDF
    Many existing studies of the development of accounting thought have either been atheoretical or have adopted Kuhn's model of scientific growth. The limitations of this 35-year-old model are discussed. Four different general neo-Kuhnian models of scholarly knowledge development are reviewed and compared with reference to an analytical matrix. The models are found to be mutually consistent, with each focusing on a different aspect of development. A composite model is proposed. Based on a hand-crafted database, author co-citation analysis is used to map empirically the entire literature structure of the accounting discipline during two consecutive time periods, 1972–81 and 1982–90. The changing structure of the accounting literature is interpreted using the proposed composite model of scholarly knowledge development

    Exploring the (missed) connections between digital scholarship and faculty development: a conceptual analysis

    Get PDF
    Abstract The aim of this paper is to explore the relationship between two research topics: digital scholarship and faculty development. The former topic drives attention on academics' new practices in digital, open and networked contexts; the second is focused on the requirements and strategies to promote academics' professional learning and career advancement. The research question addressing this study is: are faculty development strategies hindered by the lack of a cohesive view in the research on digital scholarship? The main assumption guiding this research question is that clear conceptual frameworks and models of professional practice lead to effective faculty development strategies. Through a wide overview of the evolution of both digital scholarship and faculty development, followed by a conceptual analysis of the intersections between fields, the paper attempts to show the extent on which the situation in one area (digital scholarship) might encompass criticalities for the other (faculty development) in terms of research and practices. Furthermore, three scenarios based on the several perspectives of digital scholarship are built in order to explore the research question in depth. We conclude that at the current state of art the relationship between these two topics is weak. Moreover, the dialogue between digital scholarship and faculty development could put the basis to forge effective professional learning contexts and instruments, with the ultimate goal of supporting academics to become digital scholars towards a more open and democratic vision of scholarship
    • 

    corecore