1,276 research outputs found

    The 74MHz System on the Very Large Array

    Full text link
    The Naval Research Laboratory and the National Radio Astronomy Observatory completed implementation of a low frequency capability on the VLA at 73.8 MHz in 1998. This frequency band offers unprecedented sensitivity (~25 mJy/beam) and resolution (~25 arcsec) for low-frequency observations. We review the hardware, the calibration and imaging strategies, comparing them to those at higher frequencies, including aspects of interference excision and wide-field imaging. Ionospheric phase fluctuations pose the major difficulty in calibrating the array. Over restricted fields of view or at times of extremely quiescent ionospheric ``weather'', an angle-invariant calibration strategy can be used. In this approach a single phase correction is devised for each antenna, typically via self-calibration. Over larger fields of view or at times of more normal ionospheric ``weather'' when the ionospheric isoplanatic patch size is smaller than the field of view, we adopt a field-based strategy in which the phase correction depends upon location within the field of view. This second calibration strategy was implemented by modeling the ionosphere above the array using Zernike polynomials. Images of 3C sources of moderate strength are provided as examples of routine, angle-invariant calibration and imaging. Flux density measurements indicate that the 74 MHz flux scale at the VLA is stable to a few percent, and tied to the Baars et al. value of Cygnus A at the 5 percent level. We also present an example of a wide-field image, devoid of bright objects and containing hundreds of weaker sources, constructed from the field-based calibration. We close with a summary of lessons the 74 MHz system offers as a model for new and developing low-frequency telescopes. (Abridged)Comment: 73 pages, 46 jpeg figures, to appear in ApJ

    A Brief History of AGN

    Get PDF
    Astronomers knew early in the twentieth century that some galaxies have emission-line nuclei. However, even the systematic study by Seyfert (1943) was not enough to launch active galactic nuclei (AGN) as a major topic of astronomy. The advances in radio astronomy in the 1950s revealed a new universe of energetic phenomena, and inevitably led to the discovery of quasars. These discoveries demanded the attention of observers and theorists, and AGN have been a subject of intense effort ever since. Only a year after the recognition of the redshifts of 3C 273 and 3C 48 in 1963, the idea of energy production by accretion onto a black hole was advanced. However, acceptance of this idea came slowly, encouraged by the discovery of black hole X-ray sources in our Galaxy and, more recently, supermassive black holes in the center of the Milky Way and other galaxies. Many questions remain as to the formation and fueling of the hole, the geometry of the central regions, the detailed emission mechanisms, the production of jets, and other aspects. The study of AGN will remain a vigorous part of astronomy for the foreseeable future.Comment: 37 pages, no figures. Uses aaspp4.sty. To be published in Publications of the Astronomical Society of the Pacific, 1999 Jun

    Group cognitive analytic music therapy: a quasi-experimental feasibility study conducted in a high secure hospital

    Get PDF
    This study conducted a feasibility patient preference quasi-experimental study of group cognitive analytic music therapy (G-CAMT) for mentally disordered offenders. Participants either chose or were randomised to 16 sessions of manualised G-CAMT (N = 10) plus treatment as usual (TAU) or TAU alone (N = 10). Self-rated and staff-rated outcomes were assessed at baseline, post-intervention and 8-weeks post-intervention. Residency was assessed at 2-year follow-up. Results indicate that G-CAMT was easily implemented; 9/10 participants completed G-CAMT and attendees had high satisfaction with the approach. Session attendance was high; 4/10 participants attended all sessions. At the 8-week follow-up, 3/9 G-CAMT participants had reliable reductions (i.e. statistically reliable pre to 8-week follow-up change results) in intrusive/possessive behaviours and fear of separation/abandonment. On the staff-rated outcome measure G-CAMT participants as a group were statistically significantly friendlier compared to TAU at 8-week follow-up (U = 0.50, p = 0.009, d = 1.92, CI 0.44 to 3.11). There were no differences between the arms in terms of residency outcomes at 2-year follow-up. The study is discussed in terms of G-CAMT’s theoretical grounding and high acceptability. The study is limited by its small sample size, but indicates the possibility of progressing onto a full trial

    The open future, bivalence and assertion

    Get PDF
    It is highly intuitive that the future is open and the past is closed—whereas it is unsettled whether there will be a fourth world war, it is settled that there was a first. Recently, it has become increasingly popular to claim that the intuitive openness of the future implies that contingent statements about the future, such as ‘there will be a sea battle tomorrow,’ are non-bivalent (neither true nor false). In this paper, we argue that the non-bivalence of future contingents is at odds with our pre-theoretic intuitions about the openness of the future. These are revealed by our pragmatic judgments concerning the correctness and incorrectness of assertions of future contingents. We argue that the pragmatic data together with a plausible account of assertion shows that in many cases we take future contingents to be true (or to be false), though we take the future to be open in relevant respects. It follows that appeals to intuition to support the non-bivalence of future contingents is untenable. Intuition favours bivalence

    Elgin on understanding:How does it involve know-how, endorsement and factivity?

    Get PDF
    In Chapter 3 of True Enough, Elgin (2017) outlines her view of objectual understanding, focusing largely on its non-factive nature and the extent to which a certain kind of know-how is required for the “grasping” component of understanding. I will explore four central issues that feature in this chapter, concentrating on (1) the role of know-how, (2) the concept of endorsement, (3) Elgin’s critique of the factivity constraint on understanding, and (4) how we might use aspects of Elgin’s framework to inform related debates on the norm of assertion

    Judging the impact of leadership-development activities on school practice

    Get PDF
    The nature and effectiveness of professional-development activities should be judged in a way that takes account of both the achievement of intended outcomes and the unintended consequences that may result. Our research project set out to create a robust approach that school staff members could use to assess the impact of professional-development programs on leadership and management practice without being constrained in this judgment by the stated aims of the program. In the process, we identified a number of factors and requirements relevant to a wider audience than that concerned with the development of leadership and management in England. Such an assessment has to rest upon a clear understanding of educational leadership,a clearly articulated model of practice, and a clear model of potential forms of impact. Such foundations, suitably adapted to the subject being addressed, are appropriate for assessing all teacher professional development

    Models and metaphors: complexity theory and through-life management in the built environment

    Get PDF
    Complexity thinking may have both modelling and metaphorical applications in the through-life management of the built environment. These two distinct approaches are examined and compared. In the first instance, some of the sources of complexity in the design, construction and maintenance of the built environment are identified. The metaphorical use of complexity in management thinking and its application in the built environment are briefly examined. This is followed by an exploration of modelling techniques relevant to built environment concerns. Non-linear and complex mathematical techniques such as fuzzy logic, cellular automata and attractors, may be applicable to their analysis. Existing software tools are identified and examples of successful built environment applications of complexity modelling are given. Some issues that arise include the definition of phenomena in a mathematically usable way, the functionality of available software and the possibility of going beyond representational modelling. Further questions arising from the application of complexity thinking are discussed, including the possibilities for confusion that arise from the use of metaphor. The metaphor of a 'commentary machine' is suggested as a possible way forward and it is suggested that an appropriate linguistic analysis can in certain situations reduce perceived complexity

    Reinventing grounded theory: some questions about theory, ground and discovery

    Get PDF
    Grounded theory’s popularity persists after three decades of broad-ranging critique. In this article three problematic notions are discussed—‘theory,’ ‘ground’ and ‘discovery’—which linger in the continuing use and development of grounded theory procedures. It is argued that far from providing the epistemic security promised by grounded theory, these notions—embodied in continuing reinventions of grounded theory—constrain and distort qualitative inquiry, and that what is contrived is not in fact theory in any meaningful sense, that ‘ground’ is a misnomer when talking about interpretation and that what ultimately materializes following grounded theory procedures is less like discovery and more akin to invention. The procedures admittedly provide signposts for qualitative inquirers, but educational researchers should be wary, for the significance of interpretation, narrative and reflection can be undermined in the procedures of grounded theory
    corecore