1,498 research outputs found

    Clinical and service implications of a cognitive analytic therapy model of psychosis

    Get PDF
    Cognitive analytic therapy (CAT) is an integrative, interpersonal model of therapy predicated on a radically social concept of self, developed over recent years in the UK by Anthony Ryle. A CAT-based model of psychotic disorder has been developed much more recently based on encouraging early experience in this area. The model describes and accounts for many psychotic experiences and symptoms in terms of distorted, amplified or muddled enactments of normal or ‘neurotic’ reciprocal role procedures (RRPs) and of damage at a meta-procedural level to the structures of the self. Reciprocal role procedures are understood in CAT to represent the outcome of the process of internalization of early, sign-mediated, interpersonal experience and to constitute the basis for all mental activity, normal or otherwise. Enactments of maladaptive RRPs generated by early interpersonal stress are seen in this model to constitute a form of ‘internal expressed emotion’. Joint description of these RRPs and their enactments (both internally and externally) and their subsequent revision is central to the practice of CAT during which they are mapped out through written and diagrammatic reformulations. This model may usefully complement and extend existing approaches, notably recent CBT-based interventions, particularly with ‘difficult’ patients, and generate meaningful and helpful understandings of these disorders for both patients and their treating teams. We suggest that use of a coherent and robust model such as CAT could have important clinical and service implications in terms of developing and researching models of these disorders as well as for the training of multidisciplinary teams in their effective treatment

    The 74MHz System on the Very Large Array

    Full text link
    The Naval Research Laboratory and the National Radio Astronomy Observatory completed implementation of a low frequency capability on the VLA at 73.8 MHz in 1998. This frequency band offers unprecedented sensitivity (~25 mJy/beam) and resolution (~25 arcsec) for low-frequency observations. We review the hardware, the calibration and imaging strategies, comparing them to those at higher frequencies, including aspects of interference excision and wide-field imaging. Ionospheric phase fluctuations pose the major difficulty in calibrating the array. Over restricted fields of view or at times of extremely quiescent ionospheric ``weather'', an angle-invariant calibration strategy can be used. In this approach a single phase correction is devised for each antenna, typically via self-calibration. Over larger fields of view or at times of more normal ionospheric ``weather'' when the ionospheric isoplanatic patch size is smaller than the field of view, we adopt a field-based strategy in which the phase correction depends upon location within the field of view. This second calibration strategy was implemented by modeling the ionosphere above the array using Zernike polynomials. Images of 3C sources of moderate strength are provided as examples of routine, angle-invariant calibration and imaging. Flux density measurements indicate that the 74 MHz flux scale at the VLA is stable to a few percent, and tied to the Baars et al. value of Cygnus A at the 5 percent level. We also present an example of a wide-field image, devoid of bright objects and containing hundreds of weaker sources, constructed from the field-based calibration. We close with a summary of lessons the 74 MHz system offers as a model for new and developing low-frequency telescopes. (Abridged)Comment: 73 pages, 46 jpeg figures, to appear in ApJ

    Multispectral lensless digital holographic microscope: imaging MCF-7 and MDA-MB-231 cancer cell cultures

    Get PDF
    Digital holography is the process where an object’s phase and amplitude information is retrieved from intensity images obtained using a digital camera (e.g. CCD or CMOS sensor). In-line digital holographic techniques offer full use of the recording device’s sampling bandwidth, unlike off-axis holography where object information is not modulated onto carrier fringes. Reconstructed images are obscured by the linear superposition of the unwanted, out of focus, twin images. In addition to this, speckle noise degrades overall quality of the reconstructed images. The speckle effect is a phenomenon of laser sources used in digital holographic systems. Minimizing the effects due to speckle noise, removal of the twin image and using the full sampling bandwidth of the capture device aids overall reconstructed image quality. Such improvements applied to digital holography can benefit applications such as holographic microscopy where the reconstructed images are obscured with twin image information. Overcoming such problems allows greater flexibility in current image processing techniques, which can be applied to segmenting biological cells (e.g. MCF-7 and MDA-MB- 231) to determine their overall cell density and viability. This could potentially be used to distinguish between apoptotic and necrotic cells in large scale mammalian cell processes, currently the system of choice, within the biopharmaceutical industry

    Radial Velocities of Six OB Stars

    Full text link
    We present new results from a radial velocity study of six bright OB stars with little or no prior measurements. One of these, HD 45314, may be a long-period binary, but the velocity variations of this Be star may be related to changes in its circumstellar disk. Significant velocity variations were also found for HD 60848 (possibly related to nonradial pulsations) and HD 61827 (related to wind variations). The other three targets, HD 46150, HD 54879, and HD 206183, are constant velocity objects, but we note that HD 54879 has Hα\alpha emission that may originate from a binary companion. We illustrate the average red spectrum of each target.Comment: Accepted for publication in PASP July 2007 issu

    Personal and sub-personal: a defence of Dennett's early distinction

    Get PDF
    Since 1969, when Dennett introduced a distinction between personal and sub‐personal levels of explanation, many philosophers have used ‘sub‐personal’ very loosely, and Dennett himself has abandoned a view of the personal level as genuinely autonomous. I recommend a position in which Dennett's original distinction is crucial, by arguing that the phenomenon called mental causation is on view only at the properly personal level. If one retains the commit‐’ ments incurred by Dennett's early distinction, then one has a satisfactory anti‐physicalistic, anti‐dualist philosophy of mind. It neither interferes with the projects of sub‐personal psychology, nor encourages ; instrumentalism at the personal level. People lose sight of Dennett’s personal/sub-personal distinction because they free it from its philosophical moorings. A distinction that serves a philosophical purpose is typically rooted in doctrine; it cannot be lifted out of context and continue to do its work. So I shall start from Dennett’s distinction as I read it in its original context. And when I speak of ‘the distinction’, I mean to point not only towards the terms that Dennett first used to define it but also towards the philosophical setting within which its work was cut out

    A Brief History of AGN

    Get PDF
    Astronomers knew early in the twentieth century that some galaxies have emission-line nuclei. However, even the systematic study by Seyfert (1943) was not enough to launch active galactic nuclei (AGN) as a major topic of astronomy. The advances in radio astronomy in the 1950s revealed a new universe of energetic phenomena, and inevitably led to the discovery of quasars. These discoveries demanded the attention of observers and theorists, and AGN have been a subject of intense effort ever since. Only a year after the recognition of the redshifts of 3C 273 and 3C 48 in 1963, the idea of energy production by accretion onto a black hole was advanced. However, acceptance of this idea came slowly, encouraged by the discovery of black hole X-ray sources in our Galaxy and, more recently, supermassive black holes in the center of the Milky Way and other galaxies. Many questions remain as to the formation and fueling of the hole, the geometry of the central regions, the detailed emission mechanisms, the production of jets, and other aspects. The study of AGN will remain a vigorous part of astronomy for the foreseeable future.Comment: 37 pages, no figures. Uses aaspp4.sty. To be published in Publications of the Astronomical Society of the Pacific, 1999 Jun

    Economics, Agency, and Causal Explanation

    Get PDF
    The paper considers three questions. First, what is the connection between economics and agency? It is argued that causation and explanation in economics fundamentally depend on agency. So a philosophical understanding of economic explanation must be sensitive to an understanding of agency. Second, what is the connection between agency and causation? A causal view of agency-involving explanation is defended against a number of arguments from the resurgent noncausalist tradition in the literature on agency and action-explanation. If agency is fundamental to economic explanation, it is argued, then so is causation. Third, what is the connection between causal explanation and the natural sciences? It is argued that, though the explanations given in economics and other social sciences are causal explanations, they are different in kind from the causal explanations of the natural sciences. On the one hand, then, the causal explanations of the social sciences are irreducible to those found in the natural sciences. On the other hand, the causal relations described by the social sciences are not completely autonomous; they do not float free of, or operate independently from, the causal relations charted by the natural sciences

    The open future, bivalence and assertion

    Get PDF
    It is highly intuitive that the future is open and the past is closed—whereas it is unsettled whether there will be a fourth world war, it is settled that there was a first. Recently, it has become increasingly popular to claim that the intuitive openness of the future implies that contingent statements about the future, such as ‘there will be a sea battle tomorrow,’ are non-bivalent (neither true nor false). In this paper, we argue that the non-bivalence of future contingents is at odds with our pre-theoretic intuitions about the openness of the future. These are revealed by our pragmatic judgments concerning the correctness and incorrectness of assertions of future contingents. We argue that the pragmatic data together with a plausible account of assertion shows that in many cases we take future contingents to be true (or to be false), though we take the future to be open in relevant respects. It follows that appeals to intuition to support the non-bivalence of future contingents is untenable. Intuition favours bivalence

    Judging the impact of leadership-development activities on school practice

    Get PDF
    The nature and effectiveness of professional-development activities should be judged in a way that takes account of both the achievement of intended outcomes and the unintended consequences that may result. Our research project set out to create a robust approach that school staff members could use to assess the impact of professional-development programs on leadership and management practice without being constrained in this judgment by the stated aims of the program. In the process, we identified a number of factors and requirements relevant to a wider audience than that concerned with the development of leadership and management in England. Such an assessment has to rest upon a clear understanding of educational leadership,a clearly articulated model of practice, and a clear model of potential forms of impact. Such foundations, suitably adapted to the subject being addressed, are appropriate for assessing all teacher professional development
    • 

    corecore