3,743 research outputs found

    The software-cycle model for re-engineering and reuse

    Get PDF
    This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model

    The Oral History of John R. Bailey

    Full text link

    Behavioural simulation of biological neuron systems using VHDL and VHDL-AMS

    No full text
    The investigation of neuron structures is an incredibly difficult and complex task that yields relatively low rewards in terms of information from biological forms (either animals or tissue). The structures and connectivity of even the simplest invertebrates are almost impossible to establish with standard laboratory techniques, and even when this is possible it is generally time consuming, complex and expensive. Recent work has shown how a simplified behavioural approach to modelling neurons can allow “virtual” experiments to be carried out that map the behaviour of a simulated structure onto a hypothetical biological one, with correlation of behaviour rather than underlying connectivity. The problems with such approaches are numerous. The first is the difficulty of simulating realistic aggregates efficiently, the second is making sense of the results and finally, it would be helpful to have an implementation that could be synthesised to hardware for acceleration. In this paper we present a VHDL implementation of Neuron models that allow large aggregates to be simulated. The models are demonstrated using a system level VHDL and VHDL-AMS model of the C. Elegans locomotory system

    A Quaker Experiment in Town Planning: George Cadbury and the Construction of Bournville Model Village

    Full text link
    In 1893, George Cadbury initiated the construction of Bournville Model Village, Birmingham (UK). This was the first model settlement to provide low-density housing not restricted to facto1y employees. This paper examines the relationship between Cadbury\u27s Quaker faith, the growth of his business and the development of a model community. The focus is on exploring the ways in which Cadbury departed from traditional Quaker practices, with respect to visual artistic display and religious intervention in social relations. The article, first, reviews the contribution of Quakerism to the building of George Cadbury\u27s business empire. Second, it examines the relationship between Cadbury\u27s religiously infom1ed brand of benign capitalism and the choice of a particular architectural aesthetic for Bournville. Third, the article shows how evangelical Quaker faith and practice were important in shaping the social development of the Bournville community

    THE LACK OF A PROFIT MOTIVE FOR RANCHING: IMPLICATIONS FOR POLICY ANALYSIS

    Get PDF
    The economic impact of changing land-use policies has traditionally been estimated using the standard economic model of profit maximization. Ranchers are assumed to maximize profit and to adjust production strategies so as to continue maximizing profit with altered policies. Yet, nearly 30 years of research and observation have shown that family, tradition, and the desirable way of life are the most important factors in the ranch purchase decision - not profit. Ranch buyers want an investment they can touch, feel, and enjoy, and they historically have been willing to accept relatively low returns from the livestock production. Profit maximization appears to be an inadequate model for explaining rancher behavior, describing grazing land use, and estimating the impacts of altered public land policies. In this paper, we investigate the relative importance of livestock production income and desirable lifestyle attributes in determining the market value of western ranches, and we explore what this means for economic models and policy analysis.Agricultural and Food Policy, Land Economics/Use,

    Calf health and performance during receiving is not changed by fence-line preconditioning on flint hills range vs. drylot preconditioning

    Get PDF
    Ranch-of-origin preconditioning can improve the welfare and performance of beef calves by decreasing the stress associated with weaning, transport, diet change, and commingling with other calves. Preconditioning methods that involve pasture weaning coupled with maternal contact (i.e., fence-line weaning) have been promoted as possible best management practices for minimizing stress. Prior studies focused on performance and behavior during preconditioning on the ranch of origin. Little information has been published relating to carryover effects of fence-line preconditioning compared with conventional drylot preconditioning on performance and behavior during feedlot receiving. Our objectives were to measure growth and health during a 28-day ranch-of-origin preconditioning phase and during a 60-day feedlot receiving phase among beef calves subjected to 1 of 3 ranch-of-origin preconditioning programs: (1) drylot preconditioning + dam separation, (2) pasture preconditioning + fence-line contact with dams, and (3) pasture preconditioning + fence-line contact with dams + supplemental feed delivered in a bunk. In addition, we recorded incidences of behavioral distress among these treatments during first 7 days of feedlot receiving

    On the spectroastrometric separation of binary point-source fluxes

    Full text link
    Spectroastrometry is a technique which has the potential to resolve flux distributions on scales of milliarcseconds. In this study, we examine the application of spectroastrometry to binary point sources which are spatially unresolved due to the observational point spread function convolution. The technique uses measurements with sub-pixel accuracy of the position centroid of high signal-to-noise long-slit spectrum observations. With the objects in the binary contributing fractionally more or less at different wavelengths (particularly across spectral lines), the variation of the position centroid with wavelength provides some information on the spatial distribution of the flux. We examine the width of the flux distribution in the spatial direction, and present its relation to the ratio of the fluxes of the two components of the binary. Measurement of three observables (total flux, position centroid and flux distribution width) at each wavelength allows a unique separation of the total flux into its component parts even though the angular separation of the binary is smaller than the observations' point-spread function. This is because we have three relevant observables for three unknowns (the two fluxes, and the angular separation of the binary), which therefore generates a closed problem. This is a wholly different technique than conventional deconvolution methods, which produce information on angular sizes of the sampling scale. Spectroastrometry can produce information on smaller scales than conventional deconvolution, and is successful in separating fluxes in a binary object with a separation of less than one pixel. We present an analysis of the errors involved in making binary object spectroastrometric measurements and the separation method, and highlight necessary observing methodology.Comment: 11 pages, 8 figures, accepted for publication in Astronomy and Astrophysic

    Gait analysis in a <i>Mecp2</i> knockout mouse model of Rett syndrome reveals early-onset and progressive motor deficits

    Get PDF
    Rett syndrome (RTT) is a genetic disorder characterized by a range of features including cognitive impairment, gait abnormalities and a reduction in purposeful hand skills. Mice harbouring knockout mutations in the &lt;i&gt;Mecp2&lt;/i&gt; gene display many RTT-like characteristics and are central to efforts to find novel therapies for the disorder. As hand stereotypies and gait abnormalities constitute major diagnostic criteria in RTT, it is clear that motor and gait-related phenotypes will be of importance in assessing preclinical therapeutic outcomes. We therefore aimed to assess gait properties over the prodromal phase in a functional knockout mouse model of RTT. In male &lt;i&gt;Mecp2&lt;/i&gt; knockout mice, we observed alterations in stride, coordination and balance parameters at 4 weeks of age, before the onset of other overt phenotypic changes as revealed by observational scoring. These data suggest that gait measures may be used as a robust and early marker of &lt;i&gt;Mecp2&lt;/i&gt;-dysfunction in future preclinical therapeutic studies
    corecore