471 research outputs found

    On the design of ALEPH

    Get PDF

    On the design of ALEPH

    Get PDF

    Colour evolution of Betelgeuse and Antares over two millennia, derived from historical records, as a new constraint on mass and age

    Get PDF
    After core hydrogen burning, massive stars evolve from blue-white dwarfs to red supergiants by expanding, brightening, and cooling within few millennia. We discuss a previously neglected constraint on mass, age, and evolutionary state of Betelgeuse and Antares, namely their observed colour evolution over historical times: We place all 236 stars bright enough for their colour to be discerned by the unaided eye (V ≤ 3.3 mag) on the colour-magnitude-diagram (CMD), and focus on those in the Hertzsprung gap. We study pre-telescopic records on star colour with historically critical methods to find stars that have evolved noticeably in colour within the last millennia. Our main result is that Betelgeuse was recorded with a colour significantly different (non-red) than today (red, B − V = 1.78 ± 0.05 mag). Hyginus (Rome) and Sima Qian (China) independently report it two millennia ago as appearing like Saturn (B − V = 1.09 ± 0.16 mag) in colour and ‘yellow’ (quantifiable as B − V = 0.95 ± 0.35 mag), respectively (together, 5.1σ different from today). The colour change of Betelgeuse is a new, tight constraint for single-star theoretical evolutionary models (or merger models). It is most likely located less than one millennium past the bottom of the red giant branch, before which rapid colour evolution is expected. Evolutionary tracks from MIST consistent with both its colour evolution and its location on the CMD suggest a mass of ∼14 M⊙ at ∼14 Myr. The (roughly) constant colour of Antares for the last three millennia also constrains its mass and age. Wezen was reported white historically, but is now yellow

    Image 100 procedures manual development: Applications system library definition and Image 100 software definition

    Get PDF
    An outline for an Image 100 procedures manual for Earth Resources Program image analysis was developed which sets forth guidelines that provide a basis for the preparation and updating of an Image 100 Procedures Manual. The scope of the outline was limited to definition of general features of a procedures manual together with special features of an interactive system. Computer programs were identified which should be implemented as part of an applications oriented library for the system

    Human and Machine Representations of Knowledge

    Get PDF
    Four ex1st1ng Knowledge-representations for the computat1on of s1m1lar functions 1n a chess endgame were 1mplemented on the same computer 1n the same language. They are compared w1th respect to effic1ency regard1ng time-space requirements. Three of these programs were then paraphrased 1nto English and all four were studied for their feasibility as 'open book' advice texts for the human beginner in chess. A formally verified set of rules was also tested for its suitability as an advice text. The possible effectiveness of these advice texts in 'closed book' form is considered. The above experiments comprise a case study of a phenomenon known as the "human window". This phenomenon mot1vated an analysis of four documented instances of mismatch between human and machine representations. These are: Three Mile Island II Air Traffic Control, III NORAD Mil1tary Computer System, IV The Hoogoven Royal Dutch Steel automation failur

    Data Mining by Grid Computing in the Search for Extrasolar Planets

    Get PDF
    A system is presented here to provide improved precision in ensemble differential photometry. This is achieved by using the power of grid computing to analyse astronomical catalogues. This produces new catalogues of optimised pointings for each star, which maximise the number and quality of reference stars available. Astronomical phenomena such as exoplanet transits and small-scale structure within quasars may be observed by means of millimagnitude photometric variability on the timescale of minutes to hours. Because of atmospheric distortion, ground-based observations of these phenomena require the use of differential photometry whereby the target is compared with one or more reference stars. CCD cameras enable the use of many reference stars in an ensemble. The more closely the reference stars in this ensemble resemble the target, the greater the precision of the photometry that can be achieved. The Locus Algorithm has been developed to identify the optimum pointing for a target and provide that pointing with a score relating to the degree of similarity between target and the reference stars. It does so by identifying potential points of aim for a particular telescope such that a given target and a varying set of references were included in a field of view centred on those pointings. A score is calculated for each such pointing. For each target, the pointing with the highest score is designated the optimum pointing. The application of this system to the Sloan Digital Sky Survey (SDSS) catalogue demanded the use of a High Performance Computing (HPC) solution through Grid Ireland. Pointings have thus been generated for 61,662,376 stars and 23,697 quasars

    C-MOS array design techniques: SUMC multiprocessor system study

    Get PDF
    The current capabilities of LSI techniques for speed and reliability, plus the possibilities of assembling large configurations of LSI logic and storage elements, have demanded the study of multiprocessors and multiprocessing techniques, problems, and potentialities. Evaluated are three previous systems studies for a space ultrareliable modular computer multiprocessing system, and a new multiprocessing system is proposed that is flexibly configured with up to four central processors, four 1/0 processors, and 16 main memory units, plus auxiliary memory and peripheral devices. This multiprocessor system features a multilevel interrupt, qualified S/360 compatibility for ground-based generation of programs, virtual memory management of a storage hierarchy through 1/0 processors, and multiport access to multiple and shared memory units

    Computer simulation of highway traffic

    Get PDF
    PhD ThesisThe work is concerned with the simulation of highway traffic using a high speed electronic digital computer, and is divided into four parts. The first chapter opens by describing simulation techniques in general, and traffic simulation by digital computer in particular. The advantages and disadvantages of the method are discussed. Early work in the U. S. A. and the U. K, is described, and then comparisons are drawn between the major current fields of study. The second chapter describes the formulation of a model to simulate traffic behaviour at a traffic signal controlled intersection. The object of the simulation is to determine an accurate description of the decrease in capacity of the intersection with an increase in the volume of right turning traffic. The results of the simulation are presented and analysed. Recommendations are made for incorporation into current traffic engineering practice. The third chapter describes the formulation of a model to simulate traffic moving along a long two-lane weaving section. Car following theory is used to describe the motion of the vehicles, and a model of the lanechange decision proposed and discussed. The effect of changes in the physical characteristics and traffic characteristics on the delay to vehicles using the section are predicted. The final chapter is concerned with the photographic equipment used to collect and analyse the field data necessary for the construction of accurate models

    Development of a methodology for classifying software errors

    Get PDF
    A mathematical formalization of the intuition behind classification of software errors is devised and then extended to a classification discipline: Every classification scheme should have an easily discernible mathematical structure and certain properties of the scheme should be decidable (although whether or not these properties hold is relative to the intended use of the scheme). Classification of errors then becomes an iterative process of generalization from actual errors to terms defining the errors together with adjustment of definitions according to the classification discipline. Alternatively, whenever possible, small scale models may be built to give more substance to the definitions. The classification discipline and the difficulties of definition are illustrated by examples of classification schemes from the literature and a new study of observed errors in published papers of programming methodologies
    • …
    corecore