7,854 research outputs found

    The Kinematic Composition of MgII Absorbers

    Full text link
    The study of galaxy evolution using quasar absorption lines requires an understanding of what components of galaxies and their surroundings are contributing to the absorption in various transitions. This paper considers the kinematic composition of the class of 0.4 < z < 1.0 MgII absorbers, particularly addressing the question of what fraction of this absorption is produced in halos and what fraction arises from galaxy disks. We design models with various fractional contributions from radial infall of halo material and from a rotating thick disk component. We generate synthetic spectra from lines of sight through model galaxies and compare the resulting ensembles of MgII profiles with the 0.4 < z < 1.0 sample observed with HIRES/Keck. We apply a battery of statistical tests and find that pure disk and pure halo models can be ruled out, but that various models with rotating disk and infall/halo contributions can produce an ensemble that is nearly consistent with the data. A discrepancy in all models that we considered requires the existence of a kinematic component intermediate between halo and thick disk. The variety of MgII profiles can be explained by the gas in disks and halos of galaxies not very much different than galaxies in the local Universe. In any one case there is considerable ambiguity in diagnosing the kinematic composition of an absorber from the low ionization high resolution spectra alone. Future data will allow galaxy morphologies, impact parameters, and orientations, FeII/MgII of clouds, and the distribution of high ionization gas to be incorporated into the kinematic analysis. Combining all these data will permit a more accurate diagnosis of the physical conditions along the line of sight through the absorbing galaxy.Comment: 34 pages including 14 postscript figures; Accepted by the Astrophysical Journal; URL http://www.astro.psu.edu/users/cwc/pubs.htm

    Mentat: An object-oriented macro data flow system

    Get PDF
    Mentat, an object-oriented macro data flow system designed to facilitate parallelism in distributed systems, is presented. The macro data flow model is a model of computation similar to the data flow model with two principal differences: the computational complexity of the actors is much greater than in traditional data flow systems, and there are persistent actors that maintain state information between executions. Mentat is a system that combines the object-oriented programming paradigm and the macro data flow model of computation. Mentat programs use a dynamic structure called a future list to represent the future of computations

    Dilettante, Venturesome, Tory and Crafts: Drivers of Performance Among Taxonomic Groups

    Get PDF
    Empirical research has failed to cumulate into a coherent taxonomy of small firms. This may be because the method adapted from biology by Bill McKelvey has almost never been adopted. His approach calls for extensive variables and a focused sample of organizations, contrary to most empirical studies, which are specialized. Comparing general and special purpose approaches, we find some of the latter have more explanatory power than others and that general purpose taxonomies have the greatest explanatory power. Examining performance, we find the types do not display significantly different levels of performance but they display highly varied drivers of performance

    Producing approximate answers to database queries

    Get PDF
    We have designed and implemented a query processor, called APPROXIMATE, that makes approximate answers available if part of the database is unavailable or if there is not enough time to produce an exact answer. The accuracy of the approximate answers produced improves monotonically with the amount of data retrieved to produce the result. The exact answer is produced if all of the needed data are available and query processing is allowed to continue until completion. The monotone query processing algorithm of APPROXIMATE works within the standard relational algebra framework and can be implemented on a relational database system with little change to the relational architecture. We describe here the approximation semantics of APPROXIMATE that serves as the basis for meaningful approximations of both set-valued and single-valued queries. We show how APPROXIMATE is implemented to make effective use of semantic information, provided by an object-oriented view of the database, and describe the additional overhead required by APPROXIMATE

    Constraints on the thermal and tectonic evolution of Greymouth coalfield

    Get PDF
    The southern end of the Paparoa Range in Westland, South Island, New Zealand, comprises an asymmetrical, southward plunging, faulted (Brunner-Mt Davy) anticline, the eastern limb of which is common with the western limb of an asymmetrical (Grey Valley) syncline forming a Neogene foreland basin (Grey Valley Trough). The faulted anticline is a classic inversion structure: compression during the Neogene, associated with the development of the modern Australia-Pacific plate boundary, caused a pre-existing normal fault zone, about which a late Cretaceous-Oligocene extensional half graben had formed (Paparoa Trough), to change its sense of displacement. The resulting basement loading formed the foreland basin, containing up to 3 km of mainly marine sedimentary section. Fission track results for apatite concentrates from 41 shallow drillhole and outcrop samples from the Greymouth Coalfield part of the Brunner-Mt Davy Anticline are reported and interpreted, to better establish the timing and amount of inversion, and hence the mechanism of inversion. The fission track results integrated with modelling of vitrinite reflectance data, show that the maximum paleotemperatures experienced during burial of the Late Cretaceous and mid-Eocene coal-bearing succession everywhere exceeded 85deg.C, and reached a peak of 180deg.C along the axis of the former basin. Cooling from maximum temperatures occurred during three discrete phases: 20-15 Ma, 12-7 Ma, and c. 2 Ma to the present. The amount of denudation has been variable across the inverted basin, decreasing westward from a maximum of c. 2.5 km during the first deformation phase, c. 1.2 km during the second phase, and 1.4 km during the third phase. It appears that exhumation over the coalfield continued for about 2 m.y. beyond the biostratigraphically determined time ranges of each of two synorogenic unconformities along the western limb of the Grey Valley Syncline. Stick-slip behaviour on the range front fault that localised the inversion is inferred. The tectonic evolution of the anticline-syncline pair at the southern end of the Paparoa Range, is therefore identical in style, and similar in timing, to the development of the Papahaua Range-Westport Trough across the Kongahu Fault Zone, in the vicinity of Buller Coalfield

    Scheduling real-time, periodic jobs using imprecise results

    Get PDF
    A process is called a monotone process if the accuracy of its intermediate results is non-decreasing as more time is spent to obtain the result. The result produced by a monotone process upon its normal termination is the desired result; the error in this result is zero. External events such as timeouts or crashes may cause the process to terminate prematurely. If the intermediate result produced by the process upon its premature termination is saved and made available, the application may still find the result unusable and, hence, acceptable; such a result is said to be an imprecise one. The error in an imprecise result is nonzero. The problem of scheduling periodic jobs to meet deadlines on a system that provides the necessary programming language primitives and run-time support for processes to return imprecise results is discussed. This problem differs from the traditional scheduling problems since the scheduler may choose to terminate a task before it is completed, causing it to produce an acceptable but imprecise result. Consequently, the amounts of processor time assigned to tasks in a valid schedule can be less than the amounts of time required to complete the tasks. A meaningful formulation of this problem taking into account the quality of the overall result is discussed. Three algorithms for scheduling jobs for which the effects of errors in results produced in different periods are not cumulative are described, and their relative merits are evaluated

    Imprecise results: Utilizing partial computations in real-time systems

    Get PDF
    In real-time systems, a computation may not have time to complete its execution because of deadline requirements. In such cases, no result except the approximate results produced by the computations up to that point will be available. It is desirable to utilize these imprecise results if possible. Two approaches are proposed to enable computations to return imprecise results when executions cannot be completed normally. The milestone approach records results periodically, and if a deadline is reached, returns the last recorded result. The sieve approach demarcates sections of code which can be skipped if the time available is insufficient. By using these approaches, the system is able to produce imprecise results when deadlines are reached. The design of the Concord project is described which supports imprecise computations using these techniques. Also presented is a general model of imprecise computations using these techniques, as well as one which takes into account the influence of the environment, showing where the latter approach fits into this model

    Scheduling periodic jobs using imprecise results

    Get PDF
    One approach to avoid timing faults in hard, real-time systems is to make available intermediate, imprecise results produced by real-time processes. When a result of the desired quality cannot be produced in time, an imprecise result of acceptable quality produced before the deadline can be used. The problem of scheduling periodic jobs to meet deadlines on a system that provides the necessary programming language primitives and run-time support for processes to return imprecise results is discussed. Since the scheduler may choose to terminate a task before it is completed, causing it to produce an acceptable but imprecise result, the amount of processor time assigned to any task in a valid schedule can be less than the amount of time required to complete the task. A meaningful formulation of the scheduling problem must take into account the overall quality of the results. Depending on the different types of undesirable effects caused by errors, jobs are classified as type N or type C. For type N jobs, the effects of errors in results produced in different periods are not cumulative. A reasonable performance measure is the average error over all jobs. Three heuristic algorithms that lead to feasible schedules with small average errors are described. For type C jobs, the undesirable effects of errors produced in different periods are cumulative. Schedulability criteria of type C jobs are discussed

    PERTS: A Prototyping Environment for Real-Time Systems

    Get PDF
    PERTS is a prototyping environment for real-time systems. It is being built incrementally and will contain basic building blocks of operating systems for time-critical applications, tools, and performance models for the analysis, evaluation and measurement of real-time systems and a simulation/emulation environment. It is designed to support the use and evaluation of new design approaches, experimentations with alternative system building blocks, and the analysis and performance profiling of prototype real-time systems

    The Geminga pulsar wind nebula in the mid-infrared and submillimetre

    Get PDF
    The nearby middle-aged Geminga pulsar has crossed the Galactic plane within the last ∼0.1 Myr. We present archival data from Wide-field Infrared Survey Explorer and from SCUBA and SCUBA-2 on the James Clerk Maxwell Telescope to assess whether any midinfrared and submillimetre emission arises from interaction of the pulsar wind nebula with the interstellar medium. A candidate shell and bow shock are reported. Given the low pulsar velocity and local density, dust grains appear able to penetrate into the nebula. A compact source seen towards the pulsar is fitted with a dust spectrum. If confirmed as a real association at higher resolution, this could be a circum-pulsar disc of at least a few Earth-masses, in which future planets could form
    corecore