29 research outputs found

    Performance spec for a computer system

    Get PDF
    Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Architecture, 1986.MICROFICHE COPY AVAILABLE IN ARCHIVES AND ROTCHIncludes bibliographical references (leaves 70-73).Design is not considered as a professional activity that entails preliminary design, schematic design, and design development, but rather as a creative and conceptual activity that involves the graphic expression of ideas. By using computer graphics, the computer as a protean machine provides the means to develop an instrument for design that is more expressive than the pencil. Computer graphics systems for architects are built almost exclusively for drafting and working drawing production. Most are based on systems designed for mechanical engineering of machine parts. The drafting paradigm of the CAD system implies that the drawings done on computer represent schemes in an already designed form. The antithetical device is called a creatrix. The creatrix is analogous to three instruments. It expresses the architect's ideas like a musical instrument. It manages and manipulates building complexity like mathematical instruments. It quantifies and qualifies for the architect's evaluation like scientific instruments. Its three-fold mandate is: to assist the architect's process of visualization through his expression of ideas; to manage building complexity in a way that allows the architect to think about architecture; to re-present the architect's ideas in a way that helps the architect to critically and objectively evaluate his concepts. This thesis will present a specification for a system that will satisfy the mandates. The creatrix will be specified in the context of what is feasible technologically now at a reasonable cost.by Peter Harold Jurgensen.M.S

    Phosphorus Translocation by Red Deer on a Subalpine Grassland in the Central European Alps

    Get PDF
    We examined the role of red deer (Cervus elaphus L.) in translocating phosphorus (P) from their preferred grazing sites (short-grass vegetation on subalpine grasslands) to their wider home range in a subalpine grassland ecosystem in the Central European Alps. Phosphorus was used because it is the limiting nutrient in these grasslands. When we compared P removal of aboveground biomass due to grazing with P input due to the deposit of feces on a grid of 268 cells (20 m × 20 m) covering the entire grassland, we detected distinct spatial patterns: the proportion of heavily grazed short-grass vegetation increased with increasing soil-P pool, suggesting that red deer preferably grazed on grid cells with a higher soil-P pool. Biomass consumption related to increased proportion of short-grass vegetation, and therefore P removal, increased with increasing soil-P pool. However, within the two vegetation types (short-grass and tall-grass), consumption was independent from soil-P pool. In addition, P input rates from defecation increased with increasing soil-P pool, resulting in a constant mean net P loss of 0.083 kg ha−1 y−1 (0.03%-0.07% of soil-P pool) independent of both soil-P pool and vegetation type. Thus, there was no P translocation between grid cells with different soil-P pools or between short-grass and tall-grass vegetation. Based on these results, it is likely that the net rate of P loss is too small to explain the observed changes in vegetation composition from tall-herb/meadow communities to short-grass and from tall-grass to short-grass on the grassland since 1917. Instead, we suggest that the grazing patterns of red deer directly induced succession from tall-herb/meadow communities to short-grass vegetation. Yet, it is also possible that long-term net soil-P losses indirectly drive plant succession from short-grass to tall-grass vegetation, because nutrient depletion could reduce grazing pressure in short-grass vegetation and enable the characteristic tall-grass species Carex sempervirens Vill. to establis

    Fingerprint identification using weighted minutiae

    No full text

    Wood strength loss as a measure of decomposition in northern forest mineral soil

    No full text
    Wood stake weight loss has been used as an index of wood decomposition in mineral soil, but it may not give a reliable estimate in cold boreal forests where decomposition is very slow. Various wood stake strength tests have been used as surrogates of weight loss, but little is known on which test would give the best estimate of decomposition over a variety of soil temperature conditions. Our study showed that radial compression strength (RCS) was a better indicator of wood strength change in southern pine (Pinus spp.) and aspen (Populus tremuloides Michx.) than surface hardness or longitudinal shear. The suitability of using the RCS to measure wood decomposition in boreal mineral soils was tested in six Scots pine (Pinus sylvestris L.) plantations along a North-South gradient from Finland to Poland. After 3 years RCS losses ranged from 20% in northern Finland to 94% in central Poland, compared to dry weight losses of 3% and 65%. RCS was a sensitive indicator of initial wood decomposition, and could be used in soils where decomposition is limited by low temperature, lack of water or oxygen, or where a rapid estimate of wood decomposition is wanted. © 2005 Elsevier SAS. All rights reserved

    Is detection of adverse events affected by record review methodology? An evaluation of the “Harvard Medical Practice Study” method and the “Global Trigger Tool”.

    Get PDF
    BACKGROUND: There has been a theoretical debate as to which retrospective record review method is the most valid, reliable, cost efficient and feasible for detecting adverse events. The aim of the present study was to evaluate the feasibility and capability of two common retrospective record review methods, the "Harvard Medical Practice Study" method and the "Global Trigger Tool" in detecting adverse events in adult orthopaedic inpatients. METHODS: We performed a three-stage structured retrospective record review process in a random sample of 350 orthopaedic admissions during 2009 at a Swedish university hospital. Two teams comprised each of a registered nurse and two physicians were assigned, one to each method. All records were primarily reviewed by registered nurses. Records containing a potential adverse event were forwarded to physicians for review in stage 2. Physicians made an independent review regarding, for example, healthcare causation, preventability and severity. In the third review stage all adverse events that were found with the two methods together were compared and all discrepancies after review stage 2 were analysed. Events that had not been identified by one of the methods in the first two review stages were reviewed by the respective physicians. RESULTS: Altogether, 160 different adverse events were identified in 105 (30.0%) of the 350 records with both methods combined. The "Harvard Medical Practice Study" method identified 155 of the 160 (96.9%, 95% CI: 92.9-99.0) adverse events in 104 (29.7%) records compared with 137 (85.6%, 95% CI: 79.2-90.7) adverse events in 98 (28.0%) records using the "Global Trigger Tool". Adverse events "causing harm without permanent disability" accounted for most of the observed difference. The overall positive predictive value for criteria and triggers using the "Harvard Medical Practice Study" method and the "Global Trigger Tool" was 40.3% and 30.4%, respectively. CONCLUSIONS: More adverse events were identified using the "Harvard Medical Practice Study" method than using the "Global Trigger Tool". Differences in review methodology, perception of less severe adverse events and context knowledge may explain the observed difference between two expert review teams in the detection of adverse events

    Abstract Randomness on full shift spaces q

    No full text
    We give various characterizations for algorithmically random con®gurations on full shift spaces, based on randomness tests. We show that all nonsurjective cellular automata destroy randomness and surjective cellular automata preserve randomness. Furthermore all one-dimensional cellular automata preserve nonrandomness. The last three assertions are also true if one replaces randomness by richness ± a form of pseudorandomness, which is compatible with computability. The last assertion is true even for an arbitrar

    Molecular Characterization of Collagen Hydroxylysine O-Glycosylation by Mass Spectrometry: Current Status

    No full text
    The most abundant proteins in vertebrates – the collagen family proteins – play structural and biological roles in the body. The predominant member, type I collagen, provides tissues and organs with structure and connectivity. This protein has several unique post-translational modifications that take place intra- and extra-cellularly. With growing evidence of the relevance of such post-translational modifications in health and disease, the biological significance of O-linked collagen glycosylation has recently drawn increased attention. However, several aspects of this unique modification – the requirement for prior lysyl hydroxylation as a substrate, involvement of at least two distinct glycosyl transferases, its involvement in intermolecular crosslinking – have made its molecular mapping and quantitative characterization challenging. Such characterization is obviously crucial for understanding its biological significance. Recent progress in mass spectrometry has provided an unprecedented opportunity for this type of analysis. This review summarizes recent advances in the area of O-glycosylation of fibrillar collagens and their characterization using state-of-the-art liquid chromatography–mass spectrometry-based methodologies, and perspectives on future research. The analytical characterization of collagen crosslinking and advanced glycation end-products are not addressed here
    corecore