2,785 research outputs found

    Mission oriented R and D and the advancement of technology: The impact of NASA contributions, volume 2

    Get PDF
    NASA contributions to the advancement of major developments in twelve selected fields of technology are presented. The twelve fields of technology discussed are: (1) cryogenics, (2) electrochemical energy conversion and storage, (3) high-temperature ceramics, (4) high-temperature metals (5) integrated circuits, (6) internal gas dynamics (7) materials machining and forming, (8) materials joining, (9) microwave systems, (10) nondestructive testing, (11) simulation, and (12) telemetry. These field were selected on the basis of both NASA and nonaerospace interest and activity

    Simulating Dynamical Features of Escape Panic

    Get PDF
    One of the most disastrous forms of collective human behaviour is the kind of crowd stampede induced by panic, often leading to fatalities as people are crushed or trampled. Sometimes this behaviour is triggered in life-threatening situations such as fires in crowded buildings; at other times, stampedes can arise from the rush for seats or seemingly without causes. Tragic examples within recent months include the panics in Harare, Zimbabwe, and at the Roskilde rock concert in Denmark. Although engineers are finding ways to alleviate the scale of such disasters, their frequency seems to be increasing with the number and size of mass events. Yet, systematic studies of panic behaviour, and quantitative theories capable of predicting such crowd dynamics, are rare. Here we show that simulations based on a model of pedestrian behaviour can provide valuable insights into the mechanisms of and preconditions for panic and jamming by incoordination. Our results suggest practical ways of minimising the harmful consequences of such events and the existence of an optimal escape strategy, corresponding to a suitable mixture of individualistic and collective behaviour.Comment: For related information see http://angel.elte.hu/~panic, http://www.helbing.org, http://angel.elte.hu/~fij, and http://angel.elte.hu/~vicse

    Halide: a language and compiler for optimizing parallelism, locality, and recomputation in image processing pipelines

    Get PDF
    Image processing pipelines combine the challenges of stencil computations and stream programs. They are composed of large graphs of different stencil stages, as well as complex reductions, and stages with global or data-dependent access patterns. Because of their complex structure, the performance difference between a naive implementation of a pipeline and an optimized one is often an order of magnitude. Efficient implementations require optimization of both parallelism and locality, but due to the nature of stencils, there is a fundamental tension between parallelism, locality, and introducing redundant recomputation of shared values. We present a systematic model of the tradeoff space fundamental to stencil pipelines, a schedule representation which describes concrete points in this space for each stage in an image processing pipeline, and an optimizing compiler for the Halide image processing language that synthesizes high performance implementations from a Halide algorithm and a schedule. Combining this compiler with stochastic search over the space of schedules enables terse, composable programs to achieve state-of-the-art performance on a wide range of real image processing pipelines, and across different hardware architectures, including multicores with SIMD, and heterogeneous CPU+GPU execution. From simple Halide programs written in a few hours, we demonstrate performance up to 5x faster than hand-tuned C, intrinsics, and CUDA implementations optimized by experts over weeks or months, for image processing applications beyond the reach of past automatic compilers.United States. Dept. of Energy (Award DE-SC0005288)National Science Foundation (U.S.) (Grant 0964004)Intel CorporationCognex CorporationAdobe System

    Sr isotopes in arcs revisited: tracking slab dehydration using δ88/86Sr and 87Sr/86Sr systematics of arc lavas

    Get PDF
    Dehydration of the subducting slab is a crucial process in the generation of hydrous convergent margin magmas, yet the exact processes of how and where the slab dehydrates and how these fluids are transported to the mantle wedge remain obscure. Strontium is a “fluid-mobile” element and as such well suited to investigate the source of slab-derived fluids. We employ mass-dependent Sr isotope systematics (δ88/86Sr; the deviation in 88Sr/86Sr of a sample relative to NIST SRM 987) of primitive arc lavas, in tandem with conventional radiogenic 87Sr/86Sr measurements, as a novel tracer of slab dehydration. To characterise the δ88/86Sr composition of subduction zone inputs, we present new δ88/86Sr data for subducting sediments, altered oceanic crust and MORB. Calcareous sediments are isotopically lighter and carbonate-free sediments are isotopically heavier than mid-ocean ridge basalts (MORB). Samples of the altered oceanic crust display elevated 87Sr/86Sr but only the most intensely altered sample has significantly higher δ88/86Sr than pristine MORB. Mafic arc lavas from the Aegean and Mariana arc invariably have a mass-dependent Sr isotope composition that is indistinguishable from MORB and lower 87Sr/86Sr than upper altered oceanic crust. This δ88/86Sr-87Sr/86Sr signature of the arc lavas, in combination with their high but variable Sr/Nd, can only be explained if it is provided by a fluid that acquired its Sr isotope signature in the deeper, less altered part of the subducted oceanic crust. We propose a model where the breakdown of serpentinite in the slab mantle releases a pulse of fluid at sub-arc depths. These fluids travel through and equilibrate with the overlying oceanic crust and induce wet partial melting of the upper altered crust and sediments. This hydrous melt is then delivered to the mantle source of arc magmas as a single metasomatic component. From mass balance it follows that the slab-derived fluid contributes >70% of the Sr budget of both Mariana and Aegean arc lavas. Whereas this fluid-dominated character is unsurprising for the sediment-poor Mariana arc, the Aegean arc sees the subduction of 3–6 km of calcareous sediments that were found to exert very little control on the Sr budget of the arc magmas and are overwhelmed by the fluid contribution

    A Library for Declarative Resolution-Independent 2D Graphics

    Get PDF
    The design of most 2D graphics frameworks has been guided by what the computer can draw efficiently, instead of by how graphics can best be expressed and composed. As a result, such frameworks restrict expressivity by providing a limited set of shape primitives, a limited set of textures and only affine transformations. For example, non-affine transformations can only be added by invasive modification or complex tricks rather than by simple composition. More general frameworks exist, but they make it harder to describe and analyze shapes. We present a new declarative approach to resolution-independent 2D graphics that generalizes and simplifies the functionality of traditional frameworks, while preserving their efficiency. As a real-world example, we show the implementation of a form of focus+context lenses that gives better image quality and better performance than the state-of-the-art solution at a fraction of the code. Our approach can serve as a versatile foundation for the creation of advanced graphics and higher level frameworks

    Pharmacy Adherence Measures to Assess Adherence to Antiretroviral Therapy: Review of the Literature and Implications for Treatment Monitoring

    Get PDF
    Prescription or pill-based methods for estimating adherence to antiretroviral therapy (ART), pharmacy adherence measures (PAMs), are objective estimates calculated from routinely collected pharmacy data. We conducted a literature review to evaluate PAMs, including their association with virological and other clinical outcomes, their efficacy compared with other adherence measures, and factors to consider when selecting a PAM to monitor adherence. PAMs were classified into 3 categories: medication possession ratio (MPR), pill count (PC), and pill pick-up (PPU). Data exist to recommend PAMs over self-reported adherence. PAMs consistently predicted patient outcomes, but additional studies are needed to determine the most predictive PAM parameters. Current evidence suggests that shorter duration of adherence assessment (≤6 months) and use of PAMs to predict future outcomes may be less accurate. PAMs which incorporate the number of days for which ART was prescribed without the counting of remnant pills, are reasonable minimum-resource methods to assess adherence to AR

    Decoupling algorithms from schedules for easy optimization of image processing pipelines

    Get PDF
    Using existing programming tools, writing high-performance image processing code requires sacrificing readability, portability, and modularity. We argue that this is a consequence of conflating what computations define the algorithm, with decisions about storage and the order of computation. We refer to these latter two concerns as the schedule, including choices of tiling, fusion, recomputation vs. storage, vectorization, and parallelism. We propose a representation for feed-forward imaging pipelines that separates the algorithm from its schedule, enabling high-performance without sacrificing code clarity. This decoupling simplifies the algorithm specification: images and intermediate buffers become functions over an infinite integer domain, with no explicit storage or boundary conditions. Imaging pipelines are compositions of functions. Programmers separately specify scheduling strategies for the various functions composing the algorithm, which allows them to efficiently explore different optimizations without changing the algorithmic code. We demonstrate the power of this representation by expressing a range of recent image processing applications in an embedded domain specific language called Halide, and compiling them for ARM, x86, and GPUs. Our compiler targets SIMD units, multiple cores, and complex memory hierarchies. We demonstrate that it can handle algorithms such as a camera raw pipeline, the bilateral grid, fast local Laplacian filtering, and image segmentation. The algorithms expressed in our language are both shorter and faster than state-of-the-art implementations.National Science Foundation (U.S.) (Grant 0964004)National Science Foundation (U.S.) (Grant 0964218)National Science Foundation (U.S.) (Grant 0832997)United States. Dept. of Energy (Award DE-SC0005288)Cognex CorporationAdobe System

    Veterans health administration hepatitis B testing and treatment with anti-CD20 antibody administration

    Get PDF
    AIM: To evaluate pretreatment hepatitis B virus (HBV) testing, vaccination, and antiviral treatment rates in Veterans Affairs patients receiving anti-CD20 Ab for quality improvement. METHODS: We performed a retrospective cohort study using a national repository of Veterans Health Administration (VHA) electronic health record data. We identified all patients receiving anti-CD20 Ab treatment (2002-2014). We ascertained patient demographics, laboratory results, HBV vaccination status (from vaccination records), pharmacy data, and vital status. The high risk period for HBV reactivation is during anti-CD20 Ab treatment and 12 mo follow up. Therefore, we analyzed those who were followed to death or for at least 12 mo after completing anti-CD20 Ab. Pretreatment serologic tests were used to categorize chronic HBV (hepatitis B surface antigen positive or HBsAg+), past HBV (HBsAg-, hepatitis B core antibody positive or HBcAb+), resolved HBV (HBsAg-, HBcAb+, hepatitis B surface antibody positive or HBsAb+), likely prior vaccination (isolated HBsAb+), HBV negative (HBsAg-, HBcAb-), or unknown. Acute hepatitis B was defined by the appearance of HBsAg+ in the high risk period in patients who were pretreatment HBV negative. We assessed HBV antiviral treatment and the incidence of hepatitis, liver failure, and death during the high risk period. Cumulative hepatitis, liver failure, and death after anti-CD20 Ab initiation were compared by HBV disease categories and differences compared using the χ2 test. Mean time to hepatitis peak alanine aminotransferase, liver failure, and death relative to anti-CD20 Ab administration and follow-up were also compared by HBV disease group. RESULTS: Among 19304 VHA patients who received anti-CD20 Ab, 10224 (53%) had pretreatment HBsAg testing during the study period, with 49% and 43% tested for HBsAg and HBcAb, respectively within 6 mo pretreatment in 2014. Of those tested, 2% (167/10224) had chronic HBV, 4% (326/7903) past HBV, 5% (427/8110) resolved HBV, 8% (628/8110) likely prior HBV vaccination, and 76% (6022/7903) were HBV negative. In those with chronic HBV infection, ≤ 37% received HBV antiviral treatment during the high risk period while 21% to 23% of those with past or resolved HBV, respectively, received HBV antiviral treatment. During and 12 mo after anti-CD20 Ab, the rate of hepatitis was significantly greater in those HBV positive vs negative (P = 0.001). The mortality rate was 35%-40% in chronic or past hepatitis B and 26%-31% in hepatitis B negative. In those pretreatment HBV negative, 16 (0.3%) developed acute hepatitis B of 4947 tested during anti-CD20Ab treatment and follow-up. CONCLUSION: While HBV testing of Veterans has increased prior to anti-CD20 Ab, few HBV+ patients received HBV antivirals, suggesting electronic health record algorithms may enhance health outcomes
    • …
    corecore