744 research outputs found

    A zinc transporter gene required for development of the nervous system.

    Get PDF
    The essentiality of zinc for normal brain development is well established. It has been suggested that primary and secondary zinc deficiencies can contribute to the occurrence of numerous human birth defects, including many involving the central nervous system. In a recent study, we searched for zinc transporter genes that were critical for neurodevelopment. We confirmed that ZIP12 is a zinc transporter encoded by the gene slc39a12 that is highly expressed in the central nervous systems of human, mouse, and frog (Xenopus tropicalis).Using loss-of-function methods, we determined that ZIP12 is required for neuronal differentiation and neurite outgrowth and necessary for neurulation and embryonic viability. These results highlight an essential need for zinc regulation during embryogenesis and nervous system development. We suggest that slc39a12 is a candidate gene for inherited neurodevelopmental defects in humans

    LimnogeologĂșa de Laguna ChungarĂĄ y cambio climĂĄtico durante el Holoceno superior en el altiplano chileno septentrional

    Get PDF
    [Resumen] El estudio sísmico de la Laguna Chungarå (69° 30' O, 18° 15'S, 4520 m. s.n.m, Altiplano chileno septentrional) y el anålisis sedimentológico de varios sondeos ha permitido reconstruir la evolución de la sedimentación lacustre en el lago durante el Holoceno Superior. Se han identificado dos asociaciones de facies: i) litoral, mås somera, compuesta por facies de turbera y in plataforma lacustre, mås profunda, compuesta por facies con abundantes fragmentos de Characeae. Estas facies alternan en tres ciclos sedimentarios provocados por fluctuaciones en el nivel del lago. Estas variaciones hidrológicas en la laguna han sido causadas por cambios en el balance hídrico que a su vez reflejan importantes fluctuaciones climåticas durante el Holoceno Superior.[Abstract] We reconstruct the Late Holocene sedimentary history of Laguna Chungarå (69° 30' W, 18° 10'5, 4520 m.a.s.l., northern Chilean Altiplano) based on high resolution seismic profiling and sedimentologic analyses of cores. Two sedimentary facies associations have been defined and interpreted: n macrophytedominant littoral, composed of black muds with macrophyte remains and peaty muds, and in Characeae-dominant lacustrine self, composed of gray muds and sands with abundant Characeae remains. The two facies associations define three cycles caused by oscillations in the lake level from shallower (macrophyte) to deeper (Characeae) conditions. Changes in the hydrology of Laguna Chungarå reflect variations in the effective moisture (precipitation - evaporation) in the Altiplano during the Late Holocene

    Performances of the PS<sup>2</sup> parallel storage and processing system for tomographic image visualization

    Get PDF
    We propose a new approach for developing parallel I/O- and compute-intensive applications. At a high level of abstraction, a macro data flow description describes how processing and disk access operations are combined. This high-level description (CAP) is precompiled into compilable and executable C++ source language. Parallel file system components specified by CAP are offered as reusable CAP operations. Low-level parallel file system components can, thanks to the CAP formalism, be combined with processing operations in order to yield efficient pipelined parallel I/O and compute intensive programs. The underlying parallel system is based on commodity components (PentiumPro processors, Fast Ethernet) and runs on top of WindowsNT. The CAP-based parallel program development approach is applied to the development of an I/O and processing intensive tomographic 3D image visualization application. Configurations range from a single PentiumPro I-disk system to a four PentiumPro 27-disk system. We show that performances scale well when increasing the number of processors and disks. With the largest configuration, the system is able to extract in parallel and project into the display space between three and four 512&times;512 images per second. The images may have any orientation and are extracted from a 100 MByte 3D tomographic image striped over the available set of disk

    Parallelizing I/O-intensive image access and processing applications

    Get PDF
    This article presents methods and tools for building parallel applications based on commodity components: PCs, SCSI disks, Fast Ethernet, Windows NT. Chief among these tools is CAP, our computer-aided parallelization tool. CAP generates highly pipelined applications that run communication and I/O operations in parallel with processing operations. One of CAP's successes is the Visible Human Slice Server, a 3D tomographic image server that allows clients to choose and view any cross section of the human bod

    Performance of CAP-specified linear algebra algorithms

    Get PDF
    The traditional approach to the parallelization of linear algebra algorithms such as matrix multiplication and LU factorization calls for static allocation of matrix blocks to processing elements (PEs). Such algorithms suffer from two drawbacks: they are very sensitive to load imbalances between PEs and they make it difficult to take advantage of pipelining opportunities. This paper describes dynamic versions of linear algebra algorithms, where subtasks (matrix block multiplication, matrix block LU factorization) are dynamically allocated to PEs. It analyses theoretically the performance of the dynamic algorithms. This paper's contribution is to show that the dynamic-pipelined linear-algebra algorithms can be specified compactly in CAP and yet achieve good performance. CAP is a C++ language extension for the specification of parallel applications based on macro-dataflow graphs. The CAP model, based on macro-dataflow graphs, is general and supports pipelinin

    Synthesizing parallel imaging applications using the CAP Computer-Aided Parallelization tool

    Get PDF
    Imaging applications such as filtering, image transforms and compression/decompression require vast amounts of computing power when applied to large data sets. These applications would potentially benefit from the use of parallel processing. However, dedicated parallel computers are expensive and their processing power per node lags behind that of the most recent commodity components. Furthermore, developing parallel applications remains a difficult task. In order to facilitate the development of parallel applications, we propose the CAP computer aided parallelization tool which enables application programmers to specify at a high level of abstraction the flow of data between pipelined parallel operations. In addition, the CAP tool supports the programmer in developing parallel imaging and storage operations. CAP enables combining efficiently parallel storage access routines and image processing sequential operations. The paper shows how processing and I/O intensive imaging applications must be implemented to take advantage of parallelism and pipelining between data access and processing. The paper's contribution is: (1) to show how such implementations can be compactly specified in CAP; and (2) to demonstrate that CAP specified applications achieve the performance of custom parallel code. The paper analyzes theoretically the performance of CAP specified applications and demonstrates the accuracy of the theoretical analysis through experimental measurement

    SARS-CoV-2 (COVID-19) Vaccine Intentions in Kentucky

    Get PDF
    Background: At the time of our writing, the COVID-19 pandemic continues to cause significant disruption to daily lives. In Kentucky, the burdens from this disease are higher, and vaccination rates for COVID-19 are lower, in comparison to the U.S. as a whole. Understanding vaccine intentions across key subpopulations is critical to increasing vaccination rates. Purpose: This study explores COVID-19 vaccine intentions in Kentucky across demographic subpopulations and also investigates the influences on vaccine intention of attitudes and beliefs about COVID-19. Methods: A population-based survey of 1,459 Kentucky adults was conducted between January 26 and March 20, 2021, with over-sampling of black/African American and Latino/a residents, using online and telephonic modalities. Descriptive statistics characterize the sample and overall vaccine intentions and beliefs. Multivariable linear regression models probed relationships between demographics and vaccination intentions, as well as relationships between vaccination beliefs and vaccination intention. Results: Of the 1,299 unvaccinated respondents, 53% reported intent to get vaccinated, 16% had not decided, and 31% felt they would not get vaccinated. Lower vaccination intention was independently associated with age, lower educational attainment, black/African American race, lower income, Republican political affiliation, rural residence, and several beliefs: low vaccine safety, low vaccine efficacy, the rapidity of vaccine development, and mistrust of vaccine producers. Implications: Increasing COVID-19 vaccination rates will help end this pandemic. Findings from this study can be used to tailor information campaigns aimed at helping individuals make informed decisions about COVID-19 vaccination

    Mapping interactions between the sustainable development goals: lessons learned and ways forward

    Get PDF
    Pursuing integrated research and decision-making to advance action on the sustainable development goals (SDGs) fundamentally depends on understanding interactions between the SDGs, both negative ones (“trade-offs”) and positive ones (“co-benefits”). This quest, triggered by the 2030 Agenda, has however pointed to a gap in current research and policy analysis regarding how to think systematically about interactions across the SDGs. This paper synthesizes experiences and insights from the application of a new conceptual framework for mapping and assessing SDG interactions using a defined typology and characterization approach. Drawing on results from a major international research study applied to the SDGs on health, energy and the ocean, it analyses how interactions depend on key factors such as geographical context, resource endowments, time horizon and governance. The paper discusses the future potential, barriers and opportunities for applying the approach in scientific research, in policy making and in bridging the two through a global SDG Interactions Knowledge Platform as a key mechanism for assembling, systematizing and aggregating knowledge on interactions

    Appropriating Risk Factors: The Reception of an American Approach to Chronic Disease in the two German States, c. 1950–1990

    Get PDF
    Risk factors have become a dominant approach to the aetiology of chronic disease worldwide. The concept emerged in the new field of chronic disease epidemiology in the United States in the 1950s, around near-iconic projects such as the Framingham Heart Study. In this article I examine how chronic disease epidemiology and the risk factor concept were adopted and adapted in the two German states. I draw on case studies that illuminate the characteristics of the different contexts and different take on traditions in social hygiene, social medicine and epidemiology. I also look at critics of the risk factor approach in East and West Germany, who viewed risk factors as intellectually dishonest and a new surveillance tool
    • 

    corecore