171 research outputs found

    Legacy Lead Isotopic Signature in Riverine Sediments in Everett, Washington

    Get PDF
    Highly industrialized areas near sensitive estuarine environments are a perennial focal point of environmental studies concerning releases of hazardous materials, including trace heavy metals. This study focuses on determining the heavy metal distribution legacy at a demolished ASARCO smelter site in north Everett, WA near the mouth of the Snohomish River. This site has been the target of recent remedial actions under the Model Toxics Control Act (MTCA) Cleanup Regulation developed by the Washington Department of Ecology (WA DOE) due to widespread arsenic and lead contamination in the early 1900s. Previous research has shown evidence of a correlation between heavy metal concentrations in benthic sediment of Possession Sound and proximity to industrial sites, while suggesting heavy metal concentrations varied with changes in Snohomish River discharge. Overall, lead concentrations were shown to be higher at sites closer to the mouth of the river. Therefore I hypothesize that the suspended solids in runoff from the ASARCO site is traceable down river from the site, perhaps into Possession Sound, and deposits correlate with river discharge. Future research will seek to establish the extent of dispersion of anthropogenic lead derived from the ASARCO site through lead isotope fingerprinting (Pb-204, Pb-206, Pb-207, Pb-208). Lead tracing as a geochemical indicator from one specific source will further demonstrate heavy metal distribution in a highly complex riverine system with many potential point and non-point sources of contaminants. Slag samples from WA DOE will be used to determine the isotopic signature of the site. Three one-meter benthic sediment core samples collected near the mouth of the Snohomish River, near the ASARCO site, and up river will be analyzed using a specialized MC-ICP-MS instrument at the University of Washington in January and February of 2014. Analyzing multiple layers of a core sample will demonstrate a temporal trend based contaminated sediment thickness at the sampling sites. Results will be analyzed in conjunction with Ecology Site Hazard Assessment data. This research will emphasize both the spatial and temporal aspects by demonstrating the relationship between coherent structures in a tidally influenced fluvial system and lead deposition

    Understanding quantitative DCE-MRI of the breast : towards meaningful clinical application

    Get PDF
    In most industrialized countries breast cancer will affect one out of eight women during her lifetime. In the USA, after continuously increasing for more than two decades, incidence rates are slowly decreasing since 2001. Since 1990, death rates from breast cancer have steadily decreased in women, which is attributed to both earlier detection and improved treatment. Still, it is second only to lung cancer as a cause of cancer death in women. In this work we set out to improve early detection of breast cancer via quantitative analysis of magnetic resonance images (MRI). Screening and diagnosis of breast cancer are generally performed using X-ray mammography, possibly in conjunction with ultrasonography. However, MRI is becoming an important modality for screening of women at high-risk due to for instance hereditary gene mutations, as a problem-solving tool in case of indecisive mammographic and / or ultrasonic imaging, and for anti-cancer therapy assessment. In this work, we focused on MR imaging of the breast. More specifically, the dynamic contrast-enhanced (DCE) part of the protocol was highlighted, as well as radiological assessment of DCE-MRI data. The T_1-weighted (T_1: longitudinal relaxation time, a tissue property) signal-versus-time curve that can be extracted from the DCE-MRI series that is acquired at the time of and after injection of a T_1-shortening (shorter T_1 results in higher signal) contrast agent, is usually visually assessed by the radiologist. For example, a fast initial rise to the peak (1-2 minutes post injection) followed by loss of signal within a time frame of about 5-6 minutes is a sign for malignancy, whereas a curve showing persistent (slow) uptake within the same time frame is a sign for benignity. This difference in contrast agent uptake pattern is related to physiological changes in tumorous tissue that for instance result in a stronger uptake of the contrast agent. However, this descriptive way of curve type classification is based on clinical statistics, not on knowledge about tumor physiology. We investigated pharmacokinetic modeling as a quantitative image analysis tool. Pharmacokinetics describes what happens to a substance (e.g. drug or contrast agent) after it has been administered to a living organism. This includes the mechanisms of absorption and distribution. The terms in which these mechanisms are described are physiological and can therefore provide parameters describing the functioning of the tissue. This physiological aspect makes it an attractive approach to investigate (aberrant) tissue functioning. In addition, this type of analysis excludes confounding factors due to inter- and intra-patient differences in the systemic blood circulation, as well as differences in the injection protocol. In this work, we discussed the physiological basis and details of different types of pharmacokinetic models, with the focus on compartmental models. Practical implications such as obtaining an arterial input function and model parameter estimation were taken into account as well. A simulation study of the data-imposed limitations – in terms of temporal resolution and noise properties – on the complexity of pharmacokinetic models led to the insight that only one of the tested models, the basic Tofts model, is applicable to DCE-MRI data of the breast. For the basic Tofts model we further investigated the aspect of temporal resolution, because a typical diagnostic DCE-MRI scan of the breast is acquired at a rate of about 1 image volume every minute; whereas pharmacokinetic modeling usually requires a sampling time of less than 10 s. For this experiment we developed a new downsampling method using high-temporal-resolution raw k-space data to simulate what uptake curves would have looked like if they were acquired at lower temporal resolutions. We made use of preclinical animal data. With this data we demonstrated that the limit of 10 s can be stretched to about 1 min if the arterial input function (AIF, the input to the pharmacokinetic model) is inversely derived from a healthy reference tissue, instead of measured in an artery or taken from the literature. An important precondition for the application of pharmacokinetic modeling is knowledge of the relationship between the acquired DCE-MRI signal and the actual concentration of the contrast agent in the tissue. This relationship is not trivial because with MRI we measure the indirect effect of the contrast agent on water protons. To establish this relationship via calculation of T_1 (t), we investigated both a theoretical and an empirical approach, making use of an in-house (University of Chicago) developed reference object that is scanned concurrently with the patient. The use of the calibration object can shorten the scan duration (an empirical approach requires less additional scans than an approach using a model of the acquisition technique), and can demonstrate if theoretical approaches are valid. Moreover we produced concentration images and estimated tissue proton density, also making use of the calibration object. Finally, via pharmacokinetic modeling and other MRI-derived measures we partly revealed the actions of a novel therapeutic in a preclinical study. In particular, the anti-tumor activity of a single dose of liposomal prednisolone phosphate was investigated, which is an anti-inflammatory drug that has demonstrated tumor growth inhibition. The work presented in this thesis contributes to a meaningful clinical application and interpretation of quantitative DCE-MRI of the breast

    An examination of the coverage of Oregon22

    Get PDF
    61 pagesThe World Athletics Championships Oregon22 marked the first World Championships held in the United States. Track and field officials hoped this event would help increase the support of track and field in the United States. These officials created marketing plans and expected American media covering Oregon22 to frontline this growth movement. This researcher set out to explore the coverage of Oregon22 to test if the coverage provided by American publications was equal for female and male athletes. Oregon22 was an “equal-participation event1” that had 24 events featuring female athletes and 24 events featuring male athletes. This researcher conducted an intensive descriptive analysis of 17 articles published by four prominent American publications — The New York Times, ESPN, NBC Sports, and The Los Angeles Times — as well as 11 articles published by the governing body’s website, World Athletics. The researcher chose to examine events featuring top American athletes, so that included the 100 meters, 400-meter hurdles, shot put, javelin, and pole vault. The researcher also set out to test if these four major U.S. publications and World Athletics employed a similar number of female and male journalists at Oregon22. In addition to examining the gender equity of the competitors and journalists at Oregon22, this researcher wanted to see if Oregon22 did help to increase the support of track and field in the United States. These findings could help to indicate that the quantity of coverage of female and male athletes on these four major American publications and World Athletics was equal. This research also found that American publications did not employ an equal number of female journalists, although World Athletics did. However, each of these findings are far from concrete as the researcher struggled to find articles covering these five events on the four major American publications. The researcher also found articles published by The Los Angeles Times and The New York Times discussing that lack of coverage of Oregon22, as well as track and field officials’ dismay with the support of casual sports fans. These findings led the researcher to believe that Oregon22 might not have increased the following of track and field in the United States

    El Branding y la ventaja competitiva en el Estudio Contable S.K., Lima 2019

    Get PDF
    La empresa Estudio Contables S.K, tuvo por objetivo demostrar la relación del branding con la ventaja competitiva del estudio contable S.K. Lima 2019. La investigación fue de tipo aplicada, con enfoque cuantitativo y de nivel exploratorio. La población de estudio estuvo conformada por los clientes del estudio en el distrito de cercado de lima, se obtuvo una muestra no probabilística y por convivencia de 53 personas clientes del estudio, el instrumento estuvo compuesto por 20 items los cuales se aplicaron a los clientes del estudio, la técnica aplicada fue la encuesta. Es una empresa dedicada a dar asesorías y consultorías, cuya actividad principal es controlar los diferentes presupuestos, fondos, inventario anual de todos los activos fijos y movimientos financieros de las empresas que se tienen como cliente. La información recolectada por los clientes fueron procesados por el programa estadístico SPSS statistics v. 25, se utilizó el estadístico correlacional con mediciones no paramétricas realizado por la prueba de Kolgomorov Smirnov, dando un valor de p=sig.=0,000 para ambas variables señalando así para utilizar el método estadístico inferencial Rho de Spearman el cual dio como resultado una correlación positiva muy fuerte de p= ,803 y la prueba de hipótesis presentada se acepta, los resultados fueron positivos dentro de la organización, a ello demuestran que existe relación entre las variables mencionadas

    Aplicación móvil y web responsiva utilizando el algoritmo de planificación basado en prioridades para apoyar en la atención de solicitudes del servicio delivery en la Lavandería Quin

    Get PDF
    Actualmente desde un punto de vista general, los servicios se han convertido en un tema muy relevante, desde que la mayoría de empresas se convirtió en proveedoras de servicios; por eso es muy importante entender que al momento que un cliente adquiere un servicio, este tiene la promesa que recibirá algo especial, por lo tanto la empresa tiene que trabajar para impresionarlo y darle tranquilidad. En la lavandería y tintorería Quin se identificó que existían muchas demoras para poder decidir el orden de atención de las solicitudes, así como se demoraba obtener el reporte de la cantidad de solicitudes por proceso de atención, y esto en su conjunto ocasionaba que se perdían de atender muchas solicitudes de clientes que no se recepcionaban, ocasionando que los clientes no se sientan satisfechos con respecto a la atención de sus solicitudes. Es por ello que se tomó la decisión de apoyar la atención de solicitudes del servicio delivery a través de la implementación de una aplicación móvil y web responsiva utilizando el algoritmo de planificación basado en prioridades, con la finalidad de resolver la problemática del negocio. Para el desarrollo del proyecto se utilizó herramientas tecnológicas como android studio para la implementación de la aplicación móvil, y para la plataforma web se hiso uso del framework materialize, basándose en la metodología de desarrollo de software extreme programming (XP) permitiendo una interacción ágil y fácil entre el equipo desarrollador y la empresa. Gracias a la implementación de esta solución, se logró reducir el tiempo de ordenamiento de las solicitudes recibidas a 5 segundos, se disminuyó el tiempo promedio por cada vez que se obtenía el reporte de la cantidad de servicios por proceso de atención a 2 segundos, se redujo a cero la cantidad de solicitudes no recibidas y por ende se logró aumentar la satisfacción de los usuarios en un 78.67%

    Parallel, distributed and GPU computing technologies in single-particle electron microscopy

    Get PDF
    An introduction to the current paradigm shift towards concurrency in software

    Integrated Detector Control and Calibration Processing at the European XFEL

    Full text link
    The European X-ray Free Electron Laser is a high-intensity X-ray light source currently being constructed in the area of Hamburg, that will provide spatially coherent X-rays in the energy range between 0.25keV0.25\,\mathrm{keV} and 25keV25\,\mathrm{keV}. The machine will deliver 10trains/s10\,\mathrm{trains/s}, consisting of up to 2700pulses2700\,\mathrm{pulses}, with a 4.5MHz4.5\,\mathrm{MHz} repetition rate. The LPD, DSSC and AGIPD detectors are being developed to provide high dynamic-range Mpixel imaging capabilities at the mentioned repetition rates. A consequence of these detector characteristics is that they generate raw data volumes of up to 15Gbyte/s15\,\mathrm{Gbyte/s}. In addition the detector's on-sensor memory-cell and multi-/non-linear gain architectures pose unique challenges in data correction and calibration, requiring online access to operating conditions and control settings. We present how these challenges are addressed within XFEL's control and analysis framework Karabo, which integrates access to hardware conditions, acquisition settings (also using macros) and distributed computing. Implementation of control and calibration software is mainly in Python, using self-optimizing (py) CUDA code, numpy and iPython parallels to achieve near-real time performance for calibration application.Comment: Proceeding ICALEPS 201
    corecore