226 research outputs found

    Thermoelastic Sound Source: Waveforms in a Sensing Application

    Get PDF
    Photoacoustically generated sound pulses are widely used in various NDT, NDE and sensing applications when a non-touching method is preferred. The generation mechanisms are relatively well known, including types of waves generated, directional patterns, sound pressures and damage thresholds for the laser intensity [1]. The so-called thermoelastic regime is attractive to many applications despite of its low efficiency (usually about sub 0.1%). It is because that the process is nondestructive to samples and the theory is well established [2,3,4]. The current study addresses the prediction of the temporal ultrasound pulse shape of an optimum sound generation scheme using a low power diode pumped high repetition rate Nd:YAG pulse laser [5]. A model is proposed in which the radiation from the thermoelastic sound source is treated as an instantaneous piston source at the solid-fluid interface

    Maternal concentration of polychlorinated biphenyls and dichlorodiphenyl dichlorethylene and birth weight in Michigan fish eaters: a cohort study

    Get PDF
    BACKGROUND: Studies on maternal exposure to polychlorinated biphenyls (PCBs) reported inconsistent findings regarding birth weight: some studies showed no effect, some reported decreased birth weight, and one study found an increase in weights. These studies used different markers of exposure, such as measurement of PCBs in maternal serum or questionnaire data on fish consumption. Additionally maternal exposures, such as dichlorodiphenyl-dichloroethylene (DDE), which are related to PCB exposure and may interfere with the PCB effect, were rarely taken into account. METHODS: Between 1973 and 1991, the Michigan Department of Community Health conducted three surveys to assess PCB and DDE serum concentrations in Michigan anglers. Through telephone interviews with parents, we gathered information on the birth characteristics of their offspring, focusing on deliveries that occurred after 1968. We used the maternal organochlorine (OC) measurement closest to the date of delivery as the exposure. Although one mother may have contributed more than one child, serum concentrations derived from measurements in different surveys could vary for different children from the same mother. The maternal DDE and PCB serum concentrations were categorized as follows: 0 -< 5 microg / L, 5 -< 15 microg / L, 15 -< 25 microg / L, ≥25 microg / L. Using repeated measurement models (Generalized Estimation Equation), we estimated the adjusted mean birth weight controlling for gender, birth order, gestational age, date of delivery as well as maternal age, height, education, and smoking status. RESULTS: We identified 168 offspring who were born after 1968 and had maternal exposure information. We found a reduced birth weight for the offspring of mothers who had a PCB concentration ≥25 microg / L (adjusted birth weight = 2,958 g, p = 0.022). This group, however, was comprised of only seven observations. The association was not reduced when we excluded preterm deliveries. The birth weight of offspring was increased in women with higher DDE concentrations when controlling for PCBs; however, this association was not statistically significant. CONCLUSION: Our results contribute to the body of evidence that high maternal serum PCB concentration may reduce the birth weight in offspring. However, only a small proportion of mothers may actually be exposed to PCB concentrations ≥25 microg / L

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Effect of a chemical manufacturing plant on community cancer rates

    Get PDF
    BACKGROUND: We conducted a retrospective study to determine if potential past exposure to dioxin had resulted in increased incidence of cancer in people living near a former manufacturing plant in New South Wales, Australia. During operation, from 1928 to 1970, by-products of the manufacturing process, including dioxin and other chemical waste, were dumped into wetlands and mangroves, discharged into a nearby bay and used to reclaim land along the foreshore, leaving a legacy of significant dioxin contamination. METHODS: We selected 20 Census Collector Districts within 1.5 kilometres of the former manufacturing plant as the study area. We obtained data on all cases of cancer and deaths from cancer in New South Wales from 1972 to 2001. We also compared rates for some cancer types that have been associated with dioxin exposure. Based on a person's residential address at time of cancer diagnosis, or at time of death due to cancer, various geo-coding software and processes were used to determine which collector district the case or death should be attributed to. Age and sex specific population data were used to calculate standardised incidence ratios and standardised mortality ratios, to compare the study area to two comparison areas, using indirect standardisation. RESULTS: During the 30-year study period 1,106 cases of cancer and 524 deaths due to cancer were identified in the study area. This corresponds to an age-sex standardised rate of 3.2 cases per 1,000 person-years exposed and 1.6 deaths per 1,000 person-years exposed. The study area had a lower rate of cancer and deaths from cancer than the comparison areas. The case incidence and mortality due to lung and bronchus carcinomas and haematopoietic cancers did not differ significantly from the comparison areas for the study period. There was no obvious geographical trend in ratios when comparing individual collector districts to New South Wales according to distance from the potential source of dioxin exposure. CONCLUSION: This investigation found no evidence that dioxin contamination from this site resulted in increased cancer rates in the potentially exposed population living around the former manufacturing plant

    Linear low-dose extrapolation for noncancer health effects is the exception, not the rule

    Get PDF
    The nature of the exposure-response relationship has a profound influence on risk analyses. Several arguments have been proffered as to why all exposure-response relationships for both cancer and noncarcinogenic end-points should be assumed to be linear at low doses. We focused on three arguments that have been put forth for noncarcinogens. First, the general “additivity-to-background” argument proposes that if an agent enhances an already existing disease-causing process, then even small exposures increase disease incidence in a linear manner. This only holds if it is related to a specific mode of action that has nonuniversal properties—properties that would not be expected for most noncancer effects. Second, the “heterogeneity in the population” argument states that variations in sensitivity among members ofthe target population tend to “flatten out and linearize” the exposure-response curve, but this actually only tends to broaden, not linearize, the dose-response relationship. Third, it has been argued that a review of epidemiological evidence shows linear or no-threshold effects at low exposures in humans, despite nonlinear exposure-response in the experimental dose range in animal testing for similar endpoints. It is more likely that this is attributable to exposure measurement error rather than a true non-threshold association. Assuming that every chemical is toxic at high exposures and linear at low exposures does not comport to modern-day scientific knowledge of biology. There is no compelling evidence-based justification for a general low-exposure linearity; rather, case-specific mechanistic arguments are needed

    Estimation in a Competing Risks Proportional Hazards Model Under Length-biased Sampling With Censoring

    Get PDF
    International audienceWhat population does the sample represent? The answer to this question is of crucial importance when estimating a survivor function in duration studies. As is well-known, in a stationary population, survival data obtained from a cross-sectional sample taken from the population at time t0t_0 represents not the target density f(t)f(t) but its length-biased version proportional to tf(t)tf(t), for t>0t>0. The problem of estimating survivor function from such length-biased samples becomes more complex, and interesting, in presence of competing risks and censoring. This paper lays out a sampling scheme related to a mixed Poisson process and develops nonparametric estimators of the survivor function of the target population assuming that the two independent competing risks have proportional hazards. Two cases are considered: with and without independent consoring before length biased sampling. In each case, the weak convergence of the process generated by the proposed estimator is proved. A well-known study of the duration in power for political leaders is used to illustrate our results. Finally, a simulation study is carried out in order to assess the finite sample behaviour of our estimators

    Quantitative analyses and modelling to support achievement of the 2020 goals for nine neglected tropical diseases

    Get PDF
    Quantitative analysis and mathematical models are useful tools in informing strategies to control or eliminate disease. Currently, there is an urgent need to develop these tools to inform policy to achieve the 2020 goals for neglected tropical diseases (NTDs). In this paper we give an overview of a collection of novel model-based analyses which aim to address key questions on the dynamics of transmission and control of nine NTDs: Chagas disease, visceral leishmaniasis, human African trypanosomiasis, leprosy, soil-transmitted helminths, schistosomiasis, lymphatic filariasis, onchocerciasis and trachoma. Several common themes resonate throughout these analyses, including: the importance of epidemiological setting on the success of interventions; targeting groups who are at highest risk of infection or re-infection; and reaching populations who are not accessing interventions and may act as a reservoir for infection,. The results also highlight the challenge of maintaining elimination ‘as a public health problem’ when true elimination is not reached. The models elucidate the factors that may be contributing most to persistence of disease and discuss the requirements for eventually achieving true elimination, if that is possible. Overall this collection presents new analyses to inform current control initiatives. These papers form a base from which further development of the models and more rigorous validation against a variety of datasets can help to give more detailed advice. At the moment, the models’ predictions are being considered as the world prepares for a final push towards control or elimination of neglected tropical diseases by 2020
    corecore