203 research outputs found

    Estimating summary measures of health: a structured workbook approach

    Get PDF
    BACKGROUND: Summary measures of health that combine mortality and morbidity into a single indicator are being estimated in the Canadian context for approximately 200 diseases and conditions. To manage the large amount of data and calculations for this many diseases, we have developed a structured workbook system with easy to use tools. We expect this system will be attractive to researchers from other countries or regions of Canada who are interested in estimating the health-adjusted life years (HALYs) lost to premature mortality and year-equivalents lost to reduced functioning, as well as population attributable fractions (PAFs) associated with risk factors. This paper describes the workbook system using cancers as an example, and includes the entire system as a free, downloadable package. METHODS: The workbook system was developed in Excel and runs on a personal computer. It is a database system that stores data on population structure, mortality, incidence, distributions of cases entering a multitude of health states, durations of time spent in health states, preference scores that weight for severity, life table estimates of life expectancies, and risk factor prevalence and relative risks. The tools are Excel files with embedded macro programs. The main tool generates workbooks that estimate HALY, one per disease, by copying data from the database into a pre-defined template. Other tools summarize the HALY results across diseases for easy analysis. RESULTS: The downloadable zip file contains the database files initialized with Canadian data for cancers, the tools, templates and workbooks that estimate PAF and a user guide. The workbooks that estimate HALY are generated from the system at a rate of approximately one minute per disease. The resulting workbooks are self-contained and can be used directly to explore the details of a particular disease. Results can be discounted at different rates through simple parameter modification. CONCLUSION: The structured workbook approach offers researchers an efficient, easy to use, and easy to understand set of tools for estimating HALY and PAF summary measures for their country or region of interest

    The Underestimation Of Egocentric Distance: Evidence From Frontal Matching Tasks

    Get PDF
    There is controversy over the existence, nature, and cause of error in egocentric distance judgments. One proposal is that the systematic biases often found in explicit judgments of egocentric distance along the ground may be related to recently observed biases in the perceived declination of gaze (Durgin & Li, Attention, Perception, & Psychophysics, in press), To measure perceived egocentric distance nonverbally, observers in a field were asked to position themselves so that their distance from one of two experimenters was equal to the frontal distance between the experimenters. Observers placed themselves too far away, consistent with egocentric distance underestimation. A similar experiment was conducted with vertical frontal extents. Both experiments were replicated in panoramic virtual reality. Perceived egocentric distance was quantitatively consistent with angular bias in perceived gaze declination (1.5 gain). Finally, an exocentric distance-matching task was contrasted with a variant of the egocentric matching task. The egocentric matching data approximate a constant compression of perceived egocentric distance with a power function exponent of nearly 1; exocentric matches had an exponent of about 0.67. The divergent pattern between egocentric and exocentric matches suggests that they depend on different visual cues

    Haptic search with finger movements: using more fingers does not necessarily reduce search times

    Get PDF
    Two haptic serial search tasks were used to investigate how the separations between items, and the number of fingers used to scan them, influence the search time and search strategy. In both tasks participants had to search for a target (cross) between a fixed number of non-targets (circles). The items were placed in a straight line. The target’s position was varied within blocks, and inter-item separation was varied between blocks. In the first experiment participants used their index finger to scan the display. As expected, search time depended on target position as well as on item separation. For larger separations participants’ movements were jerky, resembling ‘saccades’ and ‘fixations’, while for the shortest separation the movements were smooth. When only considering time in contact with an item, search times were the same for all separation conditions. Furthermore, participants never continued their movement after they encountered the target. These results suggest that participants did not use the time during which they were moving between the items to process information about the items. The search times were a little shorter than those in a static search experiment (Overvliet et al. in Percept Psychophys, 2007a), where multiple items were presented to the fingertips simultaneously. To investigate whether this is because the finger was moving or because only one finger was stimulated, we conducted a second experiment in which we asked participants to put three fingers in line and use them together to scan the items. Doing so increased the time in contact with the items for all separations, so search times were presumably longer in the static search experiment because multiple fingers were involved. This may be caused by the time that it takes to switch from one finger to the other

    Size and shape constancy in consumer virtual reality

    Get PDF
    With the increase in popularity of consumer virtual reality headsets, for research and other applications, it is important to understand the accuracy of 3D perception in VR. We investigated the perceptual accuracy of near-field virtual distances using a size and shape constancy task, in two commercially available devices. Participants wore either the HTC Vive or the Oculus Rift and adjusted the size of a virtual stimulus to match the geometric qualities (size and depth) of a physical stimulus they were able to refer to haptically. The judgments participants made allowed for an indirect measure of their perception of the egocentric, virtual distance to the stimuli. The data show under-constancy and are consistent with research from carefully calibrated psychophysical techniques. There was no difference in the degree of constancy found in the two headsets. We conclude that consumer virtual reality headsets provide a sufficiently high degree of accuracy in distance perception, to allow them to be used confidently in future experimental vision science, and other research applications in psychology

    Medicalization of eating and feeding

    Get PDF
    A variety of developments over the past century have produced the conditions in which eating and feeding are transformed from practices embedded in social or cultural relations into explicit medical practices. The rise of medical science, expansion of the pharmaceutical and food industries, escalating concern over diet‐related diseases and conditions, and growing anxiety over infant and childhood development have contributed to a process of medicalization. Medicalization is a sociological concept that analyses the expansion of medical terminology, interventions, or practitioners into areas of the life that were previously considered outside the medical sphere. For instance, under‐eating has previously been defined using theological language, as an act of fasting demonstrating a saintly character. Such practices are now understood through medical terms of anorexia nervosa, malnutrition, or general diagnoses such as “eating disorders not otherwise specified.” Individuals engaged in under‐ or over‐eating practices are increasingly defined by medical concepts (anorexia nervosa and obesity) and treated in medical spaces (hospitals, clinics, or rehabilitation centres) through medical interventions (pharmaceuticals, surgery, psychotherapy, or dietary regimens). Likewise, infant feeding (breast or formula) is understood as a practice that requires monitoring and instruction from medical practitioners. Further, eating in general is progressively invested with medical significance. Foods and diets are touted as possessing a therapeutic or health enhancing capacity that indicates an individual’s or population’s present and future health. Due to the high regard for, and influence of, medical science in the West, medicalization studies primarily focus on Western contexts. Medicalization does have an impact on non‐Western societies and the developing world, however its influence emanates from Western biomedicine, industries, and policies. There is important work to be done in examining the process of medicalization in non‐Western contexts, however this article is limited to the Western context ( Hunt, 1999). To analyse the medicalization of eating and feeding it is important to first sketch the theoretical and historical background of medicalization as a sociological concept. The relationship between eating and medicine is extensive. In order to focus the discussion, three examples are used – under‐eating, over‐ eating and infant feeding. This background focuses the analysis of the forces driving the medicalization of eating and feeding. Finally, in elaborating the influences and consequences of the medicalization of eating and feeding, some of the central ethical implications are identified and discusse

    Renal function in HIV-infected children and adolescents treated with tenofovir disoproxil fumarate and protease inhibitors

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Kidney disease is an important complication in HIV infected people, and this may be related to infection or antiretroviral therapy (ART). Our aim is to assess renal function in HIV infected paediatric patients, who may be particularly affected and are likely to take ART for longer than adults, and investigate the long term role of Tenofovir Disoproxil Fumarate (TDF) alone or co-administered with Ritonavir-boosted Protease Inhibitors (PI).</p> <p>Methods</p> <p>Serum creatinine, phosphate and potassium levels, with estimated Glomerular Filtration Rate (eGFR), had been prospectively evaluated for 2 years in a cohort of HIV infected children and adolescents (age 9-18) on ART, and data analyzed according to the exposure to TDF or simultaneous TDF and PI.</p> <p>Results</p> <p>Forty-nine patients were studied (57% female, mean age 14). Sixty-three percent were treated with ART containing TDF (Group A), and 37% without TDF (Group B); 47% with concomitant use of TDF and PI (Group C) and 53% without this combination (Group D). The groups didn't differ for age, gender or ethnicity. The median creatinine increased in the entire cohort and in all the groups analyzed; eGFR decreased from 143.6 mL/min/1.73 m<sup>2 </sup>at baseline to 128.9 after 2 years (<it>p </it>= 0.006) in the entire cohort. Three patients presented a mild eGFR reduction, all were on TDF+PI. Phosphatemia decreased significantly in the entire cohort (<it>p </it>= 0.0003) and in TDF+PI group (<it>p </it>= 0.0128) after 2 years. Five patients (10%) developed hypophosphatemia (Division of Acquired Immune Deficiency AE grade 1 or 2), and four of them were on TDF+PI.</p> <p>Conclusions</p> <p>Renal function decrease and hypophosphatemia occur over time in HIV infected children and adolescents on ART. The association with co-administration of TDF and PI appears weak, and further studies are warranted.</p

    Direct enzymatic esterification of cotton and Avicel with wild-type and engineered cutinases

    Get PDF
    In this work, the surface of cellulose, either Avicel or cotton fabric, was modified using cutinases without any previous treatment to swell or to solubilise the polymer. Aiming further improvement of cutinase ester synthase activity on cellulose, an engineered cutinase was investigated. Wild-type cutinase from Fusarium solani and its fusion with the carbohydrate-binding module N1 from Cellulomonas fimi were able to esterify the hydroxyl groups of cellulose with distinct efficiencies depending on the acid substrate/solvent system used, as shown by titration and by ATR-FTIR. The carbonyl stretching peak area increased significantly after enzymatic treatment during 72 h at 30 °C. Cutinase treatment resulted in relative increases of 31 and 9 % when octanoic acid and vegetable oil were used as substrates, respectively. Cutinase-N1 treatment resulted in relative increases of 11 and 29 % in the peak area when octanoic acid and vegetable oil were used as substrates, respectively. The production and application of cutinase fused with the domain N1 as a cellulose ester synthase, here reported for the first time, is therefore an interesting strategy to pursuit.This work was co-funded by the European Social Fund through the management authority POPH and FCT, Postdoctoral fellowship reference: SFRH/BPD/47555/2008. The authors also want to thank Doctor Raul Machado for his valuable help on FTIR spectral data treatment

    Explicit Logic Circuits Discriminate Neural States

    Get PDF
    The magnitude and apparent complexity of the brain's connectivity have left explicit networks largely unexplored. As a result, the relationship between the organization of synaptic connections and how the brain processes information is poorly understood. A recently proposed retinal network that produces neural correlates of color vision is refined and extended here to a family of general logic circuits. For any combination of high and low activity in any set of neurons, one of the logic circuits can receive input from the neurons and activate a single output neuron whenever the input neurons have the given activity state. The strength of the output neuron's response is a measure of the difference between the smallest of the high inputs and the largest of the low inputs. The networks generate correlates of known psychophysical phenomena. These results follow directly from the most cost-effective architectures for specific logic circuits and the minimal cellular capabilities of excitation and inhibition. The networks function dynamically, making their operation consistent with the speed of most brain functions. The networks show that well-known psychophysical phenomena do not require extraordinarily complex brain structures, and that a single network architecture can produce apparently disparate phenomena in different sensory systems

    Conserving the Stage: Climate Change and the Geophysical Underpinnings of Species Diversity

    Get PDF
    Conservationists have proposed methods for adapting to climate change that assume species distributions are primarily explained by climate variables. The key idea is to use the understanding of species-climate relationships to map corridors and to identify regions of faunal stability or high species turnover. An alternative approach is to adopt an evolutionary timescale and ask ultimately what factors control total diversity, so that over the long run the major drivers of total species richness can be protected. Within a single climatic region, the temperate area encompassing all of the Northeastern U.S. and Maritime Canada, we hypothesized that geologic factors may take precedence over climate in explaining diversity patterns. If geophysical diversity does drive regional diversity, then conserving geophysical settings may offer an approach to conservation that protects diversity under both current and future climates. Here we tested how well geology predicts the species diversity of 14 US states and three Canadian provinces, using a comprehensive new spatial dataset. Results of linear regressions of species diversity on all possible combinations of 23 geophysical and climatic variables indicated that four geophysical factors; the number of geological classes, latitude, elevation range and the amount of calcareous bedrock, predicted species diversity with certainty (adj. R2 = 0.94). To confirm the species-geology relationships we ran an independent test using 18,700 location points for 885 rare species and found that 40% of the species were restricted to a single geology. Moreover, each geology class supported 5–95 endemic species and chi-square tests confirmed that calcareous bedrock and extreme elevations had significantly more rare species than expected by chance (P<0.0001), strongly corroborating the regression model. Our results suggest that protecting geophysical settings will conserve the stage for current and future biodiversity and may be a robust alternative to species-level predictions

    Cellular Radiosensitivity: How much better do we understand it?

    Get PDF
    Purpose: Ionizing radiation exposure gives rise to a variety of lesions in DNA that result in genetic instability and potentially tumorigenesis or cell death. Radiation extends its effects on DNA by direct interaction or by radiolysis of H2O that generates free radicals or aqueous electrons capable of interacting with and causing indirect damage to DNA. While the various lesions arising in DNA after radiation exposure can contribute to the mutagenising effects of this agent, the potentially most damaging lesion is the DNA double strand break (DSB) that contributes to genome instability and/or cell death. Thus in many cases failure to recognise and/or repair this lesion determines the radiosensitivity status of the cell. DNA repair mechanisms including homologous recombination (HR) and non-homologous end-joining (NHEJ) have evolved to protect cells against DNA DSB. Mutations in proteins that constitute these repair pathways are characterised by radiosensitivity and genome instability. Defects in a number of these proteins also give rise to genetic disorders that feature not only genetic instability but also immunodeficiency, cancer predisposition, neurodegeneration and other pathologies. Conclusions: In the past fifty years our understanding of the cellular response to radiation damage has advanced enormously with insight being gained from a wide range of approaches extending from more basic early studies to the sophisticated approaches used today. In this review we discuss our current understanding of the impact of radiation on the cell and the organism gained from the array of past and present studies and attempt to provide an explanation for what it is that determines the response to radiation
    corecore