83 research outputs found

    Education inequalities in adult all-cause mortality: first national data for Australia using linked census and mortality data

    Get PDF
    BACKGROUND: National linked mortality and census data have not previously been available for Australia. We estimated education-based mortality inequalities from linked census and mortality data that are suitable for international comparisons. METHODS: We used the Australian Bureau of Statistics Death Registrations to Census file, with data on deaths (2011-2012) linked probabilistically to census data (linkage rate 81%). To assess validity, we compared mortality rates by age group (25-44, 45-64, 65-84 years), sex and area-inequality measures to those based on complete death registration data. We used negative binomial regression to quantify inequalities in all-cause mortality in relation to five levels of education ['Bachelor degree or higher' (highest) to 'no Year 12 and no post-secondary qualification' (lowest)], separately by sex and age group, adjusting for single year of age and correcting for linkage bias and missing education data. RESULTS: Mortality rates and area-based inequality estimates were comparable to published national estimates. Men aged 25-84 years with the lowest education had age-adjusted mortality rates 2.20 [95% confidence interval (CI): 2.08‒2.33] times those of men with the highest education. Among women, the rate ratio was 1.64 (1.55‒1.74). Rate ratios were 3.87 (3.38‒4.44) in men and 2.57 (2.15‒3.07) in women aged 25-44 years, decreasing to 1.68 (1.60‒1.76) in men and 1.44 (1.36‒1.53) in women aged 65-84 years. Absolute education inequalities increased with age. One in three to four deaths (31%) was associated with less than Bachelor level education. CONCLUSIONS: These linked national data enabled valid estimates of education inequality in mortality suitable for international comparisons. The magnitude of relative inequality is substantial and similar to that reported for other high-income countries.Rosemary J Korda, Nicholas Biddle, John Lynch, James Eynstone-Hinkins, Kay Soga, Emily Banks ... et al

    Distribution of Major Health Risks: Findings from the Global Burden of Disease Study

    Get PDF
    BACKGROUND: Most analyses of risks to health focus on the total burden of their aggregate effects. The distribution of risk-factor-attributable disease burden, for example by age or exposure level, can inform the selection and targeting of specific interventions and programs, and increase cost-effectiveness. METHODS AND FINDINGS: For 26 selected risk factors, expert working groups conducted comprehensive reviews of data on risk-factor exposure and hazard for 14 epidemiological subregions of the world, by age and sex. Age-sex-subregion-population attributable fractions were estimated and applied to the mortality and burden of disease estimates from the World Health Organization Global Burden of Disease database. Where possible, exposure levels were assessed as continuous measures, or as multiple categories. The proportion of risk-factor-attributable burden in different population subgroups, defined by age, sex, and exposure level, was estimated. For major cardiovascular risk factors (blood pressure, cholesterol, tobacco use, fruit and vegetable intake, body mass index, and physical inactivity) 43%–61% of attributable disease burden occurred between the ages of 15 and 59 y, and 87% of alcohol-attributable burden occurred in this age group. Most of the disease burden for continuous risks occurred in those with only moderately raised levels, not among those with levels above commonly used cut-points, such as those with hypertension or obesity. Of all disease burden attributable to being underweight during childhood, 55% occurred among children 1–3 standard deviations below the reference population median, and the remainder occurred among severely malnourished children, who were three or more standard deviations below median. CONCLUSIONS: Many major global risks are widely spread in a population, rather than restricted to a minority. Population-based strategies that seek to shift the whole distribution of risk factors often have the potential to produce substantial reductions in disease burden

    Formal System Processing of Juveniles: Effects on Delinquency

    Get PDF
    Justice practitioners have tremendous discretion on how to handle juvenile offenders. Police officers, district attorneys, juvenile court intake officers, juvenile and family court judges, and other officials can decide whether the juvenile should be “officially processed” by the juvenile justice system, diverted from the system to a program, counseling or some other services, or to do nothing at all (release the juvenile altogether). An important policy question is which strategy leads to the best outcomes for juveniles. This is an important question in the United States, but many other nations are concerned with the decision to formally process or divert juvenile offenders. There have been a number of randomized experiments in the juvenile courts that have examined the impact of juvenile system processing that should be gathered together in a systematic fashion to provide rigorous evidence about the impact of this decision on subsequent offending by juveniles. Our objective is to answer the question: Does juvenile system processing reduce subsequent delinquency? Based on the evidence presented in this report, juvenile system processing appears to not have a crime control effect, and across all measures appears to increase delinquency. This was true across measures of prevalence, incidence, severity, and self-report. Given the additional financial costs associated with system processing (especially when compared to doing nothing) and the lack of evidence for any public safety benefit, jurisdictions should review their policies regarding the handling of juveniles

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    Imaging of subsurface lineaments in the southwestern part of the Thrace Basin from gravity data

    Full text link
    Linear anomalies, as an indicator of the structural features of some geological bodies, are very important for the interpretation of gravity and magnetic data. In this study, an image processing technique known as the Hough transform (HT) algorithm is described for determining invisible boundaries and extensions in gravity anomaly maps. The Hough function implements the Hough transform used to extract straight lines or circles within two-dimensional potential field images. It is defined as image and Hough space. In the Hough domain, this function transforms each nonzero point in the parameter domain to a sinusoid. In the image space, each point in the Hough space is transformed to a straight line or circle. Lineaments are depicted from these straight lines which are transformed in the image domain. An application of the Hough transform to the Bouguer anomaly map of the southwestern part of the Thrace Basin, NW Turkey, shows the effectiveness of the proposed approach. Based on geological data and gravity data, the structural features in the southwestern part of the Thrace Basin are investigated by applying the proposed approach and the Blakely and Simpson method. Lineaments identified by these approaches are generally in good accordance with previously-mapped surface faults
    corecore