44 research outputs found

    Day of Archaeology 2011–2017: Global Community, Public Engagement, and Digital Practice.

    Get PDF
    The Day of Archaeology (http://www.dayofarchaeology.com) was a volunteer-led international archaeological blogging event that ran from 2011 to 2017. The project asked people who define themselves as archaeologists to submit one or more blog posts about their working day on a chosen day in June or July. This article explores the history of the Day of Archaeology project and the practicalities of running a large-scale collaborative blogging project, before examining some of the topics covered in the posts. An assessment of the impact of the project follows. Overall, we hope in this work to answer some of the basic questions regarding this type of collaborative, online, global engagement – what we did, who we reached, what they talked about – and also to provide some insights for any other similar initiatives that may follow us in the future

    Transnational cult paratexts : Exploring audience readings of Tartan’s Asia Extreme brand

    Get PDF
    Recent scholarship on the branding of contemporary cult Asian cinema for western audiences has frequently drawn on Said’s seminal treatise Orientalism as a means to critique sensationalist marketing materials. Whilst the excessive character of paratexts produced by film distributors such as Tartan clearly facilitates such readings, in this article I argue that this oft-repeated criticism becomes problematic when employed indiscriminately to theorise, by extension, the audiences for these films. Drawing on a recent empirical study of responses to Asian Extreme cinema and its distribution in the UK and North America, I offer an intervention in this debate by constructing a more nuanced interpretation of the ways in which cult audiences articulate their attraction to cinematic representations of cultural differenc

    Monogenic diseases that can be cured by liver transplantation

    Get PDF
    While the prevalence of most diseases caused by single-gene mutations is low and defines them as rare conditions, all together, monogenic diseases account for approximately 10 in every 1000 births according to the World Health Organisation. Orthotopic liver transplantation (LT) could offer a therapeutic option in monogenic diseases in two ways: by substituting for an injured liver or by supplying a tissue that can replace a mutant protein. In this respect, LT may be regarded as the correction of a disease at the level of the dysfunctional protein. Monogenic diseases that involve the liver represent a heterogeneous group of disorders. In conditions associated with predominant liver parenchymal damage (i.e., genetic cholestatic disorders, Wilson's disease, hereditary hemochromatosis, tyrosinemia, α1 antitrypsin deficiency), hepatic complications are the major source of morbidity and LT not only replaces a dysfunctional liver but also corrects the genetic defect and effectively cures the disease. A second group includes liver-based genetic disorders characterised by an architecturally near-normal liver (urea cycle disorders, Crigler-Najjar syndrome, familial amyloid polyneuropathy, primary hyperoxaluria type 1, atypical haemolytic uremic syndrome-1). In these defects, extrahepatic complications are the main source of morbidity and mortality while liver function is relatively preserved. Combined transplantation of other organs may be required, and other surgical techniques, such as domino and auxiliary liver transplantation, have been attempted. In a third group of monogenic diseases, the underlying genetic defect is expressed at a systemic level and liver involvement is just one of the clinical manifestations. In these conditions, LT might only be partially curative since the abnormal phenotype is maintained by extrahepatic synthesis of the toxic metabolites (i.e., methylmalonic acidemia, propionic acidemia). This review focuses on principles of diagnosis, management and LT results in both paediatric and adult populations of selected liver-based monogenic diseases, which represent examples of different transplantation strategies, driven by the understanding of the expression of the underlying genetic defect. © 2013 European Association for the Study of the Liver

    Community prevalence of SARS-CoV-2 in England from April to November, 2020: results from the ONS Coronavirus Infection Survey

    Get PDF
    Background: Decisions about the continued need for control measures to contain the spread of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) rely on accurate and up-to-date information about the number of people testing positive for SARS-CoV-2 and risk factors for testing positive. Existing surveillance systems are generally not based on population samples and are not longitudinal in design. Methods: Samples were collected from individuals aged 2 years and older living in private households in England that were randomly selected from address lists and previous Office for National Statistics surveys in repeated crosssectional household surveys with additional serial sampling and longitudinal follow-up. Participants completed a questionnaire and did nose and throat self-swabs. The percentage of individuals testing positive for SARS-CoV-2 RNA was estimated over time by use of dynamic multilevel regression and poststratification, to account for potential residual non-representativeness. Potential changes in risk factors for testing positive over time were also assessed. The study is registered with the ISRCTN Registry, ISRCTN21086382. Findings: Between April 26 and Nov 1, 2020, results were available from 1 191 170 samples from 280327 individuals; 5231 samples were positive overall, from 3923 individuals. The percentage of people testing positive for SARS-CoV-2 changed substantially over time, with an initial decrease between April 26 and June 28, 2020, from 0·40% (95% credible interval 0·29–0·54) to 0·06% (0·04–0·07), followed by low levels during July and August, 2020, before substantial increases at the end of August, 2020, with percentages testing positive above 1% from the end of October, 2020. Having a patient facing role and working outside your home were important risk factors for testing positive for SARS-CoV-2 at the end of the first wave (April 26 to June 28, 2020), but not in the second wave (from the end of August to Nov 1, 2020). Age (young adults, particularly those aged 17–24 years) was an important initial driver of increased positivity rates in the second wave. For example, the estimated percentage of individuals testing positive was more than six times higher in those aged 17–24 years than in those aged 70 years or older at the end of September, 2020. A substantial proportion of infections were in individuals not reporting symptoms around their positive test (45–68%, dependent on calendar time. Interpretation: Important risk factors for testing positive for SARS-CoV-2 varied substantially between the part of the first wave that was captured by the study (April to June, 2020) and the first part of the second wave of increased positivity rates (end of August to Nov 1, 2020), and a substantial proportion of infections were in individuals not reporting symptoms, indicating that continued monitoring for SARS-CoV-2 in the community will be important for managing the COVID-19 pandemic moving forwards

    A global horizon scan of the future impacts of robotics and autonomous systems on urban ecosystems

    Get PDF
    Technology is transforming societies worldwide. A major innovation is the emergence of robotics and autonomous systems (RAS), which have the potential to revolutionize cities for both people and nature. Nonetheless, the opportunities and challenges associated with RAS for urban ecosystems have yet to be considered systematically. Here, we report the findings of an online horizon scan involving 170 expert participants from 35 countries. We conclude that RAS are likely to transform land use, transport systems and human–nature interactions. The prioritized opportunities were primarily centred on the deployment of RAS for the monitoring and management of biodiversity and ecosystems. Fewer challenges were prioritized. Those that were emphasized concerns surrounding waste from unrecovered RAS, and the quality and interpretation of RAS-collected data. Although the future impacts of RAS for urban ecosystems are difficult to predict, examining potentially important developments early is essential if we are to avoid detrimental consequences but fully realize the benefits

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe
    corecore