5,192 research outputs found

    Assessing research impact potential: using the transdisciplinary Outcome Spaces Framework with New Zealand’s National Science Challenges

    Full text link
    © 2020, © 2020 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group. Calls for science to have impact as well as excellence have been loud and clear from research funders, policymakers and research institutions for some time. Transdisciplinary research (TDR) is expected to deliver impact by connecting scientists with stakeholders and end users to co-produce knowledge to respond to complex issues. While New Zealand’s science system is geared to deliver excellence, its capability to also deliver impact beyond academic institutions is less clear. This paper has two interconnected aims. Firstly, it presents findings from testing innovations to the TDR Outcome Spaces Framework (OSF+) with four National Science Challenges (NSCs). We conclude that OSF+ is a useful tool for planning for multiple outcomes and assessing the potential for impact. Secondly, it presents findings of how using OSF+ to assess research impact potential revealed a range of implicit theories of change (i.e. catalyst, deficit, engagement and collaboration) across the NSCs. The findings raise important questions about the prospects for New Zealand’s science system to deliver the envisaged and needed levels of research impact when current institutional settings, expectations, recognition systems, career paths and measures of success are not yet able to adequately accommodate TDR to deliver the research impact

    Evaluation of ‘Eyelander’: a video game designed to engage children and young people with homonymous visual field loss in compensatory training

    Get PDF
    Introduction: Rehabilitation can improve visual outcomes for adults with acquired homonymous visual field loss. However, it is unclear whether (re)habilitation improves visual outcomes for children because previous training schedules have been tiresome, uninteresting, and failed to keep them engaged. In this study we assessed whether children and young people with homonymous visual field loss would adhere to six weeks of unsupervised compensatory training using a specialised video game. Methods: Participants aged between 7 and 25 with homonymous visual field loss completed table-top assessments of visual search across four site visits. Two baseline assessments separated by four weeks evaluated spontaneous improvements before training began. Participants were then given a copy of the video game to use unsupervised at home for six weeks. Two follow-up assessments separated by four weeks were then conducted to evaluate immediate and acutely maintained effects of training. Results. 15 candidates met the inclusion-exclusion criteria, 9 participated, and 8 completed the study. Participants completed an average of 5.6 hours training unsupervised over the six weeks. Improvements on in-game metrics plateaued during week 3 of training. The time taken to find objects during table-top activities improved by an average of 24% (95% CI [2%, 46%]) after training. Discussion: The findings demonstrate that children and young people with homonymous visual field loss will engage with gamified compensatory training, and can improve visual outcomes with less time commitment than adults have required with non-gamified training in previous studies. Appropriately powered, randomised controlled trials are required to evaluate the validity and generalisability of observed training effects. Implications for practitioners: We conclude that (re)habilitation specialists can use specialist video games and gamification to engage children and young people with homonymous visual field loss in long-term unsupervised training schedules

    Impact of the COVID-19 pandemic on timeliness and equity of measles, mumps and rubella vaccinations in North East London: a longitudinal study using electronic health records.

    Get PDF
    OBJECTIVES: To quantify the effect of the COVID-19 pandemic on the timeliness of, and geographical and sociodemographic inequalities in, receipt of first measles, mumps and rubella (MMR) vaccination. DESIGN: Longitudinal study using primary care electronic health records. SETTING: 285 general practices in North East London. PARTICIPANTS: Children born between 23 August 2017 and 22 September 2018 (pre-pandemic cohort) or between 23 March 2019 and 1 May 2020 (pandemic cohort). MAIN OUTCOME MEASURE: Receipt of timely MMR vaccination between 12 and 18 months of age. METHODS: We used logistic regression to estimate the ORs (95% CIs) of receipt of a timely vaccination adjusting for sex, deprivation, ethnic background and Clinical Commissioning Group. We plotted choropleth maps of the proportion receiving timely vaccinations. RESULTS: Timely MMR receipt fell by 4.0% (95% CI: 3.4% to 4.6%) from 79.2% (78.8% to 79.6%) to 75.2% (74.7% to 75.7%) in the pre-pandemic (n=33 226; 51.3% boys) and pandemic (n=32 446; 51.4%) cohorts, respectively. After adjustment, timely vaccination was less likely in the pandemic cohort (0.79; 0.76 to 0.82), children from black (0.70; 0.65 to 0.76), mixed/other (0.77; 0.72 to 0.82) or with missing (0.77; 0.74 to 0.81) ethnic background, and more likely in girls (1.07; 1.03 to 1.11) and those from South Asian backgrounds (1.39; 1.30 to 1.48). Children living in the least deprived areas were more likely to receive a timely MMR (2.09; 1.78 to 2.46) but there was no interaction between cohorts and deprivation (Wald statistic: 3.44; p=0.49). The proportion of neighbourhoods where less than 60% of children received timely vaccination increased from 7.5% to 12.7% during the pandemic. CONCLUSIONS: The COVID-19 pandemic was associated with a significant fall in timely MMR receipt and increased geographical clustering of measles susceptibility in an area of historically low and inequitable MMR coverage. Immediate action is needed to avert measles outbreaks and support primary care to deliver timely and equitable vaccinations

    Formation of hot tear under controlled solidification conditions

    Get PDF
    Aluminum alloy 7050 is known for its superior mechanical properties, and thus finds its application in aerospace industry. Vertical direct-chill (DC) casting process is typically employed for producing such an alloy. Despite its advantages, AA7050 is considered as a "hard-to-cast" alloy because of its propensity to cold cracking. This type of cracks occurs catastrophically and is difficult to predict. Previous research suggested that such a crack could be initiated by undeveloped hot tears (microscopic hot tear) formed during the DC casting process if they reach a certain critical size. However, validation of such a hypothesis has not been done yet. Therefore, a method to produce a hot tear with a controlled size is needed as part of the verification studies. In the current study, we demonstrate a method that has a potential to control the size of the created hot tear in a small-scale solidification process. We found that by changing two variables, cooling rate and displacement compensation rate, the size of the hot tear during solidification can be modified in a controlled way. An X-ray microtomography characterization technique is utilized to quantify the created hot tear. We suggest that feeding and strain rate during DC casting are more important compared with the exerted force on the sample for the formation of a hot tear. In addition, we show that there are four different domains of hot-tear development in the explored experimental window-compression, microscopic hot tear, macroscopic hot tear, and failure. The samples produced in the current study will be used for subsequent experiments that simulate cold-cracking conditions to confirm the earlier proposed model.This research was carried out within the Materials innovation institute (www.m2i.nl) research framework, project no. M42.5.09340

    Incidence of SARS-CoV-2 infection according to baseline antibody status in staff and residents of 100 long-term care facilities (VIVALDI): a prospective cohort study

    Get PDF
    Background: SARS-CoV-2 infection represents a major challenge for long-term care facilities (LTCFs) and many residents and staff are seropositive following persistent outbreaks. We aimed to investigate the association between the SARS-CoV-2 antibody status at baseline and subsequent infection in this population. Methods: We did a prospective cohort study of SARS-CoV-2 infection in staff (aged 65 years) at 100 LTCFs in England between Oct 1, 2020, and Feb 1, 2021. Blood samples were collected between June and November, 2020, at baseline, and 2 and 4 months thereafter and tested for IgG antibodies to SARS-CoV-2 nucleocapsid and spike proteins. PCR testing for SARS-CoV-2 was done weekly in staff and monthly in residents. Cox regression was used to estimate hazard ratios (HRs) of a PCR-positive test by baseline antibody status, adjusted for age and sex, and stratified by LTCF. Findings: 682 residents from 86 LCTFs and 1429 staff members from 97 LTCFs met study inclusion criteria. At baseline, IgG antibodies to nucleocapsid were detected in 226 (33%) of 682 residents and 408 (29%) of 1429 staff members. 93 (20%) of 456 residents who were antibody-negative at baseline had a PCR-positive test (infection rate 0·054 per month at risk) compared with four (2%) of 226 residents who were antibody-positive at baseline (0·007 per month at risk). 111 (11%) of 1021 staff members who were antibody-negative at baseline had PCR-positive tests (0·042 per month at risk) compared with ten (2%) of 408 staff members who were antibody-positive staff at baseline (0·009 per month at risk). The risk of PCR-positive infection was higher for residents who were antibody-negative at baseline than residents who were antibody-positive at baseline (adjusted HR [aHR] 0·15, 95% CI 0·05–0·44, p=0·0006), and the risk of a PCR-positive infection was also higher for staff who were antibody-negative at baseline compared with staff who were antibody-positive at baseline (aHR 0·39, 0·19–0·82; p=0·012). 12 of 14 reinfected participants had available data on symptoms, and 11 of these participants were symptomatic. Antibody titres to spike and nucleocapsid proteins were comparable in PCR-positive and PCR-negative cases. Interpretation: The presence of IgG antibodies to nucleocapsid protein was associated with substantially reduced risk of reinfection in staff and residents for up to 10 months after primary infection

    Modelling environmental changes and effects on wild-caught species in Queensland. Environmental drivers.

    Get PDF
    We report on the findings of a collaborative research project that was designed to identify and measure the effects of environmental drivers on the abundance and population dynamics of key Queensland fishery species. The project was co-funded by the Commonwealth Government’s Fisheries Research and Development Corporation (FRDC) and carried out by a multi-disciplinary team of scientists from the University of Queensland (UQ), the Queensland Department of Agriculture and Fisheries (DAF) and the Australian Institute of Marine Science (AIMS). The research team applied modern statistical, data science and modelling techniques in combination with biological insights into the life cycles of the three target species. Background With increasing evidence that environmental conditions in the marine environment are changing rapidly, it is becoming ever more important to understand how these changes may impact on the population dynamics and abundance of important fish stocks. Understanding the influence of environmental conditions can provide greater certainty that the risk of overfishing (under adverse environmental conditions) or under harvesting (under favourable conditions) are accounted for by resource managers. This project aimed to identify the environmental factors which may be influencing the recruitment, catchability or productivity of Snapper, Pearl Perch, and Spanner Crab stocks in Queensland. Results from this work will support sustainable management of Queensland’s fisheries by directly informing the assessment and management of these key species within Queensland waters

    Multi-Epoch Multiwavelength Spectra and Models for Blazar 3C~279

    Get PDF
    Of the blazars detected by EGRET in GeV gamma rays, 3C 279 is not only the best-observed by EGRET, but also one of the best-monitored at lower frequencies. We have assembled eleven spectra, from GHz radio through GeV gamma rays, from the time intervals of EGRET observations. Although some of the data have appeared in previous publications, most are new, including data taken during the high states in early 1999 and early 2000. All of the spectra show substantial gamma-ray contribution to the total luminosity of the object; in a high state, the gamma-ray luminosity dominates over that at all other frequencies by a factor of more than 10. There is no clear pattern of time correlation; different bands do not always rise and fall together, even in the optical, X-ray, and gamma-ray bands. The spectra are modeled using a leptonic jet, with combined synchrotron self-Compton + external Compton gamma-ray production. Spectral variability of 3C 279 is consistent with variations of the bulk Lorentz factor of the jet, accompanied by changes in the spectral shape of the electron distribution. Our modeling results are consistent with the UV spectrum of 3C 279 being dominated by accretion disk radiation during times of low gamma-ray intensity.Comment: 39 pages including 13 figures; data tables not included (see ApJ web version or contact author

    Moving health upstream in urban development: Reflections on the operationalization of a transdisciplinary case study

    Get PDF
    This paper describes the development, conceptualization, and implementation of a transdisciplinary research pilot, the aim of which is to understand how human and planetary health could become a priority for those who control the urban development process. Key challenges include a significant dislocation between academia and the real world, alongside systemic failures in valuation and assessment mechanisms. The National Institutes of Health four‐phase model of transdisciplinary team‐based research is drawn on and adapted to reflect on what has worked well and what has not operationally. Results underscore the need for experienced academics open to new collaborations and ways of working; clarity of leadership without compromising exploration; clarification of the poorly understood “impacts interface” and navigation toward effective real world impact; acknowledgement of the additional time and resource required for transdisciplinary research and “nonacademic” researchers. Having practitioner‐researchers as part of the research leadership team requires rigourous reflective practice and effective management, but it can also ensure breadth in transdisciplinary outlook as well as constant course correction toward real‐world impact. It is important for the research community to understand better the opportunities and limitations provided by knowledge intermediaries in terms of function, specialism, and experience
    • 

    corecore