346 research outputs found
Measuring access: how accurate are patient-reported waiting times?
Introduction: A national audit of waiting times in England’s genitourinary medicine clinics measures patient access. Data are collected by patient questionnaires, which rely upon patients’ recollection of first contact with health services, often several days previously. The aim of this study was to assess the accuracy of patient-reported waiting times.
Methods: Data on true waiting times were collected at the time of patient booking over a three-week period and compared with patient-reported data collected upon clinic attendance. Factors contributing to patient inaccuracy were explored.
Results: Of 341 patients providing initial data, 255 attended; 207 as appointments and 48 ‘walk-in’. The accuracy of patient-reported waiting times overall was 52% (133/255). 85% of patients (216/255) correctly identified themselves as seen within or outside of 48 hours. 17% of patients (17/103) seen within 48 hours reported a longer waiting period, whereas 20% of patients (22/108) reporting waits under 48 hours were seen outside that period. Men were more likely to overestimate their waiting time (10.4% versus 3.1% p<0.02). The sensitivity of patient-completed questionnaires as a tool for assessing waiting times of less than 48 hours was 83.5%. The specificity and positive predictive value were 85.5% and 79.6%, respectively.
Conclusion: The overall accuracy of patient reported waiting times was poor. Although nearly one in six patients misclassified themselves as being seen within or outside of 48 hours, given the under and overreporting rates observed, the overall impact on Health Protection Agency waiting time data is likely to be limited
Active Learning Using Student Recorded Monologues
This paper will detail the planning, execution and feedback of an initiative to promote active learning and improve student speaking and writing using student recorded monologues. After determining the goals of the project, a rubric was created to clarify these goals and assessment criteria for both students and instructors. A 15-lesson unit was then developed based on four personal narratives. For each narrative, students completed a series of preparation activities before producing both written and spoken versions of their narratives and uploading these in the form of online assignments. Students were encouraged to review the rubric before, during, and after work on their texts. Feedback from students and instructors on completion of the program indicated that the goals were clear and relevant, there was a high level of student engagement with the unit, and that the rubric was an effective tool for guiding the learning process
Social cognition and executive functioning predictors of supervisors’ appraisal of interpersonal behaviour in the workplace following acquired brain injury
BACKGROUND: Social cognition and executive functioning difficulties following acquired brain injury have been linked to negative employment outcomes, such as demotion and loss of vocational roles. These are very counter-intuitive and challenging difficulties for other employees and work supervisors who have little or no brain injury knowledge, whose perceptions of play a key role in their responses to these difficulties and the final outcome of such problems for vocational status.
OBJECTIVES: This study aimed to study the relationship between social cognition and executive functioning difficulties and the perceptions of work supervisors’ appraisal of survivor interpersonal behaviour and social skills in the workplace.
METHOD: The performance of 73 survivors of acquired brain injury (47% TBI, 38% CVA, 15% other ABI type; 73% male; mean age 45.44 years, range 19-64 years; mean time since injury 6.36 years, range 10.5-31.33 years), currently in a vocational rehabilitation placement) on neuropsychological tests of executive functioning and social cognition was measured. Informant ratings on the Social Skills Factor subscale from the Work Personality Profile (WPP, Bolton&Roessler, 1986) were used as the primary outcome measure, a vocational functioning questionnaire assessing social and presentational aspects of workplace behaviour. The raters were non-clinical workplace informants acting in a supervisory role (supervisory placement providers and job coaches).
RESULTS: Correlational analysis identified significant associations between the WPP and survivor goal-orientated planning and implementation, mentalising ability, recognition of positive and negative emotions, and recognition of simple sarcasm (all significant at p < 0.05). These correlates were entered into a stepwise multiple regression. The combination final of survivor mentalising ability and executive functioning explained 32%of the variance in the WPP ratings (F (2, 52) = 12.15, p < 0.001).
CONCLUSION: Certain limitations of the study withstanding, the current findings add to previous literature in highlighting the relevance of survivor executive functioning and social cognition difficulties for the perceptions and appraisal of work colleagues, consistent with other studies that have identified negative vocational outcomes associated with such neuropsychological difficulties. The implications for vocational rehabilitation are discussed
Continuous, not discrete: The mutual influence of digital and physical literature
The use of computational methods to develop innovative forms of storytelling and poetry has gained traction since the late 1980s. At the same time, legacy publishing has largely migrated to using digital workflows. Despite the possibility for crossover, the electronic literature community has generally defined their practice in opposition to print and traditional publishing practices more generally. Not only does this ignore a range of hybrid forms, but it also limits non-digital literature to print, rather than considering a range of physical literatures. In this article, I argue that it is more productive to consider physical and digital literature as convergent forms as both a historicizing process, and a way of identifying innovations. Case studies of William Gibson et al’s Agrippa (a book of the dead) and Christian Bök’s The Xenotext Project’s playful use of innovations in genetics demonstrate the productive tensions in the convergence between digital and physical literature
The Importance of Connecting Editorial and Engineering in Teletext/Videotex adoption in the UK
In the 1970s and 1980s, the United Kingdom was at the cutting edge of teletext and videotex development, with the former remaining an important part of British IT culture until it was switched off in 2012. Prominent public corporations including the BBC and the British Post Office (later partially spun off as British Telecommunications, or BT) invested heavily in infrastructure for Ceefax and Prestel respectively. Despite the rich culture that developed on these platforms, they are often footnotes in histories of the Internet and computing, overshadowed by the longer-term success of Minitel, the French equivalent of Prestel.
There are two main reasons for this disappearance: First, internet histories have pivoted towards sociality online (for example, Kevin Driscoll’s work on the Modem World and Minitel ) that emphasises the ancestors of social media rather than more static text-based precursors to the Web. Second, there is a paucity of extant evidence of activity on teletext and videotex systems. Hobbyist groups who extract teletext data from video cassettes have been more active in preserving this content than institutional archives but there is a limit to their ability to reconstruct substantial bodies of material.
Building upon archival research from the BBC Written Archives, Independent Broadcasting Association (IBA, an ancestor of OFCOM), Channel 4 and BT, as well as contemporary industry and academic discussions, in this paper I offer a revisionist history of these proto-Internet technologies that helped prime the British public for the arrival of the Web in the 1990s. Through case studies of the BBC’s Ceefax service, Channel 4’s 4-Tel, and BT’s Prestel, I explore the importance of integrating engineering prowess with effective editorial and content moderation policies
Calculating flux to predict future cave radon concentrations
Cave radon concentration measurements reflect the outcome of a perpetual competition which pitches flux against ventilation and radioactive decay. The mass balance equations used to model changes in radon concentration through time routinely treat flux as a constant. This mathematical simplification is acceptable as a first order approximation despite the fact that it sidesteps an intrinsic geological problem: the majority of radon entering a cavity is exhaled as a result of advection along crustal discontinuities whose motions are inhomogeneous in both time and space. In this paper the dynamic nature of flux is investigated and the results are used to predict cave radon concentration for successive iterations. The first part of our numerical modelling procedure focuses on calculating cave air flow velocity while the second part isolates flux in a mass balance equation to simulate real time dependence among the variables. It is then possible to use this information to deliver an expression for computing cave radon concentration for successive iterations. The dynamic variables in the numerical model are represented by the outer temperature, the inner temperature, and the radon concentration while the static variables are represented by the radioactive decay constant and a range of parameters related to geometry of the cavity. Input data were recorded at Driny Cave in the Little Carpathians Mountains of western Slovakia. Here the cave passages have developed along splays of the NE–SW striking Smolenice Fault and a series of transverse faults striking NW–SE. Independent experimental observations of fault slip are provided by three permanently installed mechanical extensometers. Our numerical modelling has revealed four important flux anomalies between January 2010 and August 2011. Each of these flux anomalies was preceded by conspicuous fault slip anomalies. The mathematical procedure outlined in this paper will help to improve our understanding of radon migration along crustal discontinuities and its subsequent exhalation into the atmosphere. Furthermore, as it is possible to supply the model with continuous data, future research will focus on establishing a series of underground monitoring sites with the aim of generating the first real time global radon flux maps.The authors would like to thank Peter Zvonár, Sara Argerich-Bergada, Amanda Keen-Zebert, Lenka Thinová, and Petr Otáhal as well as the reviewers whose constructive comments have helped to improve the clarity of the manuscript. This study was conducted with support from the long term conceptual development research organisation RVO: 67985891. It is published in the framework of CzechGeo-EPOS “Distributed system of permanent observatory measurements and temporary monitoring of geophysical fields in the Czech Republic” (MŠMT Project: LM2010008).Peer reviewe
The early ACM Hypertext Conference’s role in developing pre-Web reading on-screen
The first three iterations of the ACM Hypertext conference were highly popular, drawing a diverse audience of researchers and practitioners experimenting with hypertext systems before the arrival of the World Wide Web. While the rise of the Web has often been seen as a disruptive event in the history of hypertext, in this paper I argue that if we focus on the intersection between digital publishing and hypertext, there are stronger continuities. Through archival research into early participants of the Hypertext conferences including Ben Shneiderman and Michael Joyce, I recover some of these lost connections
Reassessing the Gravity’s Rainbow Pynchon Wiki: a new research paradigm?
Since the Against the Day Wiki launched in October 2006, the Pynchon Wiki collection has received over twenty thousand edits, making it one of the largest, dedicated literary reference Wikis. One can now view and add annotations to all seven of Pynchon’s novels – only Slow Learner and his non-fiction remain sans Wiki -, and a loose community of over four hundred contributors have done so. This paper will assess the importance of the Gravity’s Rainbow Wiki in transforming understanding and interpretation through asking four different questions: does the Wiki count as a disruptive force in the Pynchon interpretive industry, who contributes to the Wiki, what types of contribution they make, and how do they exploit the hypertextual features on offer through the MediaWiki package. I will suggest that the Pynchon Wiki does not fully depart from old media forms of interpretation and remains fragmented in both community, resembling a symphony of soloists, and potential connections. This is a version of Web 2.0 synthesizing both Darcy DiNucci’s original dystopian vision of fragmentation (DiNucci) and Tim O’Reilly’s utopian idea of harnessing ‘collective intelligence’ (O’Reilly)
Drinking from the Same Cup: Federal Reserved Water Rights and National Parks in the Eastern United States
- …
