200 research outputs found

    A review of human factors principles for the design and implementation of medication safety alerts in clinical information systems.

    Get PDF
    The objective of this review is to describe the implementation of human factors principles for the design of alerts in clinical information systems. First, we conduct a review of alarm systems to identify human factors principles that are employed in the design and implementation of alerts. Second, we review the medical informatics literature to provide examples of the implementation of human factors principles in current clinical information systems using alerts to provide medication decision support. Last, we suggest actionable recommendations for delivering effective clinical decision support using alerts. A review of studies from the medical informatics literature suggests that many basic human factors principles are not followed, possibly contributing to the lack of acceptance of alerts in clinical information systems. We evaluate the limitations of current alerting philosophies and provide recommendations for improving acceptance of alerts by incorporating human factors principles in their design

    Memory for pitch in congenital amusia: Beyond a fine-grained pitch discrimination problem

    Get PDF
    Congenital amusia is a disorder that affects the perception and production of music. While amusia has been associated with deficits in pitch discrimination, several reports suggest that memory deficits also play a role. The present study investigated short-term memory span for pitch-based and verbal information in 14 individuals with amusia and matched controls. Analogous adaptive-tracking procedures were used to generate tone and digit spans using stimuli that exceeded psychophysically measured pitch perception thresholds. Individuals with amusia had significantly smaller tone spans, whereas their digits spans were a similar size to those of controls. An automated operation span task was used to determine working memory capacity. Working memory deficits were seen in only a small subgroup of individuals with amusia. These findings support the existence of a pitch-specific component within short-term memory and suggest that congenital amusia is more than a disorder of fine-grained pitch discrimination

    What determines auditory similarity? The effect of stimulus group and methodology.

    Get PDF
    Two experiments on the internal representation of auditory stimuli compared the pairwise and grouping methodologies as means of deriving similarity judgements. A total of 45 undergraduate students participated in each experiment, judging the similarity of short auditory stimuli, using one of the methodologies. The experiments support and extend Bonebright's (1996) findings, using a further 60 stimuli. Results from both methodologies highlight the importance of category information and acoustic features, such as root mean square (RMS) power and pitch, in similarity judgements. Results showed that the grouping task is a viable alternative to the pairwise task with N > 20 sounds whilst highlighting subtle differences, such as cluster tightness, between the different task results. The grouping task is more likely to yield category information as underlying similarity judgements

    Designing informative warning signals: Effects of indicator type, modality, and task demand on recognition speed and accuracy

    Get PDF
    An experiment investigated the assumption that natural indicators which exploit existing learned associations between a signal and an event make more effective warnings than previously unlearned symbolic indicators. Signal modality (visual, auditory) and task demand (low, high) were also manipulated. Warning effectiveness was indexed by accuracy and reaction time (RT) recorded during training and dual task test phases. Thirty-six participants were trained to recognize 4 natural and 4 symbolic indicators, either visual or auditory, paired with critical incidents from an aviation context. As hypothesized, accuracy was greater and RT was faster in response to natural indicators during the training phase. This pattern of responding was upheld in test phase conditions with respect to accuracy but observed in RT only in test phase conditions involving high demand and the auditory modality. Using the experiment as a specific example, we argue for the importance of considering the cognitive contribution of the user (viz., prior learned associations) in the warning design process. Drawing on semiotics and cognitive psychology, we highlight the indexical nature of so-called auditory icons or natural indicators and argue that the cogniser is an indispensable element in the tripartite nature of signification

    Revisiting the exercise heart rate-music tempo preference relationship

    Get PDF
    In the present study, we investigated a hypothesized quartic relationship (meaning three inflection points) between exercise heart rate (HR) and preferred music tempo. Initial theoretical predictions suggested a positive linear relationship (Iwanaga, 1995a, 1995b); however, recent experimental work has shown that as exercise HR increases, step changes and plateaus that punctuate the profile of music tempo preference may occur (Karageorghis, Jones, & Stuart, 2008). Tempi bands consisted of slow (95–100 bpm), medium (115–120 bpm), fast (135–140 bpm), and very fast (155–160 bpm) music. Twenty-eight active undergraduate students cycled at exercise intensities representing 40, 50, 60, 70, 80, and 90% of their maximal HR reserve while their music preference was assessed using a 10-point scale. The Exercise Intensity x Music Tempo interaction was significant, F(6.16, 160.05) = 7.08, p < .001, ηp 2 =.21, as was the test for both cubic and quartic trajectories in the exercise HR–preferred-music-tempo relationship (p < .001). Whereas slow tempo music was not preferred at any exercise intensity, preference for fast tempo increased, relative to medium and very fast tempo music, as exercise intensity increased. The implications for the prescription of music in exercise and physical activity contexts are discussed

    Agricultural intensification heightens food safety risks posed by wild birds

    Get PDF
    Agricultural intensification and simplification are key drivers of recent declines in wild bird populations, heightening the need to better balance conservation with food production. This is hindered, however, by perceptions that birds threaten food safety. While birds are known reservoirs of foodborne pathogens, there remains uncertainty about the links between landscape context, farming practices, and actual crop contamination by birds. Here, we examine relationships between landscape context, farming practices, and pathogen contamination by birds using a barrier-to-spillover approach. First, we censused bird communities using point count surveys. Second, we collected 2,024 faecal samples from captured birds alongside 1,215 faecal samples from brassica fields and food processing areas across 50 farms spanning the USA West Coast. We then estimated the prevalence of three foodborne pathogens across landscape and livestock intensification gradients. Finally, we quantified the number of plants with faeces. Campylobacterspp. were detected in 10.2% of faeces from captured birds and 13.1% of faeces from production areas. Non-native birds were 4.1 times more likely to haveCampylobacterspp. than native birds.Salmonellaspp. were detected in 0.2% of faeces from production areas and were never detected in captured birds. We detected evidence of Shiga toxigenicE. coliin one sample across the >3,200 tested. Campylobacterspp. prevalence in faeces from production areas increased with increasing mammalian livestock densities in the landscape but decreased with increasing amounts of natural habitat. We encountered bird faeces on 3.3% of plants examined. Despite the impact on pathogen prevalence, amount of natural habitat in the landscape did not increase the number of plants with bird faeces, although on-farm mammalian livestock density slightly did. Synthesis and applications. Food safety and wildlife conservation are often thought to be in conflict. However, our findings suggest that natural habitat around farms may reduce crop contamination rates by birds. This is perhaps because natural habitat can promote native birds that are less likely to harbour foodborne pathogens or because it decreases contact with livestock waste. Our results suggest that preservation of natural habitats around farms could benefit both conservation and food safety, contrary to current standards for 'best practices'

    Is three the magic number? The role of ergonomic principles in cross country comprehension of road traffic signs

    Get PDF
    Road sign comprehension plays an important part in road safety management, particularly for those drivers who are travelling in an unfamiliar country. Previous research has established that comprehension can be improved if signs are designed to adhere to ergonomic principles. However, it may be difficult for sign designers to incorporate all the principles into a single sign and may thus have to make a judgement as to the most effective ones. This study surveyed drivers in three countries to ascertain their understanding of a range of road signs, each of which conformed in varying degrees and combinations to the ergonomic principles. We found that using three of the principles was the most effective and that the most important one was that relating to standardisation; the colours and shapes used were key to comprehension. Other concepts which related to physical and spatial characteristics were less important, whilst conceptual compatibility did not aid comprehension at all. Practitioner Summary: This study explores how road sign comprehension can be improved using ergonomic principles, with particular reference to cross-border drivers. It was found that comprehension can be improved significantly if standardisation is adhered to and if at least three principles are used

    The Benefits and the Costs of Using Auditory Warning Messages in Dynamic Decision Making Settings

    Get PDF
    The failure to notice critical changes in both visual and auditory scenes may have important consequences for performance in complex dynamic environments, especially those related to security such as aviation, surveillance during major events, and command and control of emergency response. Previous work has shown that a significant number of situation changes remain undetected by operators in such environments. In the current study, we examined the impact of using auditory warning messages to support the detection of critical situation changes and to a broader extent the decision making required by the environment. Twenty-two participants performed a radar operator task involving multiple subtasks while detecting critical task-related events that were cued by a specific type of audio message. Results showed that about 22% of the critical changes remained undetected by participants, a percentage similar to that found in previous work using visual cues to support change detection. However, we found that audio messages tended to bias threat evaluation towards perceiving objects as more threatening than they were in reality. Such findings revealed both benefits and costs associated with using audio messages to support change detection in complex dynamic environments

    Patterns of analgesic use, pain and self-efficacy: a cross-sectional study of patients attending a hospital rheumatology clinic

    Get PDF
    Background: Many people attending rheumatology clinics use analgesics and non-steroidal anti-inflammatories for persistent musculoskeletal pain. Guidelines for pain management recommend regular and pre-emptive use of analgesics to reduce the impact of pain. Clinical experience indicates that analgesics are often not used in this way. Studies exploring use of analgesics in arthritis have historically measured adherence to such medication. Here we examine patterns of analgesic use and their relationships to pain, self-efficacy and demographic factors. Methods: Consecutive patients were approached in a hospital rheumatology out-patient clinic. Pattern of analgesic use was assessed by response to statements such as 'I always take my tablets every day.' Pain and self-efficacy (SE) were measured using the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) and Arthritis Self-Efficacy Scale (ASES). Influence of factors on pain level and regularity of analgesic use were investigated using linear regression. Differences in pain between those agreeing and disagreeing with statements regarding analgesic use were assessed using t-tests. Results: 218 patients (85% of attendees) completed the study. Six (2.8%) patients reported no current pain, 26 (12.3%) slight, 100 (47.4%) moderate, 62 (29.4%) severe and 17 (8.1%) extreme pain. In multiple linear regression self efficacy and regularity of analgesic use were significant (p < 0.01) with lower self efficacy and more regular use of analgesics associated with more pain. Low SE was associated with greater pain: 40 (41.7%) people with low SE reported severe pain versus 22 (18.3%) people with high SE, p < 0.001. Patients in greater pain were significantly more likely to take analgesics regularly; 13 (77%) of those in extreme pain reported always taking their analgesics every day, versus 9 (35%) in slight pain. Many patients, including 46% of those in severe pain, adjusted analgesic use to current pain level. In simple linear regression, pain was the only variable significantly associated with regularity of analgesic use: higher levels of pain corresponded to more regular analgesic use (p = 0.003). Conclusion: Our study confirms that there is a strong inverse relationship between self-efficacy and pain severity. Analgesics are often used irregularly by people with arthritis, including some reporting severe pain

    Characteristics and pathways of long-stay patients in high and medium secure settings in England : a secondary publication from a large mixed-methods study

    Get PDF
    Background: Many patients experience extended stays within forensic care, but the characteristics of long-stay patients are poorly understood. Aims: To describe the characteristics of long-stay patients in high and medium secure settings in England. Method: Detailed file reviews provided clinical, offending and risk data for a large representative sample of 401 forensic patients from 2 of the 3 high secure settings and from 23 of the 57 medium secure settings in England on 1 April 2013. The threshold for long-stay status was defined as 5 years in medium secure care or 10 years in high secure care, or 15 years in a combination of high and medium secure settings. Results: 22% of patients in high security and 18% in medium security met the definition for ‘long-stay’, with 20% staying longer than 20 years. Of the long-stay sample, 58% were violent offenders (22% both sexual and violent), 27% had been convicted for violent or sexual offences whilst in an institutional setting, and 26% had committed a serious assault on staff in the last 5 years. The most prevalent diagnosis was schizophrenia (60%) followed by personality disorder (47%, predominantly antisocial and borderline types); 16% were categorised as having an intellectual disability. Overall, 7% of the long-stay sample had never been convicted of any offence, and 16.5% had no index offence prompting admission. Although some significant differences were found between the high and medium secure samples, there were more similarities than contrasts between these two levels of security. The treatment pathways of these long-stay patients involved multiple moves between settings. An unsuccessful referral to a setting of lower security was recorded over the last 5 years for 33% of the sample. Conclusions: Long-stay patients accounted for one fifth of the forensic inpatient population in England in this representative sample. A significant proportion of this group remain unsettled. High levels of personality pathology and the risk of assaults on staff and others within the care setting are likely to impact on treatment and management. Further research into the treatment pathways of longer stay patients is warranted to understand the complex trajectories of this group
    corecore