313 research outputs found

    Teaching evolution to creationist students: the ultimate challenge

    Get PDF
    Despite overwhelming evidence for the common ancestry of life and evolution by natural selection, ideas invoking direct creation persist, disrupting teaching evolution as a central biological concept. While originating within fundamentalist Protestantism in the USA, creationist views are now prominent elsewhere and in other religions. Responses by educators include ignoring evolution; excluding evolutionary topics especially provocative to creationist students; advocating evolution while ignoring, disparaging or ridiculing creationism; distinguishing between scientific and religious approaches before considering only the scientific; and acknowledging evolution and creationist positions as different world views that one may understand, but not necessarily accept. Here, we argue that any chance of success in teaching evolution to creationist students requires elements of the last two of these approaches. Applying them requires understanding students' worldviews and the methods and limitations of science, as well as employing learning activities that engage, not alienate. In this context, we describe the creationist positions that may be encountered when teaching evolution, the fundamentals appropriate to teaching scientific method, and the teaching strategies of affirmative neutrality and procedural neutrality that educators may use to counter creationist views when teaching evolution. We regard understanding of the common ancestry of life and natural selection and other evolutionary mechanisms as threshold concepts that, once grasped, can transform students' interpretations of biology and possibly their world views. Mentioning creationism in the context of science education may be a dangerous idea, but what is worse - to establish some common ground with creationist students in the hope of leading them to an understanding of evolution, or to leave them ignorant of any evolutionary concepts at all

    Animal detections increase by using a wide-angle camera trap model but not by periodically repositioning camera traps within study sites

    Get PDF
    When using camera traps for wildlife studies, determining suitable camera models and deployment methods is essential for achieving study objectives. We aimed to determine if camera trap performance can be increased by (1) using cameras with wider detection angles, and (2) by periodically repositioning cameras within sites. We compared three camera trap groups: stationary Reconyx PC900/HC600 (40° detection angle), and paired, periodically-repositioned Reconyx PC900/HC600 and Swift 3C wide-angle camera traps (110° detection angle). Cameras operated simultaneously at 17 sites over 9 weeks within the Upper Warren region, Western Australia. Swift cameras had significantly higher detection rates, leading to better performance, especially for species 10 kg bodyweight. Reconyx cameras missed 54% of known events, with most being animals that moved within the cameras’ detection zones. Stationary and periodically-repositioned Reconyx camera traps performed similarly, although there were notable differences for some species. The better performance of Swift 3C wide-angle camera traps makes them more useful for community-level and species-level studies. The increased sensitivity of the Swift’s passive infrared sensor along with the wider detection zone played an important role in its success. When choosing camera trap models, detection angle and sensor sensitivity should be considered to produce reliable study results. Periodically repositioning cameras within sites is a technique that warrants further investigation as it may reduce camera placement bias, animal avoidance of camera traps, and increase spatial/habitat information when a limited number of cameras are deployed

    Factors determining the home ranges of pet cats: A meta-analysis

    Get PDF
    Roaming pet cats Felis catus are a significant conservation issue because they may hunt, harass and compete with wildlife; spread disease, interbreed with cats in feral populations, and hybridise with wild native felids. Studies of the roaming behaviour of pet cats are often hampered by modest sample sizes and variability between cats, limiting statistical significance of the findings and their usefulness in recommending measures to discourage roaming. We resolved these difficulties through meta-analyses of 25 studies from 10 countries involving 469 pet cats to assess the influence of sex, whether a cat was desexed and housing density on roaming. A complementary linear mixed models approach used data on 311 individual animals from 22 studies and was also able to assess the influence of age and husbandry practices on roaming. This restricted sample gave greater statistical power than the meta-analyses. Meta-analyses found that: male pet cats had larger home ranges than females, desexing did not influence home range, and cats had larger home ranges when housing densities were low. The linear mixed models supported those results. They also indicated that animals ≥ 8 years old had smaller home ranges than younger cats. Cats fed regularly, provided with veterinary care and socialised with humans had similar home ranges to cats living in association with households but not provided for in some of these ways. Short of confinement, there is no simple measure owners can adopt to reduce roaming by their cats and prevent the associated environmental problems

    Assessing the effectiveness of the Birdsbesafe® anti-predation collar cover in reducing predation on wildlife by pet cats in Western Australia

    Get PDF
    Many pet cats hunt and, irrespective of whether or not this threatens wildlife populations, distressed owners may wish to curtail hunting while allowing their pets to roam. Therefore we evaluated the effectiveness of three patterned designs (simple descriptions being rainbow, red and yellow) of the anti-predation collar cover, the Birdsbesafe® (BBS), in reducing prey captures by 114 pet cats over 2 years in a suburban Australian context. The BBS offers a colourful indicator of a cat's presence and should therefore alert prey with good colour vision (birds and herpetofauna), but not most mammals with limited colour vision. We also interviewed the 82 owners of cats in the study about their experience using the BBS and their assessment of the behavioural responses of their cats. In the first year of the study, which focused on the effectiveness of different BBS colours, captures of prey with good colour vision were reduced by 54% (95% CL 43-64%) when cats were wearing a BBS of any colour, with the rainbow and red BBS more effective than the yellow when birds were prey. Captures of mammals were not reduced significantly. The second year assessed the rainbow BBS alone, and those data combined with rainbow data in the first year found a significant reduction of 47% (95% CL 43-57%) in capture of prey with good colour vision, with no effect of differences across years. We found no evidence that cats maintained a lower predation rate once the BBS was removed. Seventy-nine per cent of owners reported that their cats had no problems with the BBS and another 17% reported that their cats adjusted within 2 days. Fourteen owners reported that their cats spent more time at home and ate more while wearing the BBS. Two owners reported their cats stayed away from home more while wearing it. Sixty-four per cent of owners using the red collar, 48% using rainbow and 46% using yellow believed that it worked. Overall, 77% of owners planned to continue using the BBS after the study had finished. The BBS is an option for owners wishing to reduce captures of birds and herpetofauna by free-ranging cats, especially where mammalian prey are introduced pests. To date, the BBS is the only predation deterrent that reduces significantly the number of herpetofauna brought home. It is unsuitable where endangered mammalian prey or large invertebrates are vulnerable to predation by pet cats

    Older Adults with Cancer: A Randomized Controlled Trial of Occupational and Physical Therapy

    Get PDF
    OBJECTIVES: The impact of occupational therapy (OT) and physical therapy (PT) on functional outcomes in older adults with cancer is unknown. DESIGN: Two-arm single-institution randomized controlled trial of outpatient OT/PT. SETTING: Comprehensive cancer center with two off-site OT/PT clinics. PARTICIPANTS: We recruited adults 65 years and older with a recent diagnosis or recurrence of cancer within 5 years, with at least one functional limitation as identified by a geriatric assessment. Participants were randomized to OT/PT or usual care. INTERVENTION: Rehabilitation consisted of individualized OT and PT that addressed functional activities and strength/endurance needs. MEASUREMENTS: Primary outcome was functional status as measured by the Nottingham Extended Activities of Daily Living scale. Secondary outcomes were Patient-Reported Outcomes Measurement Information System-Global Mental Health (GMH) and Global Physical Health (GPH), ability to participate in Social Roles (SR), physical function, and activity expectations and self-efficacy (Possibilities for Activity Scale [PActS]). RESULTS: Among those recruited (N = 63), only 45 patients (71%) were evaluable due to loss of follow-up and/or nonreceipt of intervention. The median age was 74 years; 53% were female, and 91% were white. Overall, 30% patients had hematologic malignancies, 30% breast cancer, and 16% colorectal cancers. A total of 65% were in active treatment; 49% had stage 3 or 4 disease. At follow-up, both OT/PT (P =.02) and usual care (P =.03) groups experienced a decline in functional status. PActS scores between groups (P =.04) was significantly improved in the intervention group. GMH and SR met criteria for minimally important clinical difference favoring the intervention, but not statistical significance. Several barriers were noted in the implementation of the intervention program: recruitment, concerns about cost, distance, scheduling, and limited treatment provided. CONCLUSION: OT/PT may positively influence activity expectations and self-efficacy. Future research needs to address significant barriers to implementation to increase use of OT/PT services and access to quality care

    A 100-Year Review: A century of change in temperate grazing dairy systems

    Get PDF
    peer-reviewedFrom 1917 to 2017, dairy grazing systems have evolved from uncontrolled grazing of unimproved pastures by dual-purpose dairy-beef breeds to an intensive system with a high output per unit of land from a fit-for-purpose cow. The end of World War I signaled significant government investments in agricultural research institutes around the world, which coincided with technological breakthroughs in milk harvesting and a recognition that important traits in both plants and animals could be improved upon relatively rapidly through genetic selection. Uptake of milk recording and herd testing increased rapidly through the 1920s, as did the recognition that pastures that were rested in between grazing events yielded more in a year than those continuously grazed. This, and the invention and refinement of the electric fence, led to the development of “controlled” rotational grazing. This, in itself, facilitated greater stocking rates and a 5 to 10% increase in milk output per hectare but, perhaps more importantly, it allowed a more efficient use of nitrogen fertilizer, further increasing milk output/land area by 20%. Farmer inventions led to the development of the herringbone and rotary milking parlors, which, along with the “unshortable” electric fence and technological breakthroughs in sperm dilution rates, allowed further dairy farm expansion. Simple but effective technological breakthroughs in reproduction ensured that cows were identified in estrus early (a key factor in maintaining the seasonality of milk production) and enabled researchers to quantify the anestrus problem in grazing herds. Genetic improvement of pasture species has lagged its bovine counterpart, but recent developments in multi-trait indices as well as investment in genetic technologies should significantly increase potential milk production per hectare. Decades of research on the use of feeds other than pasture (i.e., supplementary feeds) have provided consistent milk production responses when the reduction in pasture intake associated with the provision of supplementary feed (i.e., substitution rate) is accounted for. A unique feature of grazing systems research over the last 70 yr has been the use of multi-year farm systems experimentation. These studies have allowed the evaluation of strategic changes to a component of the system on all the interacting features of the system. This technique has allowed excellent component research to be “systemized” and is an essential part of the development of the intensive grazing production system that exists today. Future challenges include the provision of skilled labor or specifically designed automation to optimize farm management and both environmental sustainability and animal welfare concerns, particularly relating to the concentration of nitrogen in each urine patch and the associated risk of nitrate leaching, as well as concerns regarding exposure of animals to harsh climatic conditions. These combined challenges could affect farmers' “social license” to farm in the future

    Treatment Precedes Positive Symptoms in North American Adolescent and Young Adult Clinical High Risk Cohort

    Get PDF
    Early intervention for psychotic disorders, a growing international priority, typically targets help-seeking populations with emerging psychotic (“positive”) symptoms. We assessed the nature of and degree to which treatment of individuals at high risk for psychosis preceded or followed the onset of positive symptoms. The North American Prodrome Longitudinal Study–2 collected psychosocial treatment histories for 745 (98%) of 764 high-risk participants (M age = 18.9, 57% male, 57.5% Caucasian, 19.1% Hispanic) recruited from 8 North American communities. Similar to prior findings, 82% of participants reported psychosocial treatment prior to baseline assessment, albeit with significant variability across sites (71%–96%). Participants first received treatment a median of 1.7 years prior to the onset of a recognizable psychosis-risk syndrome. Only one fourth sought initial treatment in the year following syndrome onset. Although mean sample age differed significantly by site, age at initial treatment (M = 14.1, SD = 5.0) did not. High rates of early treatment prior to syndrome onset make sense in light of known developmental precursors to psychotic disorders but are inconsistent with the low rates of treatment retrospectively reported by first-episode psychosis samples. Findings suggest that psychosis risk studies and clinics may need to more actively recruit and engage symptomatic but non-help-seeking individuals and that community clinicians be better trained to recognize both positive and nonspecific indicators of emerging psychosis. Improved treatments for nonspecific symptoms, as well as the characteristic attenuated positive symptoms, are needed

    Depression and clinical high-risk states: Baseline presentation of depressed vs. non-depressed participants in the NAPLS-2 cohort

    Get PDF
    Depressed mood appears to be highly prevalent in clinical high risk (CHR) samples. However, many prior CHR studies utilize modest size samples and do not report on the specific impact of depression on CHR symptoms. The aim of the current paper is to investigate the prevalence of depressive disorders and the impact of lifetime depression on baseline clinical presentation and longitudinal outcomes in a large cohort of individuals meeting CHR criteria in the second phase of the North American Prodrome Longitudinal Study (NAPLS-2). Depression was assessed both categorically (via DSM-IV-TR diagnoses) and symptomatically (using a clinician-rated scale of depressive symptoms) within a sample of 764 individuals at CHR and 279 controls. Current and lifetime depressive disorders were highly prevalent (60%) in this sample. Depression diagnoses were associated with more pronounced negative and general symptoms; individuals with remitted depression had significantly less severe negative, disorganized, and general symptoms and better social and role functioning relative to those with current depression. Current mood disturbance, as measured by scores on a clinician-rated symptom scale, contributed beyond the impact of positive and negative symptoms to impairments in social functioning. Both symptomatic and diagnostic baseline depression was significantly associated with decreased likelihood of remission from CHR status; however depression did not differentially distinguish persistent CHR status from transition to psychosis at follow-up. These findings suggest that depressed mood may function as a marker of poor prognosis in CHR, yet effective treatment of depression within this population can yield improvements in symptoms and functioning

    Learning difficulties : a portuguese perspective of a universal issue

    Get PDF
    In this article we present findings of a study that was conducted with the purpose of deepening the knowledge about the field of learning difficulties in Portugal. Therefore, within these findings we will discuss across several cultural boundaries, themes related with the existence of learning difficulties as a construct, the terminology, the political, social and scientific influences on the field, and the models of identification and of ongoing school support for students. While addressing the above-mentioned themes we will draw attention to the different, yet converging, international understandings of learning difficulties
    • …
    corecore