1,789 research outputs found
Can we use antipredator behavior theory to predict wildlife responses to high-speed vehicles?
Animals seem to rely on antipredator behavior to avoid vehicle collisions. There is an extensive body of antipredator behavior theory that have been used to predict the distance/time animals should escape from predators. These models have also been used to guide empirical research on escape behavior from vehicles. However, little is known as to whether antipredator behavior models are appropriate to apply to an approaching high-speed vehicle scenario. We addressed this gap by (a) providing an overview of the main hypotheses and predictions of different antipredator behavior models via a literature review, (b) exploring whether these models can generate quantitative predictions on escape distance when parameterized with empirical data from the literature, and (c) evaluating their sensitivity to vehicle approach speed using a simulation approach wherein we assessed model performance based on changes in effect size with variations in the slope of the flight initiation distance (FID) vs. approach speed relationship. The slope of the FID vs. approach speed relationship was then related back to three different behavioral rules animals may rely on to avoid approaching threats: the spatial, temporal, or delayed margin of safety. We used literature on birds for goals (b) and (c). Our review considered the following eight models: the economic escape model, Blumstein’s economic escape model, the optimal escape model, the perceptual limit hypothesis, the visual cue model, the flush early and avoid the rush (FEAR) hypothesis, the looming stimulus hypothesis, and the Bayesian model of escape behavior. We were able to generate quantitative predictions about escape distance with the last five models. However, we were only able to assess sensitivity to vehicle approach speed for the last three models. The FEAR hypothesis is most sensitive to high-speed vehicles when the species follows the spatial (FID remains constant as speed increases) and the temporal margin of safety (FID increases with an increase in speed) rules of escape. The looming stimulus effect hypothesis reached small to intermediate levels of sensitivity to high-speed vehicles when a species follows the delayed margin of safety (FID decreases with an increase in speed). The Bayesian optimal escape model reached intermediate levels of sensitivity to approach speed across all escape rules (spatial, temporal, delayed margins of safety) but only for larger (\u3e 1 kg) species, but was not sensitive to speed for smaller species. Overall, no single antipredator behavior model could characterize all different types of escape responses relative to vehicle approach speed but some models showed some levels of sensitivity for certain rules of escape behavior. We derive some applied applications of our findings by suggesting the estimation of critical vehicle approach speeds for managing populations that are especially susceptible to road mortality. Overall, we recommend that new escape behavior models specifically tailored to high-speeds vehicles should be developed to better predict quantitatively the responses of animals to an increase in the frequency of cars, airplanes, drones, etc. they will face in the next decade
Dislocation Nucleation and Propagation in Semiconductor Heterostructures
This paper considers misfit dislocation nucleation and propagation in dilute magnetic semiconductor heterostructures in the CdTe-ZnTe-MnTe system. It is shown that, where the deposit is in tension, 1/2 \u3c 110 \u3e dislocations with inclined Burgers vectors propagate by glide along interfacial \u3c 110 \u3e directions and may dissociate giving intrinsic stacking faults. In cases where the deposit is in compression, 1/2 \u3c 110 \u3e dislocations show no evidence of dissociation and propagate by extensive cross-slip to give networks of dislocations close to interfacial \u3c 100 \u3e directions.
Evidence for dislocation sources in ZnTe/GaSb films is presented. ZnTe films contained stacking fault pyramids, single Frank faults and a new type of diamond defect are present at densities up to about 107 cm-2. Analysis showed that the diamond defects, which were four-sided defects on {111} planes with \u3c 110 \u3e edges, were of vacancy type with 1/3 \u3c 111 \u3e Frank Burgers vectors and intrinsic stacking faults. Although faulted defects showed no tendency to grow by climb, evidence is given for an unfaulted reaction in which a glissile 1/2 \u3c 110 \u3e dislocation is generated. This new model for dislocation nucleation is discussed
Cystic fibrosis mice carrying the missense mutation G551D replicate human genotype phenotype correlations
We have generated a mouse carrying the human G551D mutation in the cystic fibrosis transmembrane conductance regulator gene (CFTR) by a one-step gene targeting procedure. These mutant mice show cystic fibrosis pathology but have a reduced risk of fatal intestinal blockage compared with 'null' mutants, in keeping with the reduced incidence of meconium ileus in G551D patients. The G551D mutant mice show greatly reduced CFTR-related chloride transport, displaying activity intermediate between that of cftr(mlUNC) replacement ('null') and cftr(mlHGU) insertional (residual activity) mutants and equivalent to approximately 4% of wild-type CFTR activity. The long-term survival of these animals should provide an excellent model with which to study cystic fibrosis, and they illustrate the value of mouse models carrying relevant mutations for examining genotype-phenotype correlations
Practical probabilistic programming with monads
The machine learning community has recently shown a lot of interest in practical probabilistic programming systems that target the problem of Bayesian inference. Such systems come in different forms, but they all express probabilistic models as computational processes using syntax resembling programming languages. In the functional programming community monads are known to offer a convenient and elegant abstraction for programming with probability distributions, but their use is often limited to very simple inference problems. We show that it is possible to use the monad abstraction to construct probabilistic models for machine learning, while still offering good performance of inference in challenging models. We use a GADT as an underlying representation of a probability distribution and apply Sequential Monte Carlo-based methods to achieve efficient inference. We define a formal semantics via measure theory. We demonstrate a clean and elegant implementation that achieves performance comparable with Anglican, a state-of-the-art probabilistic programming system.The first author is supported by EPSRC and the Cambridge Trust.This is the author accepted manuscript. The final version is available from ACM via http://dx.doi.org/10.1145/2804302.280431
The incidence and make up of ability grouped sets in the UK primary school
The adoption of setting in the primary school (pupils ability grouped across classes for particular subjects) emerged during the 1990s as a means to raise standards. Recent research based on 8875 children in the Millennium Cohort Study showed that 25.8% of children in Year 2 were set for literacy and mathematics and a further 11.2% of children were set for mathematics or literacy alone. Logistic regression analysis showed that the best predictors of being in the top set for literacy or mathematics were whether the child was born in the Autumn or Winter and cognitive ability scores. Boys were significantly more likely than girls to be in the bottom literacy set. Family circumstances held less importance for setting placement compared with the child’s own characteristics, although they were more important in relation to bottom set placement. Children in bottom sets were significantly more likely to be part of a long-term single parent household, have experienced poverty, and not to have a mother with qualifications at NVQ3 or higher levels. The findings are discussed in relation to earlier research and the implications for schools are set out
Condom-use Skills Checklist: A Proxy for Assessing Condom-use Knowledge and Skills When Direct Observation Is Not Possible
Because of the continued importance of correct condom-use in controlling the HIV epidemic and the limited availability of tools for assessing correct condom-use, methods for assessing condom-application skills, especially when direct observation is not feasible, are needed. Accordingly, in the context of a high-risk population (The Bahamas) for HIV, a 17-item scale—the Condom-use Skills Checklist (CUSC)—was developed for use among young adolescents and adults. The rationale and approach to developing the scale and some measures of internal consistency, construct validity, and criterion-related validity have been described. It is concluded that the scale offers a reasonable alternative to direct observation among older subjects and that further development may make it more useful among pre-adolescents
A Microsoft-Excel-based tool for running and critically appraising network meta-analyses--an overview and application of NetMetaXL.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited.BACKGROUND: The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. METHODS: We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL's interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. RESULTS: We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. CONCLUSIONS: Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based.CC is a recipient of a Vanier Canada Graduate Scholarship from the Canadian Institutes of Health Research (funding reference number—CGV 121171) and is a trainee on the Canadian Institutes of Health Research Drug Safety and Effectiveness Network team grant (funding reference number—116573). BH is funded by a New Investigator award from the Canadian Institutes of Health Research and the Drug Safety and Effectiveness Network. This research was partly supported by funding from CADTH as part of a project to develop Excel-based tools to support the conduct of health technology assessments. This research was also supported by Cornerstone Research Group
- …