281,576 research outputs found

    Context and perceptual salience influence the formation of novel stereotypes via cumulative cultural evolution

    Get PDF
    We use a transmission chain method to establish how context and category salience influence the formation of novel stereotypes through cumulative cultural evolution. We created novel alien targets by combining features from three category dimensions—color, movement, and shape—thereby creating social targets that were individually unique but that also shared category membership with other aliens (e.g., two aliens might be the same color and shape but move differently). At the start of the transmission chains each alien was randomly assigned attributes that described it (e.g., arrogant, caring, confident). Participants were given training on the alien-attribute assignments and were then tested on their memory for these. The alien-attribute assignments participants produced during test were used as the training materials for the next participant in the transmission chain. As information was repeatedly transmitted an increasingly simplified, learnable stereotype-like structure emerged for targets who shared the same color, such that by the end of the chains targets who shared the same color were more likely to share the same attributes (a reanalysis of data from Martin et al., 2014 which we term Experiment 1). The apparent bias toward the formation of novel stereotypes around the color category dimension was also found for objects (Experiment 2). However, when the category dimension of color was made less salient, it no longer dominated the formation of novel stereotypes (Experiment 3). The current findings suggest that context and category salience influence category dimension salience, which in turn influences the cumulative cultural evolution of information.<br/

    How good are your testers? An assessment of testing ability

    Get PDF
    During our previous research conducted in the Sheffield Software Engineering Observatory [11], we found that test first programmers spent a higher percentage of their time testing than those testing after coding. However as the team allocation was based on subjects' academic records and their preference, it was unclear if they were simply better testers. Thus this paper proposes two questionnaires to assess the testing ability of subjects, in order to reveal the factors that contribute to the previous findings. Preliminary results show that the testing ability of subjects, as measured by the survey, varies based on their professional skill level

    Methods in Psychological Research

    Get PDF
    Psychologists collect empirical data with various methods for different reasons. These diverse methods have their strengths as well as weaknesses. Nonetheless, it is possible to rank them in terms of different critieria. For example, the experimental method is used to obtain the least ambiguous conclusion. Hence, it is the best suited to corroborate conceptual, explanatory hypotheses. The interview method, on the other hand, gives the research participants a kind of emphatic experience that may be important to them. It is for the reason the best method to use in a clinical setting. All non-experimental methods owe their origin to the interview method. Quasi-experiments are suited for answering practical questions when ecological validity is importa

    Numerical simulation of convective airflow in an empty room

    Get PDF
    Numerical simulation of airflow inside an empty room has been carried out for a forced convection, a natural convection and a mixed convection respectively, by using a computational fluid dynamics approach of solving the Reynolds-averaged Navier-Stokes fluid equations. Two-dimensional model was studied at first; focusing on the grid refinement, the mesh topology effect, and turbulence model influences. It was found that structured mesh results are in better agreement with available experimental measurements for all three scenarios. Further study using a three-dimensional model has shown very good agreements with test data at measuring points. Furthermore, present studies have revealed low-frequency flow unsteadiness by monitoring the time history of flow variables at measuring positions. This phenomenon has not yet reported and discussed in previous studies

    Key Steps in Developing a Cognitive Vaccine against Traumatic Flashbacks: Visuospatial Tetris versus Verbal Pub Quiz

    Get PDF
    Background: Flashbacks (intrusive memories of a traumatic event) are the hallmark feature of Post Traumatic Stress Disorder, however preventative interventions are lacking. Tetris may offer a 'cognitive vaccine' [1] against flashback development after trauma exposure. We previously reported that playing the computer game Tetris soon after viewing traumatic material reduced flashbacks compared to no-task [1]. However, two criticisms need to be addressed for clinical translation: (1) Would all games have this effect via distraction/enjoyment, or might some games even be harmful? (2) Would effects be found if administered several hours post-trauma? Accordingly, we tested Tetris versus an alternative computer game - Pub Quiz - which we hypothesized not to be helpful (Experiments 1 and 2), and extended the intervention interval to 4 hours (Experiment 2).Methodology/Principal Findings: The trauma film paradigm was used as an experimental analog for flashback development in healthy volunteers. In both experiments, participants viewed traumatic film footage of death and injury before completing one of the following: (1) no-task control condition (2) Tetris or (3) Pub Quiz. Flashbacks were monitored for 1 week. Experiment 1: 30 min after the traumatic film, playing Tetris led to a significant reduction in flashbacks compared to no-task control, whereas Pub Quiz led to a significant increase in flashbacks. Experiment 2: 4 hours post-film, playing Tetris led to a significant reduction in flashbacks compared to no-task control, whereas Pub Quiz did not.Conclusions/Significance: First, computer games can have differential effects post-trauma, as predicted by a cognitive science formulation of trauma memory. In both Experiments, playing Tetris post-trauma film reduced flashbacks. Pub Quiz did not have this effect, even increasing flashbacks in Experiment 1. Thus not all computer games are beneficial or merely distracting post-trauma - some may be harmful. Second, the beneficial effects of Tetris are retained at 4 hours post-trauma. Clinically, this delivers a feasible time-window to administer a post-trauma "cognitive vaccine''

    Evaluating the performance of model transformation styles in Maude

    Get PDF
    Rule-based programming has been shown to be very successful in many application areas. Two prominent examples are the specification of model transformations in model driven development approaches and the definition of structured operational semantics of formal languages. General rewriting frameworks such as Maude are flexible enough to allow the programmer to adopt and mix various rule styles. The choice between styles can be biased by the programmer’s background. For instance, experts in visual formalisms might prefer graph-rewriting styles, while experts in semantics might prefer structurally inductive rules. This paper evaluates the performance of different rule styles on a significant benchmark taken from the literature on model transformation. Depending on the actual transformation being carried out, our results show that different rule styles can offer drastically different performances. We point out the situations from which each rule style benefits to offer a valuable set of hints for choosing one style over the other

    Ontology of core data mining entities

    Get PDF
    In this article, we present OntoDM-core, an ontology of core data mining entities. OntoDM-core defines themost essential datamining entities in a three-layered ontological structure comprising of a specification, an implementation and an application layer. It provides a representational framework for the description of mining structured data, and in addition provides taxonomies of datasets, data mining tasks, generalizations, data mining algorithms and constraints, based on the type of data. OntoDM-core is designed to support a wide range of applications/use cases, such as semantic annotation of data mining algorithms, datasets and results; annotation of QSAR studies in the context of drug discovery investigations; and disambiguation of terms in text mining. The ontology has been thoroughly assessed following the practices in ontology engineering, is fully interoperable with many domain resources and is easy to extend

    Agile Requirements Engineering: A systematic literature review

    Get PDF
    Nowadays, Agile Software Development (ASD) is used to cope with increasing complexity in system development. Hybrid development models, with the integration of User-Centered Design (UCD), are applied with the aim to deliver competitive products with a suitable User Experience (UX). Therefore, stakeholder and user involvement during Requirements Engineering (RE) are essential in order to establish a collaborative environment with constant feedback loops. The aim of this study is to capture the current state of the art of the literature related to Agile RE with focus on stakeholder and user involvement. In particular, we investigate what approaches exist to involve stakeholder in the process, which methodologies are commonly used to present the user perspective and how requirements management is been carried out. We conduct a Systematic Literature Review (SLR) with an extensive quality assessment of the included studies. We identified 27 relevant papers. After analyzing them in detail, we derive deep insights to the following aspects of Agile RE: stakeholder and user involvement, data gathering, user perspective, integrated methodologies, shared understanding, artifacts, documentation and Non-Functional Requirements (NFR). Agile RE is a complex research field with cross-functional influences. This study will contribute to the software development body of knowledge by assessing the involvement of stakeholder and user in Agile RE, providing methodologies that make ASD more human-centric and giving an overview of requirements management in ASD.Ministerio de Economía y Competitividad TIN2013-46928-C3-3-RMinisterio de Economía y Competitividad TIN2015-71938-RED

    Event Recognition Using Signal Spectrograms in Long Pulse Experiments

    Get PDF
    As discharge duration increases, real-time complex analysis of the signal becomes more important. In this context, data acquisition and processing systems must provide models for designing experiments which use event oriented plasma control. One example of advanced data analysis is signal classification. The off-line statistical analysis of a large number of discharges provides information to develop algorithms for the determination of the plasma parameters from measurements of magnetohydrodinamic waves, for example, to detect density fluctuations induced by the Alfvén cascades using morphological patterns. The need to apply different algorithms to the signals and to address different processing algorithms using the previous results necessitates the use of an event-based experiment. The Intelligent Test and Measurement System platform is an example of architecture designed to implement distributed data acquisition and real-time processing systems. The processing algorithm sequence is modeled using an event-based paradigm. The adaptive capacity of this model is based on the logic defined by the use of state machines in SCXML. The Intelligent Test and Measurement System platform mixes a local multiprocessing model with a distributed deployment of services based on Jini
    corecore