944 research outputs found

    Searching for answers in an uncertain world: meaning threats lead to increased working memory capacity

    Get PDF
    The Meaning Maintenance Model posits that individuals seek to resolve uncertainty by searching for patterns in the environment, yet little is known about how this is accomplished. Four studies investigated whether uncertainty has an effect on people’s cognitive functioning. In particular, we investigated whether meaning threats lead to increased working memory capacity. In each study, we exposed participants to either an uncertain stimulus used to threaten meaning in past studies, or a control stimulus. Participants then completed a working memory measure, where they either had to recall lists of words (Studies 1, 2), or strings of digits (Studies 3, 4). We used both a frequentist approach and Bayesian analysis to evaluate our findings. Across the four studies, we find a small but consistent effect, where participants in the meaning threat condition show improved performance on the working memory tasks. Overall, our findings were consistent with the hypothesis that working memory capacity increases when people experience a meaning threat, which may help to explain improved pattern recognition. Additionally, our results highlight the value of using a Bayesian analytic approach, particularly when studying phenomena with high variance

    Continuous Damage Fiber Bundle Model for Strongly Disordered Materials

    Full text link
    We present an extension of the continuous damage fiber bundle model to describe the gradual degradation of highly heterogeneous materials under an increasing external load. Breaking of a fiber in the model is preceded by a sequence of partial failure events occurring at random threshold values. In order to capture the subsequent propagation and arrest of cracks, furthermore, the disorder of the number of degradation steps of material constituents, the failure thresholds of single fibers are sorted into ascending order and their total number is a Poissonian distributed random variable over the fibers. Analytical and numerical calculations showed that the failure process of the system is governed by extreme value statistics, which has a substantial effect on the macroscopic constitutive behaviour and on the microscopic bursting activity as well.Comment: 10 pages, 13 figure

    A specification method for the scalable self-governance of complex autonomic systems

    Get PDF
    IBM, amongst many others, have sought to endow computer systems with selfmanagement capabilities by delegating vital functions to the software itself and proposed the Autonomic Computing model. Hence inducing the so-called self-* properties including the system's ability to be self-configuring, self-optimising, self-healing and self-protecting. Initial attempts to realise such a vision have so far mostly relied on a passive adaptation whereby Design by Contract and Event-Condition-Action (ECA) type constructs are used to regulate the target systems behaviour: When a specific event makes a certain condition true then an action is triggered which executes either within the system or on its environment Whilst, such a model works well for closed systems, its effectiveness and applicability of approach diminishes as the size and complexity of the managed system increases, necessitating frequent updates to the ECA rule set to cater for new and/or unforeseen systems' behaviour. More recent research works are now adopting the parametric adaptation model, where the events, conditions and actions may be adjusted at runtime in response to the system's observed state. Such an improved control model works well up to a point, but for large scale systems of systems, with very many component interactions, the predictability and traceability of the regulation and its impact on the whole system is intractable. The selforganising systems theory, however, offers a scaleable alternative to systems control utilising emerging behaviour, observed at a global level, resulting from the low-level interactions of the distributed components. Whereby, for instance, key signals (signs) for ECA style feedback control need no longer be recognised or understood in the context of the design time system but are defined by their relevance to the runtime system. Nonetheless this model still suffers from a usually inaccessible control model with no intrinsic meaning assigned to data extraction from the systems operation. In other words, there is no grounded definition of particular observable events occurring in the system. This condition is termed the Signal Grounding Problem. This problem cannot usually be solved by analytical or algorithmic methods, as these solutions generally require precise problem formulations and a static operating domain. Rather cognitive techniques will be needed that perform effectively to evaluate and improve performance in the presence of complex, incomplete, dynamic and evolving environments. In order to develop a specification method for scalable self-governance of autonomic systems of systems, this thesis presents a number of ways to alleviate, or circumvent, the Signal Grounding Problem through the utilisation of cognitive systems and the properties of complex systems. After reviewing the specification methods available for governance models, the Situation Calculus dialect of first order logic is described with the necessary modalities for the specification of deliberative monitoring in partially observable environments with stochastic actions. This permits a specification method that allows the depiction of system guards and norms, under central control, as well as the deliberative functions required for decentralised components to present techniques around the Signal Grounding problem, engineer emergence and generally utilise the properties of large complex systems for their own self-governance. It is shown how these large-scale behaviours may be implemented and the properties assessed and utilised by an Observer System through fully functioning implementations and simulations. The work concludes with two case studies showing how the specification would be achieved in practice: An observer based meta-system for a decision support system in medicine is described, specified and implemented up to parametric adaptation and a NASA project is described with a specification given for the interactions and cooperative behaviour that leads to scale-free connectivity, which the observer system may then utilise for a previously described efficient monitoring strategy

    Eyes wide shut? UK consumer perceptions on aviation climate impacts and travel decisions to New Zealand

    Get PDF
    The purview of climate change concern has implicated air travel, as evidenced in a growing body of academic literature concerned with aviation CO2 emissions. This article assesses the relevance of climate change to long haul air travel decisions to New Zealand for United Kingdom consumers. Based on 15 semi-structured open-ended interviews conducted in Bournemouth, UK during June 2009, it was found that participants were unlikely to forgo potential travel decisions to New Zealand because of concern over air travel emissions. Underpinning the interviewees’ understandings and responses to air travel’s climate impact was a spectrum of awareness and attitudes to air travel and climate change. This spectrum ranged from individuals who were unaware of air travel’s climate impact to those who were beginning to consume air travel with a ‘carbon conscience’. Within this spectrum were some who were aware of the impact but not willing to change their travel behaviours at all. Rather than implicating long haul air travel, the empirical evidence instead exemplifies changing perceptions towards frequent short haul air travel and voices calls for both government and media in the UK to deliver more concrete messages on air travel’s climate impact

    Collisions of inhomogeneous pre-planetesimals

    Full text link
    In the framework of the coagulation scenario, kilometre-sized planetesimals form by subsequent collisions of pre-planetesimals of sizes from centimetre to hundreds of metres. Pre-planetesimals are fluffy, porous dust aggregates, which are inhomogeneous owing to their collisional history. Planetesimal growth can be prevented by catastrophic disruption in pre-planetesimal collisions above the destruction velocity threshold. We develop an inhomogeneity model based on the density distribution of dust aggregates, which is assumed to be a Gaussian distribution with a well-defined standard deviation. As a second input parameter, we consider the typical size of an inhomogeneous clump. These input parameters are easily accessible by laboratory experiments. For the simulation of the dust aggregates, we utilise a smoothed particle hydrodynamics (SPH) code with extensions for modelling porous solid bodies. The porosity model was previously calibrated for the simulation of silica dust, which commonly serves as an analogue for pre-planetesimal material. The inhomogeneity is imposed as an initial condition on the SPH particle distribution. We carry out collisions of centimetre-sized dust aggregates of intermediate porosity. We vary the standard deviation of the inhomogeneous distribution at fixed typical clump size. The collision outcome is categorised according to the four-population model. We show that inhomogeneous pre-planetesimals are more prone to destruction than homogeneous aggregates. Even slight inhomogeneities can lower the threshold for catastrophic disruption. For a fixed collision velocity, the sizes of the fragments decrease with increasing inhomogeneity. Pre-planetesimals with an active collisional history tend to be weaker. This is a possible obstacle to collisional growth and needs to be taken into account in future studies of the coagulation scenario.Comment: 12 pages, 9 figures, 4 table
    • …
    corecore