2,342 research outputs found

    Scientific Methods Must Be Public, and Descriptive Experience Sampling Qualifies

    Get PDF
    Hurlburt and Schwitzgebel’s groundbreaking book, Describing Inner Experience: Proponent Meets Skeptic, examines a research method called Descriptive Experience Sampling (DES). DES, which was developed by Hurlburt and collaborators, works roughly as follows. An investigator gives a subject a random beeper. During the day, as the subject hears a beep, she writes a description of her conscious experience just before the beep. The next day, the investigator interviews the subject, asks for more details, corrects any apparent mistakes made by the subject, and draws conclusions about the subject’s mind. Throughout the book, Schwitzgebel challenges some of Hurlburt’s specific conclusions. Yet both agree – and so do I – that DES is a worthy method

    The Resilience of Computationalism

    Get PDF
    Computationalism—the view that cognition is computation—has always been controversial. It faces two types of objection. According to insufficiency objections, computation is insufficient for some cognitive phenomenon X. According to objections from neural realization, cognitive processes are realized by neural processes, but neural processes have feature Y and having Y is incompatible with being (or realizing) computations. In this paper, I explain why computationalism has survived these objections. Insufficiency objections are at best partial: for all they establish, computation may be sufficient for cognitive phenomena other than X, may be part of the explanation for X, or both. Objections from neural realization are based either on a false contrast between feature Y and computation or on an account of computation that is too vague to yield the desired conclusion. To adjudicate the dispute between computationalism and its foes, I will conclude that we need a better account of computation

    Two Kinds of Concept: Implicit and Explicit

    Get PDF
    In his refreshing and thought-provoking book, Edouard Machery (2009) argues that people possess different kinds of concept. This is probably true and important. Before I get to that, I will briefly disagree on two other points

    Construction of Hamiltonian and Nambu forms for the shallow water equations

    Full text link
    A systematic method to derive the Hamiltonian and Nambu form for the shallow water equations, using the conservation for energy and potential enstrophy, is presented. Different mechanisms, such as vortical flows and emission of gravity waves, emerge from different conservation laws (CLs) for total energy and potential enstrophy. The equations are constructed using exterior differential forms and self-adjoint operators and result in the sum of two Nambu brackets, one for the vortical flow and one for the wave-mean flow interaction, and a Poisson bracket representing the interaction between divergence and geostrophic imbalance. The advantage of this approach is that the Hamiltonian and Nambu forms can be here written in a coordinate independent form

    Hydrodynamic Nambu Brackets derived by Geometric Constraints

    Full text link
    A geometric approach to derive the Nambu brackets for ideal two-dimensional (2D) hydrodynamics is suggested. The derivation is based on two-forms with vanishing integrals in a periodic domain, and with resulting dynamics constrained by an orthogonality condition. As a result, 2D hydrodynamics with vorticity as dynamic variable emerges as a generic model, with conservation laws which can be interpreted as enstrophy and energy functionals. Generalized forms like surface quasi-geostrophy and fractional Poisson equations for the stream-function are also included as results from the derivation. The formalism is extended to a hydrodynamic system coupled to a second degree of freedom, with the Rayleigh-B\'{e}nard convection as an example. This system is reformulated in terms of constitutive conservation laws with two additive brackets which represent individual processes: a first representing inviscid 2D hydrodynamics, and a second representing the coupling between hydrodynamics and thermodynamics. The results can be used for the formulation of conservative numerical algorithms that can be employed, for example, for the study of fronts and singularities.Comment: 12 page

    Hyperbolic Covariant Coherent Structures in two dimensional flows

    Full text link
    A new method to describe hyperbolic patterns in two dimensional flows is proposed. The method is based on the Covariant Lyapunov Vectors (CLVs), which have the properties to be covariant with the dynamics, and thus being mapped by the tangent linear operator into another CLVs basis, they are norm independent, invariant under time reversal and can be not orthonormal. CLVs can thus give a more detailed information on the expansion and contraction directions of the flow than the Lyapunov Vector bases, that are instead always orthogonal. We suggest a definition of Hyperbolic Covariant Coherent Structures (HCCSs), that can be defined on the scalar field representing the angle between the CLVs. HCCSs can be defined for every time instant and could be useful to understand the long term behaviour of particle tracers. We consider three examples: a simple autonomous Hamiltonian system, as well as the non-autonomous "double gyre" and Bickley jet, to see how well the angle is able to describe particular patterns and barriers. We compare the results from the HCCSs with other coherent patterns defined on finite time by the Finite Time Lyapunov Exponents (FTLEs), to see how the behaviour of these structures change asymptotically

    Nonlinear stratospheric variability: multifractal detrended fluctuation analysis and singularity spectra

    Full text link
    Characterising the stratosphere as a turbulent system, temporal fluctuations often show different correlations for different time scales as well as intermittent behaviour that cannot be captured by a single scaling exponent. In this study, the different scaling laws in the long term stratospheric variability are studied using Multifractal de-trended Fluctuation Analysis. The analysis is performed comparing four re-analysis products and different realisations of an idealised numerical model, isolating the role of topographic forcing and seasonal variability, as well as the absence of climate teleconnections and small-scale forcing. The Northern Hemisphere (NH) shows a transition of scaling exponents for time scales shorter than about one year, for which the variability is multifractal and scales in time with a power law corresponding to a red spectrum, to longer time scales, for which the variability is monofractal and scales in time with a power law corresponding to white noise. Southern Hemisphere (SH) variability also shows a transition at annual scales. The SH also shows a narrower dynamical range in multifractality than the NH, as seen in the generalised Hurst exponent and in the singularity spectra. The numerical integrations show that the models are able to reproduce the low-frequency variability but are not able to fully capture the shorter term variability of the stratosphere

    Computation vs. Information Processing: Why Their Difference Matters to Cognitive Science

    Get PDF
    Since the cognitive revolution, it’s become commonplace that cognition involves both computation and information processing. Is this one claim or two? Is computation the same as information processing? The two terms are often used interchangeably, but this usage masks important differences. In this paper, we distinguish information processing from computation and examine some of their mutual relations, shedding light on the role each can play in a theory of cognition. We recommend that theorists of cognition be explicit and careful in choosing\ud notions of computation and information and connecting them together. Much confusion can be avoided by doing so

    Information Processing, Computation and Cognition

    Get PDF
    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both – although others disagree vehemently. Yet different cognitive scientists use ‘computation’ and ‘information processing’ to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism and connectionism/computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates’ empirical aspects
    • 

    corecore