1,072 research outputs found
KINEMATIC COMPARISON OF TWO RACING WHEELCHAIR PROPULSION TECHNIQUES
The purpose of this study was to quantify selected 3-D kinematic characteristics of the upper body during racing wheelchair stroking over a roller system using the conventional technique (CVT) and para-backhand technique (PBT). Eight CVT and seven PBT users served as the subjects. Each subject performed maximum effort stroking for 30 s at two loads and was recorded by two S-VHS camcorders (60 Hz). The CVT was found to have significant shorter push time, smaller relative push time, and greater relative recovery time than the PBT. Significant difference in arm position at the instant of hand release was found between the two techniques and the difference may have implications for the stress placed on the structures around the shoulder joint. When compared to each other, the CVT is a more compact stroke and the PBT has a faster overall movement speed
Cognitive Computation sans Representation
The Computational Theory of Mind (CTM) holds that cognitive processes are essentially computational, and hence computation provides the scientific key to explaining mentality. The Representational Theory of Mind (RTM) holds that representational content is the key feature in distinguishing mental from non-mental systems. I argue that there is a deep incompatibility between these two theoretical frameworks, and that the acceptance of CTM provides strong grounds for rejecting RTM. The focal point of the incompatibility is the fact that representational content is extrinsic to formal procedures as such, and the intended interpretation of syntax makes no difference to the execution of an algorithm. So the unique 'content' postulated by RTM is superfluous to the formal procedures of CTM. And once these procedures are implemented in a physical mechanism, it is exclusively the causal properties of the physical mechanism that are responsible for all aspects of the system's behaviour. So once again, postulated content is rendered superfluous. To the extent that semantic content may appear to play a role in behaviour, it must be syntactically encoded within the system, and just as in a standard computational artefact, so too with the human mind/brain - it's pure syntax all the way down to the level of physical implementation. Hence 'content' is at most a convenient meta-level gloss, projected from the outside by human theorists, which itself can play no role in cognitive processing
Casting Light Upon The Great Endarkenment
While the Enlightenment promoted thinking for oneself independent of religious authority, the ‘Endarkenment’ (Millgram 2015) concerns deference to a new authority: the specialist, a hyperspecializer. Non-specialists need to defer to such authorities as they are unable to understand their reasoning. Millgram describes how humans are capable of being serial hyperspecializers, able to move from one specialism to another. We support the basic thrust of Millgram’s position, and seek to articulate how the core idea is deployed in very different ways in relation to extremely different philosophical areas. We attend to the issue of the degree of isolation of different specialists and we urge greater emphasis on parallel hyperspecialization, which describes how different specialisms can be embodied in one person at one time
Decision and Discovery in Defining “Disease”
This version (May 17, 2005) was published in its final form as:
Schwartz PH. Decision and discovery in defining 'disease'. In: Kincaid H, McKitrick J, editors. Establishing medical reality: essays in the metaphysics and epistemology of biomedical science. Dordrecht: Springer; 2007. p. 47-63. http://dx.doi.org/10.1007/1-4020-5216-2_5The debate over how to analyze the concept of disease has often centered on the question of whether to include a reference to values, in particular the ‘disvalue’of diseases, or whether to avoid such notions. ‘Normativists,’such as King ([1954], 1981) and Culver and Gert (1982) emphasize the undesirability of diseases, while ‘Naturalists,’ most prominently Christopher Boorse (1977, 1987, 1997), instead require just the presence of biological dysfunction. The debate between normativism and naturalism often deteriorates into stalemate, with each side able to point out significant problems with the other. It starts to look as if neither approach can work. In this paper, I argue that the standoff stems from deeply questionable assumptions that have been used to formulate the opposing positions and guide the debate. In the end, I propose an alternative set of guidelines that offer a more constructive way to devise and compare theories
A Pluralistic Theory of Wordhood
What are words and how should we individuate them? There are two main answers on the philosophical market. For some, words are bundles of structural-functional features defining a unique performance profile. For others, words are non-eternal continuants individuated by their causal-historical ancestry. These conceptions offer competing views of the nature of words, and it seems natural to assume that at most one of them can capture the essence of wordhood. This paper makes a case for pluralism about wordhood: the view that there is a plurality of acceptable conceptions of the nature of words, none of which is uniquely entitled to inform us as to what wordhood consists in
The Search for Stable, Massive, Elementary Particles
In this paper we review the experimental and observational searches for
stable, massive, elementary particles other than the electron and proton. The
particles may be neutral, may have unit charge or may have fractional charge.
They may interact through the strong, electromagnetic, weak or gravitational
forces or through some unknown force. The purpose of this review is to provide
a guide for future searches - what is known, what is not known, and what appear
to be the most fruitful areas for new searches. A variety of experimental and
observational methods such as accelerator experiments, cosmic ray studies,
searches for exotic particles in bulk matter and searches using astrophysical
observations is included in this review.Comment: 34 pages, 8 eps figure
Tomonaga-Luttinger features in the resonant Raman spectra of quantum wires
The differential cross section for resonant Raman scattering from the
collective modes in a one dimensional system of interacting electrons is
calculated non-perturbatively using the bosonization method. The results
indicate that resonant Raman spectroscopy is a powerful tool for studying
Tomonaga-Luttinger liquid behaviour in quasi-one dimensional electron systems.Comment: 4 pages, no figur
A Functional Naturalism
I provide two arguments against value-free naturalism. Both are based on considerations concerning biological teleology. Value-free naturalism is the thesis that both (1) everything is, at least in principle, under the purview of the sciences and (2) all scientific facts are purely non-evaluative. First, I advance a counterexample to any analysis on which natural selection is necessary to biological teleology. This should concern the value-free naturalist, since most value-free analyses of biological teleology appeal to natural selection. My counterexample is unique in that it is likely to actually occur. It concerns the creation of synthetic life. Recent developments in synthetic biology suggest scientists will eventually be able to develop synthetic life. Such life, however, would not have any of its traits naturally selected for. Second, I develop a simple argument that biological teleology is a scientific but value-laden notion. Consequently, value-free naturalism is false. I end with some concluding remarks on the implications for naturalism, the thesis that (1). Naturalism may be salvaged only if we reject (2). (2) is a dogma that unnecessarily constrains our conception of the sciences. Only a naturalism that recognizes value-laden notions as scientifically respectable can be true. Such a naturalism is a functional naturalism
Neo-Aristotelian Naturalism and the Evolutionary Objection: Rethinking the Relevance of Empirical Science
Neo-Aristotelian metaethical naturalism is a modern attempt at naturalizing ethics using ideas from Aristotle’s teleological metaphysics. Proponents of this view argue that moral virtue in human beings is an instance of natural goodness, a kind of goodness supposedly also found in the realm of non-human living things. Many critics question whether neo-Aristotelian naturalism is tenable in light of modern evolutionary biology. Two influential lines of objection have appealed to an evolutionary understanding of human nature and natural teleology to argue against this view. In this paper, I offer a reconstruction of these two seemingly different lines of objection as raising instances of the same dilemma, giving neo-Aristotelians a choice between contradicting our considered moral judgment and abandoning metaethical naturalism. I argue that resolving the dilemma requires showing a particular kind of continuity between the norms of moral virtue and norms that are necessary for understanding non-human living things. I also argue that in order to show such a continuity, neo-Aristotelians need to revise the relationship they adopt with empirical science and acknowledge that the latter is relevant to assessing their central commitments regarding living things. Finally, I argue that to move this debate forward, both neo-Aristotelians and their critics should pay attention to recent work on the concept of organism in evolutionary and developmental biology
- …