3,191 research outputs found

    Apperceptive patterning: Artefaction, extensional beliefs and cognitive scaffolding

    Get PDF
    In “Psychopower and Ordinary Madness” my ambition, as it relates to Bernard Stiegler’s recent literature, was twofold: 1) critiquing Stiegler’s work on exosomatization and artefactual posthumanism—or, more specifically, nonhumanism—to problematize approaches to media archaeology that rely upon technical exteriorization; 2) challenging how Stiegler engages with Giuseppe Longo and Francis Bailly’s conception of negative entropy. These efforts were directed by a prevalent techno-cultural qualifier: the rise of Synthetic Intelligence (including neural nets, deep learning, predictive processing and Bayesian models of cognition). This paper continues this project but first directs a critical analytic lens at the Derridean practice of the ontologization of grammatization from which Stiegler emerges while also distinguishing how metalanguages operate in relation to object-oriented environmental interaction by way of inferentialism. Stalking continental (Kapp, Simondon, Leroi-Gourhan, etc.) and analytic traditions (e.g., Carnap, Chalmers, Clark, Sutton, Novaes, etc.), we move from artefacts to AI and Predictive Processing so as to link theories related to technicity with philosophy of mind. Simultaneously drawing forth Robert Brandom’s conceptualization of the roles that commitments play in retrospectively reconstructing the social experiences that lead to our endorsement(s) of norms, we compliment this account with Reza Negarestani’s deprivatized account of intelligence while analyzing the equipollent role between language and media (both digital and analog)

    The Third Gravitational Lensing Accuracy Testing (GREAT3) Challenge Handbook

    Full text link
    The GRavitational lEnsing Accuracy Testing 3 (GREAT3) challenge is the third in a series of image analysis challenges, with a goal of testing and facilitating the development of methods for analyzing astronomical images that will be used to measure weak gravitational lensing. This measurement requires extremely precise estimation of very small galaxy shape distortions, in the presence of far larger intrinsic galaxy shapes and distortions due to the blurring kernel caused by the atmosphere, telescope optics, and instrumental effects. The GREAT3 challenge is posed to the astronomy, machine learning, and statistics communities, and includes tests of three specific effects that are of immediate relevance to upcoming weak lensing surveys, two of which have never been tested in a community challenge before. These effects include realistically complex galaxy models based on high-resolution imaging from space; spatially varying, physically-motivated blurring kernel; and combination of multiple different exposures. To facilitate entry by people new to the field, and for use as a diagnostic tool, the simulation software for the challenge is publicly available, though the exact parameters used for the challenge are blinded. Sample scripts to analyze the challenge data using existing methods will also be provided. See http://great3challenge.info and http://great3.projects.phys.ucl.ac.uk/leaderboard/ for more information.Comment: 30 pages, 13 figures, submitted for publication, with minor edits (v2) to address comments from the anonymous referee. Simulated data are available for download and participants can find more information at http://great3.projects.phys.ucl.ac.uk/leaderboard

    LSST: Comprehensive NEO Detection, Characterization, and Orbits

    Full text link
    (Abridged) The Large Synoptic Survey Telescope (LSST) is currently by far the most ambitious proposed ground-based optical survey. Solar System mapping is one of the four key scientific design drivers, with emphasis on efficient Near-Earth Object (NEO) and Potentially Hazardous Asteroid (PHA) detection, orbit determination, and characterization. In a continuous observing campaign of pairs of 15 second exposures of its 3,200 megapixel camera, LSST will cover the entire available sky every three nights in two photometric bands to a depth of V=25 per visit (two exposures), with exquisitely accurate astrometry and photometry. Over the proposed survey lifetime of 10 years, each sky location would be visited about 1000 times. The baseline design satisfies strong constraints on the cadence of observations mandated by PHAs such as closely spaced pairs of observations to link different detections and short exposures to avoid trailing losses. Equally important, due to frequent repeat visits LSST will effectively provide its own follow-up to derive orbits for detected moving objects. Detailed modeling of LSST operations, incorporating real historical weather and seeing data from LSST site at Cerro Pachon, shows that LSST using its baseline design cadence could find 90% of the PHAs with diameters larger than 250 m, and 75% of those greater than 140 m within ten years. However, by optimizing sky coverage, the ongoing simulations suggest that the LSST system, with its first light in 2013, can reach the Congressional mandate of cataloging 90% of PHAs larger than 140m by 2020.Comment: 10 pages, color figures, presented at IAU Symposium 23

    Phenotypic redshifts with self-organizing maps: A novel method to characterize redshift distributions of source galaxies for weak lensing

    Get PDF
    Wide-field imaging surveys such as the Dark Energy Survey (DES) rely on coarse measurements of spectral energy distributions in a few filters to estimate the redshift distribution of source galaxies. In this regime, sample variance, shot noise, and selection effects limit the attainable accuracy of redshift calibration and thus of cosmological constraints. We present a new method to combine wide-field, few-filter measurements with catalogs from deep fields with additional filters and sufficiently low photometric noise to break degeneracies in photometric redshifts. The multi-band deep field is used as an intermediary between wide-field observations and accurate redshifts, greatly reducing sample variance, shot noise, and selection effects. Our implementation of the method uses self-organizing maps to group galaxies into phenotypes based on their observed fluxes, and is tested using a mock DES catalog created from N-body simulations. It yields a typical uncertainty on the mean redshift in each of five tomographic bins for an idealized simulation of the DES Year 3 weak-lensing tomographic analysis of σΔz=0.007\sigma_{\Delta z} = 0.007, which is a 60% improvement compared to the Year 1 analysis. Although the implementation of the method is tailored to DES, its formalism can be applied to other large photometric surveys with a similar observing strategy.Comment: 24 pages, 11 figures; matches version accepted to MNRA

    Early aspects: aspect-oriented requirements engineering and architecture design

    Get PDF
    This paper reports on the third Early Aspects: Aspect-Oriented Requirements Engineering and Architecture Design Workshop, which has been held in Lancaster, UK, on March 21, 2004. The workshop included a presentation session and working sessions in which the particular topics on early aspects were discussed. The primary goal of the workshop was to focus on challenges to defining methodical software development processes for aspects from early on in the software life cycle and explore the potential of proposed methods and techniques to scale up to industrial applications

    Aerospace Medicine and Biology, a continuing bibliography with indexes

    Get PDF
    This bibliography lists 197 reports, articles and other documents introduced into the NASA scientific and technical information system in November 1984

    Beyond the Circle of Life

    Get PDF
    It seems certain to me that I will die and stay dead. By “I”, I mean me, Greg Nixon, this person, this self-identity. I am so intertwined with the chiasmus of lives, bodies, ecosystems, symbolic intersubjectivity, and life on this particular planet that I cannot imagine this identity continuing alone without them. However, one may survive one’s life by believing in universal awareness, perfection, and the peace that passes all understanding. Perhaps, we bring this back with us to the Source from which we began, changing it, enriching it. Once we have lived – if we don’t choose the eternal silence of oblivion by life denial, vanity, indifference, or simple weariness – the Source learns and we awaken within it. Awareness, consciousness, is universal – it comes with the territory – so maybe you will be one of the few prepared to become unexpectedly enlightened after the loss of body and self. You may discover your own apotheosis – something you always were, but after a lifetime of primate experience, now much more. Since you are of the Source and since you have changed from life experience and yet retained the dream of ultimate awakening, plus you have brought those chaotic emotions and memories back to the Source with you (though no longer yours), your life & memories will have mattered. Those who awaken beyond the death of self will have changed Reality

    ASTRAL PROJECTION: THEORIES OF METAPHOR, PHILOSOPHIES OF SCIENCE, AND THE ART O F SCIENTIFIC VISUALIZATION

    Get PDF
    This thesis provides an intellectual context for my work in computational scientific visualization for large-scale public outreach in venues such as digitaldome planetarium shows and high-definition public television documentaries. In my associated practicum, a DVD that provides video excerpts, 1 focus especially on work I have created with my Advanced Visualization Laboratory team at the National Center for Supercomputing Applications (Champaign, Illinois) from 2002-2007. 1 make three main contributions to knowledge within the field of computational scientific visualization. Firstly, I share the unique process 1 have pioneered for collaboratively producing and exhibiting this data-driven art when aimed at popular science education. The message of the art complements its means of production: Renaissance Team collaborations enact a cooperative paradigm of evolutionary sympathetic adaptation and co-creation. Secondly, 1 open up a positive, new space within computational scientific visualization's practice for artistic expression—especially in providing a theory of digi-epistemology that accounts for how this is possible given the limitations imposed by the demands of mapping numerical data and the computational models derived from them onto visual forms. I am concerned not only with liberating artists to enrich audience's aesthetic experiences of scientific visualization, to contribute their own vision, but also with conceiving of audiences as co-creators of the aesthetic significance of the work, to re-envision and re-circulate what they encounter there. Even more commonly than in the age of traditional media, on-line social computing and digital tools have empowered the public to capture and repurpose visual metaphors, circulating them within new contexts and telling new stories with them. Thirdly, I demonstrate the creative power of visaphors (see footnote, p. 1) to provide novel embodied experiences through my practicum as well as my thesis discussion. Specifically, I describe how the visaphors my Renaissance Teams and I create enrich the Environmentalist Story of Science, essentially promoting a counter-narrative to the Enlightenment Story of Science through articulating how humanity participates in an evolving universal consciousness through our embodied interaction and cooperative interdependence within nested, self-producing (autopoetic) systems, from the micro- to the macroscopic. This contemporary account of the natural world, its inter-related systems, and their dynamics may be understood as expressing a creative and generative energy—a kind of consciousness-that transcends the human yet also encompasses it

    Quantum Gravity and Taoist Cosmology: Exploring the Ancient Origins of Phenomenological String Theory

    Get PDF
    In the author’s previous contribution to this journal (Rosen 2015), a phenomenological string theory was proposed based on qualitative topology and hypercomplex numbers. The current paper takes this further by delving into the ancient Chinese origin of phenomenological string theory. First, we discover a connection between the Klein bottle, which is crucial to the theory, and the Ho-t’u, a Chinese number archetype central to Taoist cosmology. The two structures are seen to mirror each other in expressing the psychophysical (phenomenological) action pattern at the heart of microphysics. But tackling the question of quantum gravity requires that a whole family of topological dimensions be brought into play. What we find in engaging with these structures is a closely related family of Taoist forebears that, in concert with their successors, provide a blueprint for cosmic evolution. Whereas conventional string theory accounts for the generation of nature’s fundamental forces via a notion of symmetry breaking that is essentially static and thus unable to explain cosmogony successfully, phenomenological/Taoist string theory entails the dialectical interplay of symmetry and asymmetry inherent in the principle of synsymmetry. This dynamic concept of cosmic change is elaborated on in the three concluding sections of the paper. Here, a detailed analysis of cosmogony is offered, first in terms of the theory of dimensional development and its Taoist (yin-yang) counterpart, then in terms of the evolution of the elemental force particles through cycles of expansion and contraction in a spiraling universe. The paper closes by considering the role of the analyst per se in the further evolution of the cosmos

    The Third Gravitational Lensing Accuracy Testing (GREAT3) Challenge Handbook

    Get PDF
    The GRavitational lEnsing Accuracy Testing 3 (GREAT3) challenge is the third in a series of image analysis challenges, with a goal of testing and facilitating the development of methods for analyzing astronomical images that will be used to measure weak gravitational lensing. This measurement requires extremely precise estimation of very small galaxy shape distortions, in the presence of far larger intrinsic galaxy shapes and distortions due to the blurring kernel caused by the atmosphere, telescope optics, and instrumental effects. The GREAT3 challenge is posed to the astronomy, machine learning, and statistics communities, and includes tests of three specific effects that are of immediate relevance to upcoming weak lensing surveys, two of which have never been tested in a community challenge before. These effects include many novel aspects including realistically complex galaxy models based on high-resolution imaging from space; a spatially varying, physically motivated blurring kernel; and a combination of multiple different exposures. To facilitate entry by people new to the field, and for use as a diagnostic tool, the simulation software for the challenge is publicly available, though the exact parameters used for the challenge are blinded. Sample scripts to analyze the challenge data using existing methods will also be provided. See http://great3challenge.info and http://great3.projects.phys.ucl.ac.uk/leaderboard/ for more information
    corecore