1,403 research outputs found
Visualising urban gentrification and displacement in Greater London
Gentrification has long been a contentious issue which has prompted debate among scholars due to variations in its location, timing, context and types of measurements used. Therefore, it is worth seeking a simple and effective approach to measure the processes of gentrification, which enables comparative studies to be conducted across different cities around the world. Using six sets of thematic data from 2001 and 2011 at the neighbourhood level, this study proposes five types of gentrification and displacement by using Chapple and Zuk’s theoretical framework. London was selected as a case study. The results show that gentrification was sweeping in many ways during the 2000s in London, particularly in Inner East London. Some areas in North West London are identified as vulnerable neighbourhoods at risk of displacement and gentrification. Furthermore, it was found that most of the neighbourhoods experiencing ongoing displacement are concentrated in Outer London and Inner South London. The typology provides a useful starting point for planners and policymakers to gain deeper insights into the progress of gentrification in London. Additionally, this work can serve as an example to illustrate the potential for using similar types of open source code and census data to estimate the degree of gentrification in other cities
Visualising urban gentrification and displacement in Greater London
Gentrification has long been a contentious issue which has prompted debate among scholars due to variations in its location, timing, context and types of measurements used. Therefore, it is worth seeking a simple and effective approach to measure the processes of gentrification, which enables comparative studies to be conducted across different cities around the world. Using six sets of thematic data from 2001 and 2011 at the neighbourhood level, this study proposes five types of gentrification and displacement by using Chapple and Zuk’s theoretical framework. London was selected as a case study. The results show that gentrification was sweeping in many ways during the 2000s in London, particularly in Inner East London. Some areas in North West London are identified as vulnerable neighbourhoods at risk of displacement and gentrification. Furthermore, it was found that most of the neighbourhoods experiencing ongoing displacement are concentrated in Outer London and Inner South London. The typology provides a useful starting point for planners and policymakers to gain deeper insights into the progress of gentrification in London. Additionally, this work can serve as an example to illustrate the potential for using similar types of open source code and census data to estimate the degree of gentrification in other cities
A predictive processing theory of sensorimotor contingencies: explaining the puzzle of perceptual presence and its absence in synesthesia
Normal perception involves experiencing objects within perceptual scenes as real, as existing in the world. This property of “perceptual presence” has motivated “sensorimotor theories” which understand perception to involve the mastery of sensorimotor contingencies. However, the mechanistic basis of sensorimotor contingencies and their mastery has remained unclear. Sensorimotor theory also struggles to explain instances of perception, such as synesthesia, that appear to lack perceptual presence and for which relevant sensorimotor contingencies are difficult to identify. On alternative “predictive processing” theories, perceptual content emerges from probabilistic inference on the external causes of sensory signals, however, this view has addressed neither the problem of perceptual presence nor synesthesia. Here, I describe a theory of predictive perception of sensorimotor contingencies which (1) accounts for perceptual presence in normal perception, as well as its absence in synesthesia, and (2) operationalizes the notion of sensorimotor contingencies and their mastery. The core idea is that generative models underlying perception incorporate explicitly counterfactual elements related to how sensory inputs would change on the basis of a broad repertoire of possible actions, even if those actions are not performed. These “counterfactually-rich” generative models encode sensorimotor contingencies related to repertoires of sensorimotor dependencies, with counterfactual richness determining the degree of perceptual presence associated with a stimulus. While the generative models underlying normal perception are typically counterfactually rich (reflecting a large repertoire of possible sensorimotor dependencies), those underlying synesthetic concurrents are hypothesized to be counterfactually poor. In addition to accounting for the phenomenology of synesthesia, the theory naturally accommodates phenomenological differences between a range of experiential states including dreaming, hallucination, and the like. It may also lead to a new view of the (in)determinacy of normal perception
Can biological quantum networks solve NP-hard problems?
There is a widespread view that the human brain is so complex that it cannot
be efficiently simulated by universal Turing machines. During the last decades
the question has therefore been raised whether we need to consider quantum
effects to explain the imagined cognitive power of a conscious mind.
This paper presents a personal view of several fields of philosophy and
computational neurobiology in an attempt to suggest a realistic picture of how
the brain might work as a basis for perception, consciousness and cognition.
The purpose is to be able to identify and evaluate instances where quantum
effects might play a significant role in cognitive processes.
Not surprisingly, the conclusion is that quantum-enhanced cognition and
intelligence are very unlikely to be found in biological brains. Quantum
effects may certainly influence the functionality of various components and
signalling pathways at the molecular level in the brain network, like ion
ports, synapses, sensors, and enzymes. This might evidently influence the
functionality of some nodes and perhaps even the overall intelligence of the
brain network, but hardly give it any dramatically enhanced functionality. So,
the conclusion is that biological quantum networks can only approximately solve
small instances of NP-hard problems.
On the other hand, artificial intelligence and machine learning implemented
in complex dynamical systems based on genuine quantum networks can certainly be
expected to show enhanced performance and quantum advantage compared with
classical networks. Nevertheless, even quantum networks can only be expected to
efficiently solve NP-hard problems approximately. In the end it is a question
of precision - Nature is approximate.Comment: 38 page
The IDV source J1128+5925, a new candidate for annual modulation?
Short time-scale radio variations of compact extragalactic radio sources,
known as IntraDay Variability, can be explained in at least some sources by a
source-extrinsic effect, in which the variations are interpreted as
scintillation of radio waves caused by the turbulent ISM of the Milky Way. One
of the most convincing observational arguments in favour of propagation-induced
variability is the so called annual modulation of the characteristic
variability time-scale, which is due to the orbital motion of the Earth. Data
for the recently discovered and highly variable IDV source J1128+5925 are
presented. We study the frequency and time dependence of the IDV in this
compact quasar. We measure the characteristic variability time-scale of the IDV
throughout the year, and analyze whether the observed changes in the
variability time-scale are consistent with annual modulation. We monitored the
flux density variability of J1128+5925 with dense time sampling between 2.7 and
10.45GHz with the 100m Effelsberg radio telescope of the MPIfR and with the 25m
Urumqi radio telescope. From ten observing sessions, we determine the
variability characteristics and time-scales. The observed pronounced changes of
the variability time-scale of J1128+5925 are modelled with an anisotropic
annual modulation model. The observed frequency dependence of the variation is
in good agreement with the prediction from interstellar scintillation. Adopting
a simple model for the annual modulation model and using also the frequency
dependence of the IDV, we derive a lower limit to the distance of the
scattering screen and an upper limit to the scintillating source size. The
latter is found to be consistent with the measured core size from VLBI.Comment: 15 pages, 9 figures Accepted for publication in Astronomy and
Astrophysic
Collaboration between Science and Religious Education teachers in Scottish Secondary schools
The article reports on quantitative research that examines: (1) the current practice in collaboration; and (2) potential for collaboration between Science and Religious Education teachers in a large sample of Scottish secondary schools. The authors adopt and adapt three models (conflict; concordat and consonance) to interrogate the relationship between science and religion (and the perceived relation between these two subjects in schools) (Astley and Francis 2010). The findings indicate that there is evidence of limited collaboration and, in a few cases, a dismissive attitude towards collaboration (conflict and concordat and very weak consonance). There is, however, evidence of a genuine aspiration for greater collaboration among many teachers (moving towards a more robust consonance model). The article concludes by discussing a number of key factors that must be realised for this greater collaboration to be enacted
Protocol for a pragmatic feasibility randomised controlled trial of peer coaching for adults with long-term conditions: PEER CONNECT.
INTRODUCTION: Patients with low levels of knowledge, skills and confidence to manage their health and well-being (activation) are more likely to have unmet health needs, delay seeking healthcare and need emergency care. National Health Service England estimates that this may be applicable to 25%-40% of patients with long-term health conditions. Volunteer peer coaching may support people to increase their level of activation. This form of intervention may be particularly effective for people with low levels of activation. METHODS AND ANALYSIS: This single site, two-arm randomised controlled trial has been designed to assess the feasibility of conducting a definitive trial of volunteer peer health and well-being coaching for people with long-term health conditions (multiple sclerosis, rheumatic diseases or chronic pain) and low activation. Feasibility outcomes include recruitment and retention rates, and intervention adherence. We will measure patient activation, mental health and well-being as potential outcomes for a definitive trial. These outcomes will be summarised descriptively for each time point by allocated group and help to inform sample size calculation for the definitive trial. Criteria for progression to a full trial will be used. ETHICS AND DISSEMINATION: Ethical approval has been granted by the London - Surrey Research Ethics Committee, reference 21/LO/0715. Results from this feasibility trial will be shared directly with participants, presented at local, regional and national conferences and published in an open-access journal. TRIAL REGISTRATION NUMBER: ISRCTN12623577
Humour in Nietzsche's style
Nietzsche's writing style is designed to elicit affective responses in his readers. Humour is one of the most common means by which he attempts to engage his readers' affects. In this article, I explain how and why Nietzsche uses humour to achieve his philosophical ends. The article has three parts. In part 1, I reject interpretations of Nietzsche's humour on which he engages in self‐parody in order to mitigate the charge of decadence or dogmatism by undermining his own philosophical authority. In part 2, I look at how Nietzsche uses humour and laughter as a critical tool in his polemic against traditional morality. I argue that one important way in which Nietzsche uses humour is as a vehicle for enhancing the effectiveness of his ad hominem arguments. In part 3, I show how Nietzsche exploits humour's social dimension in order to find and cultivate what he sees as the right kinds of readers for his works
First Results from MASIV: The Micro-Arcsecond Scintillation-Induced Variability Survey
We are undertaking a large-scale, Micro-Arcsecond Scintillation-Induced
Variability (MASIV) survey of the northern sky, Dec > 0 deg, at 4.9 GHz with
the VLA. Our objective is to construct a sample of 100 to 150 scintillating
extragalactic sources with which to examine both the microarcsecond structure
and the parent populations of these sources, and to probe the turbulent
interstellar medium responsible for the scintillation. We report on our first
epoch of observations which revealed variability on timescales ranging from
hours to days in 85 of 710 compact flat-spectrum sources. The number of highly
variable sources, those with RMS flux density variations greater than 4% of the
mean, increases with decreasing source flux density but rapid, large amplitude
variables such as J1819+3845 are very rare. When compared with a model for the
scintillation due to irregularities in a 500 pc thick electron layer, our
preliminary results indicate maximum brightness temperatures ~10E+12 K, similar
to those obtained from VLBI surveys even though interstellar scintillation is
not subject to the same angular resolution limit.Comment: 18 pages, 5 figures. To appear in the Astronomical Journa
Simulating Electron Transport and Synchrotron Emission in Radio Galaxies: Shock Acceleration and Synchrotron Aging in Axis-Symmetric Flows
We introduce a simple and economical but effective method for including
relativistic electron transport in multi-dimensional simulations of radio
galaxies. The method is designed to follow explicitly diffusive acceleration at
shocks, and, in smooth flows 2nd order Fermi acceleration plus adiabatic and
synchrotron cooling. We are able to follow both the spatial and energy
distributions of the electrons, so that direct synchrotron emission properties
can be modeled in time-dependent flows for the first time.
Here we present first results in the form of some axis-symmetric MHD
simulations of Mach 20 light jet flows. These show clearly the importance of
nonsteady terminal shocks that develop in such flows even when the jet inflow
is steady. As a result of this and other consequences of the fundamentally
driven character of jets, we find complex patterns of emissivities and
synchrotron spectra, including steep spectral gradients in hot spots, islands
of distinct spectra electrons within the lobes and spectral gradients coming
from the dynamical histories of a given flow element rather than from
synchrotron aging of the embedded electrons. In addition, spectral aging in the
lobes tends to proceed more slowly than one would estimate from regions of high
emissivity.Comment: 30 pages of Latex generated text plus 7 figures in gif format.
Accepted for publication in the Astrophysical Journal. High resolution
postscript figures available through anonymous ftp at
ftp://ftp.msi.umn.edu/pub/users/twj/RGje
- …