704 research outputs found

    The Development of a Common Investment Appraisal for Urban Transport Projects.

    Get PDF
    In December 1990 we were invited by Birmingham City Council and Centro to submit a proposal for an introductory study of the development of a common investment appraisal for urban transport projects. Many of the issues had arisen during the Birmingham Integrated Transport Study (BITS) in which we were involved, and in the subsequent assessment of light rail schemes of which we have considerable experience. In subsequent discussion, the objectives were identified as being:- (i) to identify, briefly, the weaknesses with existing appraisal techniques; (ii) to develop proposals for common methods for the social cost-benefit appraisal of both urban road and rail schemes which overcome these weaknesses; (iii) to develop complementary and consistent proposals for common methods of financial appraisal of such projects; (iv) to develop proposals for variants of the methods in (ii) and (iii) which are appropriate to schemes of differing complexity and cost; (v) to consider briefly methods of treating externalities, and performance against other public sector goals, which are consistent with those developed under (ii) to (iv) above; (vi) to recommend work to be done in the second phase of the study (beyond March 1991) on the provision of input to such evaluation methods from strategic and mode-specific models, and on the testing of the proposed evaluation methods. Such issues are particularly topical at present, and we have been able to draw, in our study, on experience of:- (i) evaluation methods developed for BITS and subsequent integrated transport studies (MVA) (ii) evaluation of individual light rail and heavy rail investment projects (ITS,MVA); (iii) the recommendations of AMA in "Changing Gear" (iv) advice to IPPR on appraisal methodology (ITS); (v) submissions to the House of Commons enquiry into "Roads for the Future" (ITS); (vi) advice to the National Audit Office (ITS) (vii) involvement in the SACTRA study of urban road appraisal (MVA, ITS

    Linking working memory and long-term memory: A computational model of the learning of new words

    Get PDF
    The nonword repetition (NWR) test has been shown to be a good predictor of children’s vocabulary size. NWR performance has been explained using phonological working memory, which is seen as a critical component in the learning of new words. However, no detailed specification of the link between phonological working memory and long-term memory (LTM) has been proposed. In this paper, we present a computational model of children’s vocabulary acquisition (EPAM-VOC) that specifies how phonological working memory and LTM interact. The model learns phoneme sequences, which are stored in LTM and mediate how much information can be held in working memory. The model’s behaviour is compared with that of children in a new study of NWR, conducted in order to ensure the same nonword stimuli and methodology across ages. EPAM-VOC shows a pattern of results similar to that of children: performance is better for shorter nonwords and for wordlike nonwords, and performance improves with age. EPAM-VOC also simulates the superior performance for single consonant nonwords over clustered consonant nonwords found in previous NWR studies. EPAM-VOC provides a simple and elegant computational account of some of the key processes involved in the learning of new words: it specifies how phonological working memory and LTM interact; makes testable predictions; and suggests that developmental changes in NWR performance may reflect differences in the amount of information that has been encoded in LTM rather than developmental changes in working memory capacity. Keywords: EPAM, working memory, long-term memory, nonword repetition, vocabulary acquisition, developmental change

    Challenges in Complex Systems Science

    Get PDF
    FuturICT foundations are social science, complex systems science, and ICT. The main concerns and challenges in the science of complex systems in the context of FuturICT are laid out in this paper with special emphasis on the Complex Systems route to Social Sciences. This include complex systems having: many heterogeneous interacting parts; multiple scales; complicated transition laws; unexpected or unpredicted emergence; sensitive dependence on initial conditions; path-dependent dynamics; networked hierarchical connectivities; interaction of autonomous agents; self-organisation; non-equilibrium dynamics; combinatorial explosion; adaptivity to changing environments; co-evolving subsystems; ill-defined boundaries; and multilevel dynamics. In this context, science is seen as the process of abstracting the dynamics of systems from data. This presents many challenges including: data gathering by large-scale experiment, participatory sensing and social computation, managing huge distributed dynamic and heterogeneous databases; moving from data to dynamical models, going beyond correlations to cause-effect relationships, understanding the relationship between simple and comprehensive models with appropriate choices of variables, ensemble modeling and data assimilation, modeling systems of systems of systems with many levels between micro and macro; and formulating new approaches to prediction, forecasting, and risk, especially in systems that can reflect on and change their behaviour in response to predictions, and systems whose apparently predictable behaviour is disrupted by apparently unpredictable rare or extreme events. These challenges are part of the FuturICT agenda

    The credibility challenge for global fluvial flood risk analysis

    Get PDF
    Quantifying flood hazard is an essential component of resilience planning, emergency response, and mitigation, including insurance. Traditionally undertaken at catchment and national scales, recently, efforts have intensified to estimate flood risk globally to better allow consistent and equitable decision making. Global flood hazard models are now a practical reality, thanks to improvements in numerical algorithms, global datasets, computing power, and coupled modelling frameworks. Outputs of these models are vital for consistent quantification of global flood risk and in projecting the impacts of climate change. However, the urgency of these tasks means that outputs are being used as soon as they are made available and before such methods have been adequately tested. To address this, we compare multi-probability flood hazard maps for Africa from six global models and show wide variation in their flood hazard, economic loss and exposed population estimates, which has serious implications for model credibility. While there is around 30-40% agreement in flood extent, our results show that even at continental scales, there are significant differences in hazard magnitude and spatial pattern between models, notably in deltas, arid/semi-arid zones and wetlands. This study is an important step towards a better understanding of modelling global flood hazard, which is urgently required for both current risk and climate change projections

    Development of a web-based, guided self-help, acceptance and commitment therapy-based intervention for weight loss maintenance: evidence-, theory-, and person-based approach.

    Get PDF
    Background: The long-term impact and cost-effectiveness of weight management programs depend on posttreatment weight maintenance. There is growing evidence that interventions based on third-wave cognitive behavioral therapy, particularly acceptance and commitment therapy (ACT), could improve long-term weight management; however, these interventions are typically delivered face-to-face by psychologists, which limits the scalability of these types of intervention. Objective: The aim of this study is to use an evidence-, theory-, and person-based approach to develop an ACT-based intervention for weight loss maintenance that uses digital technology and nonspecialist guidance to minimize the resources needed for delivery at scale. Methods: Intervention development was guided by the Medical Research Council framework for the development of complex interventions in health care, Intervention Mapping Protocol, and a person-based approach for enhancing the acceptability and feasibility of interventions. Work was conducted in two phases: phase 1 consisted of collating and analyzing existing and new primary evidence and phase 2 consisted of theoretical modeling and intervention development. Phase 1 included a synthesis of existing evidence on weight loss maintenance from previous research, a systematic review and network meta-analysis of third-wave cognitive behavioral therapy interventions for weight management, a qualitative interview study of experiences of weight loss maintenance, and the modeling of a justifiable cost for a weight loss maintenance program. Phase 2 included the iterative development of guiding principles, a logic model, and the intervention design and content. Target user and stakeholder panels were established to inform each phase of development, and user testing of successive iterations of the prototype intervention was conducted. Results: This process resulted in a guided self-help ACT-based intervention called SWiM (Supporting Weight Management). SWiM is a 4-month program consisting of weekly web-based sessions for 13 consecutive weeks followed by a 4-week break for participants to reflect and practice their new skills and a final session at week 18. Each session consists of psychoeducational content, reflective exercises, and behavioral experiments. SWiM includes specific sessions on key determinants of weight loss maintenance, including developing skills to manage high-risk situations for lapses, creating new helpful habits, breaking old unhelpful habits, and learning to manage interpersonal relationships and their impact on weight management. A trained, nonspecialist coach provides guidance for the participants through the program with 4 scheduled 30-minute telephone calls and 3 further optional calls. Conclusions: This comprehensive approach facilitated the development of an intervention that is based on scientific theory and evidence for supporting people with weight loss maintenance and is grounded in the experiences of the target users and the context in which it is intended to be delivered. The intervention will be refined based on the findings of a planned pilot randomized controlled trial

    Stellar structure and compact objects before 1940: Towards relativistic astrophysics

    Full text link
    Since the mid-1920s, different strands of research used stars as "physics laboratories" for investigating the nature of matter under extreme densities and pressures, impossible to realize on Earth. To trace this process this paper is following the evolution of the concept of a dense core in stars, which was important both for an understanding of stellar evolution and as a testing ground for the fast-evolving field of nuclear physics. In spite of the divide between physicists and astrophysicists, some key actors working in the cross-fertilized soil of overlapping but different scientific cultures formulated models and tentative theories that gradually evolved into more realistic and structured astrophysical objects. These investigations culminated in the first contact with general relativity in 1939, when J. Robert Oppenheimer and his students George Volkoff and Hartland Snyder systematically applied the theory to the dense core of a collapsing neutron star. This pioneering application of Einstein's theory to an astrophysical compact object can be regarded as a milestone in the path eventually leading to the emergence of relativistic astrophysics in the early 1960s.Comment: 83 pages, 4 figures, submitted to the European Physical Journal

    Search for direct production of charginos and neutralinos in events with three leptons and missing transverse momentum in √s = 7 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for the direct production of charginos and neutralinos in final states with three electrons or muons and missing transverse momentum is presented. The analysis is based on 4.7 fb−1 of proton–proton collision data delivered by the Large Hadron Collider and recorded with the ATLAS detector. Observations are consistent with Standard Model expectations in three signal regions that are either depleted or enriched in Z-boson decays. Upper limits at 95% confidence level are set in R-parity conserving phenomenological minimal supersymmetric models and in simplified models, significantly extending previous results

    Jet size dependence of single jet suppression in lead-lead collisions at sqrt(s(NN)) = 2.76 TeV with the ATLAS detector at the LHC

    Get PDF
    Measurements of inclusive jet suppression in heavy ion collisions at the LHC provide direct sensitivity to the physics of jet quenching. In a sample of lead-lead collisions at sqrt(s) = 2.76 TeV corresponding to an integrated luminosity of approximately 7 inverse microbarns, ATLAS has measured jets with a calorimeter over the pseudorapidity interval |eta| < 2.1 and over the transverse momentum range 38 < pT < 210 GeV. Jets were reconstructed using the anti-kt algorithm with values for the distance parameter that determines the nominal jet radius of R = 0.2, 0.3, 0.4 and 0.5. The centrality dependence of the jet yield is characterized by the jet "central-to-peripheral ratio," Rcp. Jet production is found to be suppressed by approximately a factor of two in the 10% most central collisions relative to peripheral collisions. Rcp varies smoothly with centrality as characterized by the number of participating nucleons. The observed suppression is only weakly dependent on jet radius and transverse momentum. These results provide the first direct measurement of inclusive jet suppression in heavy ion collisions and complement previous measurements of dijet transverse energy imbalance at the LHC.Comment: 15 pages plus author list (30 pages total), 8 figures, 2 tables, submitted to Physics Letters B. All figures including auxiliary figures are available at http://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PAPERS/HION-2011-02

    Measurement of the polarisation of W bosons produced with large transverse momentum in pp collisions at sqrt(s) = 7 TeV with the ATLAS experiment

    Get PDF
    This paper describes an analysis of the angular distribution of W->enu and W->munu decays, using data from pp collisions at sqrt(s) = 7 TeV recorded with the ATLAS detector at the LHC in 2010, corresponding to an integrated luminosity of about 35 pb^-1. Using the decay lepton transverse momentum and the missing transverse energy, the W decay angular distribution projected onto the transverse plane is obtained and analysed in terms of helicity fractions f0, fL and fR over two ranges of W transverse momentum (ptw): 35 < ptw < 50 GeV and ptw > 50 GeV. Good agreement is found with theoretical predictions. For ptw > 50 GeV, the values of f0 and fL-fR, averaged over charge and lepton flavour, are measured to be : f0 = 0.127 +/- 0.030 +/- 0.108 and fL-fR = 0.252 +/- 0.017 +/- 0.030, where the first uncertainties are statistical, and the second include all systematic effects.Comment: 19 pages plus author list (34 pages total), 9 figures, 11 tables, revised author list, matches European Journal of Physics C versio
    • 

    corecore