836 research outputs found

    A Flat Rate Financial Transaction Tax to replace all taxes?

    Get PDF
    In this paper I propose a very radical reform of the taxation system, in which a single flat rate financial transaction tax (FTT) is used to replace the vast majority of existing taxes (including VAT, income tax, taxes on profits...). Existing economic data indicates that a flat rate FTT of 1% would generate far more revenue that is currently generated by all existing taxes, and would allow governments to rapidly repay debts and restore programs of public expenditure as well as allowing resources to be allocated to globally important challenges such as third world development, climate change and health issues.Economy, Finacial Crisis, Financial Transaction Tax

    Time Series Analysis of Surface Deformation Associated With Fluid Injection and Induced Seismicity in Timpson, Texas Using DInSAR Methods

    Get PDF
    In recent years, a rise in unconventional oil and gas production in North America has been linked to an increase in seismicity rate in these regions (Ellsworth, 2013). As fluid is pumped into deep formations, the state of stress within the subsurface changes, potentially reactivating pre-existing faults and/or causing subsidence or uplift of the surface. Therefore, hydraulic fracturing and/or fluid disposal injection can significantly increase the seismic hazard to communities and structures surrounding the injection sites (Barnhart et al., 2014). On 17th May 2012 an Mw4.8 earthquake occurred near Timpson, TX and has been linked with wastewater injection operations in the area (Shirzaei et al., 2016). This study aims to spatiotemporally relate, wastewater injection operations to seismicity near Timpson using differential interferometric synthetic aperture radar (DInSAR) analysis. Results are presented as a set of time series, produced using the Multidimensional Small Baseline Subset (MSBAS) InSAR technique, revealing two-dimensional surface deformation

    Evaluation and management implications of uncertainty in a multispecies size-structured model of population and community responses to fishing

    Get PDF
    1. Implementation of an ecosystem approach to fisheries requires advice on trade-offs among fished species and between fisheries yields and biodiversity or food web properties. However, the lack of explicit representation, analysis and consideration of uncertainty in most multispecies models has limited their application in analyses that could support management advice. 2. We assessed the consequences of parameter uncertainty by developing 78 125 multispecies size-structured fish community models, with all combinations of parameters drawn from ranges that spanned parameter values estimated from data and literature. This unfiltered ensemble was reduced to 188 plausible models, the filtered ensemble (FE), by screening outputs against fish abundance data and ecological principles such as requiring species' persistence. 3. Effects of parameter uncertainty on estimates of single-species management reference points for fishing mortality (FMSY, fishing mortality rate providing MSY, the maximum sustainable yield) and biomass (BMSY, biomass at MSY) were evaluated by calculating probability distributions of estimated reference points with the FE. There was a 50% probability that multispecies FMSY could be estimated to within ±25% of its actual value, and a 50% probability that BMSY could be estimated to within ±40% of its actual value. 4. Signal-to-noise ratio was assessed for four community indicators when mortality rates were reduced from current rates to FMSY. The slope of the community size spectrum showed the greatest signal-to-noise ratio, indicating that it would be the most responsive indicator to the change in fishing mortality F. Further, the power of an ongoing international monitoring survey to detect predicted responses of size spectrum slope was higher than for other size-based metrics. 5. Synthesis and applications: Application of the ensemble model approach allows explicit representation of parameter uncertainty and supports advice and management by (i) providing uncertainty intervals for management reference points, (ii) estimating working values of reference points that achieve a defined reduction in risk of not breaching the true reference point, (iii) estimating the responsiveness of population, community, food web and biodiversity indicators to changes in F, (iv) assessing the performance of indicators and monitoring programmes and (v) identifying priorities for data collection and changes to model structure to reduce uncertainty

    Development of a viable concrete printing process

    Get PDF
    A novel Concrete Printing process has been developed, inspired and informed by advances in 3D printing, which has the potential to produce highly customised building components. Whilst still in their infancy, these technologies could create a new era of architecture that is better adapted to the environment and integrated with engineering function. This paper describes the development of a viable concrete printing process with a practical example in designing and manufacturing a concrete component (called Wonder Bench) that includes service voids and reinforcement. The challenges met and those still to be overcome particularly in the evaluation of the manufacturing tolerances of prints are also discussed

    Learning complex cell invariance from natural videos: A plausibility proof

    Get PDF
    One of the most striking feature of the cortex is its ability to wire itself. Understanding how the visual cortex wires up through development and how visual experience refines connections into adulthood is a key question for Neuroscience. While computational models of the visual cortex are becoming increasingly detailed, the question of how such architecture could self-organize through visual experience is often overlooked. Here we focus on the class of hierarchical feedforward models of the ventral stream of the visual cortex, which extend the classical simple-to-complex cells model by Hubel and Wiesel (1962) to extra-striate areas, and have been shown to account for a host of experimental data. Such models assume two functional classes of simple and complex cells with specific predictions about their respective wiring and resulting functionalities.In these networks, the issue of learning, especially for complex cells, is perhaps the least well understood. In fact, in most of these models, the connectivity between simple and complex cells is not learned butrather hard-wired. Several algorithms have been proposed for learning invariances at the complex cell level based on a trace rule to exploit the temporal continuity of sequences of natural images, but very few can learn from natural cluttered image sequences.Here we propose a new variant of the trace rule that only reinforces the synapses between the most active cells, and therefore can handle cluttered environments. The algorithm has so far been developed and tested at the level of V1-like simple and complex cells: we verified that Gabor-like simple cell selectivity could emerge from competitive Hebbian learning. In addition, we show how the modified trace rule allows the subsequent complex cells to learn to selectively pool over simple cells with the same preferred orientation but slightly different positions thus increasing their tolerance to the precise position of the stimulus within their receptive fields

    Utilising the risky shift phenomenon in construction project management

    Get PDF
    The risky shift phenomenon predicts that groups are happier to live with uncertainty than are the individuals that comprise the group. This paper reports on the replication of the Wallach et al. (1962) 12 question, choice dilemma questionnaire which highlights the risky shift and its implications for construction project risk management

    Management and planning of a collaborative construction planning process

    Get PDF
    Construction planning is performed in a multi-disciplinary environment in which it is crucial to explore interdependencies, manage the uncertainty of the information exchange and the understanding of the context. Current construction planning often works on a “throw over the wall” basis - plans are developed only or mainly for control purpose, and ignore the “how” aspect. Construction method planning is treated as a linear process and isolated from information and logistics management. Planners are often puzzled by information; they usually receive a large amounts of formal and informal communications with different formats, some of which are not relevant to their role. The quality of the information received is also often poor (i.e. incomplete design information). In order to deal with the uncertainty caused by insufficient information, guesses are frequently made in the planning process, which neither the initial planner, nor the downstream planner will later check. They are usually ignored and left until execution of the plan, when the problems reveal themselves. This paper argues the importance of effective management of information flow in a planning process and the need to improve the management and planning of construction planning. A collaborative planning process model using a dependency structure matrix tool to manage and optimize the construction planning process is presented

    Let’s win Madrid : radical democracy and prefigurative constitutionality in the new municipalism

    Get PDF
    The thesis proposes prefigurative constitutionality as a novel approach to the production of constitutional and political theory and the study of non-state political actors in the context of strategies for radical democracy. Prefigurative constitutionality requires a bespoke understanding of prefiguration itself, as a transversal complex of types and functions of political action, applicable far beyond the limits of ‘prefigurative politics’ narrowly conceived. It opens constitutional theory to the constitutive, self-valorising practices of movements and parties, discovering in their prefigurative constitutional practices both implicit critiques of mainstream constitutionalism, and emergent constitutional models of future democracies to come. It poses the vital challenge for prefigurative practices to recognise and refine their constitutional practices. It reflects back onto questions of methodology, demanding a novel adaptation of grounded theory methods within a critical theoretical framework. The prefigurative-constitutional lens is here applied to the experience of ‘new municipalism’ in Madrid, from its roots in the 15M movement, to the separate constitution of municipalist platform Ganemos Madrid, left populist party Podemos, and their attempted confluence in Ahora Madrid, the formation that would ultimately contest and win Madrid’s 2015 municipal election. Reading Ganemos’ asambleario constitutionality through the work of Antonio Negri, and Podemos’ populist constitutionality through the work of Ernesto Laclau, reveals novel constitutional models as well as offering deeper insight into the logics of their respective political strategies. Simultaneously, the key concepts of prefiguration and hegemony are critically developed. In seeking to explain the weakness of Ahora Madrid’s constitution, the concept of leadership becomes crucial. Beyond its explanatory function, leadership also constructs the theoretical ground on which prefiguration and hegemony, Negri and Laclau, Ganemos and Podemos all most productively intersect. The analysis culminates in the discover of prefiguration and hegemony’s complementarity, and the need for their confluence in a unified theory

    Contribution à la conception d'architecture de calcul auto-adaptative intégrant des nanocomposants neuromorphiques et applications potentielles

    Get PDF
    Dans cette thèse, nous étudions les applications potentielles des nano-dispositifs mémoires émergents dans les architectures de calcul. Nous montrons que des architectures neuro-inspirées pourraient apporter l'efficacité et l'adaptabilité nécessaires à des applications de traitement et de classification complexes pour la perception visuelle et sonore. Cela, à un cout moindre en termes de consommation énergétique et de surface silicium que les architectures de type Von Neumann, grâce à une utilisation synaptique de ces nano-dispositifs. Ces travaux se focalisent sur les dispositifs dit memristifs , récemment (ré)-introduits avec la découverte du memristor en 2008 et leur utilisation comme synapse dans des réseaux de neurones impulsionnels. Cela concerne la plupart des technologies mémoire émergentes : mémoire à changement de phase Phase-Change Memory (PCM), Conductive-Bridging RAM (CBRAM), mémoire résistive Resistive RAM (RRAM)... Ces dispositifs sont bien adaptés pour l'implémentation d'algorithmes d'apprentissage non supervisés issus des neurosciences, comme Spike-Timing-Dependent Plasticity (STDP), ne nécessitant que peu de circuit de contrôle. L'intégration de dispositifs memristifs dans des matrices, ou crossbar , pourrait en outre permettre d'atteindre l'énorme densité d'intégration nécessaire pour ce type d'implémentation (plusieurs milliers de synapses par neurone), qui reste hors de portée d'une technologie purement en Complementary Metal Oxide Semiconductor (CMOS). C'est l'une des raisons majeures pour lesquelles les réseaux de neurones basés sur la technologie CMOS n'ont pas eu le succès escompté dans les années 1990. A cela s'ajoute la relative complexité et inefficacité de l'algorithme d'apprentissage de rétro-propagation du gradient, et ce malgré tous les aspects prometteurs des architectures neuro-inspirées, tels que l'adaptabilité et la tolérance aux fautes. Dans ces travaux, nous proposons des modèles synaptiques de dispositifs memristifs et des méthodologies de simulation pour des architectures les exploitant. Des architectures neuro-inspirées de nouvelle génération sont introduites et simulées pour le traitement de données naturelles. Celles-ci tirent profit des caractéristiques synaptiques des nano-dispositifs memristifs, combinées avec les dernières avancées dans les neurosciences. Nous proposons enfin des implémentations matérielles adaptées pour plusieurs types de dispositifs. Nous évaluons leur potentiel en termes d'intégration, d'efficacité énergétique et également leur tolérance à la variabilité et aux défauts inhérents à l'échelle nano-métrique de ces dispositifs. Ce dernier point est d'une importance capitale, puisqu'il constitue aujourd'hui encore la principale difficulté pour l'intégration de ces technologies émergentes dans des mémoires numériques.In this thesis, we study the potential applications of emerging memory nano-devices in computing architecture. More precisely, we show that neuro-inspired architectural paradigms could provide the efficiency and adaptability required in some complex image/audio processing and classification applications. This, at a much lower cost in terms of power consumption and silicon area than current Von Neumann-derived architectures, thanks to a synaptic-like usage of these memory nano-devices. This work is focusing on memristive nano-devices, recently (re-)introduced by the discovery of the memristor in 2008 and their use as synapses in spiking neural network. In fact, this includes most of the emerging memory technologies: Phase-Change Memory (PCM), Conductive-Bridging RAM (CBRAM), Resistive RAM (RRAM)... These devices are particularly suitable for the implementation of natural unsupervised learning algorithms like Spike-Timing-Dependent Plasticity (STDP), requiring very little control circuitry.The integration of memristive devices in crossbar array could provide the huge density required by this type of architecture (several thousand synapses per neuron), which is impossible to match with a CMOS-only implementation. This can be seen as one of the main factors that hindered the rise of CMOS-based neural network computing architectures in the nineties, among the relative complexity and inefficiency of the back-propagation learning algorithm, despite all the promising aspects of such neuro-inspired architectures, like adaptability and fault-tolerance. In this work, we propose synaptic models for memristive devices and simulation methodologies for architectural design exploiting them. Novel neuro-inspired architectures are introduced and simulated for natural data processing. They exploit the synaptic characteristics of memristives nano-devices, along with the latest progresses in neurosciences. Finally, we propose hardware implementations for several device types. We assess their scalability and power efficiency potential, and their robustness to variability and faults, which are unavoidable at the nanometric scale of these devices. This last point is of prime importance, as it constitutes today the main difficulty for the integration of these emerging technologies in digital memories.PARIS11-SCD-Bib. électronique (914719901) / SudocSudocFranceF

    Defining an AEC research agenda - a vision from the UK

    Get PDF
    This paper outlines the current research agenda for construction, as seen from a UK perspective, and the associated government initiatives. It then presents a vision of how design and construction will be undertaken and the implications for the management of this activity, structured around four themes of management, technology, the role of clients and the role of industry and the professions. The research activity of the Department and relevant staff are outlined and the paper concludes with a brief description of how we are taking forward our industrially based research
    corecore