7,159 research outputs found

    Consistency of Urban Background Black Carbon Concentration Measurements by Portable AE51 and Reference AE22 Aethalometers: Effect of Corrections for Filter Loading

    Get PDF
    Monitoring exposure to black carbon with portable devices is an important part of researching the health impacts of combustion-related air pollutants. We collected 786 hourly averaged equivalent black carbon (eBC) measurements from co-located duplicate portable AE51 Aethalometers and a UK Government reference AE22 Aethalometer (the data for the latter were corrected for filter darkening effects using a standard procedure), at an urban background site in Glasgow, UK. The AE51 and the reference concentrations were highly correlated (R 2 ≥ 0.87) for the combined deployment periods. The application of a previously reported method for correcting the AE51’s underestimation of concentrations, associated with filter loading, generally led to an overestimation of values (specifically, the normalised mean bias values for the two AE51s increased from –2% and +3% to +14% and +25% across the full range of measurements after correction). We found only limited and inconsistent evidence that the AE51 Aethalometers (attenuation [AE51_ATN] ≤ ~52) underestimated the eBC concentrations compared to the reference measurements. Thus, our observations indicate that the AE51 can achieve close agreement with the reference AE22 monitor without applying corrections for filter loading at relatively low AE51_ATN values in environments with low eBC concentrations

    Testing performance of standards-based protocols in DPM

    Get PDF
    In the interests of the promotion of the increased use of non-proprietary protocols in grid storage systems, we perform tests on the performance of WebDAV and pNFS transport with the DPM storage solution. We find that the standards-based protocols behave similarly to the proprietary standards currently in use, despite encountering some issues with the state of the implementation itself. We thus conclude that there is no performance-based reason to avoid using such protocols for data management in future

    Metaphors in the Wealth of Nations

    Get PDF
    This paper reconstructs the ways in which metaphors are used in the text of “The Wealth of Nations”. Its claims are: a) metaphor statements are basically similar to those in the “Theory of the Moral Sentiments”; b) the metaphors’ ‘primary subjects’ refer to mechanics, hydraulics, blood circulation, agriculture, medicine; c) metaphors may be lumped together into a couple of families, the family of mechanical analogies, and that of iatro-political analogies. Further claims are: a basic physico-moral analogy is the framework for Smith’s psychological theory as well as for his overall social theory and for his theory of market mechanisms; a iatro-mechanical analogy is as pervasive as the physico-moral analogy and provides the framework for his overall evolutionary theory of society; the invisible-hand simile relies on the physico-moral analogy, and elaborates on the role of vis attractiva and vis a tergo in mechanics

    Humans, robots and values

    Get PDF
    The issue of machines replacing humans dates back to the dawn of industrialisation. In this paper we examine what is fundamental in the distinction between human and robotic work by reflecting on the work of the classical political economists and engineers. We examine the relationship between the ideas of machine work and human work on the part of Marx and Watt as well as their role in the creation of economic value. We examine the extent to which artificial power sources could feasibly substitute for human effort in their arguments. We go on to examine the differing views of Smith and Marx with respect to the economic effort contributed by animals and consider whether the philosophical distinction made between human and non-human work can be sustained in the light of modern biological research. We emphasise the non-universal character of animal work before going on to discuss the ideas of universal machines in Capek and Turing giving as a counter example a cloth-folding robot being developed in our School. We then return to Watt and discuss the development of thermodynamics and information theory. We show how recent research has led to a unification not only of these fields but also a unitary understanding of the labour process and the value-creation process. We look at the implications of general robotisation for profitability and the future of capitalism. For this we draw on the work of von Neumann not only on computers but also in economics to point to the {\em real} threat posed by robots

    Individualism

    Get PDF

    Analysing I/O bottlenecks in LHC data analysis on grid storage resources

    Get PDF
    We describe recent I/O testing frameworks that we have developed and applied within the UK GridPP Collaboration, the ATLAS experiment and the DPM team, for a variety of distinct purposes. These include benchmarking vendor supplied storage products, discovering scaling limits of SRM solutions, tuning of storage systems for experiment data analysis, evaluating file access protocols, and exploring I/O read patterns of experiment software and their underlying event data models. With multiple grid sites now dealing with petabytes of data, such studies are becoming essential. We describe how the tests build, and improve, on previous work and contrast how the use-cases differ. We also detail the results obtained and the implications for storage hardware, middleware and experiment software

    Quality modeling in electronic healthcare: a study of mHealth Service

    Get PDF
    Information and communication technologies (ICTs) have the potential to radically transform health services in developing countries. Among various ICT driven health platforms, mobile health is the most promising one because of its widespread penetration and cost effective services. This paper aims to examine Quality Modeling in Electronic Healthcare by using PLS based SEM

    Minimum Bias Triggers at ATLAS, LHC

    Get PDF
    In the first phase of LHC data-taking ATLAS will measure the charged-particle density at the initial center-of-mass energy of 10 TeV and then at 14 TeV. This will allow us improve our knowledge of soft QCD models and pin-down cross-sections of different classes of inelastic collisions at LHC energies. In particular, the dominant non-diffractive interaction is a key process to understanding QCD backgrounds when we reach higher luminosities. We highlight two minimum-bias triggers, sensitive to particles in complementary ranges in pseudo-rapidity, one based on signals from the Inner Detector, the other explicitly designed to trigger on inelastic processes. Studies of their trigger efficiencies as well as possible trigger biases are presented.Comment: 3 pages, 4 figures, poster proceedings for ICHEP 200

    An Invasive Metaphor: the Concept of Centre of Gravity in Economics

    Get PDF
    This paper undertakes a critical examination of the concept of 'centre of gravity' as adapted by economics from classical mechanics, relating it to the idea of 'long-run' profits, prices and quantities, as presented in the work of the post-Sraffians.(1) It will also address the origin of this concept of 'long-run' in Marshall's distinction between long-run and short-run determinations of economic magnitudes. It shows that economists have generally conceived of centre of gravity as a theoretical magnitude which is not observed, but around which observed magnitudes oscillate either randomly or in some deterministic manner; this much is generally agreed. This idea has, however, been interpreted in two distinct ways in the history of economic thought: (1) as an attractor dynamically determined at each point in time by path-dependent historical processes which have led the economy to be in its present state. (2) as a hypothetical static equilibrium state of the economy determined independent of history by its current exogenous parameters (utility, technical capacity, etc) It demonstrates that these two ideas are necessarily distinct and that both must be taken into account in any pluralistic research programme. Mathematically the attractor of a variable is not in general equal to its hypothetical static equilibrium, except in highly restricted circumstances such as the absence of technical change. Moreover, again outside of exceptional circumstances, the divergence between the predictions of observed magnitudes given by the two approaches increases over time, so that it cannot even be accepted that one converges on the other. Error will therefore result if it is assumed a priori that (1) is identical to (2). The fact that the two conceptions lead to different predictions does not decide that either one is correct. This should be determined empirically and therefore, an agreed empirical test should be established by the community of social scientists or, better still, society. The paper will argue that, empirically, the 'test variable' against which both conceptions should be checked is the time average of the variables in question. This is not a distinct concept of 'centre of gravity' but an empirical observable. In a pluralistic programme, the predictions of both conceptions should be evaluated against this proposed test variable. The second part of the paper examines the common basis for the critical stance taken by both Keynes and Marx to the second conception, which is rooted in a common attitude to the relation between substance and accident, and a correspondingly similar conception of uncertainty. It will relate this to the work of Quetelet and the development of the statistical method in sociology which, it will argue, is rooted in an ontologically distinct conception of social magnitudes to that found in economics, closer to the concept which Keynes and Marx shared. It argues that the post-Sraffian conception of long-run is based on a fallacious identification of these two distinct concepts, rendering the post-Sraffian approach equally incompatible with Keynes's and Marx's theories. It argues that the post-Sraffian conception of centre of gravity is 'intrinsically antipluralistic' in that it depends absolutely on the conflation of two concepts which are in fact necessarily distinct, leading to the suppression of the non-equilibrium concept as an alternative to the scientific procedure of testing the predictions of both concepts against an observable.

    Introducing instrumentation and data acquisition to mechanical engineering students using LabVIEW

    Get PDF
    For several years, LabVIEW has been used within the Department of Mechanical Engineering at the University of Strathclyde as the basis for introducing the basic concepts and practice of data acquisition, and more generally, instrumentation, to postgraduate engineering students and undergraduate project students. The objectives of introducing LabVIEW within the curriculum were to expose students to instrumentation and experimental analysis, and to create courseware that could be used flexibly for a range of students. It was also important that staff time for laboratory work be kept to manageable levels. A course module was developed which allows engineering students with very little or no previous knowledge of instrumentation or programming to become acquainted with the basics of programming, experimentation and data acquisition. The basic course structure has been used to teach both undergraduates and postgraduates as well as laboratory technical staff. The paper describes the objectives of the use of LabVIEW for teaching, the structure of the module developed, and the response of students who have been subjected to the course, and how it is intended to expand the delivery to greater student numbers
    corecore