7,345 research outputs found

    Circulation, retention, and mixing of waters within the Weddell-Scotia Confluence, Southern Ocean:The role of stratified Taylor columns

    Get PDF
    The waters of the Weddell-Scotia Confluence (WSC) lie above the rugged topography of the South Scotia Ridge in the Southern Ocean. Meridional exchanges across the WSC transfer water and tracers between the Antarctic Circumpolar Current (ACC) to the north and the subpolar Weddell Gyre to the south. Here, we examine the role of topographic interactions in mediating these exchanges, and in modifying the waters transferred. A case study is presented using data from a free-drifting, intermediate-depth float, which circulated anticyclonically over Discovery Bank on the South Scotia Ridge for close to 4 years. Dimensional analysis indicates that the local conditions are conducive to the formation of Taylor columns. Contemporaneous ship-derived transient tracer data enable estimation of the rate of isopycnal mixing associated with this column, with values of O(1000 m2/s) obtained. Although necessarily coarse, this is of the same order as the rate of isopycnal mixing induced by transient mesoscale eddies within the ACC. A picture emerges of the Taylor column acting as a slow, steady blender, retaining the waters in the vicinity of the WSC for lengthy periods during which they can be subject to significant modification. A full regional float data set, bathymetric data, and a Southern Ocean state estimate are used to identify other potential sites for Taylor column formation. We find that they are likely to be sufficiently widespread to exert a significant influence on water mass modification and meridional fluxes across the southern edge of the ACC in this sector of the Southern Ocean

    Equality, value pluralism and relevance: Is luck egalitarianism in one way good, but not all things considered?

    Get PDF
    Some luck egalitarians argue that justice is just one value among others and is thus not necessarily what we should strive for in order to make the world better. Yet, by focusing on only one dimension of what matters – luck equality – it proves very difficult to draw political implications in cases where several values are in tension. We believe that normative political philosophy must have the ambitionto guide political action. Hence, in this paper we make a negative and a positive point. Negatively, we argue that the inability to offer recommendations on what to strive for potentially weakens Kasper Lippert-Rasmussen’s account of luck egalitarianism. In order not to be irrelevant for political practice, a more serviceable version of luck egalitarianism that would allow for all-things-considered judgments is needed. Positively, we examine two possible routes toward such a view. One would be to stick to pluralism, but to discuss possible clashes and find a rule of regulation in each case. Another would consist in giving up value pluralism by identifying an over-arching value or principle that would arbitrate between different values. We suggest that Lippert-Rasmussen’s foundation of equality carries the potential for such an overarching principle.Political Philosophy and Ethic

    CLUSTERING AND INDEXING HISTORIC VESSEL MOVEMENT DATA WITH SPACE FILLING CURVES

    Get PDF
    This paper reports on the result of an on-going study using Space Filling Curves (SFCs) for indexing and clustering vessel movement message data (obtained via the Automated Identification System, AIS) inside a geographical Database Management System (Geo-DBMS). With AIS, vessels transmit their positions in intervals ranging from 2 seconds to 3 minutes. Every 6 minutes voyage related information is broadcast. Relevant AIS messages contain a position, timestamp and vessel identifier. This information can be stored in a DBMS as separate columns with different types (as 2D point plus time plus identifier), or in an integrated column (as higher dimensional 4D point which is encoded as the position on a space filling curve, that we will call the SFC-key). Subsequently, indexing based on this SFC-key column can replace separate indexes (where this one integrated index will need less storage space than separate indexes). Moreover, this integrated index allows a good clustering (physical ordering of the table). Also, in an approach with separate indexes for location, time and object identifier the query optimizer inside a DBMS has to estimate which index is most selective for a given query. It is not possible to use two indexes at the same time &ndash; e.g. in case of a space-time query. An approach with one multi-dimensional integrated index does not have this problem. It results in faster query responses when specifying multiple selection criteria; i.e. both search geometry and time interval. We explain the steps needed to make this SFC approach available fully inside a DBMS (to avoid expensive data transfer to external programs during use). The SFC approach makes it possible to better cluster the (spatio-temporal) data compared to an approach with separate indexes. Moreover, we show experiments (with 723,853,597 AIS position report messages spanning 3 months, Sep&ndash;Dec 2016, using data for Europe, both on-sea and inland water ways) to compare an approach based on one multi-dimensional integrated index (using a SFC) with non-integrated approach. We analyze loading time (including SFC encoding) and storage requirements, together with the speed of execution of queries and granularity of answers. Conclusion is that time spend on query execution in case of space-time queries where both dimensions are selective using the integrated SFC approach outperforms the non-integrated approach (typically a factor 2&ndash;6). Also, the SFC approach saves considerably on storage space (less space needed for indexes). Lastly, we propose some future improvements to get some better query performance using the SFC approach (e.g. IOT, range-glueing and nD-histogram).</p

    Changes in global ocean bottom properties and volume transports in CMIP5 models under climate change scenarios

    Get PDF
    Changes in bottom temperature, salinity and density in the global ocean by 2100 for CMIP5 climate models are investigated for the climate change scenarios RCP4.5 and RCP8.5. The mean of 24 models shows a decrease in density in all deep basins except the North Atlantic which becomes denser. The individual model responses to climate change forcing are more complex: regarding temperature, the 24 models predict a warming of the bottom layer of the global ocean; in salinity, there is less agreement regarding the sign of the change, especially in the Southern Ocean. The magnitude and equatorward extent of these changes also vary strongly among models. The changes in properties can be linked with changes in the mean transport of key water masses. The Atlantic Meridional Overturning Circulation weakens in most models and is directly linked to changes in bottom density in the North Atlantic. These changes are due to the intrusion of modified Antarctic Bottom Water, made possible by the decrease in North Atlantic Deep Water formation. In the Indian, Pacific and South Atlantic, changes in bottom density are congruent with the weakening in Antarctic Bottom Water transport through these basins. We argue that the greater the 1986-2005 meridional transports, the more changes have propagated equatorwards by 2100. However, strong decreases in density over 100 years of climate change cause a weakening of the transports. The speed at which these property changes reach the deep basins is critical for a correct assessment of the heat storage capacity of the oceans as well as for predictions of future sea level rise

    Staying true with the help of others: doxastic self-control through interpersonal commitment

    Get PDF
    I explore the possibility and rationality of interpersonal mechanisms of doxastic self-control, that is, ways in which individuals can make use of other people in order to get themselves to stick to their beliefs. I look, in particular, at two ways in which people can make interpersonal epistemic commitments, and thereby willingly undertake accountability to others, in order to get themselves to maintain their beliefs in the face of anticipated “epistemic temptations”. The first way is through the avowal of belief, and the second is through the establishment of collective belief. I argue that both of these forms of interpersonal epistemic commitment can function as effective tools for doxastic self-control, and, moreover, that the control they facilitate should not be dismissed as irrational from an epistemic perspective

    Using XDAQ in Application Scenarios of the CMS Experiment

    Full text link
    XDAQ is a generic data acquisition software environment that emerged from a rich set of of use-cases encountered in the CMS experiment. They cover not the deployment for multiple sub-detectors and the operation of different processing and networking equipment as well as a distributed collaboration of users with different needs. The use of the software in various application scenarios demonstrated the viability of the approach. We discuss two applications, the tracker local DAQ system for front-end commissioning and the muon chamber validation system. The description is completed by a brief overview of XDAQ.Comment: Conference CHEP 2003 (Computing in High Energy and Nuclear Physics, La Jolla, CA

    The CMS Event Builder

    Full text link
    The data acquisition system of the CMS experiment at the Large Hadron Collider will employ an event builder which will combine data from about 500 data sources into full events at an aggregate throughput of 100 GByte/s. Several architectures and switch technologies have been evaluated for the DAQ Technical Design Report by measurements with test benches and by simulation. This paper describes studies of an EVB test-bench based on 64 PCs acting as data sources and data consumers and employing both Gigabit Ethernet and Myrinet technologies as the interconnect. In the case of Ethernet, protocols based on Layer-2 frames and on TCP/IP are evaluated. Results from ongoing studies, including measurements on throughput and scaling are presented. The architecture of the baseline CMS event builder will be outlined. The event builder is organised into two stages with intelligent buffers in between. The first stage contains 64 switches performing a first level of data concentration by building super-fragments from fragments of 8 data sources. The second stage combines the 64 super-fragments into full events. This architecture allows installation of the second stage of the event builder in steps, with the overall throughput scaling linearly with the number of switches in the second stage. Possible implementations of the components of the event builder are discussed and the expected performance of the full event builder is outlined.Comment: Conference CHEP0

    TOWARDS A SCALE DEPENDENT FRAMEWORK FOR CREATING VARIO-SCALE MAPS

    Get PDF
    Traditionally, the content for vario-scale maps has been created using a ‘one fits all’ approach equal for all scales. Initially only the delete/merge operation was used to create the vario-scale data using the importance and the compatibility functions defined at class level (and evaluated at instance level) to create the tGAP structure with planar partition as basis. In order to improve the generalization quality other operators and techniques have been added during the past years; e.g. simplify, collapse (change area to line representation), split, attractiveness regions and the introduction of the concept of linear network topology. However, the decision which operation to apply has been hard coded in our software, making it not very flexible. Further, we want to include awareness of the current scale when deciding what generalization operation to apply. For this purpose we propose the scale dependent framework (SDF), which at its core contains the encoding of the generalization knowledge in the SDF conceptual model. This SDF model covers the representation of scale dependent class importance, scale dependent class compatibility values, scale dependent attractiveness regions and last but not least specification of generalization operations that are scale and class dependent. By changing the settings in the SDF configuration and re-running the vario-scale generalization process, we can easily experiment in order to find best settings (for specific map user needs). In this paper we design the SDF conceptual model and explicitly motivate and define the scope of its expressiveness. We further present the improved scale dependent tGAP creation software and present initial results in the form of better created vario-scale map content
    • 

    corecore