308,615 research outputs found

    Anatomy of extraordinary rainfall and flash flood in a Dutch lowland catchment

    Get PDF
    On 26 August 2010 the eastern part of The Netherlands and the bordering part of Germany were struck by a series of rainfall events lasting for more than a day. Over an area of 740 km2 more than 120 mm of rainfall were observed in 24 h. This extreme event resulted in local flooding of city centres, highways and agricultural fields, and considerable financial loss. In this paper we report on the unprecedented flash flood triggered by this exceptionally heavy rainfall event in the 6.5 km2 Hupsel Brook catchment, which has been the experimental watershed employed by Wageningen University since the 1960s. This study aims to improve our understanding of the dynamics of such lowland flash floods. We present a detailed hydrometeorological analysis of this extreme event, focusing on its synoptic meteorological characteristics, its space-time rainfall dynamics as observed with rain gauges, weather radar and a microwave link, as well as the measured soil moisture, groundwater and discharge response of the catchment. At the Hupsel Brook catchment 160 mm of rainfall was observed in 24 h, corresponding to an estimated return period of well over 1000 years. As a result, discharge at the catchment outlet increased from 4.4 × 10-3 to nearly 5 m3 s-1. Within 7 h discharge rose from 5 × 10-2 to 4.5 m3 s-1. The catchment response can be divided into four phases: (1) soil moisture reservoir filling, (2) groundwater response, (3) surface depression filling and surface runoff and (4) backwater feedback. The first 35 mm of rainfall were stored in the soil without a significant increase in discharge. Relatively dry initial conditions (in comparison to those for past discharge extremes) prevented an even faster and more extreme hydrological response

    Polar föhn winds and warming over the Larsen C Ice Shelf, Antarctica

    Get PDF
    Recent hypotheses that the foehn effect is partly responsible for warming to the east of the Antarctic Peninsula (AP) and enhanced melt rates on the Larsen C Ice Shelf are supported in a study combining the analysis of observational and high resolution model data. Leeside warming and drying during foehn events is observed in new aircraft, radiosonde and automatic weather station data and simulated by the UK Met Office Unified Model at ~1.5 km grid spacing (MetUM 1.5 km). Three contrasting cases are investigated. In Case A relatively weak southwesterly flow induces a nonlinear foehn event. Strongly accelerated flow above and a hydraulic jump immediately downwind of the lee slopes lead to high amplitude warming in the immediate lee of the AP, downwind of which the warming effect diminishes rapidly due to the upward ‘rebound’ of the foehn flow. Case C defines a relatively linear case associated with strong northwesterly winds. The lack of a hydraulic jump enables foehn flow to flood across the entire ice shelf at low levels. Melt rates are high due to a combination of large radiative heat flux, due to dry, clear leeside conditions, and sensible heat flux downward from the warm, well-mixed foehn flow. Climatological work suggests that such strong northwesterly cases are often responsible for high Larsen C melt rates. Case B describes a weak, relatively non-linear foehn event associated with insignificant daytime melt rates. Previously unknown jets – named polar foehn jets – emanating from the mouths of leeside inlets are identified as a type of gap flow. They are cool and moist relative to adjacent calmer regions, due to lower-altitude upwind source regions, and are characterised by larger turbulent heat fluxes both within the air column and at the surface. The relative importance of the three mechanisms deemed to induce leeside foehn warming (isentropic drawdown, latent heating and sensible heating) are quantified using a novel method analysing back trajectories and MetUM 1.5 km model output. It is shown that, depending on the linearity of the flow regime and the humidity of the air mass, each mechanism can dominate. This implies that there is no dominant foehn warming mechanism, contrary to the conclusions of previous work

    A comprehensive test of order choice theory: recent evidence from the NYSE

    Get PDF
    We perform a comprehensive test of order choice theory from a sample period when the NYSE trades in decimals and allows automatic executions. We analyze the decision to submit or cancel an order or to take no action. For submitted orders we distinguish order type (market vs. limit), order side (buy vs. sell), execution method (floor vs. automatic), and order pricing aggressiveness. We use a multinomial logit specification and a new statistical test. We find a negative autocorrelation in changes in order flow exists over five-minute intervals supporting dynamic limit order book theory, despite a positive first-order autocorrelation in order type. Orders routed to the NYSE’s floor are sensitive to market conditions (e.g., spread, depth, volume, volatility, market and individual-stock returns, and private information), but those using the automatic execution system (Direct+) are insensitive to market conditions. When the quoted depth is large, traders are more likely to “jump the queue” by submitting limit orders with limit prices bettering existing quotes. Aggressively-priced limit orders are more likely late in the trading day providing evidence in support of prior experimental results

    Hydrological connectivity inferred from diatom transport through the riparian-stream system

    Get PDF
    Funding for this research was provided by the Luxembourg National Research Fund (FNR) in the framework of the BIGSTREAM (C09/SR/14), ECSTREAM (C12/SR/40/8854) and CAOS (INTER/DFG/11/01) projects. We are most grateful to the Administration des Services Techniques de l’Agriculture (ASTA) for providing meteorological data. We also acknowledge Delphine Collard for technical assistance in diatom sample treatment and preparation, François Barnich for the water chemistry analyses, and Jean-François Iffly, Christophe Hissler, Jérôme Juilleret, Laurent Gourdol and Julian Klaus for their constructive comments on the project and technical assistance in the field.Peer reviewedPublisher PD

    A new data assimilation procedure to develop a debris flow run-out model

    Get PDF
    Abstract Parameter calibration is one of the most problematic phases of numerical modeling since the choice of parameters affects the model\u2019s reliability as far as the physical problems being studied are concerned. In some cases, laboratory tests or physical models evaluating model parameters cannot be completed and other strategies must be adopted; numerical models reproducing debris flow propagation are one of these. Since scale problems affect the reproduction of real debris flows in the laboratory or specific tests used to determine rheological parameters, calibration is usually carried out by comparing in a subjective way only a few parameters, such as the heights of soil deposits calculated for some sections of the debris flows or the distance traveled by the debris flows using the values detected in situ after an event has occurred. Since no automatic or objective procedure has as yet been produced, this paper presents a numerical procedure based on the application of a statistical algorithm, which makes it possible to define, without ambiguities, the best parameter set. The procedure has been applied to a study case for which digital elevation models of both before and after an important event exist, implicating that a good database for applying the method was available. Its application has uncovered insights to better understand debris flows and related phenomena

    Web Services: A Process Algebra Approach

    Full text link
    It is now well-admitted that formal methods are helpful for many issues raised in the Web service area. In this paper we present a framework for the design and verification of WSs using process algebras and their tools. We define a two-way mapping between abstract specifications written using these calculi and executable Web services written in BPEL4WS. Several choices are available: design and correct errors in BPEL4WS, using process algebra verification tools, or design and correct in process algebra and automatically obtaining the corresponding BPEL4WS code. The approaches can be combined. Process algebra are not useful only for temporal logic verification: we remark the use of simulation/bisimulation both for verification and for the hierarchical refinement design method. It is worth noting that our approach allows the use of any process algebra depending on the needs of the user at different levels (expressiveness, existence of reasoning tools, user expertise)

    Neural markers of performance states in an Olympic athlete: An EEG case study in air-pistol shooting

    Get PDF
    This study focused on identifying the neural markers underlying optimal and suboptimal performance experiences of an elite air-pistol shooter, based on the tenets of the multi-action plan (MAP) model. According to the MAP model’s assumptions, skilled athletes’ cortical patterns are expected to differ among optimal/automatic (Type 1), optimal/controlled (Type 2), suboptimal/controlled (Type 3), and suboptimal/automatic (Type 4) performance experiences. We collected performance (target pistol shots), cognitive-affective (perceived control, accuracy, and hedonic tone), and cortical activity data (32-channel EEG) of an elite shooter. Idiosyncratic descriptive analyses revealed differences in perceived accuracy in regard to optimal and suboptimal performance states. Event-Related Desynchronization/Synchronization analysis supported the notion that optimal-automatic performance experiences (Type 1) were characterized by a global synchronization of cortical arousal associated with the shooting task, whereas suboptimal controlled states (Type 3) were underpinned by high cortical activity levels in the attentional brain network. Results are addressed in the light of the neural efficiency hypothesis and reinvestment theory. Perceptual training recommendations aimed at restoring optimal performance levels are discussed
    corecore