35,272 research outputs found

    Measurement of uncertainty costs with dynamic traffic simulations

    Get PDF
    Non-recurrent congestion in transportation networks occurs as a consequence of stochastic factors affecting demand and supply. Intelligent Transportation Systems such as Advanced Traveler Information Systems (ATIS) and Advanced Traffic Management Systems (ATMS) are designed in order to reduce the impacts of non-recurrent congestion by providing information to a fraction of users or by controlling the variability of traffic flows. For these reasons, the design of ATIS and ATMS requires reliable forecast of non-recurrent congestion. This paper proposes a new method to measure the impacts of non-recurrent congestion on travel costs by taking risk aversion into account. The traffic model is based on the dynamic traffic simulations model METROPOLIS. Incidents are generated randomly by reducing the capacity of the network. Users can instantaneously adapt to the unexpected travel conditions or can also change their behavior via a day-to-day adjustment process. Comparisons with incident-free simulations provide a benchmark for potential travel time savings that can be brought in by a state-of-the-art information system. We measure the impact of variable travel conditions by describing the willingness to pay to avoid risky or unreliable journeys. Indeed, for risk averse drivers, any uncertainty corresponds to a utility loss. This utility loss is computed for several levels of network disruption. The main results of the paper is that the utility loss due to uncertainty is of the same order of magnitude as the total travel costs.

    Pressure Fluctuations in Natural Gas Networks caused by Gas-Electric Coupling

    Full text link
    The development of hydraulic fracturing technology has dramatically increased the supply and lowered the cost of natural gas in the United States, driving an expansion of natural gas-fired generation capacity in several electrical inter-connections. Gas-fired generators have the capability to ramp quickly and are often utilized by grid operators to balance intermittency caused by wind generation. The time-varying output of these generators results in time-varying natural gas consumption rates that impact the pressure and line-pack of the gas network. As gas system operators assume nearly constant gas consumption when estimating pipeline transfer capacity and for planning operations, such fluctuations are a source of risk to their system. Here, we develop a new method to assess this risk. We consider a model of gas networks with consumption modeled through two components: forecasted consumption and small spatio-temporarily varying consumption due to the gas-fired generators being used to balance wind. While the forecasted consumption is globally balanced over longer time scales, the fluctuating consumption causes pressure fluctuations in the gas system to grow diffusively in time with a diffusion rate sensitive to the steady but spatially-inhomogeneous forecasted distribution of mass flow. To motivate our approach, we analyze the effect of fluctuating gas consumption on a model of the Transco gas pipeline that extends from the Gulf of Mexico to the Northeast of the United States.Comment: 10 pages, 7 figure

    Simulating California reservoir operation using the classification and regression-tree algorithm combined with a shuffled cross-validation scheme

    Get PDF
    The controlled outflows from a reservoir or dam are highly dependent on the decisions made by the reservoir operators, instead of a natural hydrological process. Difference exists between the natural upstream inflows to reservoirs and the controlled outflows from reservoirs that supply the downstream users. With the decision maker's awareness of changing climate, reservoir management requires adaptable means to incorporate more information into decision making, such as water delivery requirement, environmental constraints, dry/wet conditions, etc. In this paper, a robust reservoir outflow simulation model is presented, which incorporates one of the well-developed data-mining models (Classification and Regression Tree) to predict the complicated human-controlled reservoir outflows and extract the reservoir operation patterns. A shuffled cross-validation approach is further implemented to improve CART's predictive performance. An application study of nine major reservoirs in California is carried out. Results produced by the enhanced CART, original CART, and random forest are compared with observation. The statistical measurements show that the enhanced CART and random forest overperform the CART control run in general, and the enhanced CART algorithm gives a better predictive performance over random forest in simulating the peak flows. The results also show that the proposed model is able to consistently and reasonably predict the expert release decisions. Experiments indicate that the release operation in the Oroville Lake is significantly dominated by SWP allocation amount and reservoirs with low elevation are more sensitive to inflow amount than others

    Big data analytics:Computational intelligence techniques and application areas

    Get PDF
    Big Data has significant impact in developing functional smart cities and supporting modern societies. In this paper, we investigate the importance of Big Data in modern life and economy, and discuss challenges arising from Big Data utilization. Different computational intelligence techniques have been considered as tools for Big Data analytics. We also explore the powerful combination of Big Data and Computational Intelligence (CI) and identify a number of areas, where novel applications in real world smart city problems can be developed by utilizing these powerful tools and techniques. We present a case study for intelligent transportation in the context of a smart city, and a novel data modelling methodology based on a biologically inspired universal generative modelling approach called Hierarchical Spatial-Temporal State Machine (HSTSM). We further discuss various implications of policy, protection, valuation and commercialization related to Big Data, its applications and deployment
    corecore