5,805 research outputs found

    Using the general link transmission model in a dynamic traffic assignment to simulate congestion on urban networks

    Get PDF
    This article presents two new models of Dynamic User Equilibrium that are particularly suited for ITS applications, where the evolution of vehicle flows and travel times must be simulated on large road networks, possibly in real-time. The key feature of the proposed models is the detail representation of the main congestion phenomena occurring at nodes of urban networks, such as vehicle queues and their spillback, as well as flow conflicts in mergins and diversions. Compared to the simple word of static assignment, where only the congestion along the arc is typically reproduced through a separable relation between vehicle flow and travel time, this type of DTA models are much more complex, as the above relation becomes non-separable, both in time and space. Traffic simulation is here attained through a macroscopic flow model, that extends the theory of kinematic waves to urban networks and non-linear fundamental diagrams: the General Link Transmission Model. The sub-models of the GLTM, namely the Node Intersection Model, the Forward Propagation Model of vehicles and the Backward Propagation Model of spaces, can be combined in two different ways to produce arc travel times starting from turn flows. The first approach is to consider short time intervals of a few seconds and process all nodes for each temporal layer in chronological order. The second approach allows to consider long time intervals of a few minutes and for each sub-model requires to process the whole temporal profile of involved variables. The two resulting DTA models are here analyzed and compared with the aim of identifying their possible use cases. A rigorous mathematical formulation is out of the scope of this paper, as well as a detailed explanation of the solution algorithm. The dynamic equilibrium is anyhow sought through a new method based on Gradient Projection, which is capable to solve both proposed models with any desired precision in a reasonable number of iterations. Its fast convergence is essential to show that the two proposed models for network congestion actually converge at equilibrium to nearly identical solutions in terms of arc flows and travel times, despite their two diametrical approaches wrt the dynamic nature of the problem, as shown in the numerical tests presented here

    EEG-based cognitive control behaviour assessment: an ecological study with professional air traffic controllers

    Get PDF
    Several models defining different types of cognitive human behaviour are available. For this work, we have selected the Skill, Rule and Knowledge (SRK) model proposed by Rasmussen in 1983. This model is currently broadly used in safety critical domains, such as the aviation. Nowadays, there are no tools able to assess at which level of cognitive control the operator is dealing with the considered task, that is if he/she is performing the task as an automated routine (skill level), as procedures-based activity (rule level), or as a problem-solving process (knowledge level). Several studies tried to model the SRK behaviours from a Human Factor perspective. Despite such studies, there are no evidences in which such behaviours have been evaluated from a neurophysiological point of view, for example, by considering brain activity variations across the different SRK levels. Therefore, the proposed study aimed to investigate the use of neurophysiological signals to assess the cognitive control behaviours accordingly to the SRK taxonomy. The results of the study, performed on 37 professional Air Traffic Controllers, demonstrated that specific brain features could characterize and discriminate the different SRK levels, therefore enabling an objective assessment of the degree of cognitive control behaviours in realistic setting

    Optimization of a Transmission Network

    Get PDF

    The Political Nature of TCP/IP

    Get PDF
    Despite the importance of the Internet in the modern world, many users and even policy makers don’t have a necessary historical or technical grasp of the technology behind it. In the spirit of addressing this issue, this thesis attempts to shed light on the historical, political, and technical context of TCP/IP. TCP/IP is the Internet Protocol Suite, a primary piece of Internet architecture with a well-documented history. After at technical overview, detailing the main function of TCP/IP, I examine aspects of the social and developmental record of this technology using STS theoretical approaches such as Hughesian systems theory, Social Construction of Technology (SCOT), and Langdon Winner’s brand of technological determinism. Key points in TCP/IP evolution, when viewed from an STS perspective, illuminate the varied reasons behind decisions and development of the technology. For example, as detailed in this paper, both technical and political motivations were behind the architectural politics built into TCP/IP in the 1970s, and similar motivations spurred the rejection of OSI protocols by Internet developers two decades later. Armed with resultant contextual understanding of previous TCP/IP developments, a few possible directions (both political and technical) in contemporary and future Internet development are then explored, such as the slow migration to IPv6 and the meaning of network neutrality

    Advances in Data Mining Knowledge Discovery and Applications

    Get PDF
    Advances in Data Mining Knowledge Discovery and Applications aims to help data miners, researchers, scholars, and PhD students who wish to apply data mining techniques. The primary contribution of this book is highlighting frontier fields and implementations of the knowledge discovery and data mining. It seems to be same things are repeated again. But in general, same approach and techniques may help us in different fields and expertise areas. This book presents knowledge discovery and data mining applications in two different sections. As known that, data mining covers areas of statistics, machine learning, data management and databases, pattern recognition, artificial intelligence, and other areas. In this book, most of the areas are covered with different data mining applications. The eighteen chapters have been classified in two parts: Knowledge Discovery and Data Mining Applications

    Mathematics and the Internet: A Source of Enormous Confusion and Great Potential

    Get PDF
    Graph theory models the Internet mathematically, and a number of plausible mathematically intersecting network models for the Internet have been developed and studied. Simultaneously, Internet researchers have developed methodology to use real data to validate, or invalidate, proposed Internet models. The authors look at these parallel developments, particularly as they apply to scale-free network models of the preferential attachment type

    Real-time traffic monitoring using mobile phone data

    Get PDF

    Resource dimensioning through buffer sampling

    Get PDF
    Link dimensioning, i.e., selecting a (minimal) link capacity such that the users’ performance requirements are met, is a crucial component of network design. It requires insight into the interrelationship among the traffic offered (in terms of the mean offered load , but also its fluctuation around the mean, i.e., ‘burstiness’), the envisioned performance level, and the capacity needed. We first derive, for different performance criteria, theoretical dimensioning formulas that estimate the required capacity cc as a function of the input traffic and the performance target. For the special case of Gaussian input traffic, these formulas reduce to c=M+αVc = M + \alpha V, where directly relates to the performance requirement (as agreed upon in a service level agreement) and VV reflects the burstiness (at the timescale of interest). We also observe that Gaussianity applies for virtually all realistic scenarios; notably, already for a relatively low aggregation level, the Gaussianity assumption is justified.\ud As estimating MM is relatively straightforward, the remaining open issue concerns the estimation of VV. We argue that particularly if corresponds to small time-scales, it may be inaccurate to estimate it directly from the traffic traces. Therefore, we propose an indirect method that samples the buffer content, estimates the buffer content distribution, and ‘inverts’ this to the variance. We validate the inversion through extensive numerical experiments (using a sizeable collection of traffic traces from various representative locations); the resulting estimate of VV is then inserted in the dimensioning formula. These experiments show that both the inversion and the dimensioning formula are remarkably accurate
    corecore