566 research outputs found

    Traffic flow modeling and forecasting using cellular automata and neural networks : a thesis presented in partial fulfillment of the requirements for the degree of Master of Science in Computer Science at Massey University, Palmerston North, New Zealand

    Get PDF
    In This thesis fine grids are adopted in Cellular Automata (CA) models. The fine-grid models are able to describe traffic flow in detail allowing position, speed, acceleration and deceleration of vehicles simulated in a more realistic way. For urban straight roads, two types of traffic flow, free and car-following flow, have been simulated. A novel five-stage speed-changing CA model is developed to describe free flow. The 1.5-second headway, based on field data, is used to simulate car-following processes, which corrects the headway of 1 second used in all previous CA models. Novel and realistic CA models, based on the Normal Acceptable Space (NAS) method, are proposed to systematically simulate driver behaviour and interactions between drivers to enter single-lane Two-Way Stop-Controlled (TWSC) intersections and roundabouts. The NAS method is based on the two following Gaussian distributions. Distribution of space required for all drivers to enter intersections or roundabouts is assumed to follow a Gaussian distribution, which corresponds to heterogeneity of driver behaviour. While distribution of space required for a single driver to enter an intersection or roundabout is assumed to follow another Gaussian distribution, which corresponds to inconsistency of driver behavior. The effects of passing lanes on single-lane highway traffic are investigated using fine grids CA. Vehicles entering, exiting from and changing lanes on passing lane sections are discussed in detail. In addition, a Genetic Algorithm-based Neural Network (GANN) method is proposed to predict Short-term Traffic Flow (STF) in urban networks, which is expected to be helpful for traffic control. Prediction accuracy and generalization ability of NN are improved by optimizing the number of neurons in the hidden layer and connection weights of NN using genetic operations such as selection, crossover and mutation

    A layered view model for XML with conceptual and logical extensions, and its applications

    Full text link
    University of Technology, Sydney. Faculty of Information Technology.EXtensible Markup Language (XML) is becoming the dominant standard for storing, describing and interchanging data among various Enterprises Information Systems (EIS), web repositories and databases. With this increasing reliance on such self-describing, schema-based, semi-structured data language XML, there exists a need to model, design, and manipulate XML and associated semantics at a higher level of abstraction than at the instance level. However, existing OO conceptual modelling languages provide insufficient modelling constructs for utilizing XML structures, descriptions and constraints, and XML and associated schema languages lack the ability to provide higher levels of abstraction, such as conceptual models that are easily understood by humans. To this end, it is interesting to investigate conceptual and schema formalisms as a means of providing higher level semantics in the context of XML-related data modelling. In particular we note that there is a strong need to model views of XML repositories at the conceptual level. This is in contrast to the situation for views for the relational model which are generally defined at the implementation level. In this research, we use XML view and introduce the Layered View Model (LVM, for short), a declarative conceptual framework for specifying and defining views at a higher level of abstraction. The views in the LVM are specified using explicit conceptual, logical and instance level semantics and provide declarative transformation between these levels of abstraction. For such a task, an elaborated and enhanced OO based modelling and transformation methodology is employed. The LVM framework leads to a number of interesting problems that are studied in this research. First we address the issue of conceptualizing the notion of views: the clear separation of conceptual concerns from the implementation and data language concerns. Here, the LVM views are considered as first-class citizens of the conceptual model. Second we provide formal semantics and definitions to enforce representation, specification and definition of such views at the highest level of abstraction, the conceptual level. Third we address the issue of modelling and transformation of LVM views to the required level of abstraction, namely to the schema and instance levels. Finally, we apply LVM to real-world data modelling scenarios to develop other architectural frameworks in the domains such as dimensional XML data modelling, ontology views in the Semantic Web paradigm and modelling user-centred websites and web portals

    Engineering XML solutions using views

    Get PDF
    In industrial informatics, engineering data intensive Enterprise Information Systems (EIS) is a challenging task without abstraction and partitioning. Further, the introduction of semi-structured data (namely XML) and its rapid adaptation by the commercial and industrial systems increased the complexity for data engineering. Conversely, the introduction of OMG's MDA presents an interesting paradigm for EIS and system modelling, where a system is designed at a higher level of abstraction. This presents an interesting problem to investigate data engineering XML solutions under the MDA initiatives, where, models and framework requires higher level of abstraction. In this paper we investigate a view model that can provide layered design methodology for modelling data intensive XML solutions for EIS paradigm, with sufficient level of abstraction

    Combined experimental-theoretical study of the OH + CO → H + CO2 reaction dynamics

    Get PDF
    A combined experimental−theoretical study is performed to advance our understanding of the dynamics of the prototypical tetra-atom, complex-forming reaction OH + CO → H + CO 2, which is also of great practical relevance in combustion, Earth’s atmosphere, and, potentially, Mars’s atmosphere and interstellar chemistry. New crossed molecular beam experiments with mass spectrometric detection are analyzed together with the results from previous experiments and compared with quasi-classical trajectory (QCT) calculations on a new, fulldimensional potential energy surface (PES). Comparisons between experiment and theory are carried out both in the center-of-mass and laboratory frames. Good agreement is found between experiment and theory, both for product angular and translational energy distributions, leading to the conclusion that the new PES is the most accurate at present in elucidating the dynamics of this fundamental reaction. Yet, small deviations between experiment and theory remain and are presumably attributable to the QCT treatment of the scattering dynamics

    Deployment and performance evaluation of a SNAP-based resource broker on the White Rose grid

    Get PDF
    Resource brokering is an essential component in building effective Grid systems. The aim of this paper is to evaluate the performance of a SNAP (Service Negotiation and Acquisition Protocol) based resource broker on a large distributed Grid infrastructure, the White Rose Grid. The broker uses a three-phase commit protocol to reserve resources on demand, as the traditional advance reservation facilities cannot cater for such needs due to the prior time that it requires to schedule reservations. Experiments are designed and carried out on the White Rose Grid. The experimental results show that the inclusion of the three-phase commit protocol provides a performance enhancement on a large distributed Grid Infrastructure, in terms of the time taken from submission of user requirements until a job begins execution. The results support those previously obtained through the use of mathematical modelling and simulation. The broker is a viable contender for use in future Grid resource brokering implementations

    ICCSA 2022

    Get PDF
    Producción CientíficaThe process of economic, social, and cultural development leads to relevant changes in urban areas. Urban transformations usually generate a series of public and private real estate compounds which constitute real obstacles to urban walkability. The growing attention towards the sustainable development goals established on a global scale introduced new contents in urban redevelopment policies, aimed at favoring higher levels of accessibility in the consolidated fabric, particularly that of the pedestrian type. In addition, the recent pandemic has recently reassessed the role of pedestrian mobility as a primary way of moving instead of using other means of transport. As a result, urban walkability has moved at the core of the sustainable city paradigm. More precisely, issues related to accessibility and walkability should be considered when addressing the obstacle generated by those sites that can be properly defined ‘urban enclaves’, especially when abandoned or under redevelopment. These conditions may encourage the gradual reopening of these areas for citizens. Within this framework, the Sustainable Urban Mobility Plan (SUMP) can represent a strategic tool for identifying the critical aspects to face for the creation of a new network of pedestrian routes aimed at improving urban walkability. The objective of this study is to define a set of principles and criteria, both tangible and intangible, for calculating the proximity index (PI). The PI may consequently drive urban regeneration projects also through the design of new paths for crossing the enclaves to improve urban permeability and, therefore, the level of walkabilitySardinia Foundation (CUP F74I19001040007
    • …
    corecore