4,223 research outputs found

    ENHANCED SCHEDULING TRAFFIC LIGHT MODEL USING DISCRETE EVENT SIMULATION FOR IMPROVED SIGNAL TIMING ANALYSIS

    Get PDF
    Most traffic light today used pre-timed traffic light, traffic light using sensors and traffic light which displaying a countdown timer. However, the existing methods consume a long time of vehicle queuing and waiting the traffic light signals to change, which created congestion at intersection of roads. In this paper, the proposed model enhanced the scheduling traffic light, which simulates the vehicle behaviour based on discrete event simulation and queue theory. Therefore, the simulation becomes more realistic and contributes to accurate outcome. This work focuses on the analysis of the average waiting time for the vehicle in three cases: heavy, medium and low traffic volume. The most optimum traffic signal timing is the one with minimum waiting time for the vehicles. Moreover, the new model solves the critical traffic congestion problem not only in simulation but also in real environment, which drivers take the longest average waiting time is 86 seconds while the shortest average waiting time is 64 seconds at the junction although in heavy traffic congestion. An extensive simulations have been conducted in this work in which a green interval as a control parameter is selected

    Data-driven modelling of biological multi-scale processes

    Full text link
    Biological processes involve a variety of spatial and temporal scales. A holistic understanding of many biological processes therefore requires multi-scale models which capture the relevant properties on all these scales. In this manuscript we review mathematical modelling approaches used to describe the individual spatial scales and how they are integrated into holistic models. We discuss the relation between spatial and temporal scales and the implication of that on multi-scale modelling. Based upon this overview over state-of-the-art modelling approaches, we formulate key challenges in mathematical and computational modelling of biological multi-scale and multi-physics processes. In particular, we considered the availability of analysis tools for multi-scale models and model-based multi-scale data integration. We provide a compact review of methods for model-based data integration and model-based hypothesis testing. Furthermore, novel approaches and recent trends are discussed, including computation time reduction using reduced order and surrogate models, which contribute to the solution of inference problems. We conclude the manuscript by providing a few ideas for the development of tailored multi-scale inference methods.Comment: This manuscript will appear in the Journal of Coupled Systems and Multiscale Dynamics (American Scientific Publishers

    Deep Learning of Scene-Specific Classifier for Pedestrian Detection in Dubai

    Get PDF
    The performance of a generic pedestrian detector varies based on the data fed to it; when applied to a specific scene, its performance degrades dramatically, which require the detector to be fed with the specific target in mind so that it can produce the desired predictions and detect for the user the specified target. In this paper, I propose to feed the automated specialization of a scene-specific pedestrian detector, with multiple sources from pictures to even videos beginning with a generic video surveillance detector, however manually marking samples to ease the process, as the knowledge accumulated from the master program is still insufficient to produce high-end automated sample marking for the detector. The key idea is to consider a deep detector as a feature that produces a perception of the likelihood of a pedestrian being detected in the target. The system then will be fed with the manually marked samples to enhance its performance and the usage of an already existing system using the Monte Carlo sequential filter system. There has been the implementation of the pedestrian detectors in China, where it showcased the different patterns, the detector can classify and assess whether a pedestrian is present within the testing data or not. The project is truly fascinating as it shows how a machine can learn when fed with the right data and produce sensible results that lead to human renovation and up their living standards by decreasing the number of accidents related to pedestrians affecting the overall rate of accidents. “Many real-world data analysis tasks involve estimating unknown quantities from some given observations” as addressed by the authors within their report on Monte Carlo methods (Doucet A., de Freitas N., Gordon N.). In order to compute rational approximations, it is also important to follow numerical techniques. The techniques of Monte Carlo method (MCM) are powerful tools that allow us to achieve this objective (Andrieu C., Doucet A., Punskaya E.)

    Belief State Planning for Autonomous Driving: Planning with Interaction, Uncertain Prediction and Uncertain Perception

    Get PDF
    This work presents a behavior planning algorithm for automated driving in urban environments with an uncertain and dynamic nature. The algorithm allows to consider the prediction uncertainty (e.g. different intentions), perception uncertainty (e.g. occlusions) as well as the uncertain interactive behavior of the other agents explicitly. Simulating the most likely future scenarios allows to find an optimal policy online that enables non-conservative planning under uncertainty

    Statistical Filtering for Multimodal Mobility Modeling in Cyber Physical Systems

    Get PDF
    A Cyber-Physical System integrates computations and dynamics of physical processes. It is an engineering discipline focused on technology with a strong foundation in mathematical abstractions. It shares many of these abstractions with engineering and computer science, but still requires adaptation to suit the dynamics of the physical world. In such a dynamic system, mobility management is one of the key issues against developing a new service. For example, in the study of a new mobile network, it is necessary to simulate and evaluate a protocol before deployment in the system. Mobility models characterize mobile agent movement patterns. On the other hand, they describe the conditions of the mobile services. The focus of this thesis is on mobility modeling in cyber-physical systems. A macroscopic model that captures the mobility of individuals (people and vehicles) can facilitate an unlimited number of applications. One fundamental and obvious example is traffic profiling. Mobility in most systems is a dynamic process and small non-linearities can lead to substantial errors in the model. Extensive research activities on statistical inference and filtering methods for data modeling in cyber-physical systems exist. In this thesis, several methods are employed for multimodal data fusion, localization and traffic modeling. A novel energy-aware sparse signal processing method is presented to process massive sensory data. At baseline, this research examines the application of statistical filters for mobility modeling and assessing the difficulties faced in fusing massive multi-modal sensory data. A statistical framework is developed to apply proposed methods on available measurements in cyber-physical systems. The proposed methods have employed various statistical filtering schemes (i.e., compressive sensing, particle filtering and kernel-based optimization) and applied them to multimodal data sets, acquired from intelligent transportation systems, wireless local area networks, cellular networks and air quality monitoring systems. Experimental results show the capability of these proposed methods in processing multimodal sensory data. It provides a macroscopic mobility model of mobile agents in an energy efficient way using inconsistent measurements

    Unraveling the intricacies of spatial organization of the ErbB receptors and downstream signaling pathways

    Get PDF
    Faced with the complexity of diseases such as cancer which has 1012 mutations, altering gene expression, and disrupting regulatory networks, there has been a paradigm shift in the biological sciences and what has emerged is a much more quantitative field of biology. Mathematical modeling can aid in biological discovery with the development of predictive models that provide future direction for experimentalist. In this work, I have contributed to the development of novel computational approaches which explore mechanisms of receptor aggregation and predict the effects of downstream signaling. The coupled spatial non-spatial simulation algorithm, CSNSA is a tool that I took part in developing, which implements a spatial kinetic Monte Carlo for capturing receptor interactions on the cell membrane with Gillespies stochastic simulation algorithm, SSA, for temporal cytosolic interactions. Using this framework we determine that receptor clustering significantly enhances downstream signaling. In the next study the goal was to understand mechanisms of clustering. Cytoskeletal interactions with mobile proteins are known to hinder diffusion. Using a Monte Carlo approach we simulate these interactions, determining at what cytoskeletal distribution and receptor concentration optimal clustering occurs and when it is inhibited. We investigate oligomerization induced trapping to determine mechanisms of clustering, and our results show that the cytoskeletal interactions lead to receptor clustering. After exploring the mechanisms of clustering we determine how receptor aggregation effects downstream signaling. We further proceed by implementing the adaptively coarse grained Monte Carlo, ACGMC to determine if \u27receptor-sharing\u27 occurs when receptors are clustered. In our proposed \u27receptor-sharing\u27 mechanism a cytosolic species binds with a receptor then disassociates and rebinds a neighboring receptor. We tested our hypothesis using a novel computational approach, the ACGMC, an algorithm which enables the spatial temporal evolution of the system in three dimensions by using a coarse graining approach. In this framework we are modeling EGFR reaction-diffusion events on the plasma membrane while capturing the spatial-temporal dynamics of proteins in the cytosol. From this framework we observe \u27receptor-sharing\u27 which may be an important mechanism in the regulation and overall efficiency of signal transduction. In summary, I have helped to develop predictive computational tools that take systems biology in a new direction.\u2

    Context Exploitation in Data Fusion

    Get PDF
    Complex and dynamic environments constitute a challenge for existing tracking algorithms. For this reason, modern solutions are trying to utilize any available information which could help to constrain, improve or explain the measurements. So called Context Information (CI) is understood as information that surrounds an element of interest, whose knowledge may help understanding the (estimated) situation and also in reacting to that situation. However, context discovery and exploitation are still largely unexplored research topics. Until now, the context has been extensively exploited as a parameter in system and measurement models which led to the development of numerous approaches for the linear or non-linear constrained estimation and target tracking. More specifically, the spatial or static context is the most common source of the ambient information, i.e. features, utilized for recursive enhancement of the state variables either in the prediction or the measurement update of the filters. In the case of multiple model estimators, context can not only be related to the state but also to a certain mode of the filter. Common practice for multiple model scenarios is to represent states and context as a joint distribution of Gaussian mixtures. These approaches are commonly referred as the join tracking and classification. Alternatively, the usefulness of context was also demonstrated in aiding the measurement data association. Process of formulating a hypothesis, which assigns a particular measurement to the track, is traditionally governed by the empirical knowledge of the noise characteristics of sensors and operating environment, i.e. probability of detection, false alarm, clutter noise, which can be further enhanced by conditioning on context. We believe that interactions between the environment and the object could be classified into actions, activities and intents, and formed into structured graphs with contextual links translated into arcs. By learning the environment model we will be able to make prediction on the target\u2019s future actions based on its past observation. Probability of target future action could be utilized in the fusion process to adjust tracker confidence on measurements. By incorporating contextual knowledge of the environment, in the form of a likelihood function, in the filter measurement update step, we have been able to reduce uncertainties of the tracking solution and improve the consistency of the track. The promising results demonstrate that the fusion of CI brings a significant performance improvement in comparison to the regular tracking approaches
    • …
    corecore