830,872 research outputs found
Directed Flow of Information in Chimera States
We investigated interactions within chimera states in a phase oscillator
network with two coupled subpopulations. To quantify interactions within and
between these subpopulations, we estimated the corresponding (delayed) mutual
information that -- in general -- quantifies the capacity or the maximum rate
at which information can be transferred to recover a sender's information at
the receiver with a vanishingly low error probability. After verifying their
equivalence with estimates based on the continuous phase data, we determined
the mutual information using the time points at which the individual phases
passed through their respective Poincar\'{e} sections. This stroboscopic view
on the dynamics may resemble, e.g., neural spike times, that are common
observables in the study of neuronal information transfer. This discretization
also increased processing speed significantly, rendering it particularly
suitable for a fine-grained analysis of the effects of experimental and model
parameters. In our model, the delayed mutual information within each
subpopulation peaked at zero delay, whereas between the subpopulations it was
always maximal at non-zero delay, irrespective of parameter choices. We
observed that the delayed mutual information of the desynchronized
subpopulation preceded the synchronized subpopulation. Put differently, the
oscillators of the desynchronized subpopulation were 'driving' the ones in the
synchronized subpopulation. These findings were also observed when estimating
mutual information of the full phase trajectories. We can thus conclude that
the delayed mutual information of discrete time points allows for inferring a
functional directed flow of information between subpopulations of coupled phase
oscillators
Estimation of synthetic flood design hydrographs using a distributed rainfall–runoff model coupled with a copula-based single storm rainfall generator
In this paper a procedure to derive synthetic flood design hydrographs (SFDH)
using a bivariate representation of rainfall forcing (rainfall duration and
intensity) via copulas, which describes and models the correlation between
two variables independently of the marginal laws involved, coupled with a
distributed rainfall–runoff model, is presented. Rainfall–runoff modelling
(R–R modelling) for estimating the hydrological response at the outlet of a
catchment was performed by using a conceptual fully distributed procedure
based on the Soil Conservation Service – Curve Number method as an excess
rainfall model and on a distributed unit hydrograph with climatic
dependencies for the flow routing. Travel time computation, based on the
distributed unit hydrograph definition, was performed by implementing a
procedure based on flow paths, determined from a digital elevation model
(DEM) and roughness parameters obtained from distributed geographical
information. In order to estimate the primary return period of the SFDH,
which provides the probability of occurrence of a hydrograph flood, peaks and
flow volumes obtained through R–R modelling were treated statistically using copulas. Finally, the shapes of
hydrographs have been generated on the basis of historically significant
flood events, via cluster analysis.
<br><br>
An application of the procedure described above has been carried out and results presented for the case study of
the Imera catchment in Sicily, Italy
Risk based multi-objective security control and congestion management
Deterministic security criterion has served power system operation, congestion management quite well in last decades. It is simple to be implemented in a security control model, for example, security constrained optimal power flow (SCOPF). However, since event likelihood and violation information are not addressed, it does not provide quantitative security understanding, and so results in system inadequate awareness. Therefore, even if computation capability and information techniques have been greatly improved and widely applied in the operation support tool, operators are still not able to get rid of the security threat, especially in the market competitive environment.;Probability approach has shown its strong ability for planning purpose, and recently gets attention in operation area. Since power system security assessment needs to analyze consequence of all credible events, risk defined as multiplication of event probability and severity is well suited to give an indication to quantify the system security level, and congestion level as well. Since risk addresses extra information, its application for making BETTER online operation decision becomes an attractive research topic.;This dissertation focus on system online risk calculation, risk based multi-objective optimization model development, risk based security control design, and risk based congestion management. A regression model is proposed to predict contingency probability using weather and geography information for online risk calculation. Risk based multi-objective optimization (RBMO) model is presented, considering conflict objectives: risks and cost. Two types of method, classical methods and evolutionary algorithms, are implemented to solve RBMO problem, respectively. A risk based decision making architecture for security control is designed based on the Pareto-optimal solution understanding, visualization tool and high level information analysis. Risk based congestion management provides a market lever to uniformly expand a security VOLUME , where greater volume means more risk. Meanwhile, risk based LMP signal contracts ALL dimensions of this VOLUME in proper weights (state probabilities) at a time.;Two test systems, 6-bus and IEEE RTS 96, are used to test developed algorithms. The simulation results show that incorporating risk into security control and congestion management will evolve our understanding of security level, improve control and market efficiency, and support operator to maneuver system in an effective fashion
A Finite State Automaton Representation And Simulation Of A Data/Frame Model Of Sensemaking
This thesis presents the application of a finite state automaton (FSA) to analytic modeling of Data/Frame Model (DFM) of sensemaking. A FSA is chosen for the DFM simulation because of its inherent characteristics to mimic changes in system behaviors and transitional states akin to the dynamic information changes in dynamic and unstructured emergencies. It also has the ability to capture feedback and loops, transitions, and spatio-temporal events based on iterative processes of an individual or a group of sensemakers. The thesis has exploited the human-driven DFM constructs for analytical modeling using Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) software system. Sensemaking times, problem stage time (PST), and nodeto-node (NTN) transition times serve as the major performance factors. The results obtained show differences in sensemaking times based on problem complexity and information uncertainty. An analysis of variance (ANOVA) statistical analysis, for three developed fictitious scenarios with different complexities and Hurricane Katrina, was conducted to investigate sensemaking performance. The results show that sensemaking performance was significant with an F (3,177) of 16.78 and probability value less than 0.05, indicating an overall effect of sensemaking information flow on sensemaking. Tukey’s Studentized Range Test shows the significant statistical differences between the complexities of Hurricane Katrina (HK) and medium complexity scenario (MC), HK and low complexity scenario (LC), high complexity scenario (HC) and LC, and MC and LC
Recommended from our members
Modeling Interactions Between Human Factors and Traffic Flow Characteristics
To serve research needs for traffic flow model development and highway safety enhancement, we model interactions between human factors and traffic flow character- istics, this topic includes methods on collecting data, modeling impacts of parameters on flow, and calibrating parameters on observed data. An example of successful traf- fic data collection is NGSIM data, which contains location, speed, and acceleration information of vehicles. An algorithm was designed to match and extract vehicles’ trajectory records, and utilize the extracted information for pattern recognition of lane changing maneuvers. This algorithm reads records from an NGSIM data set, pick out vehicles executing lane changing maneuvers, and note the corresponding time stamps. Also through matching these records by vehicle ID and time stamp, we obtain a map of vehicles when a lane changing is happening, thus calculating gaps and relative speeds becomes possible. An example of utilizing these information is pattern recognition on lane changing maneuvers. We analyze lane changing maneu- vers with speed data and gap data. The approach with speed data shows convincing results, as most lane changing vehicles have a descending and then ascending pattern on their speed profiles before executing the maneuver. On the other hand we can use collected data for calibrating parameters in traffic flow models. A heuristic method- ology is implemented to provide results with high accuracy, high efficiency and high robustness. Techniques include data aggregation and bisection analysis are applied in this approach to ensure achieving these goals and further requirements. Two traf- fic flow simulation models, Longitudinal Control Model (LCM) and Newell’s Model are calibrated by applying this approach using traffic data collected at Georgia 400 highway in July, 2003, with satisfying accuracy and robustness produced in a running time of less than 2 seconds. Meanwhile we can enhance human factors by applying new technologies, and connected vehicle is a good example which is rapidly devel- oping. Future vehicles will be able to communicate with each other which greatly improves drivers’ situational awareness. Consequently, drivers may be able to re- spond earlier to safety hazards before they manifest themselves in forms of imminent danger. Therefore, the overall effect of this technology can be attributed to drivers’ enhanced perception-reaction (P-R) capability which, in turn, translates to improved flow and capacity. However, it is critical to quantify such benefits before large-scale investment is made. In our research, a statistical transformation model is formulated to predict the probability distribution function of flow. By entering distributions of P-R time and enhanced P-R time, this model helps compare before and after distri- butions of traffic flow, based on which benefits of connected vehicles on traffic flow can be analyzed
Analysis of laminar-turbulent transition process in mixing layer with various information measures
In the laminar-turbulent transition process of a mixing layer formed downstream of a two-dimensional nozzle exit, an analysis was performed based on various information measures. Shannon entropy, permutation entropy and Kullback-Leibler divergence were introduced, and former studies in which they were used in the turbulent analysis were then reviewed. In the present study, the probability distribution of time series of hot-wire output voltage data was obtained, then analyzed. The aim of the investigation was to clarify the effectiveness of the analysis for the transition and turbulent flow. In addition, equations which Shannon entropy must satisfy in the turbulent flow field were derived. The Shannon entropy of the fluctuating velocity changed monotonically in the downstream direction. Thus, it appears to measure the transition process in the mixing layer. The permutation entropy of the fluctuating velocity first increased, then decreased, then increased again, and decreased finally. It reflected the increase of the fluctuating velocity and change of fluctuation manner (from periodic to irregular fluctuation) during the transition process. The Kullback-Leibler divergence based on the probability density function of the fluctuating velocity increased first, then decreased downstream, and thus did not show a monotonic change during the transition process in the mixing layer
Modelling election dynamics and the impact of disinformation
Complex dynamical systems driven by the unravelling of information can be
modelled effectively by treating the underlying flow of information as the
model input. Complicated dynamical behaviour of the system is then derived as
an output. Such an information-based approach is in sharp contrast to the
conventional mathematical modelling of information-driven systems whereby one
attempts to come up with essentially {\it ad hoc} models for the outputs. Here,
dynamics of electoral competition is modelled by the specification of the flow
of information relevant to election. The seemingly random evolution of the
election poll statistics are then derived as model outputs, which in turn are
used to study election prediction, impact of disinformation, and the optimal
strategy for information management in an election campaign.Comment: 20 pages, 5 figure
- …