19 research outputs found

    Entropic Analysis of Votes Expressed in Italian Elections between 1948 and 2018

    Get PDF
    open access articleIn Italy, the elections occur often, indeed almost every year the citizens are involved in a democratic choice for deciding leaders of different administrative entities. Sometimes the citizens are called to vote for filling more than one office in more than one administrative body. This phenomenon has occurred 35 times after 1948; it creates the peculiar condition of having the same sample of people expressing decisions on political bases at the same time. Therefore, the Italian contemporaneous ballots constitute the occasion to measure coherence and chaos in the way of expressing political opinion. In this paper, we address all the Italian elections that occurred between 1948and2018. Wecollectthenumberofvotesperpartyateachadministrativelevelandwetreateach electionasamanifestationofacomplexsystem. Then,weusetheShannonentropyandtheGiniIndex to study the degree of disorder manifested during different types of elections at the municipality level. A particular focus is devoted to the contemporaneous elections. Such cases implicate different disorder dynamics in the contemporaneous ballots, when different administrative level are involved. Furthermore, some features that characterize different entropic regimes have emerged

    A generalized permutation entropy for random processes

    Get PDF
    Permutation entropy measures the complexity of deterministic time series via a data symbolic quantization consisting of rank vectors called ordinal patterns or just permutations. The reasons for the increasing popularity of this entropy in time series analysis include that (i) it converges to the Kolmogorov-Sinai entropy of the underlying dynamics in the limit of ever longer permutations, and (ii) its computation dispenses with generating and ad hoc partitions. However, permutation entropy diverges when the number of allowed permutations grows super-exponentially with their length, as is usually the case when time series are output by random processes. In this Letter we propose a generalized permutation entropy that is finite for random processes, including discrete-time dynamical systems with observational or dynamical noise.Comment: 9 pages, 5 figure

    Gintropy: Gini index based generalization of Entropy

    Full text link
    Entropy is being used in physics, mathematics, informatics and in related areas to describe equilibration, dissipation, maximal probability states and optimal compression of information. The Gini index on the other hand is an established measure for social and economical inequalities in a society. In this paper we explore the mathematical similarities and connections in these two quantities and introduce a new measure that is capable to connect these two at an interesting analogy level. This supports the idea that a generalization of the Gibbs--Boltzmann--Shannon entropy, based on a transformation of the Lorenz curve, can properly serve in quantifying different aspects of complexity in socio- and econo-physics.Comment: 13 pages, 3 Figure

    Signal Fluctuations and the Information Transmission Rates in Binary Communication Channels

    Full text link
    In nervous system information is conveyed by sequence of action potentials (spikes-trains). As MacKay and McCulloch proposed, spike-trains can be represented as bits sequences coming from Information Sources. Previously, we studied relations between Information Transmission Rates (ITR) carried out by the spikes, their correlations, and frequencies. Here, we concentrate on the problem of how spikes fluctuations affect ITR. The Information Theory Method developed by Shannon is applied. Information Sources are modeled as stationary stochastic processes. We assume such sources as two states Markov processes. As a spike-trains' fluctuation measure, we consider the Standard Deviation SD, which, in fact, measures average fluctuation of spikes around the average spike frequency. We found that character of ITR and signal fluctuations relation strongly depends on parameter s which is a sum of transitions probabilities from no spike state to spike state and vice versa. It turned out that for smaller s (s<1) the quotient ITR/SD has a maximum and can tend to zero depending on transition probabilities. While for s large enough 1<s the ITR/SD is separated from 0 for each s. Similar behavior was observed when we replaced Shannon entropy terms in Markov entropy formula by their approximation with polynomials. We also show that the ITR quotient by Variance behaves in a completely different way. We show that for large transition parameter s the Information Transmission Rate by SD will never decrease to 0. Specifically, for 1<s<1.7 the ITR will be always, independently on transition probabilities which form this s, above the level of fluctuations, i.e. we have SD<ITR. We conclude that in a more noisy environment, to get appropriate reliability and efficiency of transmission, Information Sources with higher tendency of transition from the state no spike to spike state and vice versa should be applied.Comment: 11 pages, 3 figure

    Nonlinear fokker-planck equation approach to systems of interacting particles: Thermostatistical features related to the range of the interactions

    Get PDF
    Nonlinear Fokker-Planck equations (NLFPEs) constitute useful effective descriptions of some interacting many-body systems. Important instances of these nonlinear evolution equations are closely related to the thermostatistics based on the Sq power-law entropic functionals. Most applications of the connection between the NLFPE and the Sq entropies have focused on systems interacting through short-range forces. In the present contribution we re-visit the NLFPE approach to interacting systems in order to clarify the role played by the range of the interactions, and to explore the possibility of developing similar treatments for systems with long-range interactions, such as those corresponding to Newtonian gravitation. In particular, we consider a system of particles interacting via forces following the inverse square law and performing overdamped motion, that is described by a density obeying an integro-differential evolution equation that admits exact time-dependent solutions of the q-Gaussian form. These q-Gaussian solutions, which constitute a signature of Sq-thermostatistics, evolve in a similar but not identical way to the solutions of an appropriate nonlinear, power-law Fokker-Planck equation.Fil: Plastino, Ángel Ricardo. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad Nacional del Noroeste de la Provincia de Buenos Aires. Centro de Bioinvestigaciones (Sede Junín); Argentina. Universidad Nacional del Noroeste de la Provincia de Buenos Aires. Departamento de Ciencias Básicas y Experimentales; ArgentinaFil: Wedemann, Roseli S.. Universidade do Estado de Rio do Janeiro; Brasi

    Entropic characterization of quantum states with maximal evolution under given energy constraints

    Get PDF
    A measure D[t1, t2] for the amount of dynamical evolution exhibited by a quantum system during a time interval [t1, t2] is defined in terms of how distinguishable from each other are, on average, the states of the system at different times. We investigate some properties of the measure D showing that, for increasing values of the interval's duration, the measure quickly reaches an asymptotic value given by the linear entropy of the energy distribution associated with the system's (pure) quantum state. This leads to the formulation of an entropic variational problem characterizing the quantum states that exhibit the largest amount of dynamical evolution under energy constraints given by the expectation value of the energy.Fil: Majtey, Ana Paula. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Córdoba. Instituto de Física Enrique Gaviola. Universidad Nacional de Córdoba. Instituto de Física Enrique Gaviola; Argentina. Universidad Nacional de Córdoba. Facultad de Matemática, Astronomía y Física; ArgentinaFil: Valdés Hernández, Andrea. Universidad Nacional Autónoma de México; MéxicoFil: Maglione, Cesar German. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Córdoba. Instituto de Física Enrique Gaviola. Universidad Nacional de Córdoba. Instituto de Física Enrique Gaviola; Argentina. Universidad Nacional de Córdoba. Facultad de Matemática, Astronomía y Física; ArgentinaFil: Plastino, Ángel Ricardo. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad Nacional del Noroeste de la Provincia de Buenos Aires. Centro de Bioinvestigaciones (Sede Pergamino); Argentin

    On the unit Burr-XII distribution with the quantile regression modeling and applications

    Get PDF
    In this paper, we modify the Burr-XII distribution through the inverse exponential scheme to obtain a new two-parameter distribution on the unit interval called the unit Burr-XII distribution. The basic statistical properties of the newly defined distribution are studied. Parameters estimation is dealt and different estimation methods are assessed through two simulation studies. A new quantile regression model based on the proposed distribution is introduced. Applications of the proposed distribution and its regression model to real data sets show that the proposed models have better modeling capabilities than competing models

    Causal Information Rate

    Get PDF
    Information processing is common in complex systems, and information geometric theory provides a useful tool to elucidate the characteristics of non-equilibrium processes, such as rare, extreme events, from the perspective of geometry. In particular, their time-evolutions can be viewed by the rate (information rate) at which new information is revealed (a new statistical state is accessed). In this paper, we extend this concept and develop a new information-geometric measure of causality by calculating the effect of one variable on the information rate of the other variable. We apply the proposed causal information rate to the Kramers equation and compare it with the entropy-based causality measure (information flow). Overall, the causal information rate is a sensitive method for identifying causal relations
    corecore