2,510,250 research outputs found

    Multi Agent Modelling: Evolution and Skull Thickness in Hominids

    Get PDF
    Within human evolution, the period of Homo Erectus is particularly interesting since in this period, our ancestors have carried thicker skulls than the species both before and after them. There are competing theories as to the reasons of this enlargement and its reversal. One of these is the theory that Homo Erectus males fought for females by clubbing each other on the head. The other one says that due to the fact that Homo Erectus’ did not cook their food at all, they had to have strong jaw muscles attached to ridges on either side of the skull which prohibited brain and skull growth but required the skull to be thick. The re-thinning of the skull on the other hand might be due to the fact that a thick skull provided poor cooling for the brain or that as hominids started using tools to cut their food and using fire to cook it, they did not require the strong jaw muscles anymore and this trait was actually selected against since the brain had a tendency to grow and the ridges and a thick skull were preventing this. In this paper we simulated both the fighting and the diet as ways in which the hominid skull grew thicker. We also added other properties such as cooperation, selfishness and vision to our agents and analyzed their changes over generations. Keywords: Evolution, Skull Thickness, Hominids, Multi-Agent Modeling, Genetic Algorithm

    Zigzag Codes: MDS Array Codes with Optimal Rebuilding

    Get PDF
    MDS array codes are widely used in storage systems to protect data against erasures. We address the \emph{rebuilding ratio} problem, namely, in the case of erasures, what is the fraction of the remaining information that needs to be accessed in order to rebuild \emph{exactly} the lost information? It is clear that when the number of erasures equals the maximum number of erasures that an MDS code can correct then the rebuilding ratio is 1 (access all the remaining information). However, the interesting and more practical case is when the number of erasures is smaller than the erasure correcting capability of the code. For example, consider an MDS code that can correct two erasures: What is the smallest amount of information that one needs to access in order to correct a single erasure? Previous work showed that the rebuilding ratio is bounded between 1/2 and 3/4, however, the exact value was left as an open problem. In this paper, we solve this open problem and prove that for the case of a single erasure with a 2-erasure correcting code, the rebuilding ratio is 1/2. In general, we construct a new family of rr-erasure correcting MDS array codes that has optimal rebuilding ratio of er\frac{e}{r} in the case of ee erasures, 1≀e≀r1 \le e \le r. Our array codes have efficient encoding and decoding algorithms (for the case r=2r=2 they use a finite field of size 3) and an optimal update property.Comment: 23 pages, 5 figures, submitted to IEEE transactions on information theor

    IDENTIFYING IT SOLUTIONS ON FRAUD IN ELECTRONIC TRANSACTIONS OF FUNDS FROM BANKING SYSTEM

    Get PDF
    Although we hear daily of fraud, most of them are not reported. Some reports estimated that approximately 90% of assaults are not reported outside organizations were attacked, and only some of the reports are completed by punishment.In fact, for fear of losing customers, some companies (usually banks and large corporations) prefers to fall to an understanding with attackers in exchange for preserving part of the stolen money and keeping silence. Taking into account the development and modernization of the economies of the world in the last four decades, and simultaneous this global banking development and distribution were strongly influenced by the introduction of new computer technology; in such activities new computer technology had a strong impact on providers and on consumers.IT security, fraud, electronic transactions, banking system

    Diffusion Adaptation over Networks under Imperfect Information Exchange and Non-stationary Data

    Full text link
    Adaptive networks rely on in-network and collaborative processing among distributed agents to deliver enhanced performance in estimation and inference tasks. Information is exchanged among the nodes, usually over noisy links. The combination weights that are used by the nodes to fuse information from their neighbors play a critical role in influencing the adaptation and tracking abilities of the network. This paper first investigates the mean-square performance of general adaptive diffusion algorithms in the presence of various sources of imperfect information exchanges, quantization errors, and model non-stationarities. Among other results, the analysis reveals that link noise over the regression data modifies the dynamics of the network evolution in a distinct way, and leads to biased estimates in steady-state. The analysis also reveals how the network mean-square performance is dependent on the combination weights. We use these observations to show how the combination weights can be optimized and adapted. Simulation results illustrate the theoretical findings and match well with theory.Comment: 36 pages, 7 figures, to appear in IEEE Transactions on Signal Processing, June 201
    • 

    corecore