1,670 research outputs found

    A Neural System for Automated CCTV Surveillance

    Get PDF
    This paper overviews a new system, the “Owens Tracker,” for automated identification of suspicious pedestrian activity in a car-park. Centralized CCTV systems relay multiple video streams to a central point for monitoring by an operator. The operator receives a continuous stream of information, mostly related to normal activity, making it difficult to maintain concentration at a sufficiently high level. While it is difficult to place quantitative boundaries on the number of scenes and time period over which effective monitoring can be performed, Wallace and Diffley [1] give some guidance, based on empirical and anecdotal evidence, suggesting that the number of cameras monitored by an operator be no greater than 16, and that the period of effective monitoring may be as low as 30 minutes before recuperation is required. An intelligent video surveillance system should therefore act as a filter, censuring inactive scenes and scenes showing normal activity. By presenting the operator only with unusual activity his/her attention is effectively focussed, and the ratio of cameras to operators can be increased. The Owens Tracker learns to recognize environmentspecific normal behaviour, and refers sequences of unusual behaviour for operator attention. The system was developed using standard low-resolution CCTV cameras operating in the car-parks of Doxford Park Industrial Estate (Sunderland, Tyne and Wear), and targets unusual pedestrian behaviour. The modus operandi of the system is to highlight excursions from a learned model of normal behaviour in the monitored scene. The system tracks objects and extracts their centroids; behaviour is defined as the trajectory traced by an object centroid; normality as the trajectories typically encountered in the scene. The essential stages in the system are: segmentation of objects of interest; disambiguation and tracking of multiple contacts, including the handling of occlusion and noise, and successful tracking of objects that “merge” during motion; identification of unusual trajectories. These three stages are discussed in more detail in the following sections, and the system performance is then evaluated

    Unilateral Auditory Temporal Resolution Deficit: A Case Study

    Get PDF
    An adult with a unilateral precipitous severe high-frequency hearing loss displayed a selective auditory temporal resolution deficit in the poorer ear despite excellent word recognition ability in quiet bilaterally. Word recognition performance was inferior in interrupted noise, reverberation, and time-compression conditions when stimuli were presented to the hearing-impaired ear and compared with performance for stimuli presented to the normal-hearing ear or that of normal-hearing listeners. It was suggested that a restricted listening bandwidth was responsible for the performance decrement on the tasks involving temporal resolution. This case illustrates the importance of employing temporal resolution tasks in an audiologic test battery. Such assessment tools may reveal deficits that otherwise may go unnoticed in light of excellent word recognition in quiet. Educational Objectives: After reading this article the reader will be able to (1) appreciate the effect of high-frequency hearing loss on temporal resolution and (2) realize the importance of utilizing temporal resolution tasks in an audiologic test battery

    Alumni Recital: Andrew Carpenter, alto saxophone

    Get PDF

    Network Architecture and Mutual Monitoring in Public Goods Experiments

    Get PDF
    Recent experiments show that public goods can be provided at high levels when mutual monitoring and costly punishment are allowed. All these experiments, however, study monitoring and punishment in a setting where all agents can monitor and punish each other (i.e., in a complete network). The architecture of social networks becomes important when individuals can only monitor and punish the other individuals to whom they are connected by the network. We study several non-trivial network architectures that give rise to their own distinctive patterns of behavior. Nevertheless, a number of simple, yet fundamental, properties in graph theory allow us to interpret the variation in the patterns of behavior that arise in the laboratory and to explain the impact of network architecture on the efficiency and dynamics of the experimental outcomes.experiment, networks, public good, monitoring, punishment

    Active filter current compensation for transmission optimisation

    Get PDF
    This dissertation is based on the fact that any m-wire electrical system can be modelled as m-equivalent Thevenin voltages and impedances when viewed from any node. The dissertation describes how to calculate the optimal distribution of currents, so a specific amount of power can flow through and reach the network equivalent Thevenin voltages with minimal losses. The optimal current distribution method uses a recently patented method which calculates the optimal currents for each of the wires which are shown to be obtained from the Thevenin parameters and power flow at any instant in time at any node. Once the ideal currents are found, these can be obtained by active and passive devices to inject a specific amount of power (positive and negative) as to compensate existing currents. The focus is particularly on the proof of concept by simulations and physical experiments with work not specifically described in the patent with more emphasis on the optimisation to active compensation. It is explained and shown how this can be implemented using the Malengret and Gaunt method. This method reduces the cost in application where not all the currents need to be processed through a converter (e.g. inverter) but only the difference between the existing and desired optimal currents. A smaller shunt parallel converter can result with ideal current flow without the need for interrupting the currents as described in the present patent. The methodology is explained and demonstrated by simulation

    CHAMBERS V. MISSISSIPPI: THE HEARSAY RULE AND RACIAL EVALUATIONS OF CREDIBILITY

    Full text link

    Truth and Reference: Some Doubts about Formal Semantics

    Get PDF
    Formal semantics might be understood as the attempt to show that the most fruitful theories about a natural language are based upon a formalized specification of that language’s structure. Since it hopes to provide the basis for theories about language, formal semantics is obviously concerned with notions like grammaticality, reference, and meaning. Just as a system of formal logic attempts to give a formal account of our intuitions about the validity of informal arguments, so does the formalization of semantics attempt to systematize and make rigorous our intuitions about, for example, the grammaticality, significance, synonymy, reference, and truth value of certain expressions in a natural language. The ultimate aim of formal semantics might be thought of as constructing an account of meaning analogous to the logical account of validity, viz. one that would provide necessary and sufficient conditions for determining the meaning of any expression in the natural language, or portion of a natural language, that is being formalized

    Rank-normalization, folding, and localization: An improved R^\widehat{R} for assessing convergence of MCMC

    Full text link
    Markov chain Monte Carlo is a key computational tool in Bayesian statistics, but it can be challenging to monitor the convergence of an iterative stochastic algorithm. In this paper we show that the convergence diagnostic R^\widehat{R} of Gelman and Rubin (1992) has serious flaws. Traditional R^\widehat{R} will fail to correctly diagnose convergence failures when the chain has a heavy tail or when the variance varies across the chains. In this paper we propose an alternative rank-based diagnostic that fixes these problems. We also introduce a collection of quantile-based local efficiency measures, along with a practical approach for computing Monte Carlo error estimates for quantiles. We suggest that common trace plots should be replaced with rank plots from multiple chains. Finally, we give recommendations for how these methods should be used in practice.Comment: Minor revision for improved clarit

    Rank-normalization, folding, and localization: An improved R^\widehat{R} for assessing convergence of MCMC

    Get PDF
    Markov chain Monte Carlo is a key computational tool in Bayesian statistics, but it can be challenging to monitor the convergence of an iterative stochastic algorithm. In this paper we show that the convergence diagnostic R^\widehat{R} of Gelman and Rubin (1992) has serious flaws. Traditional R^\widehat{R} will fail to correctly diagnose convergence failures when the chain has a heavy tail or when the variance varies across the chains. In this paper we propose an alternative rank-based diagnostic that fixes these problems. We also introduce a collection of quantile-based local efficiency measures, along with a practical approach for computing Monte Carlo error estimates for quantiles. We suggest that common trace plots should be replaced with rank plots from multiple chains. Finally, we give recommendations for how these methods should be used in practice.Comment: Minor revision for improved clarit
    • 

    corecore