2,608 research outputs found

    Distributed Object Tracking Using a Cluster-Based Kalman Filter in Wireless Camera Networks

    Get PDF
    Local data aggregation is an effective means to save sensor node energy and prolong the lifespan of wireless sensor networks. However, when a sensor network is used to track moving objects, the task of local data aggregation in the network presents a new set of challenges, such as the necessity to estimate, usually in real time, the constantly changing state of the target based on information acquired by the nodes at different time instants. To address these issues, we propose a distributed object tracking system which employs a cluster-based Kalman filter in a network of wireless cameras. When a target is detected, cameras that can observe the same target interact with one another to form a cluster and elect a cluster head. Local measurements of the target acquired by members of the cluster are sent to the cluster head, which then estimates the target position via Kalman filtering and periodically transmits this information to a base station. The underlying clustering protocol allows the current state and uncertainty of the target position to be easily handed off among clusters as the object is being tracked. This allows Kalman filter-based object tracking to be carried out in a distributed manner. An extended Kalman filter is necessary since measurements acquired by the cameras are related to the actual position of the target by nonlinear transformations. In addition, in order to take into consideration the time uncertainty in the measurements acquired by the different cameras, it is necessary to introduce nonlinearity in the system dynamics. Our object tracking protocol requires the transmission of significantly fewer messages than a centralized tracker that naively transmits all of the local measurements to the base station. It is also more accurate than a decentralized tracker that employs linear interpolation for local data aggregation. Besides, the protocol is able to perform real-time estimation because our implementation takes into consideration the sparsit- - y of the matrices involved in the problem. The experimental results show that our distributed object tracking protocol is able to achieve tracking accuracy comparable to the centralized tracking method, while requiring a significantly smaller number of message transmissions in the network

    Gossip Algorithms for Distributed Signal Processing

    Full text link
    Gossip algorithms are attractive for in-network processing in sensor networks because they do not require any specialized routing, there is no bottleneck or single point of failure, and they are robust to unreliable wireless network conditions. Recently, there has been a surge of activity in the computer science, control, signal processing, and information theory communities, developing faster and more robust gossip algorithms and deriving theoretical performance guarantees. This article presents an overview of recent work in the area. We describe convergence rate results, which are related to the number of transmitted messages and thus the amount of energy consumed in the network for gossiping. We discuss issues related to gossiping over wireless links, including the effects of quantization and noise, and we illustrate the use of gossip algorithms for canonical signal processing tasks including distributed estimation, source localization, and compression.Comment: Submitted to Proceedings of the IEEE, 29 page

    Security and Privacy Issues in Cloud Computing

    Full text link
    Cloud computing transforming the way of information technology (IT) for consuming and managing, promising improving cost efficiencies, accelerate innovations, faster time-to-market and the ability to scale applications on demand (Leighton, 2009). According to Gartner, while the hype grew ex-ponentially during 2008 and continued since, it is clear that there is a major shift towards the cloud computing model and that the benefits may be substantial (Gartner Hype-Cycle, 2012). However, as the shape of the cloud computing is emerging and developing rapidly both conceptually and in reality, the legal/contractual, economic, service quality, interoperability, security and privacy issues still pose significant challenges. In this chapter, we describe various service and deployment models of cloud computing and identify major challenges. In particular, we discuss three critical challenges: regulatory, security and privacy issues in cloud computing. Some solutions to mitigate these challenges are also proposed along with a brief presentation on the future trends in cloud computing deployment

    Gravitational waves: search results, data analysis and parameter estimation

    Get PDF
    The Amaldi 10 Parallel Session C2 on gravitational wave (GW) search results, data analysis and parameter estimation included three lively sessions of lectures by 13 presenters, and 34 posters. The talks and posters covered a huge range of material, including results and analysis techniques for ground-based GW detectors, targeting anticipated signals from different astrophysical sources: compact binary inspiral, merger and ringdown; GW bursts from intermediate mass binary black hole mergers, cosmic string cusps, core-collapse supernovae, and other unmodeled sources; continuous waves from spinning neutron stars; and a stochastic GW background. There was considerable emphasis on Bayesian techniques for estimating the parameters of coalescing compact binary systems from the gravitational waveforms extracted from the data from the advanced detector network. This included methods to distinguish deviations of the signals from what is expected in the context of General Relativity

    Development and Application of the Spherical Harmonic Veto Definer for Gravitational-Wave Transient Search

    Get PDF
    The rapid analysis of gravitational-wave data is not trivial for many reasons, such as the non-Gaussian non-stationary nature of LIGO detector noise and the lack of exhaustive waveform models. Non-Gaussian non-stationary noise and instrumental artifacts are known as ’glitches’. X-Pipeline Spherical Radiometer (X-SphRad) is a software package designed for performing autonomous searches for un-modelled gravitational- wave bursts. X-SphRad has an approach based on spherical radiometry, that transforms time-series data streams into the spherical harmonic domain. Spherical harmonic coefficients show potential in discriminating glitches from signals. For my Ph.D. thesis, I evaluated and implemented a tool for glitch rejection called Spherical Harmonic Veto Definer (SHaVeD). SHaVeD is a Matlab script that loads spherical harmonic coefficients computed by X-SphRad, and performs statistics that computes a threshold to apply. The threshold is used to identify every glitch’s GPS time and create a cut of one second around it. SHaVeD saves this information in a two-column file where the first column is the GPS starting time of the cut and the second is the final time. X-SphRad can include SHaVeD as a data quality to veto glitches. The tool is tested with X-SphRad and the coherent WaveBurst (cWB) pipeline over the O2 observation run. Results have shown how the inclusion of SHaVeD in the analysis could allow a lowering of some thresholds used in this type of research. Tests show how SHaVeD has reduced the amplitude of the loudest false event by a factor of 3, meaning that it rejected false events in a volume 9 times greater than usual

    Tools and Technologies for Enabling Characterisation in Synthetic Biology

    Get PDF
    Synthetic Biology represents a movement to utilise biological organisms for novel applications through the use of rigorous engineering principles. These principles rely on a solid and well versed understanding of the underlying biological components and functions (relevant to the application). In order to achieve this understanding, reliable behavioural and contextual information is required (more commonly known as characterisation data). Focussing on lowering the barrier of entry for current research facilities to regularly and easily perform characterisation assays will directly improve the communal knowledge base for Synthetic Biology and enable the further application of rational engineering principles. Whilst characterisation remains a fundamental principle for Synthetic Biology research, the high time costs, subjective measurement protocols, and ambiguous data analysis specifications, deter regular performance of characterisation assays. Vitally, this prevents the valid application of many of the key Synthetic Biology processes that have been derived to improve research yield (with regards to solving application problems) and directly prevent the intended goal of addressing the ad hoc nature of modern research from being realised. Designing new technologies and tools to facilitate rapid ‘hands off’ characterisation assays for research facilities will improve the uptake of characterisation within the research pipeline. To achieve this two core problem areas have been identified that limit current characterisation attempts in conventional research. Therefore, it was the primary aim of this investigation to overcome these two core problems to promote regular characterisation. The first issue identified as preventing the regular use of characterisation assays was the user-intensive methodologies and technologies available to researchers. There is currently no standardised characterisation equipment for assaying samples and the methodologies are heavily dependent on the researcher and their application for successful and complete characterisation. This study proposed a novel high throughput solution to the characterisation problem that was capable of low cost, concurrent, and rapid characterisation of simple biological DNA elements. By combining in vitro transcription-translation with microfluidics a potent solution to the characterisation problem was proposed. By utilising a completely in vitro approach along with excellent control abilities of microfluidic technologies, a prototype platform for high throughput characterisation was developed. The second issue identified was the lack of flexible, versatile software designed specifically for the data handling needs that are quickly arising within the characterisation speciality. The lack of general solutions in this area is problematic because of the increasing amount of data that is both required and generated for the characterisation output to be considered as rigorous and of value. To alleviate this issue a novel framework for laboratory data handling was developed that employs a plugin strategy for data submission and analysis. Employing a plugin strategy improves the shelf life of data handling software by allowing it to grow with the needs of the speciality. Another advantage to this strategy is the increased ability for well documented processing and analysis standards to arise that are available for all researchers. Finally, the software provided a powerful and flexible data storage schema that allowed all currently conceivable characterisation data types to be stored in a well-documented manner. The two solutions identified within this study increase the amount of enabling tools and technologies available to researchers within Synthetic Biology, which in turn will increase the uptake of regular characterisation. Consequently, this will potentially improve the lateral transfer of knowledge between research projects and reduce the need to perform ad hoc experiments to investigate facets of the fundamental biological components being utilised.Open Acces

    Smartphone-based crowdsourcing for estimating the bottleneck capacity in wireless networks

    Get PDF
    Crowdsourcing enables the fine-grained characterization and performance evaluation of today׳s large-scale networks using the power of the masses and distributed intelligence. This paper presents SmartProbe, a system that assesses the bottleneck capacity of Internet paths using smartphones, from a mobile crowdsourcing perspective. With SmartProbe measurement activities are more bandwidth efficient compared to similar systems, and a larger number of users can be supported. An application based on SmartProbe is also presented: georeferenced measurements are mapped and used to compare the performance of mobile broadband operators in wide areas. Results from one year of operation are included

    Bayesian coherent analysis of in-spiral gravitational wave signals with a detector network

    Full text link
    The present operation of the ground-based network of gravitational-wave laser interferometers in "enhanced" configuration brings the search for gravitational waves into a regime where detection is highly plausible. The development of techniques that allow us to discriminate a signal of astrophysical origin from instrumental artefacts in the interferometer data and to extract the full range of information are some of the primary goals of the current work. Here we report the details of a Bayesian approach to the problem of inference for gravitational wave observations using a network of instruments, for the computation of the Bayes factor between two hypotheses and the evaluation of the marginalised posterior density functions of the unknown model parameters. The numerical algorithm to tackle the notoriously difficult problem of the evaluation of large multi-dimensional integrals is based on a technique known as Nested Sampling, which provides an attractive alternative to more traditional Markov-chain Monte Carlo (MCMC) methods. We discuss the details of the implementation of this algorithm and its performance against a Gaussian model of the background noise, considering the specific case of the signal produced by the in-spiral of binary systems of black holes and/or neutron stars, although the method is completely general and can be applied to other classes of sources. We also demonstrate the utility of this approach by introducing a new coherence test to distinguish between the presence of a coherent signal of astrophysical origin in the data of multiple instruments and the presence of incoherent accidental artefacts, and the effects on the estimation of the source parameters as a function of the number of instruments in the network.Comment: 22 page
    corecore