12,171 research outputs found

    Localisation and tracking of people using distributed UWB sensors

    Get PDF
    In vielen Überwachungs- und Rettungsszenarien ist die Lokalisierung und Verfolgung von Personen in Innenräumen auf nichtkooperative Weise erforderlich. Für die Erkennung von Objekten durch Wände in kurzer bis mittlerer Entfernung, ist die Ultrabreitband (UWB) Radartechnologie aufgrund ihrer hohen zeitlichen Auflösung und Durchdringungsfähigkeit Erfolg versprechend. In dieser Arbeit wird ein Prozess vorgestellt, mit dem Personen in Innenräumen mittels UWB-Sensoren lokalisiert werden können. Er umfasst neben der Erfassung von Messdaten, Abstandschätzungen und dem Erkennen von Mehrfachzielen auch deren Ortung und Verfolgung. Aufgrund der schwachen Reflektion von Personen im Vergleich zum Rest der Umgebung, wird zur Personenerkennung zuerst eine Hintergrundsubtraktionsmethode verwendet. Danach wird eine konstante Falschalarmrate Methode zur Detektion und Abstandschätzung von Personen angewendet. Für Mehrfachziellokalisierung mit einem UWB-Sensor wird eine Assoziationsmethode entwickelt, um die Schätzungen des Zielabstandes den richtigen Zielen zuzuordnen. In Szenarien mit mehreren Zielen kann es vorkommen, dass ein näher zum Sensor positioniertes Ziel ein anderes abschattet. Ein Konzept für ein verteiltes UWB-Sensornetzwerk wird vorgestellt, in dem sich das Sichtfeld des Systems durch die Verwendung mehrerer Sensoren mit unterschiedlichen Blickfeldern erweitert lässt. Hierbei wurde ein Prototyp entwickelt, der durch Fusion von Sensordaten die Verfolgung von Mehrfachzielen in Echtzeit ermöglicht. Dabei spielen insbesondere auch Synchronisierungs- und Kooperationsaspekte eine entscheidende Rolle. Sensordaten können durch Zeitversatz und systematische Fehler gestört sein. Falschmessungen und Rauschen in den Messungen beeinflussen die Genauigkeit der Schätzergebnisse. Weitere Erkenntnisse über die Zielzustände können durch die Nutzung zeitlicher Informationen gewonnen werden. Ein Mehrfachzielverfolgungssystem wird auf der Grundlage des Wahrscheinlichkeitshypothesenfilters (Probability Hypothesis Density Filter) entwickelt, und die Unterschiede in der Systemleistung werden bezüglich der von den Sensoren ausgegebene Informationen, d.h. die Fusion von Ortungsinformationen und die Fusion von Abstandsinformationen, untersucht. Die Information, dass ein Ziel detektiert werden sollte, wenn es aufgrund von Abschattungen durch andere Ziele im Szenario nicht erkannt wurde, wird als dynamische Überdeckungswahrscheinlichkeit beschrieben. Die dynamische Überdeckungswahrscheinlichkeit wird in das Verfolgungssystem integriert, wodurch weniger Sensoren verwendet werden können, während gleichzeitig die Performanz des Schätzers in diesem Szenario verbessert wird. Bei der Methodenauswahl und -entwicklung wurde die Anforderung einer Echtzeitanwendung bei unbekannten Szenarien berücksichtigt. Jeder untersuchte Aspekt der Mehrpersonenlokalisierung wurde im Rahmen dieser Arbeit mit Hilfe von Simulationen und Messungen in einer realistischen Umgebung mit UWB Sensoren verifiziert.Indoor localisation and tracking of people in non-cooperative manner is important in many surveillance and rescue applications. Ultra wideband (UWB) radar technology is promising for through-wall detection of objects in short to medium distances due to its high temporal resolution and penetration capability. This thesis tackles the problem of localisation of people in indoor scenarios using UWB sensors. It follows the process from measurement acquisition, multiple target detection and range estimation to multiple target localisation and tracking. Due to the weak reflection of people compared to the rest of the environment, a background subtraction method is initially used for the detection of people. Subsequently, a constant false alarm rate method is applied for detection and range estimation of multiple persons. For multiple target localisation using a single UWB sensor, an association method is developed to assign target range estimates to the correct targets. In the presence of multiple targets it can happen that targets closer to the sensor induce shadowing over the environment hindering the detection of other targets. A concept for a distributed UWB sensor network is presented aiming at extending the field of view of the system by using several sensors with different fields of view. A real-time operational prototype has been developed taking into consideration sensor cooperation and synchronisation aspects, as well as fusion of the information provided by all sensors. Sensor data may be erroneous due to sensor bias and time offset. Incorrect measurements and measurement noise influence the accuracy of the estimation results. Additional insight of the targets states can be gained by exploiting temporal information. A multiple person tracking framework is developed based on the probability hypothesis density filter, and the differences in system performance are highlighted with respect to the information provided by the sensors i.e. location information fusion vs range information fusion. The information that a target should have been detected when it is not due to shadowing induced by other targets is described as dynamic occlusion probability. The dynamic occlusion probability is incorporated into the tracking framework, allowing fewer sensors to be used while improving the tracker performance in the scenario. The method selection and development has taken into consideration real-time application requirements for unknown scenarios at every step. Each investigated aspect of multiple person localization within the scope of this thesis has been verified using simulations and measurements in a realistic environment using M-sequence UWB sensors

    Quantum-inspired computational imaging

    Get PDF
    Computational imaging combines measurement and computational methods with the aim of forming images even when the measurement conditions are weak, few in number, or highly indirect. The recent surge in quantum-inspired imaging sensors, together with a new wave of algorithms allowing on-chip, scalable and robust data processing, has induced an increase of activity with notable results in the domain of low-light flux imaging and sensing. We provide an overview of the major challenges encountered in low-illumination (e.g., ultrafast) imaging and how these problems have recently been addressed for imaging applications in extreme conditions. These methods provide examples of the future imaging solutions to be developed, for which the best results are expected to arise from an efficient codesign of the sensors and data analysis tools.Y.A. acknowledges support from the UK Royal Academy of Engineering under the Research Fellowship Scheme (RF201617/16/31). S.McL. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grant EP/J015180/1). V.G. acknowledges support from the U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office award W911NF-10-1-0404, the U.S. DARPA REVEAL program through contract HR0011-16-C-0030, and U.S. National Science Foundation through grants 1161413 and 1422034. A.H. acknowledges support from U.S. Army Research Office award W911NF-15-1-0479, U.S. Department of the Air Force grant FA8650-15-D-1845, and U.S. Department of Energy National Nuclear Security Administration grant DE-NA0002534. D.F. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grants EP/M006514/1 and EP/M01326X/1). (RF201617/16/31 - UK Royal Academy of Engineering; EP/J015180/1 - UK Engineering and Physical Sciences Research Council; EP/M006514/1 - UK Engineering and Physical Sciences Research Council; EP/M01326X/1 - UK Engineering and Physical Sciences Research Council; W911NF-10-1-0404 - U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office; HR0011-16-C-0030 - U.S. DARPA REVEAL program; 1161413 - U.S. National Science Foundation; 1422034 - U.S. National Science Foundation; W911NF-15-1-0479 - U.S. Army Research Office; FA8650-15-D-1845 - U.S. Department of the Air Force; DE-NA0002534 - U.S. Department of Energy National Nuclear Security Administration)Accepted manuscrip

    Distributed physical sensors network for the protection of critical infrastractures against physical attacks

    Get PDF
    The SCOUT project is based on the use of multiple innovative and low impact technologies for the protection of space control ground stations and the satellite links against physical and cyber-attacks, and for intelligent reconfiguration of the ground station network (including the ground node of the satellite link) in the case that one or more nodes fail. The SCOUT sub-system devoted to physical attacks protection, SENSNET, is presented. It is designed as a network of sensor networks that combines DAB and DVB-T based passive radar, noise radar, Ku-band radar, infrared cameras, and RFID technologies. The problem of data link architecture is addressed and the proposed solution described

    Optical techniques for 3D surface reconstruction in computer-assisted laparoscopic surgery

    Get PDF
    One of the main challenges for computer-assisted surgery (CAS) is to determine the intra-opera- tive morphology and motion of soft-tissues. This information is prerequisite to the registration of multi-modal patient-specific data for enhancing the surgeon’s navigation capabilites by observ- ing beyond exposed tissue surfaces and for providing intelligent control of robotic-assisted in- struments. In minimally invasive surgery (MIS), optical techniques are an increasingly attractive approach for in vivo 3D reconstruction of the soft-tissue surface geometry. This paper reviews the state-of-the-art methods for optical intra-operative 3D reconstruction in laparoscopic surgery and discusses the technical challenges and future perspectives towards clinical translation. With the recent paradigm shift of surgical practice towards MIS and new developments in 3D opti- cal imaging, this is a timely discussion about technologies that could facilitate complex CAS procedures in dynamic and deformable anatomical regions

    Target localization and tracking by fusing doppler differentials from cellular emanations with a multi-spectral video tracker

    Get PDF
    We present an algorithm for fusing data from a constellation of RF sensors detecting cellular emanations with the output of a multi-spectral video tracker to localize and track a target with a specific cell phone. The RF sensors measure the Doppler shift caused by the moving cellular emanation and then Doppler differentials between all sensor pairs are calculated. The multi-spectral video tracker uses a Gaussian mixture model to detect foreground targets and SIFT features to track targets through the video sequence. The data is fused by associating the Doppler differential from the RF sensors with the theoretical Doppler differential computed from the multi-spectral tracker output. The absolute difference and the root-mean-square difference are computed to associate the Doppler differentials from the two sensor systems. Performance of the algorithm was evaluated using synthetically generated datasets of an urban scene with multiple moving vehicles. The presented fusion algorithm correctly associates the cellular emanation with the corresponding video target for low measurement uncertainty and in the presence of favorable motion patterns. For nearly all objects the fusion algorithm has high confidence in associating the emanation with the correct multi-spectral target from the most probable background target

    Improving Accuracy in Ultra-Wideband Indoor Position Tracking through Noise Modeling and Augmentation

    Get PDF
    The goal of this research is to improve the precision in tracking of an ultra-wideband (UWB) based Local Positioning System (LPS). This work is motivated by the approach taken to improve the accuracies in the Global Positioning System (GPS), through noise modeling and augmentation. Since UWB indoor position tracking is accomplished using methods similar to that of the GPS, the same two general approaches can be used to improve accuracy. Trilateration calculations are affected by errors in distance measurements from the set of fixed points to the object of interest. When these errors are systemic, each distinct set of fixed points can be said to exhibit a unique set noise. For UWB indoor position tracking, the set of fixed points is a set of sensors measuring the distance to a tracked tag. In this work we develop a noise model for this sensor set noise, along with a particle filter that uses our set noise model. To the author\u27s knowledge, this noise has not been identified and modeled for an LPS. We test our methods on a commercially available UWB system in a real world setting. From the results we observe approximately 15% improvement in accuracy over raw UWB measurements. The UWB system is an example of an aided sensor since it requires a person to carry a device which continuously broadcasts its identity to determine its location. Therefore the location of each user is uniquely known even when there are multiple users present. However, it suffers from limited precision as compared to some unaided sensors such as a camera which typically are placed line of sight (LOS). An unaided system does not require active participation from people. Therefore it has more difficulty in uniquely identifying the location of each person when there are a large number of people present in the tracking area. Therefore we develop a generalized fusion framework to combine measurements from aided and unaided systems to improve the tracking precision of the aided system and solve data association issues in the unaided system. The framework uses a Kalman filter to fuse measurements from multiple sensors. We test our approach on two unaided sensor systems: Light Detection And Ranging (LADAR) and a camera system. Our study investigates the impact of increasing the number of people in an indoor environment on the accuracies using a proposed fusion framework. From the results we observed that depending on the type of unaided sensor system used for augmentation, the improvement in precision ranged from 6-25% for up to 3 people
    corecore