3,495 research outputs found

    Machine Learning in Wireless Sensor Networks: Algorithms, Strategies, and Applications

    Get PDF
    Wireless sensor networks monitor dynamic environments that change rapidly over time. This dynamic behavior is either caused by external factors or initiated by the system designers themselves. To adapt to such conditions, sensor networks often adopt machine learning techniques to eliminate the need for unnecessary redesign. Machine learning also inspires many practical solutions that maximize resource utilization and prolong the lifespan of the network. In this paper, we present an extensive literature review over the period 2002-2013 of machine learning methods that were used to address common issues in wireless sensor networks (WSNs). The advantages and disadvantages of each proposed algorithm are evaluated against the corresponding problem. We also provide a comparative guide to aid WSN designers in developing suitable machine learning solutions for their specific application challenges.Comment: Accepted for publication in IEEE Communications Surveys and Tutorial

    Safe navigation for vehicles

    Get PDF
    La navigation par satellite prend un virage très important ces dernières années, d'une part par l'arrivée imminente du système Européen GALILEO qui viendra compléter le GPS Américain, mais aussi et surtout par le succès grand public qu'il connaît aujourd'hui. Ce succès est dû en partie aux avancées technologiques au niveau récepteur, qui, tout en autorisant une miniaturisation de plus en plus avancée, en permettent une utilisation dans des environnements de plus en plus difficiles. L'objectif aujourd'hui est de préparer l'utilisation de ce genre de signal dans une optique bas coût dans un milieu urbain automobile pour des applications critiques d'un point de vue sécurité (ce que ne permet pas les techniques d'hybridation classiques). L'amélioration des technologies (réduction de taille des capteurs type MEMS ou Gyroscope) ne peut, à elle seule, atteindre l'objectif d'obtenir une position dont nous pouvons être sûrs si nous utilisons les algorithmes classiques de localisation et d'hybridation. En effet ces techniques permettent d'avoir une position sans cependant permettre d'en quantifier le niveau de confiance. La faisabilité de ces applications repose d'une part sur une recherche approfondie d'axes d'amélioration des algorithmes de localisation, mais aussi et conjointement, sur la possibilité, via les capteurs externes de maintenir un niveau de confiance élevé et quantifié dans la position même en absence de signal satellitaire. ABSTRACT : Satellite navigation has acquired an increased importance during these last years, on the one hand due to the imminent appearance of the European GALILEO system that will complement the American GPS, and on the other hand due to the great success it has encountered in the commercial civil market. An important part of this success is based on the technological development at the receiver level that has rendered satellite navigation possible even in difficult environments. Today's objective is to prepare the utilisation of this kind of signals for land vehicle applications demanding high precision positioning. One of the main challenges within this research domain, which cannot be addressed by classical coupling techniques, is related to the system capability to provide reliable position estimations. The enhancement in dead-reckoning technologies (i.e. size reduction of MEMS-based sensors or gyroscopes) cannot all by itself reach the necessary confidence levels if exploited with classical localization and integration algorithms. Indeed, these techniques provide a position estimation whose reliability or confidence level it is very difficult to quantify. The feasibility of these applications relies not only on an extensive research to enhance the navigation algorithm performances in harsh scenarios, but also and in parallel, on the possibility to maintain, thanks to the presence of additional sensors, a high confidence level on the position estimation even in the absence of satellite navigation signals

    GNSS Shadow Matching: The Challenges Ahead

    Get PDF
    GNSS shadow matching is a new technique that uses 3D mapping to improve positioning accuracy in dense urban areas from tens of meters to within five meters, potentially less. This paper presents the first comprehensive review of shadow matching’s error sources and proposes a program of research and development to take the technology from proof of concept to a robust, reliable and accurate urban positioning product. A summary of the state of the art is also included. Error sources in shadow matching may be divided into six categories: initialization, modelling, propagation, environmental complexity, observation, and algorithm approximations. Performance is also affected by the environmental geometry and it is sometimes necessary to handle solution ambiguity. For each error source, the cause and how it impacts the position solution is explained. Examples are presented, where available, and improvements to the shadow-matching algorithms to mitigate each error are proposed. Methods of accommodating quality control within shadow matching are then proposed, including uncertainty determination, ambiguity detection, and outlier detection. This is followed by a discussion of how shadow matching could be integrated with conventional ranging-based GNSS and other navigation and positioning technologies. This includes a brief review of methods to enhance ranging-based GNSS using 3D mapping. Finally, the practical engineering challenges of shadow matching are assessed, including the system architecture, efficient GNSS signal prediction and the acquisition of 3D mapping data

    GROUND DEFORMATION ANALYSIS IN APENNINE AREAS, SEISMICALLY ACTIVE OR ASEISMIC, USING DATA FROM SAR INTERFEROMETRY AND INTEGRATION OF GEOMORPHOLOGICAL AND STRUCTURAL DATA

    Get PDF
    The core of the study herein has been the analysis of PS-InSAR datasets aimed at providing new constraints to the active tectonics framework, and seismotectonics, of several regions of the Apennines. The analysed Permanent Scatterers datasets result from processing of large amounts of temporally continuous series of radar images acquired with the ERS (1992-2000), ENVISAT (2003-2010) and COSMO SKYMED (2011-2014) satellite missions. Such datasets, which are available in the cartographic website (Geoportale Nazionale) of the Italian Ministry of Environment (MATTM) have been collected through time by the MATTM in the frame of the "Extraordinary Remote Sensing Plan" (Piano Straordinario di Telerilevamento Ambientale, PST-A, law n. 179/2002 - article 27), with the aim of supporting local administrations in the field of environmental policies. The database was realized through three phases: the first one (2008-2009), which involved the interferometric processing of SAR images acquired throughout the country by the ERS1/ERS2 and ENVISAT satellites in both ascending and descending orbits, from 1992 to 2008; the second one (2010-2011) integrated the existing database with the processing of the SAR images acquired by the ENVISAT satellite from 2008 to 2010; the third phase (2013-2015) provided an upgrading and updating of the previously developed database on critical areas, based on StripMap H image acquired with a 16-day recurrence, either in ascending or descending orbit, using the Italian national satellite system, the COSMO SKYMED. With this study, a massive use of Permanent Scatterer datasets is applied for the first time at assessment of ground deformation of large (hundreds of km2 wide) regions of Italy over the last decades, in order to unravelling their current tectonic behaviour. To date in the field of tectonics – in particular, of earthquake geology - the SAR images have been used essentially through the DinSAR technique (comparison between two images, acquired pre- and post-event) in order to constrain the co- and post-seismic deformation (Massonet et al., 1993; Peltzer et al., 1996, 1998; Stramondo et al., 1999; Atzori et al., 2009; Copley and Reynolds, 2014), while the approach that has been used in the case studies that are the object of the research herein is based on analyses of data that (with the exception of the Lunigiana case study) cover an about 20-year long time window. The opportunity of analysing so long, continuous SAR records has allowed detection of both coseismic displacement of moderate earthquakes (i.e., the M 6.3 2009 L’Aquila earthquake, and the M 5 2013 Lunigiana earthquake), and subdued ground displacements - and acceleration – on time scale ranging from yearly to decades. The specific approach used in this study rests on a combination of various techniques of analysis and processing of the PS datasets. In general, as the analyses that have been carried out aimed at identifying motion values with wide areal extent, a statistical filtering has been applied to PSs velocity values in order to discard from the initial, “native”, dataset fast-moving PSs that may be associated with the occurrence of local-scale phenomena (e.g., landslides, sediment compaction, water extraction, etc.). Furthermore, an in depth inspection of time series of PSs from all of the investigated areas has been carried out with the aim of identifying changing (LoS-oriented) motion trends over the analysed time windows. A distinctive feature of this study was the estimation of vertical ground displacements. In fact, while most studies on ground deformation are based on analysis of SAR data recorded along either ascending or descending satellite orbits (thus based on LoS-oriented motions), a specific focus of this study was to obtain - starting from LoS-oriented PS velocity values - displacement values in the vertical plane oriented west-east. In order to evaluate vertical displacements, a geometrical relationship was applied to ascending - descending PSs pairs. As PS from ascending and descending tracks are neither spatially coincident nor synchronous, each image pair was obtained by selecting ascending-descending radar images with a time separation within one month. In the L’Aquila region case study, the combination of data recorded along both the ascending and descending satellite orbits has been crucial to the identification of pre-seismic ground motions, undetected in previous works that – similarly – had addressed assessment of possible pre-seismic satellite-recorded signals. In the various case studies, different kinds of GIS-aided geostatistical analyses were used to extract and synthesise information on ground deformation through the construction of both raster maps of displacement values for the ascending and descending LoS, respectively, and maps of the vertical (z, up - down) component of the “real” displacement vector. In the Campania plain case study, the PS-InSAR data analysis and processing have been integrated by detail scale geomorphological-stratigraphical analysis. Results of analyses of the two independent data sets are consistent, and point to tectonically-controlled ground displacements in a large part of the northern part of the study area (Volturno plain) during the 1992-2010 analysed time span. In particular, the integrated data sets show that the boundaries of the area affected by current subsidence follow fault scarps formed in the 39 ka old Campania Ignimbrite, while the horst blocks of such faults are substantially stable (or slightly uplifting) during the analysed time window. Furthermore, mean rates of current subsidence and long-term (Late Pleistocene to present) mean subsidence rates are comparable, pointing to current vertical displacement assessed through the PS-InSAR data analysis as the expression of the recent tectonics of the analysed sectors of the Campania plain. The Campania plain substantially lacks strong historical seismicity. Such evidence suggests that the detected surface displacements result at least in part from aseismic fault activity. The Monte Marzano case study has allowed assessment of subdued deformation along both the major structures that were activated with the Irpinia 1980 earthquake, i.e. the NE-dipping Monte Marzano fault and the SW-dipping Conza fault, respectively. Ground deformation associated with such structures appears decreasing from the time window covered by the ERS satellites (1992-2000) to that covered by the ENVISAT (2003-2010). These data suggest that post-seismic slip of the M 6.9 has continued until 20 years after the main shock to become very weak in the following ten years. Furthermore, the PS-InSAR data analysis has shown that wide areas located between the Monte Marzano and Conza faults (i.e., in the one that is recognised as the graben structure bounded by those structures) show uplift in the range of 0-2 mm/yr, more evident in the period surveyed by the ERS satellites (1992-2000) and less evident in the 2003-2010 time span (ENVISAT). Such uplift might be related to the occurrence, at depth, of a fluid reservoir that has been independently identified by seismic tomography (Amoroso et al., 2014). In depth analysis of pre-seismic periods have been carried out in three study areas, i.e. those of the 1997 Colfiorito earthquake, of the 2009 L’Aquila earthquake and of the 2013 Lunigiana earthquake. The Colfiorito case study has not provided any significant information on possible pre-seismic ground deformation, most probably due to the PS spatial distribution in that region too much discontinuous to allow identification of both net signals from inspection of the rare and sparse PS time series, and statistically meaningful surface displacement patterns. Both in the L’Aquila and Lunigiana case studies, ground deformation signals in the pre-seismic period have been detected from inspection of PS time series. Pre-seismic ground deformation signals detected in the Lunigiana area (which was affected by a strike-slip faulting earthquake; Eva et al., 2014, Pezzo et al., 2014, Stramondo et al., 2014) are questionable, as they are quite complex and difficult to be interpreted and framed within the local tectonic scenario. Conversely, very clear and net pre-seismic signals have been identified in the region hit by the L’Aquila normal faulting earthquake. There, in the time span predating of some four years the 6th April 2009 main shock, ground deformation with distinct spatial patterns, and orientations, have been detected. In particular, the PS-InSAR analysis has shown that the hanging wall block of the Paganica fault (the surface expression of the structure activated with the main shock; e.g., Galli et al., 2010) has been subject to slow uplift and eastward horizontal motion from 2005 to September/October 2008, and then (October 2008-March 2009) subject to subsidence and westward oriented horizontal motion. Following coseismic collapse, in the early post-seismic period (April-May 2009), subsidence extended eastwards beyond the Paganica fault trace. The region affected by opposite pre-seismic motions covers the area in which the 6th April main shock and most of both foreshocks and aftershocks (Valoroso et al., 2013) were recorded, while the inversion of the pre-seismic displacements is coeval with onset of the foreshocks (October 2008; Di Luccio et al., 2010). In addition, such a region includes both topographic highs and lows. All of such features point to a correlation of the detected motions with the seismic phenomena, and suggest a deep-seated causative mechanism, such as volume changes in response to vertical/lateral fluids migration and fracturing processes at depth, with all phenomena having been documented in connection with the 2009 earthquake in the study region (e.g., Di Luccio et al., 2010; Lucente et al., 2010; Moro et al., 2017). Pre-seismic ground deformation that has been detected in the L’Aquila region could represent a precursor signal of the 2009, M 6.3 earthquake. Such a hypothesis should be tested, in the future, through the continuous monitoring through SAR satellites, but also high-resolution geodetic techniques, of seismically active regions worldwide aimed at detecting the possible occurrence of pre-seismic signals. However, the results of this study point to the long-term (yearly scale) PS-InSAR technique as a tool crucial to the detection of ground deformation in areas struck by recent earthquakes, and to monitoring active – possibly aseismic - structures. Such knowledge may strongly support strategies addressed at territorial planning and mitigation of seismic hazard, and represent an important sustenance for actions ruled by Civil Protection. On the other hand, the results of this study highlight the importance of the existing PS database, and the importance of continuing implementing such an instrument in the future

    Real-Time Localization Using Software Defined Radio

    Get PDF
    Service providers make use of cost-effective wireless solutions to identify, localize, and possibly track users using their carried MDs to support added services, such as geo-advertisement, security, and management. Indoor and outdoor hotspot areas play a significant role for such services. However, GPS does not work in many of these areas. To solve this problem, service providers leverage available indoor radio technologies, such as WiFi, GSM, and LTE, to identify and localize users. We focus our research on passive services provided by third parties, which are responsible for (i) data acquisition and (ii) processing, and network-based services, where (i) and (ii) are done inside the serving network. For better understanding of parameters that affect indoor localization, we investigate several factors that affect indoor signal propagation for both Bluetooth and WiFi technologies. For GSM-based passive services, we developed first a data acquisition module: a GSM receiver that can overhear GSM uplink messages transmitted by MDs while being invisible. A set of optimizations were made for the receiver components to support wideband capturing of the GSM spectrum while operating in real-time. Processing the wide-spectrum of the GSM is possible using a proposed distributed processing approach over an IP network. Then, to overcome the lack of information about tracked devices’ radio settings, we developed two novel localization algorithms that rely on proximity-based solutions to estimate in real environments devices’ locations. Given the challenging indoor environment on radio signals, such as NLOS reception and multipath propagation, we developed an original algorithm to detect and remove contaminated radio signals before being fed to the localization algorithm. To improve the localization algorithm, we extended our work with a hybrid based approach that uses both WiFi and GSM interfaces to localize users. For network-based services, we used a software implementation of a LTE base station to develop our algorithms, which characterize the indoor environment before applying the localization algorithm. Experiments were conducted without any special hardware, any prior knowledge of the indoor layout or any offline calibration of the system

    Loose Ends for the Exomoon Candidate Host Kepler-1625b

    Get PDF
    The claim of an exomoon candidate in the Kepler-1625b system has generated substantial discussion regarding possible alternative explanations for the purported signal. In this work we examine in detail these possibilities. First, the effect of more flexible trend models is explored and we show that sufficiently flexible models are capable of attenuating the signal, although this is an expected byproduct of invoking such models. We also explore trend models using X and Y centroid positions and show that there is no data-driven impetus to adopt such models over temporal ones. We quantify the probability that the 500 ppm moon-like dip could be caused by a Neptune-sized transiting planet to be < 0.75%. We show that neither autocorrelation, Gaussian processes nor a Lomb-Scargle periodogram are able to recover a stellar rotation period, demonstrating that K1625 is a quiet star with periodic behavior < 200 ppm. Through injection and recovery tests, we find that the star does not exhibit a tendency to introduce false-positive dip-like features above that of pure Gaussian noise. Finally, we address a recent re-analysis by Kreidberg et al (2019) and show that the difference in conclusions is not from differing systematics models but rather the reduction itself. We show that their reduction exhibits i) slightly higher intra-orbit and post-fit residual scatter, ii) \simeq 900 ppm larger flux offset at the visit change, iii) \simeq 2 times larger Y-centroid variations, and iv) \simeq 3.5 times stronger flux-centroid correlation coefficient than the original analysis. These points could be explained by larger systematics in their reduction, potentially impacting their conclusions.Comment: 21 pages, 4 tables, 11 figures. Accepted for publication in The Astronomical Journal, January 202

    A Survey for Transient Astronomical Radio Emission at 611 MHz

    Full text link
    We have constructed and operated the Survey for Transient Astronomical Radio Emission (STARE) to detect transient astronomical radio emission at 611 MHz originating from the sky over the northeastern United States. The system is sensitive to transient events on timescales of 0.125 s to a few minutes, with a typical zenith flux density detection threshold of approximately 27 kJy. During 18 months of around-the-clock observing with three geographically separated instruments, we detected a total of 4,318,486 radio bursts. 99.9% of these events were rejected as locally generated interference, determined by requiring the simultaneous observation of an event at all three sites for it to be identified as having an astronomical origin. The remaining 3,898 events have been found to be associated with 99 solar radio bursts. These results demonstrate the remarkably effective RFI rejection achieved by a coincidence technique using precision timing (such as GPS clocks) at geographically separated sites. The non-detection of extra-solar bursting or flaring radio sources has improved the flux density sensitivity and timescale sensitivity limits set by several similar experiments in the 1970s. We discuss the consequences of these limits for the immediate solar neighborhood and the discovery of previously unknown classes of sources. We also discuss other possible uses for the large collection of 611 MHz monitoring data assembled by STARE.Comment: 24 pages, 6 figures; to appear in PAS

    Learning Human Behaviour Patterns by Trajectory and Activity Recognition

    Get PDF
    The world’s population is ageing, increasing the awareness of neurological and behavioural impairments that may arise from the human ageing. These impairments can be manifested by cognitive conditions or mobility reduction. These conditions are difficult to be detected on time, relying only on the periodic medical appointments. Therefore, there is a lack of routine screening which demands the development of solutions to better assist and monitor human behaviour. The available technologies to monitor human behaviour are limited to indoors and require the installation of sensors around the user’s homes presenting high maintenance and installation costs. With the widespread use of smartphones, it is possible to take advantage of their sensing information to better assist the elderly population. This study investigates the question of what we can learn about human pattern behaviour from this rich and pervasive mobile sensing data. A deployment of a data collection over a period of 6 months was designed to measure three different human routines through human trajectory analysis and activity recognition comprising indoor and outdoor environment. A framework for modelling human behaviour was developed using human motion features, extracted in an unsupervised and supervised manner. The unsupervised feature extraction is able to measure mobility properties such as step length estimation, user points of interest or even locomotion activities inferred from an user-independent trained classifier. The supervised feature extraction was design to be user-dependent as each user may have specific behaviours that are common to his/her routine. The human patterns were modelled through probability density functions and clustering approaches. Using the human learned patterns, inferences about the current human behaviour were continuously quantified by an anomaly detection algorithm, where distance measurements were used to detect significant changes in behaviour. Experimental results demonstrate the effectiveness of the proposed framework that revealed an increase potential to learn behaviour patterns and detect anomalies
    corecore