2,319 research outputs found

    Megacollect 2004: Hyperspectral Collection Experiment of Terrestrial Targets and Backgrounds of the RIT Megascene and Surrounding Area (Rochester, NY)

    Get PDF
    This paper describes a collaborative collection campaign to spectrally image and measure a well characterized scene for hyperspectral algorithm development and validation/verification of scene simulation models (DIRSIG). The RIT Megascene, located in the northeast corner of Monroe County near Rochester, New York, has been modeled and characterized under the DIRSIG environment and has been simulated for various hyperspectral and multispectral systems (e.g., HYDICE, LANDSAT, etc.). Until recently, most of the electro-optical imagery of this area has been limited to very high altitude airborne or orbital platforms with low spatial resolutions. Megacollect 2004 addresses this shortcoming by bringing together, in June of 2004, a suite of airborne sensors to image this area in the VNIR, SWIR, MWIR, and LWIR regions. These include the COMPASS (hyperspectral VNIR,SWIR), SEBASS (hyperspectral LWIR), WASP (broadband VIS, SWIR, MWIR, LWIR) and MISI (hyperspectral VNIR, broadband SWIR, MWIR, LWIR). In conjunction with the airborne collections, an extensive ground truth measurement campaign was conducted to characterize atmospheric parameters, select targets, and backgrounds in the field. Laboratory measurements were also made on samples to confirm the field measurements. These spectral measurements spanned the visible and thermal region from 0.4 to 20 microns. These measurements will help identify imaging factors that affect algorithm robustness and areas of improvement in the physical modeling of scene/sensor phenomena. Reflectance panels have also been deployed as control targets to both quantify sensor characteristics and atmospheric effects. A subset of these targets have also been deployed as an independent test suite for target detection algorithms. Details of the planning, coordination, protocols, and execution of the campaign will be discussed with particular emphasis on the ground measurements. The system used to collect the metadata of ground truth measurements and disseminate this data will be described. Lastly, lessons learned in the field will be underscored to highlight additional measurements and changes in protocol to improve future collections of this area

    White paper on Selected Environmental Parameters affecting Autonomous Vehicle (AV) Sensors

    Full text link
    Autonomous Vehicles (AVs) being developed these days rely on various sensor technologies to sense and perceive the world around them. The sensor outputs are subsequently used by the Automated Driving System (ADS) onboard the vehicle to make decisions that affect its trajectory and how it interacts with the physical world. The main sensor technologies being utilized for sensing and perception (S&P) are LiDAR (Light Detection and Ranging), camera, RADAR (Radio Detection and Ranging), and ultrasound. Different environmental parameters would have different effects on the performance of each sensor, thereby affecting the S&P and decision-making (DM) of an AV. In this publication, we explore the effects of different environmental parameters on LiDARs and cameras, leading us to conduct a study to better understand the impact of several of these parameters on LiDAR performance. From the experiments undertaken, the goal is to identify some of the weaknesses and challenges that a LiDAR may face when an AV is using it. This informs AV regulators in Singapore of the effects of different environmental parameters on AV sensors so that they can determine testing standards and specifications which will assess the adequacy of LiDAR systems installed for local AV operations more robustly. Our approach adopts the LiDAR test methodology first developed in the Urban Mobility Grand Challenge (UMGC-L010) White Paper on LiDAR performance against selected Automotive Paints.Comment: 25 pages, 20 figures. This white paper was developed with support from the Urban Mobility Grand Challenge Fund by the Land Transport Authority of Singapore (No. UMGC-L010). For associated dataset, see https://researchdata.ntu.edu.sg/dataset.xhtml?persistentId=doi:10.21979/N9/NT8HIM. arXiv admin note: substantial text overlap with arXiv:2309.0134

    Development and Flight of a Robust Optical-Inertial Navigation System Using Low-Cost Sensors

    Get PDF
    This research develops and tests a precision navigation algorithm fusing optical and inertial measurements of unknown objects at unknown locations. It provides an alternative to the Global Positioning System (GPS) as a precision navigation source, enabling passive and low-cost navigation in situations where GPS is denied/unavailable. This paper describes two new contributions. First, a rigorous study of the fundamental nature of optical/inertial navigation is accomplished by examining the observability grammian of the underlying measurement equations. This analysis yields a set of design principles guiding the development of optical/inertial navigation algorithms. The second contribution of this research is the development and flight test of an optical-inertial navigation system using low-cost and passive sensors (including an inexpensive commercial-grade inertial sensor, which is unsuitable for navigation by itself). This prototype system was built and flight tested at the U.S. Air Force Test Pilot School. The algorithm that was implemented leveraged the design principles described above, and used images from a single camera. It was shown (and explained by the observability analysis) that the system gained significant performance by aiding it with a barometric altimeter and magnetic compass, and by using a digital terrain database (DTED). The (still) low-cost and passive system demonstrated performance comparable to high quality navigation-grade inertial navigation systems, which cost an order of magnitude more than this optical-inertial prototype. The resultant performance of the system tested provides a robust and practical navigation solution for Air Force aircraft

    Versatility Of Low-Power Wide-Area Network Applications

    Get PDF
    Low-Power Wide-Area Network (LPWAN) is regarded as the leading communication technology for wide-area Internet-of-Things (IoT) applications. It offers low-power, long-range, and low-cost communication. With different communication requirements for varying IoT applications, many competing LPWAN technologies operating in both licensed (e.g., NB-IoT, LTE-M, and 5G) and unlicensed (e.g., LoRa and SigFox) bands have emerged. LPWANs are designed to support applications with low-power and low data rate operations. They are not well-designed to host applications that involve high mobility, high traffic, or real-time communication (e.g., volcano monitoring and control applications).With the increasing number of mobile devices in many IoT domains (e.g., agricultural IoT and smart city), mobility support is not well-addressed in LPWAN. Cellular-based/licensed LPWAN relies on the wired infrastructure to enable mobility. On the other hand, most unlicensed LPWANs operate on the crowded ISM band or are required to duty cycle, making handling mobility a challenge. In this dissertation, we first identify the key opportunities of LPWAN, highlight the challenges, and show potential directions for future research. We then enable the versatility of LPWAN applications first by enabling applications involving mobility over LPWAN. Specifically, we propose to handle mobility in LPWAN over white space considering Sensor Network Over White Space (SNOW). SNOW is a highly scalable and energy-efficient LPWAN operating over the TV white spaces. TV white spaces are the allocated but locally unused available TV channels (54 - 698 MHz in the US). We proposed a dynamic Carrier Frequency Offset (CFO) estimation and compensation technique that considers the impact of the Doppler shift due to mobility. Also, we design energy-efficient and fast BS discovery and association approaches. Finally, we demonstrate the feasibility of our approach through experiments in different deployments. Finally, we present a collision detection and recovery technique called RnR (Reverse & Replace Decoding) that applies to LPWANs. Additionally, we discuss future work to enable handling burst transmission over LPWAN and localization in mobile LPWAN

    Smart Water:A Prototype for Monitoring Water Consumption

    Get PDF
    ABSTRACTThe traditional metering system of water meters, through human readers, although still widely used by companies that provide water service, tends to become an increasingly unviable process over the years, due to urban growth. With the objective of finding a solution to this question, this paper presents the development of a prototype to monitor water consumption and an application that allows the end user to visualize his consumption. For the prototype, it was used the NodeMCU module, because of it being a low-cost device, along with a Wisol WSSFM10R2 Breakout module, which allows communication through the Sigfox network, considered an alternative network for IoT communications, using simple AT commands, besides the Sigfox company provide all the architecture for the developer. The present work also discusses about how the Sigfox hardware and network works, explaining the pulse conversion processes emitted by the flow and pressure sensors, as well as the use of the NodeMCU module for control and sending of messages through the Sigfox network. In general, the prototype obtained a satisfactory result in relation to the calculation of water consumption, reaching accuracy rates above 90% in tests that used the values returned by both sensors in constant flows and an average accuracy rate of around 99% for tests with varied flows, where it has been proven that the use of the pressure sensor optimizes the consumption calculation
    corecore