62,644 research outputs found

    Weather Balloon Payload Box

    Get PDF
    A payload box holding a self-rotating camera was constructed to go on a weather balloon that will document the upcoming solar eclipse on August 21, 2017. A group of physics students, and the paper’s author, are working under Dr. Darci Snowden on the CWU Near Space Observation Team for research dedicated to the eclipse in Oregon. Various projects, including the payload box, are being designed to go up on a high altitude weather balloon. The payload box was designed and constructed to withstand the impact force of falling from 120,000 ft. This was done so the box could be reusable for future weather balloon projects. To achieve this, the box was made from fiberglass and foam with a thickness of 4 cm to withstand impact. The payload box was also designed to hold an “imaging platform” that will hold and rotate a camera using a servo motor. The motor knows where to rotate the camera based on how much light it senses coming from the windows of the payload box. During the launch in August, the camera should be able to communicate to the “ground station” computer so images can be seen in real time. With an expected terminal velocity of 4.39 m/s (14.40 ft/s), the expected impact force the payload box was designed to withstand (while remaining reusable) is 68.03 N (15.29 lbf)

    Real-time Panorama Stitching using a Single PTZ-Camera without using Image Feature Matching

    Get PDF
    In surveillance applications one thing to consider is how much of a scene one can cover with a camera. One way to augment this is to take images with overlap and blend them, creating a new image with bigger field of view and thereby increase the scene coverage. In this thesis work we have been looking at how one can create panorama images with a pan-tilt-camera and how fast it can be done. We chose a circular panorama representation for this. Our approach was that gathering enough metadata from the camera one can rectify the gathered images and blend them without matching feature-points or other computationally heavy operations. We show that this can be done. The images gathered was corrected for lens distortions and rolling shutter effects arising from rotating the camera. Attempts where made to find an optimal path for the camera to follow while capturing images. An algorithm to do intensity corrections of the images was also implemented. We find that one can rotate the camera at high speeds and still produce a good quality panorama image. The limiting factors are the precision of the meta data gathered, like motion data from the on-board gyro, and the lighting conditions, since a short shutter time is required to minimize motion blur. The quality varies depending on the time taken to capture the images needed to create the spherical projection. The fastest run was done in 1.6 seconds with some distortions. A run in around 4 seconds generally produce a good quality panorama image

    Integrated surveying for the archaeological documentation of a neolithic site

    Get PDF
    It has been tested the applicability of integrated surveys (remote sensing, digital photogrammetry and terrestrial laser scanning (TLS)) in order to verify, through gradual and successive steps, how geomatic techniques can get 3D results with metric value combined with a quality content for an archaeological site. In particular, the data have been collected during the excavation campaign of Neolithic archaeological site in Taranto. The possibilities to scan articulated forms, in the presence of curve, concavity and convexity, and jutting parts rotate, characterized by alterations, through the acquisition of a dense points cloud makes the technique TLS needed in archaeology. Through the photogrammetric technique the laser data has been integrated concerning some details found on the site for which it has been required a higher degree of detail. The photogrammetric data has been acquired with the calibrated camera. The processing of the acquired data and their integration has been made possible to study an important archeological site, in its totality, from small scale (general site framework) to large scale (3D model with a high degree of detail) and to structure a multi-temporal database for simplified data management

    Infrared video tracking of UAVs: Guided landing in the absence of GPS signals

    Get PDF
    Master's Project (M.S.) University of Alaska Fairbanks, 2019Unmanned Aerial Vehicles (UAVs) use Global Positioning System (GPS) signals to determine their position for automated flight. The GPS signals require an unobstructed view of the sky in order to obtain position information. When inside without a clear view of the sky, such as in a building or mine, other methods are necessary to obtain the relative position of the UAV. For obstacle avoidance a LIDAR/SONAR system is sufficient to ensure automated flight, but for precision landing the LIDAR/SONAR system is insufficient for effectively identifying the location of the landing platform and providing flight control inputs to guide the UAV to the landing platform. This project was developed in order to solve this problem by creating a guidance system utilizing an infrared (IR) camera to track an IR LED and blue LEDs mounted on the UAV from a RaspberryPI 3 Model B+. The RaspberryPI, using OpenCV libraries, can effectively track the position of the LED lights mounted on the UAV, determine rotational and lateral corrections based on this tracking, and, using Dronekit-Python libraries, command the UAV to position itself and land on the platform of the Husky UGV (Unmanned Ground Vehicle)

    Self calibrating autoTRAC

    Get PDF
    The work reported here demonstrates how to automatically compute the position and attitude of a targeting reflective alignment concept (TRAC) camera relative to the robot end effector. In the robotics literature this is known as the sensor registration problem. The registration problem is important to solve if TRAC images need to be related to robot position. Previously, when TRAC operated on the end of a robot arm, the camera had to be precisely located at the correct orientation and position. If this location is in error, then the robot may not be able to grapple an object even though the TRAC sensor indicates it should. In addition, if the camera is significantly far from the alignment it is expected to be at, TRAC may give incorrect feedback for the control of the robot. A simple example is if the robot operator thinks the camera is right side up but the camera is actually upside down, the camera feedback will tell the operator to move in an incorrect direction. The automatic calibration algorithm requires the operator to translate and rotate the robot arbitrary amounts along (about) two coordinate directions. After the motion, the algorithm determines the transformation matrix from the robot end effector to the camera image plane. This report discusses the TRAC sensor registration problem

    Influence of the Earth on the background and the sensitivity of the GRM and ECLAIRs instruments aboard the Chinese-French mission SVOM

    Full text link
    SVOM (Space-based multi-band astronomical Variable Object Monitor) is a future Chinese-French satellite mission which is dedicated to Gamma-Ray Burst (GRB) studies. Its anti-solar pointing strategy makes the Earth cross the field of view of its payload every orbit. In this paper, we present the variations of the gamma-ray background of the two high energy instruments aboard SVOM, the Gamma-Ray Monitor (GRM) and ECLAIRs, as a function of the Earth position. We conclude with an estimate of the Earth influence on their sensitivity and their GRB detection capability.Comment: 24 pages, 15 figures, accepted for publication in Experimental Astronom

    Visual stimulation of saccades in magnetically tethered Drosophila

    Get PDF
    Flying fruit flies, Drosophila melanogaster, perform `body saccades', in which they change heading by about 90° in roughly 70 ms. In free flight, visual expansion can evoke saccades, and saccade-like turns are triggered by similar stimuli in tethered flies. However, because the fictive turns in rigidly tethered flies follow a much longer time course, the extent to which these two behaviors share a common neural basis is unknown. A key difference between tethered and free flight conditions is the presence of additional sensory cues in the latter, which might serve to modify the time course of the saccade motor program. To study the role of sensory feedback in saccades, we have developed a new preparation in which a fly is tethered to a fine steel pin that is aligned within a vertically oriented magnetic field, allowing it to rotate freely around its yaw axis. In this experimental paradigm, flies perform rapid turns averaging 35° in 80 ms, similar to the kinematics of free flight saccades. Our results indicate that tethered and free flight saccades share a common neural basis, but that the lack of appropriate feedback signals distorts the behavior performed by rigidly fixed flies. Using our new paradigm, we also investigated the features of visual stimuli that elicit saccades. Our data suggest that saccades are triggered when expanding objects reach a critical threshold size, but that their timing depends little on the precise time course of expansion. These results are consistent with expansion detection circuits studied in other insects, but do not exclude other models based on the integration of local movement detectors
    corecore