164,307 research outputs found

    Diagnostic Techniques to Elucidate the Aerodynamic Performance of Acoustic Liners

    Get PDF
    In support of Topic A.2.8 of NASA NRA NNH10ZEA001N, the University of Florida (UF) has investigated the use of flow field optical diagnostic and micromachined sensor-based techniques for assessing the wall shear stress on an acoustic liner. Stereoscopic particle image velocimetry (sPIV) was used to study the velocity field over a liner in the Grazing Flow Impedance Duct (GFID). The results indicate that the use of a control volume based method to determine the wall shear stress is prone to significant error. The skin friction over the liner as measured using velocity curve fitting techniques was shown to be locally reduced behind an orifice, relative to the hard wall case in a streamwise plane centered on the orifice. The capacitive wall shear stress sensor exhibited a linear response for a range of shear stresses over a hard wall. PIV over the liner is consistent with lifting of the near wall turbulent structure as it passes over an orifice, followed by a region of low wall shear stress

    Vision-based interface applied to assistive robots

    Get PDF
    This paper presents two vision-based interfaces for disabled people to command a mobile robot for personal assistance. The developed interfaces can be subdivided according to the algorithm of image processing implemented for the detection and tracking of two different body regions. The first interface detects and tracks movements of the user's head, and these movements are transformed into linear and angular velocities in order to command a mobile robot. The second interface detects and tracks movements of the user's hand, and these movements are similarly transformed. In addition, this paper also presents the control laws for the robot. The experimental results demonstrate good performance and balance between complexity and feasibility for real-time applications.Fil: Pérez Berenguer, María Elisa. Universidad Nacional de San Juan. Facultad de Ingeniería. Departamento de Electrónica y Automática. Gabinete de Tecnología Médica; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Soria, Carlos Miguel. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - San Juan. Instituto de Automática. Universidad Nacional de San Juan. Facultad de Ingeniería. Instituto de Automática; ArgentinaFil: López Celani, Natalia Martina. Universidad Nacional de San Juan. Facultad de Ingeniería. Departamento de Electrónica y Automática. Gabinete de Tecnología Médica; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Nasisi, Oscar Herminio. Universidad Nacional de San Juan. Facultad de Ingeniería. Instituto de Automática; ArgentinaFil: Mut, Vicente Antonio. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - San Juan. Instituto de Automática. Universidad Nacional de San Juan. Facultad de Ingeniería. Instituto de Automática; Argentin

    Animal-Inspired Agile Flight Using Optical Flow Sensing

    Full text link
    There is evidence that flying animals such as pigeons, goshawks, and bats use optical flow sensing to enable high-speed flight through forest clutter. This paper discusses the elements of a theory of controlled flight through obstacle fields in which motion control laws are based on optical flow sensing. Performance comparison is made with feedback laws that use distance and bearing measurements, and practical challenges of implementation on an actual robotic air vehicle are described. The related question of fundamental performance limits due to clutter density is addressed.Comment: 20 pages, 7 figure

    A contribution to vision-based autonomous helicopter flight in urban environments

    Get PDF
    A navigation strategy that exploits the optic flow and inertial information to continuously avoid collisions with both lateral and frontal obstacles has been used to control a simulated helicopter flying autonomously in a textured urban environment. Experimental results demonstrate that the corresponding controller generates cautious behavior, whereby the helicopter tends to stay in the middle of narrow corridors, while its forward velocity is automatically reduced when the obstacle density increases. When confronted with a frontal obstacle, the controller is also able to generate a tight U-turn that ensures the UAV’s survival. The paper provides comparisons with related work, and discusses the applicability of the approach to real platforms

    Image-Based Flexible Endoscope Steering

    Get PDF
    Manually steering the tip of a flexible endoscope to navigate through an endoluminal path relies on the physician’s dexterity and experience. In this paper we present the realization of a robotic flexible endoscope steering system that uses the endoscopic images to control the tip orientation towards the direction of the lumen. Two image-based control algorithms are investigated, one is based on the optical flow and the other is based on the image intensity. Both are evaluated using simulations in which the endoscope was steered through the lumen. The RMS distance to the lumen center was less than 25% of the lumen width. An experimental setup was built using a standard flexible endoscope, and the image-based control algorithms were used to actuate the wheels of the endoscope for tip steering. Experiments were conducted in an anatomical model to simulate gastroscopy. The image intensity- based algorithm was capable of steering the endoscope tip through an endoluminal path from the mouth to the duodenum accurately. Compared to manual control, the robotically steered endoscope performed 68% better in terms of keeping the lumen centered in the image
    corecore