5,960 research outputs found

    Survey of Visual and Force/Tactile Control of Robots for Physical Interaction in Spain

    Get PDF
    Sensors provide robotic systems with the information required to perceive the changes that happen in unstructured environments and modify their actions accordingly. The robotic controllers which process and analyze this sensory information are usually based on three types of sensors (visual, force/torque and tactile) which identify the most widespread robotic control strategies: visual servoing control, force control and tactile control. This paper presents a detailed review on the sensor architectures, algorithmic techniques and applications which have been developed by Spanish researchers in order to implement these mono-sensor and multi-sensor controllers which combine several sensors

    Infrastructure Enabled Autonomy Acting as an Intelligent Transportation System for Autonomous Cars

    Get PDF
    Autonomous cars have the ability to increase safety, efficiency, and speed of travel. Yet many see a point at which stand-alone autonomous agents populate an area too densely, creating increased risk - particularly when each agent is operating and making decisions on its own and in its own self-interest. The problem at hand then becomes how to best implement and scale this new technology and structure in such a way that it can keep pace with a rapidly changing world, benefitting not just individuals, but societies. This research approaches the challenge by developing an intelligent transportation system that relies on an infrastructure. The solution lies in the removal of sensing and high computational tasks from the vehicles, allowing static ground stations with multi sensor-sensing packs to sense the surrounding environment and direct the vehicles safely from start to goal. On a high level, the Infrastructure Enabled Autonomy system (IEA) uses less hardware, bandwidth, energy, and money to maintain a controlled environment for a vehicle to operate when in highly congested environments. Through the development of background detection algorithms, this research has shown the advantage of static MSSPs analyzing the same environment over time, and carrying an increased reliability from fewer unknowns about the area of interest. It was determined through testing that wireless commands can sufficiently operate a vehicle in a limited agent environment, and do not bottleneck the system. The horizontal trial outcome illustrated that a switching MSSP state of the IEA system showed similar loop time, but a greatly increased standard deviation. However, after performing a t-test with a 95 percent confidence interval, the static and switching MSSP state trials were not significantly different. The final testing quantified the cross track error. For a straight path, the vehicle being controlled by the IEA system had a cross track error less than 12 centimeters, meaning between the controller, network lag, and pixel error, the system was robust enough to generate stable control of the vehicle with minimal error

    3-D Scene Reconstruction from Aerial Imagery

    Get PDF
    3-D scene reconstructions derived from Structure from Motion (SfM) and Multi-View Stereo (MVS) techniques were analyzed to determine the optimal reconnaissance flight characteristics suitable for target reconstruction. In support of this goal, a preliminary study of a simple 3-D geometric object facilitated the analysis of convergence angles and number of camera frames within a controlled environment. Reconstruction accuracy measurements revealed at least 3 camera frames and a 6 convergence angle were required to achieve results reminiscent of the original structure. The central investigative effort sought the applicability of certain airborne reconnaissance flight profiles to reconstructing ground targets. The data sets included images collected within a synthetic 3-D urban environment along circular, linear and s-curve aerial flight profiles equipped with agile and non-agile sensors. S-curve and dynamically controlled linear flight paths provided superior results, whereas with sufficient data conditioning and combination of orthogonal flight paths, all flight paths produced quality reconstructions under a wide variety of operational considerations

    Reionization and Cosmology with 21 cm Fluctuations

    Full text link
    Measurement of the spatial distribution of neutral hydrogen via the redshifted 21 cm line promises to revolutionize our knowledge of the epoch of reionization and the first galaxies, and may provide a powerful new tool for observational cosmology from redshifts 1<z<4 . In this review we discuss recent advances in our theoretical understanding of the epoch of reionization (EoR), the application of 21 cm tomography to cosmology and measurements of the dark energy equation of state after reionization, and the instrumentation and observational techniques shared by 21 cm EoR and post reionization cosmology machines. We place particular emphasis on the expected signal and observational capabilities of first generation 21 cm fluctuation instruments.Comment: Invited review for Annual Review of Astronomy and Astrophysics (2010 volume

    Calibration and 3D Mapping for Multi-sensor Inspection Tasks with Industrial Robots

    Get PDF
    Le ispezioni di qualità sono una parte essenziale per garantire che il processo di produzione si svolga senza intoppi e che il prodotto finale soddisfi standard elevati. I robot industriali sono diventati uno strumento fondamentale per condurre le ispezioni di qualità, consentendo precisione e coerenza nel processo di ispezione. Utilizzando tecnologie di ispezione avanzate, i robot industriali possono rilevare difetti e anomalie nei prodotti a una velocità superiore a quella degli ispettori umani, migliorando l'efficienza della produzione. Grazie alla capacità di automatizzare le attività di ispezione ripetitive e noiose, i robot industriali possono anche ridurre il rischio di errore umano e aumentare la qualità dei prodotti. Con il continuo progresso tecnologico, l'uso dei robot industriali per le ispezioni di qualità si sta diffondendo in tutti i settori industriali, da quello automobilistico e manifatturiero a quello aerospaziale. Lo svantaggio di una tale varietà di compiti di ispezione è che di solito le ispezioni industriali richiedono configurazioni robotiche specifiche e sensori appropriati, rendendo ogni ispezione molto specifica e personalizzata. Per questo motivo, la presente tesi fornisce una panoramica di un framework di ispezione generale che risolve il problema della creazione di celle di lavoro di ispezione personalizzate, proponendo moduli software generali che possono essere facilmente configurati per affrontare ogni specifico scenario di ispezione. In particolare, questa tesi si concentra sui problemi della calibrazione occhio-mano, ovvero il problema di calcolare con precisione la posizione del sensore nella cella di lavoro rispetto all'inquadratura del robot, e del Data Mapping, utilizzato per mappare i dati del sensore nella rappresentazione del modello 3D dell'oggetto ispezionato. Per la calibrazione occhio-mano proponiamo due tecniche che risolvono con precisione la posizione del sensore in più configurazioni robotiche. Entrambe considerano la configurazione robot-sensore eye-on-base e eye-in-hand, vale a dire il modo in cui discriminiamo se il sensore è montato in un punto fisso della cella di lavoro o nel braccio terminale del manipolatore robotico, rispettivamente. Inoltre, uno dei principali contributi di questa tesi è un approccio generale alla calibrazione occhio-mano che è anche in grado di gestire, grazie a una formulazione unificata di ottimizzazione del grafo di posa, configurazioni di ispezione in cui sono coinvolti più sensori (ad esempio, reti multi-camera). In definitiva, questa tesi propone un metodo generale che sfrutta un risultato preciso e accurato della calibrazione occhio-mano per affrontare il problema del Data Mapping per i robot di ispezione multiuso. Questo approccio è stato applicato in diverse configurazioni di ispezione, dall'industria automobilistica a quella aerospaziale e manifatturiera. La maggior parte dei contributi presentati in questa tesi sono disponibili come pacchetti software open-source. Riteniamo che ciò favorisca la collaborazione, consenta una precisa ripetibilità dei nostri esperimenti e faciliti la ricerca futura sulla calibrazione di complesse configurazioni robotiche industriali.Quality inspections are an essential part of ensuring the manufacturing process runs smoothly and that the final product meets high standards. Industrial robots have emerged as a key tool in conducting quality inspections, allowing for precision and consistency in the inspection process. By utilizing advanced inspection technologies, industrial robots can detect defects and anomalies in products at a faster pace than human inspectors, improving production efficiency. With the ability to automate repetitive and tedious inspection tasks, industrial robots can also reduce the risk of human error and increase product quality. As technology continues to advance, the use of industrial robots for quality inspections is becoming more widespread across industrial sectors, ranging from automotive and manufactury to aerospace industries. The drawback of such a large variety of inspection tasks is that usually industrial inspections require specific robotic setups and appropriate sensors, making every inspection very specific and custom buildt. For this reason, this thesis gives an overview of a general inspection framework that solves the problem of creating customized inspection workcells by proposing general software modules that can be easily configured to address each specific inspection scenario. In particular, this thesis is focusing on the problems of Hand-eye Calibration, that is the problem of accurately computing the position of the sensor in the workcell with respect to the robot frame, and Data Mapping that is used to map sensor data to the 3D model representation of the inspected object. For the Hand-eye Calibration we propose two techniques that accurately solve the position of the sensor in multiple robotic setups. They both consider eye-on-base and eye-in-hand robot-sensor configuration, namely, this is the way in which we discriminate if the sensor is mounted in a fixed place in the workcell or in the end-effector of the robot manipulator, respectively. Moreover, one of the main contributions of this thesis is a general hand-eye calibration approach that is also capable of handling, thanks to a unified pose-graph optimization formulation, inspection setups where multiple sensors are involved (e.g., multi-camera networks). In the end, this thesis is proposing a general method that takes advantage of a precise and accurate hand-eye calibration result to address the problem of Data Mapping for multi-purpose inspection robots. This approach has been applied in multiple inspection setups, ranging from automotive to aerospace and manufactury industry. Most of the contributions presented in this thesis are available as open-source software packages. We believe that this will foster collaboration, enable precise repeatability of our experiments, and facilitate future research on the calibration of complex industrial robotic setups

    Viewfinder: final activity report

    Get PDF
    The VIEW-FINDER project (2006-2009) is an 'Advanced Robotics' project that seeks to apply a semi-autonomous robotic system to inspect ground safety in the event of a fire. Its primary aim is to gather data (visual and chemical) in order to assist rescue personnel. A base station combines the gathered information with information retrieved from off-site sources. The project addresses key issues related to map building and reconstruction, interfacing local command information with external sources, human-robot interfaces and semi-autonomous robot navigation. The VIEW-FINDER system is a semi-autonomous; the individual robot-sensors operate autonomously within the limits of the task assigned to them, that is, they will autonomously navigate through and inspect an area. Human operators monitor their operations and send high level task requests as well as low level commands through the interface to any nodes in the entire system. The human interface has to ensure the human supervisor and human interveners are provided a reduced but good and relevant overview of the ground and the robots and human rescue workers therein

    A Comparative Review of Hand-Eye Calibration Techniques for Vision Guided Robots

    Get PDF
    Hand-eye calibration enables proper perception of the environment in which a vision guided robot operates. Additionally, it enables the mapping of the scene in the robots frame. Proper hand-eye calibration is crucial when sub-millimetre perceptual accuracy is needed. For example, in robot assisted surgery, a poorly calibrated robot would cause damage to surrounding vital tissues and organs, endangering the life of a patient. A lot of research has gone into ways of accurately calibrating the hand-eye system of a robot with different levels of success, challenges, resource requirements and complexities. As such, academics and industrial practitioners are faced with the challenge of choosing which algorithm meets the implementation requirements based on the identified constraints. This review aims to give a general overview of the strengths and weaknesses of different hand-eye calibration algorithms available to academics and industrial practitioners to make an informed design decision, as well as incite possible areas of research based on the identified challenges. We also discuss different calibration targets which is an important part of the calibration process that is often overlooked in the design process
    corecore