23 research outputs found

    Balance de resultados en la asignatura Electrónica y Automática (Grado de Mecánica) con la incorporación de metodologías activas de aprendizaje

    Full text link
    [ES] Es bien sabido que las modalidades pedagógicas que fomentan que el alumno sea protagonista de su proceso de aprendizaje son más eficaces. Sin embargo, el arraigo hacia la modalidad de clase magistral hace que tanto alumnos como profesores sean muy reacios a cambiar drásticamente a un modelo activo, sobretodo en asignaturas con alto grado teórico y de primeros cursos de grado. Es por ello que este artículo propone combinar la modalidad de clase magistral con metodologías activas de cara a mejorar el proceso de enseñanza-aprendizaje en alumnos de primeros cursos de grado. El artículo describe con detalle un caso de aplicación en la asignatura de Electrónica y Automática de segundo curso de Grado de Ingeniería Mecánica. Además, se muestra una comparativa de los resultados académicos obtenidos con el mismo profesorado en los cursos 2018/2019, con la modalidad tradicional de clase magistral, y el 2019/2020, con la propuesta de este trabajo. Por último, el artículo expone la opinión que han transmitido los alumnos acerca de su experiencia con la asignatura en el curso 2019/2020.[EN] It is well known that active models of teaching make the student learning more efficient than with the lecture model of teaching. However, the attachment to the lecture model of teaching makes both students and teachers very reluctant to drastically change to any active model of teaching, especially in subjects with high theoretical content and in first grade courses. For this reason, this article proposes to combine the lecture model of teaching with active methods of teaching in order to improve the teaching-learning process of first grade students. The article describes in detail a case of application for the subject of Electronics and Automation of the second year of the Mechanical Engineering Degree. In addition, a comparison of the academic results obtained with the same teaching staff in the 2018/2019 academic year, using the traditional lecture model of teaching, and the 2019/2020 academic year, combining the lecture model of teaching with active methods, is shown and analyzed. Moreover, the article shows the student opinions about their experience with the subject in the 2019/2020 academic year.Solanes, JE.; Graciá, L. (2021). Balance de resultados en la asignatura Electrónica y Automática (Grado de Mecánica) con la incorporación de metodologías activas de aprendizaje. En IN-RED 2020: VI Congreso de Innovación Educativa y Docencia en Red. Editorial Universitat Politècnica de València. 333-345. https://doi.org/10.4995/INRED2020.2020.11958OCS33334

    On improving robot image-based visual servoing based on dual-rate reference filtering control strategy

    Full text link
    It is well known that the use of multi-rate control techniques have improved the performance of many systems in general, and robotic systems, in particular. The main contribution of this paper is the generalization of the Reference Filtering control strategy from a dual-rate point of view, improving its inherent properties by overcoming the problem of sensor latency. In the paper, we discuss and analyze the improvements introduced by the novel dual-rate reference filtering control strategy in terms of convergence time, reachability and robustness. More specifically, we discuss the capability to solve positioning tasks, when hardware limitations are present with large sampling rates. In addition, a comparison is made between the single-rate and the proposed dual-rate control strategies to prove the advantages of the latter approach. A complete set-up has been prepared for validation, including a six degree of freedom (DOF) industrial manipulator, a smart camera and embedded hardware used as a high level controller.This work was supported by VALi+d Program (Generalitat Valenciana), DIVISAMOS Project (Spanish Ministry, DPI-2009-14744-C03-01), PROMETEO Program (Conselleria d'Educacio, Generalitat Valenciana) and SAFEBUS: Ministry of Economy and Competitivity, IPT-2011-1165-370000).Solanes Galbis, JE.; Muñoz Benavent, P.; Girbés, V.; Armesto Ángel, L.; Tornero Montserrat, J. (2015). On improving robot image-based visual servoing based on dual-rate reference filtering control strategy. Robotica. 1-18. https://doi.org/10.1017/S0263574715000454S11

    Sliding mode control for robust and smooth reference tracking in robot visual servoing

    Full text link
    [EN] An approach based on sliding mode is proposed in this work for reference tracking in robot visual servoing. In particular, 2 sliding mode controls are obtained depending on whether joint accelerations or joint jerks are considered as the discontinuous control action. Both sliding mode controls are extensively compared in a 3D-simulated environment with their equivalent well-known continuous controls, which can be found in the literature, to highlight their similarities and differences. The main advantages of the proposed method are smoothness, robustness, and low computational cost. The applicability and robustness of the proposed approach are substantiated by experimental results using a conventional 6R industrial manipulator (KUKA KR 6 R900 sixx [AGILUS]) for positioning and tracking tasks.Spanish Government, Grant/Award Number: BES-2010-038486; Generalitat Valenciana, Grant/Award Number: BEST/2017/029 and APOSTD/2016/044Muñoz-Benavent, P.; Gracia, L.; Solanes, JE.; Esparza, A.; Tornero, J. (2018). Sliding mode control for robust and smooth reference tracking in robot visual servoing. International Journal of Robust and Nonlinear Control. 28(5):1728-1756. https://doi.org/10.1002/rnc.3981S17281756285Hutchinson, S., Hager, G. D., & Corke, P. I. (1996). A tutorial on visual servo control. IEEE Transactions on Robotics and Automation, 12(5), 651-670. doi:10.1109/70.538972Chaumette, F., & Hutchinson, S. (2008). Visual Servoing and Visual Tracking. Springer Handbook of Robotics, 563-583. doi:10.1007/978-3-540-30301-5_25Corke, P. (2011). Robotics, Vision and Control. Springer Tracts in Advanced Robotics. doi:10.1007/978-3-642-20144-8RYAN, E. P., & CORLESS, M. (1984). Ultimate Boundedness and Asymptotic Stability of a Class of Uncertain Dynamical Systems via Continuous and Discontinuous Feedback Control. IMA Journal of Mathematical Control and Information, 1(3), 223-242. doi:10.1093/imamci/1.3.223Chaumette, F., & Hutchinson, S. (2006). Visual servo control. I. Basic approaches. IEEE Robotics & Automation Magazine, 13(4), 82-90. doi:10.1109/mra.2006.250573Chaumette, F., & Hutchinson, S. (2007). Visual servo control. II. Advanced approaches [Tutorial]. IEEE Robotics & Automation Magazine, 14(1), 109-118. doi:10.1109/mra.2007.339609Bonfe M Mainardi E Fantuzzi C Variable structure PID based visual servoing for robotic tracking and manipulation 2002 Lausanne, Switzerland https://doi.org/10.1109/IRDS.2002.1041421Solanes, J. E., Muñoz-Benavent, P., Girbés, V., Armesto, L., & Tornero, J. (2015). On improving robot image-based visual servoing based on dual-rate reference filtering control strategy. Robotica, 34(12), 2842-2859. doi:10.1017/s0263574715000454Elena M Cristiano M Damiano F Bonfe M Variable structure PID controller for cooperative eye-in-hand/eye-to-hand visual servoing 2003 Istanbul, Turkey https://doi.org/10.1109/CCA.2003.1223145Hashimoto, K., Ebine, T., & Kimura, H. (1996). Visual servoing with hand-eye manipulator-optimal control approach. IEEE Transactions on Robotics and Automation, 12(5), 766-774. doi:10.1109/70.538981Chan A Leonard S Croft EA Little JJ Collision-free visual servoing of an eye-in-hand manipulator via constraint-aware planning and control 2011 San Francisco, CA, USA https://doi.org/10.1109/ACC.2011.5991008Allibert, G., Courtial, E., & Chaumette, F. (2010). Visual Servoing via Nonlinear Predictive Control. Lecture Notes in Control and Information Sciences, 375-393. doi:10.1007/978-1-84996-089-2_20Kragic, D., & Christensen, H. I. (2003). Robust Visual Servoing. The International Journal of Robotics Research, 22(10-11), 923-939. doi:10.1177/027836490302210009Mezouar Y Chaumette F Path planning in image space for robust visual servoing 2000 San Francisco, CA, USA https://doi.org/10.1109/ROBOT.2000.846445Morel, G., Zanne, P., & Plestan, F. (2005). Robust visual servoing: bounding the task function tracking errors. IEEE Transactions on Control Systems Technology, 13(6), 998-1009. doi:10.1109/tcst.2005.857409Hammouda, L., Kaaniche, K., Mekki, H., & Chtourou, M. (2015). Robust visual servoing using global features based on random process. International Journal of Computational Vision and Robotics, 5(2), 138. doi:10.1504/ijcvr.2015.068803Yang YX Liu D Liu H Robot-self-learning visual servoing algorithm using neural networks 2002 Beijing, China https://doi.org/10.1109/ICMLC.2002.1174473Sadeghzadeh, M., Calvert, D., & Abdullah, H. A. (2014). Self-Learning Visual Servoing of Robot Manipulator Using Explanation-Based Fuzzy Neural Networks and Q-Learning. Journal of Intelligent & Robotic Systems, 78(1), 83-104. doi:10.1007/s10846-014-0151-5Lee AX Levine S Abbeel P Learning Visual Servoing With Deep Features and Fitted Q-Iteration 2017Fakhry, H. H., & Wilson, W. J. (1996). A modified resolved acceleration controller for position-based visual servoing. Mathematical and Computer Modelling, 24(5-6), 1-9. doi:10.1016/0895-7177(96)00112-4Keshmiri, M., Wen-Fang Xie, & Mohebbi, A. (2014). Augmented Image-Based Visual Servoing of a Manipulator Using Acceleration Command. IEEE Transactions on Industrial Electronics, 61(10), 5444-5452. doi:10.1109/tie.2014.2300048Edwards, C., & Spurgeon, S. (1998). Sliding Mode Control. doi:10.1201/9781498701822Zanne P Morel G Piestan F Robust vision based 3D trajectory tracking using sliding mode control 2000 San Francisco, CA, USAOliveira TR Peixoto AJ Leite AC Hsu L Sliding mode control of uncertain multivariable nonlinear systems applied to uncalibrated robotics visual servoing 2009 St. Louis, MO, USAOliveira, T. R., Leite, A. C., Peixoto, A. J., & Hsu, L. (2014). Overcoming Limitations of Uncalibrated Robotics Visual Servoing by means of Sliding Mode Control and Switching Monitoring Scheme. Asian Journal of Control, 16(3), 752-764. doi:10.1002/asjc.899Li, F., & Xie, H.-L. (2010). Sliding mode variable structure control for visual servoing system. International Journal of Automation and Computing, 7(3), 317-323. doi:10.1007/s11633-010-0509-5Kim J Kim D Choi S Won S Image-based visual servoing using sliding mode control 2006 Busan, South KoreaBurger W Dean-Leon E Cheng G Robust second order sliding mode control for 6D position based visual servoing with a redundant mobile manipulator 2015 Seoul, South KoreaBecerra, H. M., López-Nicolás, G., & Sagüés, C. (2011). A Sliding-Mode-Control Law for Mobile Robots Based on Epipolar Visual Servoing From Three Views. IEEE Transactions on Robotics, 27(1), 175-183. doi:10.1109/tro.2010.2091750Parsapour, M., & Taghirad, H. D. (2015). Kernel-based sliding mode control for visual servoing system. IET Computer Vision, 9(3), 309-320. doi:10.1049/iet-cvi.2013.0310Xin J Ran BJ Ma XM Robot visual sliding mode servoing using SIFT features 2016 Chengdu, ChinaZhao, Y. M., Lin, Y., Xi, F., Guo, S., & Ouyang, P. (2016). Switch-Based Sliding Mode Control for Position-Based Visual Servoing of Robotic Riveting System. Journal of Manufacturing Science and Engineering, 139(4). doi:10.1115/1.4034681Moosavian, S. A. A., & Papadopoulos, E. (2007). Modified transpose Jacobian control of robotic systems. Automatica, 43(7), 1226-1233. doi:10.1016/j.automatica.2006.12.029Sagara, S., & Taira, Y. (2008). Digital control of space robot manipulators with velocity type joint controller using transpose of generalized Jacobian matrix. Artificial Life and Robotics, 13(1), 355-358. doi:10.1007/s10015-008-0584-7Khalaji, A. K., & Moosavian, S. A. A. (2015). Modified transpose Jacobian control of a tractor-trailer wheeled robot. Journal of Mechanical Science and Technology, 29(9), 3961-3969. doi:10.1007/s12206-015-0841-3Utkin, V., Guldner, J., & Shi, J. (2017). Sliding Mode Control in Electro-Mechanical Systems. doi:10.1201/9781420065619Utkin, V. (2016). Discussion Aspects of High-Order Sliding Mode Control. IEEE Transactions on Automatic Control, 61(3), 829-833. doi:10.1109/tac.2015.2450571Romdhane, H., Dehri, K., & Nouri, A. S. (2016). Discrete second-order sliding mode control based on optimal sliding function vector for multivariable systems with input-output representation. International Journal of Robust and Nonlinear Control, 26(17), 3806-3830. doi:10.1002/rnc.3536Sharma, N. K., & Janardhanan, S. (2017). Optimal discrete higher-order sliding mode control of uncertain LTI systems with partial state information. International Journal of Robust and Nonlinear Control. doi:10.1002/rnc.3785LEVANT, A. (1993). Sliding order and sliding accuracy in sliding mode control. International Journal of Control, 58(6), 1247-1263. doi:10.1080/00207179308923053Levant, A. (2003). Higher-order sliding modes, differentiation and output-feedback control. International Journal of Control, 76(9-10), 924-941. doi:10.1080/0020717031000099029Bartolini, G., Ferrara, A., & Usai, E. (1998). Chattering avoidance by second-order sliding mode control. IEEE Transactions on Automatic Control, 43(2), 241-246. doi:10.1109/9.661074Siciliano, B., Sciavicco, L., Villani, L., & Oriolo, G. (2009). Robotics. Advanced Textbooks in Control and Signal Processing. doi:10.1007/978-1-84628-642-1Deo, A. S., & Walker, I. D. (1995). Overview of damped least-squares methods for inverse kinematics of robot manipulators. Journal of Intelligent & Robotic Systems, 14(1), 43-68. doi:10.1007/bf01254007WHEELER, G., SU, C.-Y., & STEPANENKO, Y. (1998). A Sliding Mode Controller with Improved Adaptation Laws for the Upper Bounds on the Norm of Uncertainties. Automatica, 34(12), 1657-1661. doi:10.1016/s0005-1098(98)80024-1Yu-Sheng Lu. (2009). Sliding-Mode Disturbance Observer With Switching-Gain Adaptation and Its Application to Optical Disk Drives. IEEE Transactions on Industrial Electronics, 56(9), 3743-3750. doi:10.1109/tie.2009.2025719Chen, X., Shen, W., Cao, Z., & Kapoor, A. (2014). A novel approach for state of charge estimation based on adaptive switching gain sliding mode observer in electric vehicles. Journal of Power Sources, 246, 667-678. doi:10.1016/j.jpowsour.2013.08.039Cong, B. L., Chen, Z., & Liu, X. D. (2012). On adaptive sliding mode control without switching gain overestimation. International Journal of Robust and Nonlinear Control, 24(3), 515-531. doi:10.1002/rnc.2902Taleb, M., Plestan, F., & Bououlid, B. (2014). An adaptive solution for robust control based on integral high-order sliding mode concept. International Journal of Robust and Nonlinear Control, 25(8), 1201-1213. doi:10.1002/rnc.3135Zhu, J., & Khayati, K. (2016). On a new adaptive sliding mode control for MIMO nonlinear systems with uncertainties of unknown bounds. International Journal of Robust and Nonlinear Control, 27(6), 942-962. doi:10.1002/rnc.3608Hafez AHA Cervera E Jawahar CV Hybrid visual servoing by boosting IBVS and PBVS 2008 Damascus, SyriaKermorgant O Chaumette F Combining IBVS and PBVS to ensure the visibility constraint 2011 San Francisco, CA, USACorke, P. I., & Hutchinson, S. A. (2001). A new partitioned approach to image-based visual servo control. IEEE Transactions on Robotics and Automation, 17(4), 507-515. doi:10.1109/70.954764Yang, Z., & Shen, S. (2017). Monocular Visual–Inertial State Estimation With Online Initialization and Camera–IMU Extrinsic Calibration. IEEE Transactions on Automation Science and Engineering, 14(1), 39-51. doi:10.1109/tase.2016.2550621Chesi G Hashimoto K Static-eye against hand-eye visual servoing 2002 Las Vegas, NV, USABourdis N Marraud D Sahbi H Camera pose estimation using visual servoing for aerial video change detection 2012 Munich, GermanyShademan A Janabi-Sharifi F Sensitivity analysis of EKF and iterated EKF pose estimation for position-based visual servoing 2005 USAMalis, E., Mezouar, Y., & Rives, P. (2010). Robustness of Image-Based Visual Servoing With a Calibrated Camera in the Presence of Uncertainties in the Three-Dimensional Structure. IEEE Transactions on Robotics, 26(1), 112-120. doi:10.1109/tro.2009.2033332Chen J Behal A Dawson D Dixon W Adaptive visual servoing in the presence of intrinsic calibration uncertainty 2003 USAMezouar Y Malis E Robustness of central catadioptric image-based visual servoing to uncertainties on 3D parameters 2004 Sendai, JapanMarchand, E., Spindler, F., & Chaumette, F. (2005). ViSP for visual servoing: a generic software platform with a wide class of robot control skills. IEEE Robotics & Automation Magazine, 12(4), 40-52. doi:10.1109/mra.2005.157702

    Virtual Reality-Based Interface for Advanced Assisted Mobile Robot Teleoperation

    Full text link
    [EN] This work proposes a new interface for the teleoperation of mobile robots based on virtual reality that allows a natural and intuitive interaction and cooperation between the human and the robot, which is useful for many situations, such as inspection tasks, the mapping of complex environments, etc. Contrary to previous works, the proposed interface does not seek the realism of the virtual environment but provides all the minimum necessary elements that allow the user to carry out the teleoperation task in a more natural and intuitive way. The teleoperation is carried out in such a way that the human user and the mobile robot cooperate in a synergistic way to properly accomplish the task: the user guides the robot through the environment in order to benefit from the intelligence and adaptability of the human, whereas the robot is able to automatically avoid collisions with the objects in the environment in order to benefit from its fast response. The latter is carried out using the well-known potential field-based navigation method. The efficacy of the proposed method is demonstrated through experimentation with the Turtlebot3 Burger mobile robot in both simulation and real-world scenarios. In addition, usability and presence questionnaires were also conducted with users of different ages and backgrounds to demonstrate the benefits of the proposed approach. In particular, the results of these questionnaires show that the proposed virtual reality based interface is intuitive, ergonomic and easy to use.This research was funded by the Spanish Government (Grant PID2020-117421RB-C21 funded byMCIN/AEI/10.13039/501100011033) and by the Generalitat Valenciana (Grant GV/2021/181).Solanes, JE.; Muñoz García, A.; Gracia Calandin, LI.; Tornero Montserrat, J. (2022). Virtual Reality-Based Interface for Advanced Assisted Mobile Robot Teleoperation. Applied Sciences. 12(12):1-22. https://doi.org/10.3390/app12126071122121

    Generalization of reference filtering control strategy for 2D/3D visual feedback control of industrial robot manipulators

    Full text link
    This is an Author's Accepted Manuscript of an article published in Solanes, J. E., Munoz-Benavent, P., Armesto, L., Gracia, L., & Tornero, J. (2022). Generalization of reference filtering control strategy for 2D/3D visual feedback control of industrial robot manipulators. International Journal of Computer Integrated Manufacturing, 35(3), 229-246, 2021 Informa UK Limited, trading as Taylor & Francis Group, available online at: http://www.tandfonline.com/10.1080/0951192X.2021.1973108.[EN] This paper develops the application of the Dual Rate Dual Sampling Reference Filtering Control Strategy to 2D and 3D visual feedback control. This strategy allows to overcome the problem of sensor latency and to address the problem of control task failure due to visual features leaving the camera field of view. In particular, a Dual Rate Kalman Filter is used to generate inter-sample estimations of the visual features to deal with the problem of vision sensor latency, whereas a Dual Rate Extended Kalman Filter Smoother is used to generate more convenient visual features trajectories in the image plane. Both 2D and 3D visual feedback control approaches are widely analyzed throughout the paper, as well as the overall system performance using different visual feedback controllers, providing a set of results that highlight the improvements in terms of solution reachability, robustness, and time domain response. The proposed control strategy has been validated on an industrial system with hard real-time limitations, consisting of a 6 DOF industrial manipulator, a 5 MP camera, and a PLC as controller.This work was supported in part by the Spanish Government under the projects PID2020-117421RB-C21 and PID2020116585GB-I00, and in part by the Generalitat Valenciana under the project GV/2021/181.Solanes, JE.; Muñoz-Benavent, P.; Armesto, L.; Gracia Calandin, LI.; Tornero Montserrat, J. (2022). Generalization of reference filtering control strategy for 2D/3D visual feedback control of industrial robot manipulators. International Journal of Computer Integrated Manufacturing. 35(3):229-246. https://doi.org/10.1080/0951192X.2021.197310822924635

    Combining haptics and inertial motion capture to enhance remote control of a dual-arm robot

    Full text link
    [EN] High dexterity is required in tasks in which there is contact between objects, such as surface conditioning (wiping, polishing, scuffing, sanding, etc.), specially when the location of the objects involved is unknown or highly inaccurate because they are moving, like a car body in automotive industry lines. These applications require the human adaptability and the robot accuracy. However, sharing the same workspace is not possible in most cases due to safety issues. Hence, a multi-modal teleoperation system combining haptics and an inertial motion capture system is introduced in this work. The human operator gets the sense of touch thanks to haptic feedback, whereas using the motion capture device allows more naturalistic movements. Visual feedback assistance is also introduced to enhance immersion. A Baxter dual-arm robot is used to offer more flexibility and manoeuvrability, allowing to perform two independent operations simultaneously. Several tests have been carried out to assess the proposed system. As it is shown by the experimental results, the task duration is reduced and the overall performance improves thanks to the proposed teleoperation method.This research was funded by Generalitat Valenciana (Grants GV/2021/074 and GV/2021/181) and by the SpanishGovernment (Grants PID2020-118071GB-I00 and PID2020-117421RBC21 funded by MCIN/AEI/10.13039/501100011033). This work was also supported byCoordenacao de Aperfeiaoamento de Pessoal de Nivel Superior (CAPES Brasil) under Finance Code 001, by CEFET-MG, and by a Royal Academy of Engineering Chair in Emerging Technologies to YD.Girbés-Juan, V.; Schettino, V.; Gracia Calandin, LI.; Solanes, JE.; Demiris, Y.; Tornero, J. (2022). Combining haptics and inertial motion capture to enhance remote control of a dual-arm robot. Journal on Multimodal User Interfaces. 16(2):219-238. https://doi.org/10.1007/s12193-021-00386-8219238162Hägele M, Nilsson K, Pires JN, Bischoff R (2016) Industrial robotics. Springer, Cham, pp 1385–1422. https://doi.org/10.1007/978-3-319-32552-1_54Hokayem PF, Spong MW (2006) Bilateral teleoperation: an historical survey. Automatica 42(12):2035–2057. https://doi.org/10.1016/j.automatica.2006.06.027Son HI (2019) The contribution of force feedback to human performance in the teleoperation of multiple unmanned aerial vehicles. J Multimodal User Interfaces 13(4):335–342Jones B, Maiero J, Mogharrab A, Aguliar IA, Adhikari A, Riecke BE, Kruijff E, Neustaedter C, Lindeman RW (2020) Feetback: augmenting robotic telepresence with haptic feedback on the feet. In: Proceedings of the 2020 international conference on multimodal interaction, pp 194–203Merrad W, Héloir A, Kolski C, Krüger A (2021) Rfid-based tangible and touch tabletop for dual reality in crisis management context. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-021-00370-2Schettino V, Demiris Y (2019) Inference of user-intention in remote robot wheelchair assistance using multimodal interfaces. In: 2019 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4600–4606Casper J, Murphy RR (2003) Human–robot interactions during the robot-assisted urban search and rescue response at the world trade center. IEEE Trans Syst Man Cybern Part B (Cybern) 33(3):367–385. https://doi.org/10.1109/TSMCB.2003.811794Chen JY (2010) UAV-guided navigation for ground robot tele-operation in a military reconnaissance environment. Ergonomics 53(8):940–950. https://doi.org/10.1080/00140139.2010.500404 (pMID: 20658388.)Aleotti J, Micconi G, Caselli S, Benassi G, Zambelli N, Bettelli M, Calestani D, Zappettini A (2019) Haptic teleoperation of UAV equipped with gamma-ray spectrometer for detection and identification of radio-active materials in industrial plants. In: Tolio T, Copani G, Terkaj W (eds) Factories of the future: the Italian flagship initiative. Springer, Cham, pp 197–214. https://doi.org/10.1007/978-3-319-94358-9_9Santos Carreras L (2012) Increasing haptic fidelity and ergonomics in teleoperated surgery. PhD Thesis, EPFL, Lausanne, pp 1–188. https://doi.org/10.5075/epfl-thesis-5412Hatzfeld C, Neupert C, Matich S, Braun M, Bilz J, Johannink J, Miller J, Pott PP, Schlaak HF, Kupnik M, Werthschützky R, Kirschniak A (2017) A teleoperated platform for transanal single-port surgery: ergonomics and workspace aspects. In: IEEE world haptics conference (WHC), pp 1–6. https://doi.org/10.1109/WHC.2017.7989847Burns JO, Mellinkoff B, Spydell M, Fong T, Kring DA, Pratt WD, Cichan T, Edwards CM (2019) Science on the lunar surface facilitated by low latency telerobotics from a lunar orbital platform-gateway. Acta Astronaut 154:195–203. https://doi.org/10.1016/j.actaastro.2018.04.031Sivčev S, Coleman J, Omerdić E, Dooly G, Toal D (2018) Underwater manipulators: a review. Ocean Eng 163:431–450. https://doi.org/10.1016/j.oceaneng.2018.06.018Abich J, Barber DJ (2017) The impact of human–robot multimodal communication on mental workload, usability preference, and expectations of robot behavior. J Multimodal User Interfaces 11(2):211–225. https://doi.org/10.1007/s12193-016-0237-4Hong A, Lee DG, Bülthoff HH, Son HI (2017) Multimodal feedback for teleoperation of multiple mobile robots in an outdoor environment. J Multimodal User Interfaces 11(1):67–80. https://doi.org/10.1007/s12193-016-0230-yKatyal KD, Brown CY, Hechtman SA, Para MP, McGee TG, Wolfe KC, Murphy RJ, Kutzer MDM, Tunstel EW, McLoughlin MP, Johannes MS (2014) Approaches to robotic teleoperation in a disaster scenario: from supervised autonomy to direct control. In: IEEE/RSJ international conference on intelligent robots and systems, pp 1874–1881. https://doi.org/10.1109/IROS.2014.6942809Niemeyer G, Preusche C, Stramigioli S, Lee D (2016) Telerobotics. Springer, Cham, pp 1085–1108. https://doi.org/10.1007/978-3-319-32552-1_43Li J, Li Z, Hauser K (2017) A study of bidirectionally telepresent tele-action during robot-mediated handover. In: Proceedings—IEEE international conference on robotics and automation, pp 2890–2896. https://doi.org/10.1109/ICRA.2017.7989335Peng XB, Kanazawa A, Malik J, Abbeel P, Levine S (2018) Sfv: reinforcement learning of physical skills from videos. ACM Trans. Graph. 37(6):178:1-178:14. https://doi.org/10.1145/3272127.3275014Coleca F, State A, Klement S, Barth E, Martinetz T (2015) Self-organizing maps for hand and full body tracking. Neurocomputing 147: 174–184. Advances in self-organizing maps subtitle of the special issue: selected papers from the workshop on self-organizing maps 2012 (WSOM 2012). https://doi.org/10.1016/j.neucom.2013.10.041Von Marcard T, Rosenhahn B, Black MJ, Pons-Moll G (2017) Sparse inertial poser: automatic 3d human pose estimation from sparse Imus. In: Computer graphics forum, vol 36. Wiley, pp 349–360Zhao J (2018) A review of wearable IMU (inertial-measurement-unit)-based pose estimation and drift reduction technologies. J Phys Conf Ser 1087:042003. https://doi.org/10.1088/1742-6596/1087/4/042003Malleson C, Gilbert A, Trumble M, Collomosse J, Hilton A, Volino M (2018) Real-time full-body motion capture from video and IMUs. In: Proceedings—2017 international conference on 3D vision, 3DV 2017 (September), pp 449–457. https://doi.org/10.1109/3DV.2017.00058Du G, Zhang P, Mai J, Li Z (2012) Markerless kinect-based hand tracking for robot teleoperation. Int J Adv Robot Syst 9(2):36. https://doi.org/10.5772/50093Çoban M, Gelen G (2018) Wireless teleoperation of an industrial robot by using myo arm band. In: International conference on artificial intelligence and data processing (IDAP), pp 1–6. https://doi.org/10.1109/IDAP.2018.8620789Lipton JI, Fay AJ, Rus D (2018) Baxter’s homunculus: virtual reality spaces for teleoperation in manufacturing. IEEE Robot Autom Lett 3(1):179–186. https://doi.org/10.1109/LRA.2017.2737046Zhang T, McCarthy Z, Jow O, Lee D, Chen X, Goldberg K, Abbeel P (2018) Deep imitation learning for complex manipulation tasks from virtual reality teleoperation. In: IEEE international conference on robotics and automation (ICRA), pp 5628–5635. https://doi.org/10.1109/ICRA.2018.8461249Hannaford B, Okamura AM (2016) Haptics. Springer, Cham, pp 1063–1084. https://doi.org/10.1007/978-3-319-32552-1_42Rodríguez J-L, Velàzquez R (2012) Haptic rendering of virtual shapes with the Novint Falcon. Proc Technol 3:132–138. https://doi.org/10.1016/J.PROTCY.2012.03.014Teklemariam HG, Das AK (2017) A case study of phantom omni force feedback device for virtual product design. Int J Interact Des Manuf (IJIDeM) 11(4):881–892. https://doi.org/10.1007/s12008-015-0274-3Karbasizadeh N, Zarei M, Aflakian A, Masouleh MT, Kalhor A (2018) Experimental dynamic identification and model feed-forward control of Novint Falcon haptic device. Mechatronics 51:19–30. https://doi.org/10.1016/j.mechatronics.2018.02.013Georgiou T, Demiris Y (2017) Adaptive user modelling in car racing games using behavioural and physiological data. User Model User-Adapted Interact 27(2):267–311. https://doi.org/10.1007/s11257-017-9192-3Son HI (2019) The contribution of force feedback to human performance in the teleoperation of multiple unmanned aerial vehicles. J Multimodal User Interfaces 13(4):335–342. https://doi.org/10.1007/s12193-019-00292-0Ramírez-Fernández C, Morán AL, García-Canseco E (2015) Haptic feedback in motor hand virtual therapy increases precision and generates less mental workload. In: 2015 9th international conference on pervasive computing technologies for healthcare (PervasiveHealth), pp 280–286. https://doi.org/10.4108/icst.pervasivehealth.2015.260242Saito Y, Raksincharoensak P (2019) Effect of risk-predictive haptic guidance in one-pedal driving mode. Cognit Technol Work 21(4):671–684. https://doi.org/10.1007/s10111-019-00558-3Girbés V, Armesto L, Dols J, Tornero J (2016) Haptic feedback to assist bus drivers for pedestrian safety at low speed. IEEE Trans Haptics 9(3):345–357. https://doi.org/10.1109/TOH.2016.2531686Girbés V, Armesto L, Dols J, Tornero J (2017) An active safety system for low-speed bus braking assistance. IEEE Trans Intell Transp Syst 18(2):377–387. https://doi.org/10.1109/TITS.2016.2573921Escobar-Castillejos D, Noguez J, Neri L, Magana A, Benes B (2016) A review of simulators with haptic devices for medical training. J Med Syst 40(4):104. https://doi.org/10.1007/s10916-016-0459-8Coles TR, Meglan D, John NW (2011) The role of haptics in medical training simulators: a survey of the state of the art. IEEE Trans Haptics 4(1):51–66. https://doi.org/10.1109/TOH.2010.19Okamura AM, Verner LN, Reiley CE, Mahvash M (2010) Haptics for robot-assisted minimally invasive surgery. In: Kaneko M, Nakamura Y (eds) Robotics research. Springer tracts in advanced robotics, vol 66. Springer, Berlin, pp 361–372. https://doi.org/10.1007/978-3-642-14743-2_30Ehrampoosh S, Dave M, Kia MA, Rablau C, Zadeh MH (2013) Providing haptic feedback in robot-assisted minimally invasive surgery: a direct optical force-sensing solution for haptic rendering of deformable bodies. Comput Aided Surg 18(5–6):129–141. https://doi.org/10.3109/10929088.2013.839744Ju Z, Yang C, Li Z, Cheng L, Ma H (2014) Teleoperation of humanoid Baxter robot using haptic feedback. In: 2014 international conference on multisensor fusion and information integration for intelligent systems (MFI). IEEE, pp 1–6. https://doi.org/10.1109/MFI.2014.6997721Clark JP, Lentini G, Barontini F, Catalano MG, Bianchi M, O’Malley MK (2019) On the role of wearable haptics for force feedback in teleimpedance control for dual-arm robotic teleoperation. In: International conference on robotics and automation (ICRA), pp 5187–5193. https://doi.org/10.1109/ICRA.2019.8793652Gracia L, Solanes JE, Muñoz-Benavent P, Miro JV, Perez-Vidal C, Tornero J (2018) Adaptive sliding mode control for robotic surface treatment using force feedback. Mechatronics 52:102–118. https://doi.org/10.1016/j.mechatronics.2018.04.008Zhu D, Xu X, Yang Z, Zhuang K, Yan S, Ding H (2018) Analysis and assessment of robotic belt grinding mechanisms by force modeling and force control experiments. Tribol Int 120:93–98. https://doi.org/10.1016/j.triboint.2017.12.043Smith C, Karayiannidis Y, Nalpantidis L, Gratal X, Qi P, Dimarogonas DV, Kragic D (2012) Dual arm manipulation—a survey. Robot Auton Syst 60(10):1340–1353. https://doi.org/10.1016/j.robot.2012.07.005Girbés-Juan V, Schettino V, Demiris Y, Tornero J (2021) Haptic and visual feedback assistance for dual-arm robot teleoperation in surface conditioning tasks. IEEE Trans Haptics 14(1):44–56. https://doi.org/10.1109/TOH.2020.3004388Tunstel EW Jr, Wolfe KC, Kutzer MD, Johannes MS, Brown CY, Katyal KD, Para MP, Zeher MJ (2013) Recent enhancements to mobile bimanual robotic teleoperation with insight toward improving operator control. Johns Hopkins APL Tech Digest 32(3):584García A, Solanes JE, Gracia L, Muñoz-Benavent P, Girbés-Juan V, Tornero J (2021) Bimanual robot control for surface treatment tasks. Int J Syst Sci. https://doi.org/10.1080/00207721.2021.1938279Jasim IF, Plapper PW, Voos H (2014) Position identification in force-guided robotic peg-in-hole assembly tasks. Proc CIRP 23((C)):217–222. https://doi.org/10.1016/j.procir.2014.10.077Song HC, Kim YL, Song JB (2016) Guidance algorithm for complex-shape peg-in-hole strategy based on geometrical information and force control. Adv Robot 30(8):552–563. https://doi.org/10.1080/01691864.2015.1130172Kramberger A, Gams A, Nemec B, Chrysostomou D, Madsen O, Ude A (2017) Generalization of orientation trajectories and force-torque profiles for robotic assembly. Robot Auton Syst 98:333–346. https://doi.org/10.1016/j.robot.2017.09.019Pliego-Jiménez J, Arteaga-Pérez MA (2015) Adaptive position/force control for robot manipulators in contact with a rigid surface with unknown parameters. In: European control conference (ECC), pp 3603–3608. https://doi.org/10.1109/ECC.2015.7331090Gierlak P, Szuster M (2017) Adaptive position/force control for robot manipulator in contact with a flexible environment. Robot Auton Syst 95:80–101. https://doi.org/10.1016/j.robot.2017.05.015Solanes JE, Gracia L, Muñoz-Benavent P, Miro JV, Girbés V, Tornero J (2018) Human–robot cooperation for robust surface treatment using non-conventional sliding mode control. ISA Trans 80:528–541. https://doi.org/10.1016/j.isatra.2018.05.013Ravandi AK, Khanmirza E, Daneshjou K (2018) Hybrid force/position control of robotic arms manipulating in uncertain environments based on adaptive fuzzy sliding mode control. Appl Soft Comput 70:864–874. https://doi.org/10.1016/j.asoc.2018.05.048Solanes JE, Gracia L, Muñoz-Benavent P, Esparza A, Miro JV, Tornero J (2018) Adaptive robust control and admittance control for contact-driven robotic surface conditioning. Robot Comput Integr Manuf 54:115–132. https://doi.org/10.1016/j.rcim.2018.05.003Perez-Vidal C, Gracia L, Sanchez-Caballero S, Solanes JE, Saccon A, Tornero J (2019) Design of a polishing tool for collaborative robotics using minimum viable product approach. Int J Comput Integr Manuf 32(9):848–857. https://doi.org/10.1080/0951192X.2019.1637026Chen F, Zhao H, Li D, Chen L, Tan C, Ding H (2019) Contact force control and vibration suppression in robotic polishing with a smart end effector. Robot Comput Integr Manuf 57:391–403. https://doi.org/10.1016/j.rcim.2018.12.019Mohammad AEK, Hong J, Wang D, Guan Y (2019) Synergistic integrated design of an electrochemical mechanical polishing end-effector for robotic polishing applications. Robot Comput Integr Manuf 55:65–75. https://doi.org/10.1016/j.rcim.2018.07.005Waldron KJ, Schmiedeler J (2016) Kinematics. Springer, Cham, pp 11–36. https://doi.org/10.1007/978-3-319-32552-1_2Featherstone R, Orin DE (2016) Dynamics. Springer, Cham, pp 37–66. https://doi.org/10.1007/978-3-319-32552-1_3Wen K, Necsulescu D, Sasiadek J (2008) Haptic force control based on impedance/admittance control aided by visual feedback. Multimed Tools Appl 37(1):39–52. https://doi.org/10.1007/s11042-007-0172-1Tzafestas C, Velanas S, Fakiridis G (2008) Adaptive impedance control in haptic teleoperation to improve transparency under time-delay. In: IEEE international conference on robotics and automation, pp 212–219. https://doi.org/10.1109/ROBOT.2008.4543211Chiaverini S, Oriolo G, Maciejewski AA (2016) Redundant robots. Springer, Cham, pp 221–242. https://doi.org/10.1007/978-3-319-32552-1_10Ogata K (1987) Discrete-time control systems. McGraw-Hill, New YorkGarcía A, Girbés-Juan V, Solanes JE, Gracia L, Perez-Vidal C, Tornero J (2020) Human–robot cooperation for surface repair combining automatic and manual modes. IEEE Access 8:154024–154035. https://doi.org/10.1109/ACCESS.2020.301450

    Application of Neural Radiance Fields (NeRFs) for 3D Model Representation in the Industrial Metaverse

    Get PDF
    [EN] This study explores the utilization of Neural Radiance Fields (NeRFs), with a specific focus on the Instant NeRFs technique. The objective is to represent three-dimensional (3D) models within the context of the industrial metaverse, aiming to achieve a high-fidelity reconstruction of objects in virtual environments. NeRFs, renowned for their innovative approach, enable comprehensive model reconstructions by integrating diverse viewpoints and lighting conditions. The study employs tools such as Unity, Photon Pun2, and Oculus Interaction SDK to develop an immersive metaverse. Within this virtual industrial environment, users encounter numerous interactive six-dimensional (6D) models, fostering active engagement and enriching the overall experience. While initial implementations showcase promising results, they also introduce computational complexities. Nevertheless, this integration forms the basis for immersive comprehension and collaborative interactions within the industrial metaverse. The evolving potential of NeRF technology promises even more exciting prospects in the future.This work has been funded by the Spanish Government (Grant PID2020-117421RB-C21 funded by MCIN/AEI/10.13039/501100011033).Fabra, L.; Solanes, JE.; Muñoz García, A.; Martí Testón, A.; Alabau, A.; Gracia Calandin, LI. (2024). Application of Neural Radiance Fields (NeRFs) for 3D Model Representation in the Industrial Metaverse. Applied Sciences. 14(5). https://doi.org/10.3390/app1405182514

    Exploring the Relationship between the Coverage of AI in WIRED Magazine and Public Opinion Using Sentiment Analysis

    Get PDF
    [EN] The presence and significance of artificial intelligence (AI) technology in society have been steadily increasing since 2000. While its potential benefits are widely acknowledged, concerns about its impact on society, the economy, and ethics have also been raised. Consequently, artificial intelligence has garnered widespread attention in news media and popular culture. As mass media plays a pivotal role in shaping public perception, it is crucial to evaluate opinions expressed in these outlets. Understanding the public¿s perception of artificial intelligence is essential for effective public policy and decision making. This paper presents the results of a sentiment analysis study conducted on WIRED magazine¿s coverage of artificial intelligence between January 2018 and April 2023. The objective of the study is to assess the prevailing opinions towards artificial intelligence in articles from WIRED magazine, which is widely recognized as one of the most reputable and influential publications in the field of technology and innovation. Using two sentiment analysis techniques, AFINN and VADER, a total of 4265 articles were analyzed for positive, negative, and neutral sentiments. Additionally, a term frequency analysis was conducted to categorize articles based on the frequency of mentions of artificial intelligence. Finally, a linear regression analysis of the mean positive and negative sentiments was performed to examine trends for each month over a five-year period. The results revealed a leading pattern: there was a predominant positive sentiment with an upward trend in both positive and negative sentiments. This polarization of sentiment suggests a shift towards more extreme positions, which should influence public policy and decision making in the near future.This work has been funded by the Spanish Government (Grant PID2020-117421RBC21 funded by MCIN/AEI/10.13039/501100011033) and by the Generalitat Valenciana (Grant INVEST/2022/324).Moriniello, F.; Martí Testón, A.; Muñoz García, A.; Silva Jasaui, D.; Gracia Calandin, LI.; Solanes, JE. (2024). Exploring the Relationship between the Coverage of AI in WIRED Magazine and Public Opinion Using Sentiment Analysis. Applied Sciences. 14(5). https://doi.org/10.3390/app1405199414

    Bimanual robot control for surface treatment tasks

    Full text link
    This is an Author's Accepted Manuscript of an article published in Alberto García, J. Ernesto Solanes, Luis Gracia, Pau Muñoz-Benavent, Vicent Girbés-Juan & Josep Tornero (2022) Bimanual robot control for surface treatment tasks, International Journal of Systems Science, 53:1, 74-107, DOI: 10.1080/00207721.2021.1938279 [copyright Taylor & Francis], available online at: http://www.tandfonline.com/10.1080/00207721.2021.1938279[EN] This work develops a method to perform surface treatment tasks using a bimanual robotic system, i.e. two robot arms cooperatively performing the task. In particular, one robot arm holds the work-piece while the other robot arm has the treatment tool attached to its end-effector. Moreover, the human user teleoperates all the six coordinates of the former robot arm and two coordinates of the latter robot arm, i.e. the teleoperator can move the treatment tool on the plane given by the work- piece surface. Furthermore, a force sensor attached to the treatment tool is used to automatically attain the desired pressure between the tool and the workpiece and to automatically keep the tool orientation orthogonal to the workpiece surface. In addition, to assist the human user during the teleoperation, several constraints are defined for both robot arms in order to avoid exceeding the allowed workspace, e.g. to avoid collisions with other objects in the environment. The theory used in this work to develop the bimanual robot control relies on sliding mode control and task prioritisation. Finally, the feasibility and effectiveness of the method are shown through experimental results using two robot arms.This work was supported by Generalitat Valenciana [grant numbers ACIF/2019/007 and GV/2021/181] and Spanish Ministry of Science and Innovation [grant number PID2020117421RB-C21].García-Fernández, A.; Solanes, JE.; Gracia Calandin, LI.; Muñoz-Benavent, P.; Girbés-Juan, V.; Tornero, J. (2022). Bimanual robot control for surface treatment tasks. International Journal of Systems Science. 53(1):74-107. https://doi.org/10.1080/00207721.2021.19382797410753

    Virtual Production: Real-Time Rendering Pipelines for Indie Studios and the Potential in Different Scenarios

    Get PDF
    [EN] This work aims to identify and propose a functional pipeline for indie live-action films using Virtual Production with photorealistic real-time rendering game engines. The new production landscape is radically changing how movies and shows are made. Those were made in a linear pipeline, and now filmmakers can execute multiple tasks in a parallel mode using real-time renderers with high potential for different types of productions. Four interviews of professionals in the Spanish film and television market were conducted to obtain the whole perspective of the new paradigm. Following those examples, a virtual production set was implemented with an Antilatency tracking system, Unreal Engine (version 5.3), and Aximmetry (version 2023.3.2) as the leading software applications. Results are commented on, presenting how all the work is currently closely connected between pre-production, shooting, and post-production and analyzing its potential in different fields.Spanish Government (Grant PID2020-117421RB-C21 funded by MCIN/AEI/10.13039/501100011033) and Generalitat Valenciana (Grant INVEST/2022/324).Silva Jasaui, D.; Martí Testón, A.; Muñoz García, A.; Moriniello, F.; Solanes, JE.; Gracia Calandin, LI. (2024). Virtual Production: Real-Time Rendering Pipelines for Indie Studios and the Potential in Different Scenarios. Applied Sciences. 14(6). https://doi.org/10.3390/app1406253014
    corecore