34 research outputs found

    Insect inspired visual motion sensing and flying robots

    Get PDF
    International audienceFlying insects excellently master visual motion sensing techniques. They use dedicated motion processing circuits at a low energy and computational costs. Thanks to observations obtained on insect visual guidance, we developed visual motion sensors and bio-inspired autopilots dedicated to flying robots. Optic flow-based visuomotor control systems have been implemented on an increasingly large number of sighted autonomous robots. In this chapter, we present how we designed and constructed local motion sensors and how we implemented bio-inspired visual guidance scheme on-board several micro-aerial vehicles. An hyperacurate sensor in which retinal micro-scanning movements are performed via a small piezo-bender actuator was mounted onto a miniature aerial robot. The OSCAR II robot is able to track a moving target accurately by exploiting the microscan-ning movement imposed to its eye's retina. We also present two interdependent control schemes driving the eye in robot angular position and the robot's body angular position with respect to a visual target but without any knowledge of the robot's orientation in the global frame. This "steering-by-gazing" control strategy, which is implemented on this lightweight (100 g) miniature sighted aerial robot, demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    Localization in Low Luminance, Slippery Indoor Environment Using Afocal Optical Flow Sensor and Image Processing

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ (๋ฐ•์‚ฌ)-- ์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› ๊ณต๊ณผ๋Œ€ํ•™ ์ „๊ธฐยท์ •๋ณด๊ณตํ•™๋ถ€, 2017. 8. ์กฐ๋™์ผ.์‹ค๋‚ด ์„œ๋น„์Šค๋กœ๋ด‡์˜ ์œ„์น˜ ์ถ”์ •์€ ์ž์œจ ์ฃผํ–‰์„ ์œ„ํ•œ ํ•„์ˆ˜ ์š”๊ฑด์ด๋‹ค. ํŠนํžˆ ์นด๋ฉ”๋ผ๋กœ ์œ„์น˜๋ฅผ ์ถ”์ •ํ•˜๊ธฐ ์–ด๋ ค์šด ์‹ค๋‚ด ์ €์กฐ๋„ ํ™˜๊ฒฝ์—์„œ ๋ฏธ๋„๋Ÿฌ์ง์ด ๋ฐœ์ƒํ•  ๊ฒฝ์šฐ์—๋Š” ์œ„์น˜ ์ถ”์ •์˜ ์ •ํ™•๋„๊ฐ€ ๋‚ฎ์•„์ง„๋‹ค. ๋ฏธ๋„๋Ÿฌ์ง์€ ์ฃผ๋กœ ์นดํŽซ์ด๋‚˜ ๋ฌธํ„ฑ ๋“ฑ์„ ์ฃผํ–‰ํ•  ๋•Œ ๋ฐœ์ƒํ•˜๋ฉฐ, ํœ  ์—”์ฝ”๋” ๊ธฐ๋ฐ˜์˜ ์ฃผํ–‰๊ธฐ๋ก์œผ๋กœ๋Š” ์ฃผํ–‰ ๊ฑฐ๋ฆฌ์˜ ์ •ํ™•ํ•œ ์ธ์‹์— ํ•œ๊ณ„๊ฐ€ ์žˆ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š” ์นด๋ฉ”๋ผ ๊ธฐ๋ฐ˜ ๋™์‹œ์  ์œ„์น˜์ถ”์ • ๋ฐ ์ง€๋„์ž‘์„ฑ ๊ธฐ์ˆ (simultaneous localization and mappingSLAM)์ด ๋™์ž‘ํ•˜๊ธฐ ์–ด๋ ค์šด ์ €์กฐ๋„, ๋ฏธ๋„๋Ÿฌ์šด ํ™˜๊ฒฝ์—์„œ ์ €๊ฐ€์˜ ๋ชจ์…˜์„ผ์„œ์™€ ๋ฌดํ•œ์ดˆ์  ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ(afocal optical flow sensorAOFS) ๋ฐ VGA๊ธ‰ ์ „๋ฐฉ ๋‹จ์•ˆ์นด๋ฉ”๋ผ๋ฅผ ์œตํ•ฉํ•˜์—ฌ ๊ฐ•์ธํ•˜๊ฒŒ ์œ„์น˜๋ฅผ ์ถ”์ •ํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์ œ์•ˆํ–ˆ๋‹ค. ๋กœ๋ด‡์˜ ์œ„์น˜ ์ถ”์ •์€ ์ฃผํ–‰๊ฑฐ๋ฆฌ ์ˆœ๊ฐ„ ๋ณ€ํ™”๋Ÿ‰๊ณผ ๋ฐฉ์œ„๊ฐ ์ˆœ๊ฐ„ ๋ณ€ํ™”๋Ÿ‰์„ ๋ˆ„์  ์œตํ•ฉํ•˜์—ฌ ์‚ฐ์ถœํ–ˆ์œผ๋ฉฐ, ๋ฏธ๋„๋Ÿฌ์šด ํ™˜๊ฒฝ์—์„œ๋„ ์ข€ ๋” ์ •ํ™•ํ•œ ์ฃผํ–‰๊ฑฐ๋ฆฌ ์ถ”์ •์„ ์œ„ํ•ด ํœ  ์—”์ฝ”๋”์™€ AOFS๋กœ๋ถ€ํ„ฐ ํš๋“ํ•œ ์ด๋™ ๋ณ€์œ„ ์ •๋ณด๋ฅผ ์œตํ•ฉํ–ˆ๊ณ , ๋ฐฉ์œ„๊ฐ ์ถ”์ •์„ ์œ„ํ•ด ๊ฐ์†๋„ ์„ผ์„œ์™€ ์ „๋ฐฉ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ ํŒŒ์•…๋œ ์‹ค๋‚ด ๊ณต๊ฐ„์ •๋ณด๋ฅผ ํ™œ์šฉํ–ˆ๋‹ค. ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ๋Š” ๋ฐ”ํ€ด ๋ฏธ๋„๋Ÿฌ์ง์— ๊ฐ•์ธํ•˜๊ฒŒ ์ด๋™ ๋ณ€์œ„๋ฅผ ์ถ”์ • ํ•˜์ง€๋งŒ, ์นดํŽซ์ฒ˜๋Ÿผ ํ‰ํ‰ํ•˜์ง€ ์•Š์€ ํ‘œ๋ฉด์„ ์ฃผํ–‰ํ•˜๋Š” ์ด๋™ ๋กœ๋ด‡์— ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ๋ฅผ ์žฅ์ฐฉํ•  ๊ฒฝ์šฐ, ์ฃผํ–‰ ์ค‘ ๋ฐœ์ƒํ•˜๋Š” ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ์™€ ๋ฐ”๋‹ฅ ๊ฐ„์˜ ๋†’์ด ๋ณ€ํ™”๊ฐ€ ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ๋ฅผ ์ด์šฉํ•œ ์ด๋™๊ฑฐ๋ฆฌ ์ถ”์ • ์˜ค์ฐจ์˜ ์ฃผ์š”์ธ์œผ๋กœ ์ž‘์šฉํ•œ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š” ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ์— ๋ฌดํ•œ์ดˆ์ ๊ณ„ ์›๋ฆฌ๋ฅผ ์ ์šฉํ•˜์—ฌ ์ด ์˜ค์ฐจ ์š”์ธ์„ ์™„ํ™”ํ•˜๋Š” ๋ฐฉ์•ˆ์„ ์ œ์‹œํ•˜์˜€๋‹ค. ๋กœ๋ด‡ ๋ฌธํ˜• ์‹œ์Šคํ…œ(robotic gantry system)์„ ์ด์šฉํ•˜์—ฌ ์นดํŽซ ๋ฐ ์„ธ๊ฐ€์ง€ ์ข…๋ฅ˜์˜ ๋ฐ”๋‹ฅ์žฌ์งˆ์—์„œ ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ์˜ ๋†’์ด๋ฅผ 30 mm ์—์„œ 50 mm ๋กœ ๋ณ€ํ™”์‹œํ‚ค๋ฉฐ 80 cm ๊ฑฐ๋ฆฌ๋ฅผ ์ด๋™ํ•˜๋Š” ์‹คํ—˜์„ 10๋ฒˆ์”ฉ ๋ฐ˜๋ณตํ•œ ๊ฒฐ๊ณผ, ๋ณธ ๋…ผ๋ฌธ์—์„œ ์ œ์•ˆํ•˜๋Š” AOFS ๋ชจ๋“ˆ์€ 1 mm ๋†’์ด ๋ณ€ํ™” ๋‹น 0.1% ์˜ ๊ณ„ํ†ต์˜ค์ฐจ(systematic error)๋ฅผ ๋ฐœ์ƒ์‹œ์ผฐ์œผ๋‚˜, ๊ธฐ์กด์˜ ๊ณ ์ •์ดˆ์ ๋ฐฉ์‹์˜ ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ๋Š” 14.7% ์˜ ๊ณ„ํ†ต์˜ค์ฐจ๋ฅผ ๋‚˜ํƒ€๋ƒˆ๋‹ค. ์‹ค๋‚ด ์ด๋™์šฉ ์„œ๋น„์Šค ๋กœ๋ด‡์— AOFS๋ฅผ ์žฅ์ฐฉํ•˜์—ฌ ์นดํŽซ ์œ„์—์„œ 1 m ๋ฅผ ์ฃผํ–‰ํ•œ ๊ฒฐ๊ณผ ํ‰๊ท  ๊ฑฐ๋ฆฌ ์ถ”์ • ์˜ค์ฐจ๋Š” 0.02% ์ด๊ณ , ๋ถ„์‚ฐ์€ 17.6% ์ธ ๋ฐ˜๋ฉด, ๊ณ ์ •์ดˆ์  ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ๋ฅผ ๋กœ๋ด‡์— ์žฅ์ฐฉํ•˜์—ฌ ๊ฐ™์€ ์‹คํ—˜์„ ํ–ˆ์„ ๋•Œ์—๋Š” 4.09% ์˜ ํ‰๊ท  ์˜ค์ฐจ ๋ฐ 25.7% ์˜ ๋ถ„์‚ฐ์„ ๋‚˜ํƒ€๋ƒˆ๋‹ค. ์ฃผ์œ„๊ฐ€ ๋„ˆ๋ฌด ์–ด๋‘์›Œ์„œ ์˜์ƒ์„ ์œ„์น˜ ๋ณด์ •์— ์‚ฌ์šฉํ•˜๊ธฐ ์–ด๋ ค์šด ๊ฒฝ์šฐ, ์ฆ‰, ์ €์กฐ๋„ ์˜์ƒ์„ ๋ฐ๊ฒŒ ๊ฐœ์„ ํ–ˆ์œผ๋‚˜ SLAM์— ํ™œ์šฉํ•  ๊ฐ•์ธํ•œ ํŠน์ง•์  ํ˜น์€ ํŠน์ง•์„ ์„ ์ถ”์ถœํ•˜๊ธฐ ์–ด๋ ค์šด ๊ฒฝ์šฐ์—๋„ ๋กœ๋ด‡ ์ฃผํ–‰ ๊ฐ๋„ ๋ณด์ •์— ์ €์กฐ๋„ ์ด๋ฏธ์ง€๋ฅผ ํ™œ์šฉํ•˜๋Š” ๋ฐฉ์•ˆ์„ ์ œ์‹œํ–ˆ๋‹ค. ์ €์กฐ๋„ ์˜์ƒ์— ํžˆ์Šคํ† ๊ทธ๋žจ ํ‰ํ™œํ™”(histogram equalization) ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ ์šฉํ•˜๋ฉด ์˜์ƒ์ด ๋ฐ๊ฒŒ ๋ณด์ • ๋˜๋ฉด์„œ ๋™์‹œ์— ์žก์Œ๋„ ์ฆ๊ฐ€ํ•˜๊ฒŒ ๋˜๋Š”๋ฐ, ์˜์ƒ ์žก์Œ์„ ์—†์• ๋Š” ๋™์‹œ์— ์ด๋ฏธ์ง€ ๊ฒฝ๊ณ„๋ฅผ ๋šœ๋ ทํ•˜๊ฒŒ ํ•˜๋Š” ๋กค๋ง ๊ฐ€์ด๋˜์Šค ํ•„ํ„ฐ(rolling guidance filterRGF)๋ฅผ ์ ์šฉํ•˜์—ฌ ์ด๋ฏธ์ง€๋ฅผ ๊ฐœ์„ ํ•˜๊ณ , ์ด ์ด๋ฏธ์ง€์—์„œ ์‹ค๋‚ด ๊ณต๊ฐ„์„ ๊ตฌ์„ฑํ•˜๋Š” ์ง๊ต ์ง์„  ์„ฑ๋ถ„์„ ์ถ”์ถœ ํ›„ ์†Œ์‹ค์ (vanishing pointVP)์„ ์ถ”์ •ํ•˜๊ณ  ์†Œ์‹ค์ ์„ ๊ธฐ์ค€์œผ๋กœ ํ•œ ๋กœ๋ด‡ ์ƒ๋Œ€ ๋ฐฉ์œ„๊ฐ์„ ํš๋“ํ•˜์—ฌ ๊ฐ๋„ ๋ณด์ •์— ํ™œ์šฉํ–ˆ๋‹ค. ์ œ์•ˆํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ๋กœ๋ด‡์— ์ ์šฉํ•˜์—ฌ 0.06 ~ 0.21 lx ์˜ ์ €์กฐ๋„ ์‹ค๋‚ด ๊ณต๊ฐ„(77 sqm)์— ์นดํŽซ์„ ์„ค์น˜ํ•˜๊ณ  ์ฃผํ–‰ํ–ˆ์„ ๊ฒฝ์šฐ, ๋กœ๋ด‡์˜ ๋ณต๊ท€ ์œ„์น˜ ์˜ค์ฐจ๊ฐ€ ๊ธฐ์กด 401 cm ์—์„œ 21 cm๋กœ ์ค„์–ด๋“ฆ์„ ํ™•์ธํ•  ์ˆ˜ ์žˆ์—ˆ๋‹ค.์ œ 1 ์žฅ ์„œ ๋ก  1 1.1 ์—ฐ๊ตฌ์˜ ๋ฐฐ๊ฒฝ 1 1.2 ์„ ํ–‰ ์—ฐ๊ตฌ ์กฐ์‚ฌ 6 1.2.1 ์‹ค๋‚ด ์ด๋™ํ˜• ์„œ๋น„์Šค ๋กœ๋ด‡์˜ ๋ฏธ๋„๋Ÿฌ์ง ๊ฐ์ง€ ๊ธฐ์ˆ  6 1.2.2 ์ €์กฐ๋„ ์˜์ƒ ๊ฐœ์„  ๊ธฐ์ˆ  8 1.3 ๊ธฐ์—ฌ๋„ 12 1.4 ๋…ผ๋ฌธ์˜ ๊ตฌ์„ฑ 14 ์ œ 2 ์žฅ ๋ฌดํ•œ์ดˆ์  ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ(AOFS) ๋ชจ๋“ˆ 16 2.1 ๋ฌดํ•œ์ดˆ์  ์‹œ์Šคํ…œ(afocal system) 16 2.2 ๋ฐ”๋Š˜๊ตฌ๋ฉ ํšจ๊ณผ 18 2.3 ๋ฌดํ•œ์ดˆ์  ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ(AOFS) ๋ชจ๋“ˆ ํ”„๋กœํ† ํƒ€์ž… 20 2.4 ๋ฌดํ•œ์ดˆ์  ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ(AOFS) ๋ชจ๋“ˆ ์‹คํ—˜ ๊ณ„ํš 24 2.5 ๋ฌดํ•œ์ดˆ์  ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ(AOFS) ๋ชจ๋“ˆ ์‹คํ—˜ ๊ฒฐ๊ณผ 29 ์ œ 3 ์žฅ ์ €์กฐ๋„์˜์ƒ์˜ ๋ฐฉ์œ„๊ฐ๋ณด์ • ํ™œ์šฉ๋ฐฉ๋ฒ• 36 3.1 ์ €์กฐ๋„ ์˜์ƒ ๊ฐœ์„  ๋ฐฉ๋ฒ• 36 3.2 ํ•œ ์žฅ์˜ ์˜์ƒ์œผ๋กœ ์‹ค๋‚ด ๊ณต๊ฐ„ ํŒŒ์•… ๋ฐฉ๋ฒ• 38 3.3 ์†Œ์‹ค์  ๋žœ๋“œ๋งˆํฌ๋ฅผ ์ด์šฉํ•œ ๋กœ๋ด‡ ๊ฐ๋„ ์ถ”์ • 41 3.4 ์ตœ์ข… ์ฃผํ–‰๊ธฐ๋ก ์•Œ๊ณ ๋ฆฌ์ฆ˜ 46 3.5 ์ €์กฐ๋„์˜์ƒ์˜ ๋ฐฉ์œ„๊ฐ ๋ณด์ • ์‹คํ—˜ ๊ณ„ํš 48 3.6 ์ €์กฐ๋„์˜์ƒ์˜ ๋ฐฉ์œ„๊ฐ ๋ณด์ • ์‹คํ—˜ ๊ฒฐ๊ณผ 50 ์ œ 4 ์žฅ ์ €์กฐ๋„ ํ™˜๊ฒฝ ์œ„์น˜์ธ์‹ ์‹คํ—˜ ๊ฒฐ๊ณผ 54 4.1 ์‹คํ—˜ ํ™˜๊ฒฝ 54 4.2 ์‹œ๋ฎฌ๋ ˆ์ด์…˜ ์‹คํ—˜ ๊ฒฐ๊ณผ 59 4.3 ์ž„๋ฒ ๋””๋“œ ์‹คํ—˜ ๊ฒฐ๊ณผ 61 ์ œ 5 ์žฅ ๊ฒฐ๋ก  62Docto

    MEMS Technology for Biomedical Imaging Applications

    Get PDF
    Biomedical imaging is the key technique and process to create informative images of the human body or other organic structures for clinical purposes or medical science. Micro-electro-mechanical systems (MEMS) technology has demonstrated enormous potential in biomedical imaging applications due to its outstanding advantages of, for instance, miniaturization, high speed, higher resolution, and convenience of batch fabrication. There are many advancements and breakthroughs developing in the academic community, and there are a few challenges raised accordingly upon the designs, structures, fabrication, integration, and applications of MEMS for all kinds of biomedical imaging. This Special Issue aims to collate and showcase research papers, short commutations, perspectives, and insightful review articles from esteemed colleagues that demonstrate: (1) original works on the topic of MEMS components or devices based on various kinds of mechanisms for biomedical imaging; and (2) new developments and potentials of applying MEMS technology of any kind in biomedical imaging. The objective of this special session is to provide insightful information regarding the technological advancements for the researchers in the community

    Vision-based control of near-obstacle flight

    Get PDF
    Lightweight micro unmanned aerial vehicles (micro-UAVs) capable of autonomous flight in natural and urban environments have a large potential for civil and commercial applications, including environmental monitoring, forest fire monitoring, homeland security, traffic monitoring, aerial imagery, mapping and search and rescue. Smaller micro-UAVs capable of flying inside houses or small indoor environments have further applications in the domain of surveillance, search and rescue and entertainment. These applications require the capability to fly near to the ground and amongst obstacles. Existing UAVs rely on GPS and AHRS (attitude heading reference system) to control their flight and are unable to detect and avoid obstacles. Active distance sensors such as radars or laser range finders could be used to measure distances to obstacles, but are typically too heavy and power-consuming to be embedded on lightweight systems. In this thesis, we draw inspiration from biology and explore alternative approaches to flight control that allow aircraft to fly near obstacles. We show that optic flow can be used on flying platforms to estimate the proximity of obstacles and propose a novel control strategy, called optiPilot, for vision-based near-obstacle flight. Thanks to optiPilot, we demonstrate for the first time autonomous near-obstacle flight of micro-UAVs, both indoor and outdoor, without relying on an AHRS nor external beacons such as GPS. The control strategy only requires a small series of optic flow sensors, two rate gyroscopes and an airspeed sensor. It can run on a tiny embedded microcontroller in realtime. Despite its simplicity, optiPilot is able to fully control the aircraft, including altitude regulation, attitude stabilisation, obstacle avoidance, landing and take-off. This parsimony, inherited from the biology of flying insects, contrasts with the complexity of the systems used so far for flight control while offering more capabilities. The results presented in this thesis contribute to a better understanding of the minimal requirements, in terms of sensing and control architecture, that enable animals and artificial systems to fly and bring closer to reality the perspective of using lightweight and inexpensive micro-UAV for civilian purposes

    Optic Flow for Obstacle Avoidance and Navigation: A Practical Approach

    Full text link
    This thesis offers contributions and innovations to the development of vision-based autonomous flight control systems for small unmanned aerial vehicles operating in cluttered urban environments. Although many optic flow algorithms have been reported, almost none have addressed the critical issue of accuracy and reliability over a wide dynamic range of optic flow. My aim is to rigorously develop improved optic flow sensing to meet realistic mission requirements for autonomous navigation and collision avoidance. A review of related work enabled development of a new hybrid optic flow algorithm concept combining the best properties of image correlation and interpolation with additional innovations to enhance accuracy, computational speed and reliability. Key analytical work yielded a methodology for determining optic flow dynamic range requirements from system and sensor design parameters and a technique enabling a video sensor to operate as a passive ranging system for closed loop flight control. Detailed testing led to development of the hybrid image interpolation algorithm (HI2A) using improved correlation search strategies, sparse images to reduce processing loads, a solution tracking loop to bypass the more intensive initial estimation process, a frame look-back method to improve accuracy at low optic flow, a modified interpolation technique to improve robustness and an extensive error checking system for validating outputs. A realistic simulation system was developed incorporating independent, precision ground truthing to assess algorithm accuracy. Comparison testing of the HI2A against the commonly-used Lucas Kanade algorithm demonstrates major improvement in accuracy over greatly expanded dynamic range. A reactive flight controller using ranging data from a monocular, forward looking video sensor and rules-based logic was developed and tested in Monte Carlo simulations of a hundred flights. At higher flight speeds than reported in similar tests, collision-free results were obtained in a realistic urban canyon environment. The HI2A algorithm and flight controller software performance on a common PC was up to eight times faster than real-time for outputs of 250 measurements at 50 Hz. The feasibility of terrain mapping in real-time was demonstrated using 3D ranging data from optic flow in an overflight of the urban simulation environment indicating the potential for its use in path planning approaches to navigation and collision avoidance

    Accelerated neuromorphic cybernetics

    Get PDF
    Accelerated mixed-signal neuromorphic hardware refers to electronic systems that emulate electrophysiological aspects of biological nervous systems in analog voltages and currents in an accelerated manner. While the functional spectrum of these systems already includes many observed neuronal capabilities, such as learning or classification, some areas remain largely unexplored. In particular, this concerns cybernetic scenarios in which nervous systems engage in closed interaction with their bodies and environments. Since the control of behavior and movement in animals is both the purpose and the cause of the development of nervous systems, such processes are, however, of essential importance in nature. Besides the design of neuromorphic circuit- and system components, the main focus of this work is therefore the construction and analysis of accelerated neuromorphic agents that are integrated into cybernetic chains of action. These agents are, on the one hand, an accelerated mechanical robot, on the other hand, an accelerated virtual insect. In both cases, the sensory organs and actuators of their artificial bodies are derived from the neurophysiology of the biological prototypes and are reproduced as faithfully as possible. In addition, each of the two biomimetic organisms is subjected to evolutionary optimization, which illustrates the advantages of accelerated neuromorphic nervous systems through significant time savings

    Using reconstructed visual reality in ant navigation research

    Get PDF
    Insects have low resolution eyes and a tiny brain, yet they continuously solve very complex navigational problems; an ability that underpins fundamental biological processes such as pollination and parental care. Understanding the methods they employ would have profound impact on the fields of machine vision and robotics. As our knowledge on insect navigation grows, our physical, physiological and neural models get more complex and detailed. To test these models we need to perform increasingly sophisticated experiments. Evolution has optimised the animals to operate in their natural environment. To probe the fine details of the methods they utilise we need to use natural visual scenery which, for experimental purposes, we must be able to manipulate arbitrarily. Performing physiological experiments on insects outside the laboratory is not practical and our ability to modify the natural scenery for outdoor behavioural experiments is very limited. The solution is reconstructed visual reality, a projector that can present the visual aspect of the natural environment to the animal with high fidelity, taking the peculiarities of insect vision into account. While projectors have been used in insect research before, during my candidature I designed and built a projector specifically tuned to insect vision. To allow the ant to experience a full panoramic view, the projector completely surrounds her. The device (Antarium) is a polyhedral approximation of a sphere. It contains 20 thousand pixels made out of light emitting diodes (LEDs) that match the spectral sensitivity of Myrmecia. Insects have a much higher fusion frequency limit than humans, therefore the device has a very high flicker frequency (9kHz) and also a high frame rate (190fps). In the Antarium the animal is placed in the centre of the projector on a trackball. To test the trackball and to collect reference data, outdoor experiments were performed where ants were captured, tethered and placed on the trackball. The apparatus with the ant on it was then placed at certain locations relative to the nest and the foraging tree and the movements of the animal on the ball were recorded and analysed. The outdoor experiments proved that the trackball was well suited for our ants, and also provided the baseline behaviour reference for the subsequent Antarium experiments. To assess the Antarium, the natural habitat of the experimental animals was recreated as a 3-dimensional model. That model was then projected for the ants and their movements on the trackball was recorded, just like in the outdoor experiments Initial feasibility tests were performed by projecting a static image, which matches what the animals experienced during the outdoor experiments. To assess whether the ant was orienting herself relative to the scene we rotated the projected scene around her and her response monitored. Statistical methods were used to compare the outdoor and in-Antarium behaviour. The results proved that the concept was solid, but they also uncovered several shortcomings of the Antarium. Nevertheless, even with its limitations the Antarium was used to perform experiments that would be very hard to do in a real environment. In one experiment the foraging tree was repositioned in or deleted from the scene to see whether the animals go to where the tree is or where by their knowledge it should be. The results suggest the latter but the absence or altered location of the foraging tree certainly had a significant effect on the animals. In another experiment the scene, including the sky, were re-coloured to see whether colour plays a significant role in navigation. Results indicate that even very small amount of UV information statistically significantly improves the navigation of the animals. To rectify the device limitations discovered during the experiments a new, improved projector was designed and is currently being built
    corecore