140 research outputs found

    Gesture Recognition Aplication based on Dynamic Time Warping (DTW) FOR Omni-Wheel Mobile Robot

    Get PDF
    This project presents of the movement of omni-wheel robot moves in the trajectory obtained from the gesture recognition system based on Dynamic Time Warping. Single camera is used as the input of the system, which is also a reference to the movement of the omni-wheel robot. Some systems for gesture recognition have been developed using various methods and different approaches. The movement of the omni-wheel robot using the method of Dynamic Time Wrapping (DTW) which has the advantage able to calculate the distance of two data vectors with different lengths. By using this method we can measure the similarity between two sequences at different times and speeds. Dynamic Time Warping to compare the two parameters at varying times and speeds. Application of DTW widely applied in video, audio, graphics, etc. Due to data that can be changed in a linear manner so that it can be analyzed with DTW. In short can find the most suitable value by minimizing the difference between two multidimensional signals that have been compressed. DTW method is expected to gesture recognition system to work optimally, have a high enough value of accuracy and processing time is realtime

    Optic-Flow Based Car-Like Robot Operating in a 5-Decade Light Level Range

    No full text
    International audienceIn this paper, we present (i) a novel bio-inspired 1-D OF sensor which is robust to high-dynamic-range lighting conditions and independent of the visual patterns encountered, and (ii) a low-cost car-like robot called BioCarBot, which estimates its velocity and steering angle by means of an Extended Kalman Filer (EKF) using only the OF measurements delivered by two downward-facing sensors of this kind. Indoor experiments were carried out, in which the robot was driven in the closed-loop mode, using a proportional integral (PI) controller based on the velocity and steering angle estimates. The results presented here show that our novel OF sensor can deliver a wide range of high-frequency (333 Hz) OF measurements (from 1 to 10 rad s) with a relatively high resolution (up to 0.05 rad s) in a 5-decade high-dynamic range of light levels. Neither the refresh rate nor the resolution of the OF sensors presented here depended on either the visual patterns or the lighting conditions, and could be theoretically set at whatever value required

    Combined visual odometry and visual compass for off-road mobile robots localization

    Get PDF
    In this paper, we present the work related to the application of a visual odometry approach to estimate the location of mobile robots operating in off-road conditions. The visual odometry approach is based on template matching, which deals with estimating the robot displacement through a matching process between two consecutive images. Standard visual odometry has been improved using visual compass method for orientation estimation. For this purpose, two consumer-grade monocular cameras have been employed. One camera is pointing at the ground under the robot, and the other is looking at the surrounding environment. Comparisons with popular localization approaches, through physical experiments in off-road conditions, have shown the satisfactory behavior of the proposed strateg

    Localization in Low Luminance, Slippery Indoor Environment Using Afocal Optical Flow Sensor and Image Processing

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ (๋ฐ•์‚ฌ)-- ์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› ๊ณต๊ณผ๋Œ€ํ•™ ์ „๊ธฐยท์ •๋ณด๊ณตํ•™๋ถ€, 2017. 8. ์กฐ๋™์ผ.์‹ค๋‚ด ์„œ๋น„์Šค๋กœ๋ด‡์˜ ์œ„์น˜ ์ถ”์ •์€ ์ž์œจ ์ฃผํ–‰์„ ์œ„ํ•œ ํ•„์ˆ˜ ์š”๊ฑด์ด๋‹ค. ํŠนํžˆ ์นด๋ฉ”๋ผ๋กœ ์œ„์น˜๋ฅผ ์ถ”์ •ํ•˜๊ธฐ ์–ด๋ ค์šด ์‹ค๋‚ด ์ €์กฐ๋„ ํ™˜๊ฒฝ์—์„œ ๋ฏธ๋„๋Ÿฌ์ง์ด ๋ฐœ์ƒํ•  ๊ฒฝ์šฐ์—๋Š” ์œ„์น˜ ์ถ”์ •์˜ ์ •ํ™•๋„๊ฐ€ ๋‚ฎ์•„์ง„๋‹ค. ๋ฏธ๋„๋Ÿฌ์ง์€ ์ฃผ๋กœ ์นดํŽซ์ด๋‚˜ ๋ฌธํ„ฑ ๋“ฑ์„ ์ฃผํ–‰ํ•  ๋•Œ ๋ฐœ์ƒํ•˜๋ฉฐ, ํœ  ์—”์ฝ”๋” ๊ธฐ๋ฐ˜์˜ ์ฃผํ–‰๊ธฐ๋ก์œผ๋กœ๋Š” ์ฃผํ–‰ ๊ฑฐ๋ฆฌ์˜ ์ •ํ™•ํ•œ ์ธ์‹์— ํ•œ๊ณ„๊ฐ€ ์žˆ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š” ์นด๋ฉ”๋ผ ๊ธฐ๋ฐ˜ ๋™์‹œ์  ์œ„์น˜์ถ”์ • ๋ฐ ์ง€๋„์ž‘์„ฑ ๊ธฐ์ˆ (simultaneous localization and mappingSLAM)์ด ๋™์ž‘ํ•˜๊ธฐ ์–ด๋ ค์šด ์ €์กฐ๋„, ๋ฏธ๋„๋Ÿฌ์šด ํ™˜๊ฒฝ์—์„œ ์ €๊ฐ€์˜ ๋ชจ์…˜์„ผ์„œ์™€ ๋ฌดํ•œ์ดˆ์  ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ(afocal optical flow sensorAOFS) ๋ฐ VGA๊ธ‰ ์ „๋ฐฉ ๋‹จ์•ˆ์นด๋ฉ”๋ผ๋ฅผ ์œตํ•ฉํ•˜์—ฌ ๊ฐ•์ธํ•˜๊ฒŒ ์œ„์น˜๋ฅผ ์ถ”์ •ํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์ œ์•ˆํ–ˆ๋‹ค. ๋กœ๋ด‡์˜ ์œ„์น˜ ์ถ”์ •์€ ์ฃผํ–‰๊ฑฐ๋ฆฌ ์ˆœ๊ฐ„ ๋ณ€ํ™”๋Ÿ‰๊ณผ ๋ฐฉ์œ„๊ฐ ์ˆœ๊ฐ„ ๋ณ€ํ™”๋Ÿ‰์„ ๋ˆ„์  ์œตํ•ฉํ•˜์—ฌ ์‚ฐ์ถœํ–ˆ์œผ๋ฉฐ, ๋ฏธ๋„๋Ÿฌ์šด ํ™˜๊ฒฝ์—์„œ๋„ ์ข€ ๋” ์ •ํ™•ํ•œ ์ฃผํ–‰๊ฑฐ๋ฆฌ ์ถ”์ •์„ ์œ„ํ•ด ํœ  ์—”์ฝ”๋”์™€ AOFS๋กœ๋ถ€ํ„ฐ ํš๋“ํ•œ ์ด๋™ ๋ณ€์œ„ ์ •๋ณด๋ฅผ ์œตํ•ฉํ–ˆ๊ณ , ๋ฐฉ์œ„๊ฐ ์ถ”์ •์„ ์œ„ํ•ด ๊ฐ์†๋„ ์„ผ์„œ์™€ ์ „๋ฐฉ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ ํŒŒ์•…๋œ ์‹ค๋‚ด ๊ณต๊ฐ„์ •๋ณด๋ฅผ ํ™œ์šฉํ–ˆ๋‹ค. ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ๋Š” ๋ฐ”ํ€ด ๋ฏธ๋„๋Ÿฌ์ง์— ๊ฐ•์ธํ•˜๊ฒŒ ์ด๋™ ๋ณ€์œ„๋ฅผ ์ถ”์ • ํ•˜์ง€๋งŒ, ์นดํŽซ์ฒ˜๋Ÿผ ํ‰ํ‰ํ•˜์ง€ ์•Š์€ ํ‘œ๋ฉด์„ ์ฃผํ–‰ํ•˜๋Š” ์ด๋™ ๋กœ๋ด‡์— ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ๋ฅผ ์žฅ์ฐฉํ•  ๊ฒฝ์šฐ, ์ฃผํ–‰ ์ค‘ ๋ฐœ์ƒํ•˜๋Š” ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ์™€ ๋ฐ”๋‹ฅ ๊ฐ„์˜ ๋†’์ด ๋ณ€ํ™”๊ฐ€ ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ๋ฅผ ์ด์šฉํ•œ ์ด๋™๊ฑฐ๋ฆฌ ์ถ”์ • ์˜ค์ฐจ์˜ ์ฃผ์š”์ธ์œผ๋กœ ์ž‘์šฉํ•œ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š” ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ์— ๋ฌดํ•œ์ดˆ์ ๊ณ„ ์›๋ฆฌ๋ฅผ ์ ์šฉํ•˜์—ฌ ์ด ์˜ค์ฐจ ์š”์ธ์„ ์™„ํ™”ํ•˜๋Š” ๋ฐฉ์•ˆ์„ ์ œ์‹œํ•˜์˜€๋‹ค. ๋กœ๋ด‡ ๋ฌธํ˜• ์‹œ์Šคํ…œ(robotic gantry system)์„ ์ด์šฉํ•˜์—ฌ ์นดํŽซ ๋ฐ ์„ธ๊ฐ€์ง€ ์ข…๋ฅ˜์˜ ๋ฐ”๋‹ฅ์žฌ์งˆ์—์„œ ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ์˜ ๋†’์ด๋ฅผ 30 mm ์—์„œ 50 mm ๋กœ ๋ณ€ํ™”์‹œํ‚ค๋ฉฐ 80 cm ๊ฑฐ๋ฆฌ๋ฅผ ์ด๋™ํ•˜๋Š” ์‹คํ—˜์„ 10๋ฒˆ์”ฉ ๋ฐ˜๋ณตํ•œ ๊ฒฐ๊ณผ, ๋ณธ ๋…ผ๋ฌธ์—์„œ ์ œ์•ˆํ•˜๋Š” AOFS ๋ชจ๋“ˆ์€ 1 mm ๋†’์ด ๋ณ€ํ™” ๋‹น 0.1% ์˜ ๊ณ„ํ†ต์˜ค์ฐจ(systematic error)๋ฅผ ๋ฐœ์ƒ์‹œ์ผฐ์œผ๋‚˜, ๊ธฐ์กด์˜ ๊ณ ์ •์ดˆ์ ๋ฐฉ์‹์˜ ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ๋Š” 14.7% ์˜ ๊ณ„ํ†ต์˜ค์ฐจ๋ฅผ ๋‚˜ํƒ€๋ƒˆ๋‹ค. ์‹ค๋‚ด ์ด๋™์šฉ ์„œ๋น„์Šค ๋กœ๋ด‡์— AOFS๋ฅผ ์žฅ์ฐฉํ•˜์—ฌ ์นดํŽซ ์œ„์—์„œ 1 m ๋ฅผ ์ฃผํ–‰ํ•œ ๊ฒฐ๊ณผ ํ‰๊ท  ๊ฑฐ๋ฆฌ ์ถ”์ • ์˜ค์ฐจ๋Š” 0.02% ์ด๊ณ , ๋ถ„์‚ฐ์€ 17.6% ์ธ ๋ฐ˜๋ฉด, ๊ณ ์ •์ดˆ์  ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ๋ฅผ ๋กœ๋ด‡์— ์žฅ์ฐฉํ•˜์—ฌ ๊ฐ™์€ ์‹คํ—˜์„ ํ–ˆ์„ ๋•Œ์—๋Š” 4.09% ์˜ ํ‰๊ท  ์˜ค์ฐจ ๋ฐ 25.7% ์˜ ๋ถ„์‚ฐ์„ ๋‚˜ํƒ€๋ƒˆ๋‹ค. ์ฃผ์œ„๊ฐ€ ๋„ˆ๋ฌด ์–ด๋‘์›Œ์„œ ์˜์ƒ์„ ์œ„์น˜ ๋ณด์ •์— ์‚ฌ์šฉํ•˜๊ธฐ ์–ด๋ ค์šด ๊ฒฝ์šฐ, ์ฆ‰, ์ €์กฐ๋„ ์˜์ƒ์„ ๋ฐ๊ฒŒ ๊ฐœ์„ ํ–ˆ์œผ๋‚˜ SLAM์— ํ™œ์šฉํ•  ๊ฐ•์ธํ•œ ํŠน์ง•์  ํ˜น์€ ํŠน์ง•์„ ์„ ์ถ”์ถœํ•˜๊ธฐ ์–ด๋ ค์šด ๊ฒฝ์šฐ์—๋„ ๋กœ๋ด‡ ์ฃผํ–‰ ๊ฐ๋„ ๋ณด์ •์— ์ €์กฐ๋„ ์ด๋ฏธ์ง€๋ฅผ ํ™œ์šฉํ•˜๋Š” ๋ฐฉ์•ˆ์„ ์ œ์‹œํ–ˆ๋‹ค. ์ €์กฐ๋„ ์˜์ƒ์— ํžˆ์Šคํ† ๊ทธ๋žจ ํ‰ํ™œํ™”(histogram equalization) ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ ์šฉํ•˜๋ฉด ์˜์ƒ์ด ๋ฐ๊ฒŒ ๋ณด์ • ๋˜๋ฉด์„œ ๋™์‹œ์— ์žก์Œ๋„ ์ฆ๊ฐ€ํ•˜๊ฒŒ ๋˜๋Š”๋ฐ, ์˜์ƒ ์žก์Œ์„ ์—†์• ๋Š” ๋™์‹œ์— ์ด๋ฏธ์ง€ ๊ฒฝ๊ณ„๋ฅผ ๋šœ๋ ทํ•˜๊ฒŒ ํ•˜๋Š” ๋กค๋ง ๊ฐ€์ด๋˜์Šค ํ•„ํ„ฐ(rolling guidance filterRGF)๋ฅผ ์ ์šฉํ•˜์—ฌ ์ด๋ฏธ์ง€๋ฅผ ๊ฐœ์„ ํ•˜๊ณ , ์ด ์ด๋ฏธ์ง€์—์„œ ์‹ค๋‚ด ๊ณต๊ฐ„์„ ๊ตฌ์„ฑํ•˜๋Š” ์ง๊ต ์ง์„  ์„ฑ๋ถ„์„ ์ถ”์ถœ ํ›„ ์†Œ์‹ค์ (vanishing pointVP)์„ ์ถ”์ •ํ•˜๊ณ  ์†Œ์‹ค์ ์„ ๊ธฐ์ค€์œผ๋กœ ํ•œ ๋กœ๋ด‡ ์ƒ๋Œ€ ๋ฐฉ์œ„๊ฐ์„ ํš๋“ํ•˜์—ฌ ๊ฐ๋„ ๋ณด์ •์— ํ™œ์šฉํ–ˆ๋‹ค. ์ œ์•ˆํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ๋กœ๋ด‡์— ์ ์šฉํ•˜์—ฌ 0.06 ~ 0.21 lx ์˜ ์ €์กฐ๋„ ์‹ค๋‚ด ๊ณต๊ฐ„(77 sqm)์— ์นดํŽซ์„ ์„ค์น˜ํ•˜๊ณ  ์ฃผํ–‰ํ–ˆ์„ ๊ฒฝ์šฐ, ๋กœ๋ด‡์˜ ๋ณต๊ท€ ์œ„์น˜ ์˜ค์ฐจ๊ฐ€ ๊ธฐ์กด 401 cm ์—์„œ 21 cm๋กœ ์ค„์–ด๋“ฆ์„ ํ™•์ธํ•  ์ˆ˜ ์žˆ์—ˆ๋‹ค.์ œ 1 ์žฅ ์„œ ๋ก  1 1.1 ์—ฐ๊ตฌ์˜ ๋ฐฐ๊ฒฝ 1 1.2 ์„ ํ–‰ ์—ฐ๊ตฌ ์กฐ์‚ฌ 6 1.2.1 ์‹ค๋‚ด ์ด๋™ํ˜• ์„œ๋น„์Šค ๋กœ๋ด‡์˜ ๋ฏธ๋„๋Ÿฌ์ง ๊ฐ์ง€ ๊ธฐ์ˆ  6 1.2.2 ์ €์กฐ๋„ ์˜์ƒ ๊ฐœ์„  ๊ธฐ์ˆ  8 1.3 ๊ธฐ์—ฌ๋„ 12 1.4 ๋…ผ๋ฌธ์˜ ๊ตฌ์„ฑ 14 ์ œ 2 ์žฅ ๋ฌดํ•œ์ดˆ์  ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ(AOFS) ๋ชจ๋“ˆ 16 2.1 ๋ฌดํ•œ์ดˆ์  ์‹œ์Šคํ…œ(afocal system) 16 2.2 ๋ฐ”๋Š˜๊ตฌ๋ฉ ํšจ๊ณผ 18 2.3 ๋ฌดํ•œ์ดˆ์  ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ(AOFS) ๋ชจ๋“ˆ ํ”„๋กœํ† ํƒ€์ž… 20 2.4 ๋ฌดํ•œ์ดˆ์  ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ(AOFS) ๋ชจ๋“ˆ ์‹คํ—˜ ๊ณ„ํš 24 2.5 ๋ฌดํ•œ์ดˆ์  ๊ด‘ํ•™ํ๋ฆ„์„ผ์„œ(AOFS) ๋ชจ๋“ˆ ์‹คํ—˜ ๊ฒฐ๊ณผ 29 ์ œ 3 ์žฅ ์ €์กฐ๋„์˜์ƒ์˜ ๋ฐฉ์œ„๊ฐ๋ณด์ • ํ™œ์šฉ๋ฐฉ๋ฒ• 36 3.1 ์ €์กฐ๋„ ์˜์ƒ ๊ฐœ์„  ๋ฐฉ๋ฒ• 36 3.2 ํ•œ ์žฅ์˜ ์˜์ƒ์œผ๋กœ ์‹ค๋‚ด ๊ณต๊ฐ„ ํŒŒ์•… ๋ฐฉ๋ฒ• 38 3.3 ์†Œ์‹ค์  ๋žœ๋“œ๋งˆํฌ๋ฅผ ์ด์šฉํ•œ ๋กœ๋ด‡ ๊ฐ๋„ ์ถ”์ • 41 3.4 ์ตœ์ข… ์ฃผํ–‰๊ธฐ๋ก ์•Œ๊ณ ๋ฆฌ์ฆ˜ 46 3.5 ์ €์กฐ๋„์˜์ƒ์˜ ๋ฐฉ์œ„๊ฐ ๋ณด์ • ์‹คํ—˜ ๊ณ„ํš 48 3.6 ์ €์กฐ๋„์˜์ƒ์˜ ๋ฐฉ์œ„๊ฐ ๋ณด์ • ์‹คํ—˜ ๊ฒฐ๊ณผ 50 ์ œ 4 ์žฅ ์ €์กฐ๋„ ํ™˜๊ฒฝ ์œ„์น˜์ธ์‹ ์‹คํ—˜ ๊ฒฐ๊ณผ 54 4.1 ์‹คํ—˜ ํ™˜๊ฒฝ 54 4.2 ์‹œ๋ฎฌ๋ ˆ์ด์…˜ ์‹คํ—˜ ๊ฒฐ๊ณผ 59 4.3 ์ž„๋ฒ ๋””๋“œ ์‹คํ—˜ ๊ฒฐ๊ณผ 61 ์ œ 5 ์žฅ ๊ฒฐ๋ก  62Docto

    Ein mobiler Serviceroboter zur Automatisierung der Probenahme und des Probenmanagements in einem biotechnologischen Pilotlabor

    Get PDF
    Scherer T. A mobile service robot for automisation of sample taking and sample management in a biotechnological pilot laboratory. Bielefeld (Germany): Bielefeld University; 2004.In biotechnologischen Laboratorien ist die Qualitรคt der typischerweise pharmazeutischen Produkte ein wortwรถrtlich lebenswichtiges Ziel. Die Qualitรคt der Zellkultivierungen wurde historisch nur durch off-line Messungen von physikalischen Prozessparametern wie pH und pO2 sichergestellt. Biologische Parameter wie die Zelldichte und -viabilitรคt wurden nur off-line gemessen, weil das dazu notwendige Probenmanagement hochkomplizierte Manipulationen und Analysen beinhaltet und deshalb nicht automatisiert werden konnte. Es gibt zwar mehrere automatisierte Gerรคte, um einem Labortechniker zu assistieren, aber kein System, welches das gesamte Probenmanagement automatisiert. In dieser Arbeit wird ein neuer Typ von Serviceroboter prรคsentiert, der aus einem auf einer mobilen Plattform montierten Roboterarm besteht und diese Lรผcke schlieรŸt. Dieser Roboter muss eine ganze Reihe von Problemen bewรคltigen: Er muss seine Position im Labor bestimmen kรถnnen (Lokalisation), er muss eine kollisionsfreie Bahn zu den beteiligten Gerรคten finden kรถnnen (Bahnplanung mit Hindernisvermeidung), er darf bei seinen Bewegungen keine Menschen gefรคhrden oder Laborausrรผstung beschรคdigen (Kollisionsvermeidung), er muss die zu bedienenden Gerรคte erkennen und ihre Position prรคzise messen kรถnnen (Bildverarbeitung), er muss sie bedienen kรถnnen (Armsteuerung), er muss Objekte greifen kรถnnen (Greifer und Finger) und er muss sie gefรผgig handhaben kรถnnen, um sie nicht zu beschรคdigen (Kraftregelung). Er muss autonom sein, um nur die allernotwendigste Menge an Benutzereingriffen zu benรถtigen, und doch durch ein Laborsteuerprogramm kontrollierbar sein, um Eingriffe zu erlauben. SchlieรŸlich muss er einfach durch ungeschultes Personal zu warten sein. All diese Aspekte werden von dem in dieser Arbeit prรคsentierten neuen Robotersystem abgedeckt.In biotechnolgical laboratories, the quality of the typically pharmaceutical product is a literally life-important goal. Historically, the quality of the cell cultivations was ensured by on-line measurements of physical process parameters like pH and pO2 only. Biological parameters like cell density and viability were only measured off-line, because the necessary sample management involves highly complicated manipulations and analyses and could therefore not be automated. Various automated devices to assist a laboratory technician do exist, but so far no system to automate the entire sample management. In this work a novel type of service robot consisting of a robot arm mounted on a mobile platform is presented that closes this gap. This robot has to master a multitude of problems: It must be able to locate its position in the laboratory (localisation), it must be able to find a collision-free path to the involved devices (path planning with obstacle avoidance), it must not endanger humans or damage laboratory equipment while moving (collision avoidance), it must be able to recognize the devices to be manipulated and measure their precise position (computer vision), it must be able to manipulate them (arm control), it must be able to grasp objects (gripper and fingers) and it must be able to handle them with compliance in order to not damage them (force control). It must be autonomous in order to only require the least possible amount of user intervention, and yet controllable by a laboratory control program in order to allow intervention. Finally, it must be easily maintainable by non-expert personell. All these aspects are covered by the novel robot system presented in this thesis

    Object Recognition

    Get PDF
    Vision-based object recognition tasks are very familiar in our everyday activities, such as driving our car in the correct lane. We do these tasks effortlessly in real-time. In the last decades, with the advancement of computer technology, researchers and application developers are trying to mimic the human's capability of visually recognising. Such capability will allow machine to free human from boring or dangerous jobs

    Optical Wireless Data Center Networks

    Get PDF
    Bandwidth and computation-intensive Big Data applications in disciplines like social media, bio- and nano-informatics, Internet-of-Things (IoT), and real-time analytics, are pushing existing access and core (backbone) networks as well as Data Center Networks (DCNs) to their limits. Next generation DCNs must support continuously increasing network traffic while satisfying minimum performance requirements of latency, reliability, flexibility and scalability. Therefore, a larger number of cables (i.e., copper-cables and fiber optics) may be required in conventional wired DCNs. In addition to limiting the possible topologies, large number of cables may result into design and development problems related to wire ducting and maintenance, heat dissipation, and power consumption. To address the cabling complexity in wired DCNs, we propose OWCells, a class of optical wireless cellular data center network architectures in which fixed line of sight (LOS) optical wireless communication (OWC) links are used to connect the racks arranged in regular polygonal topologies. We present the OWCell DCN architecture, develop its theoretical underpinnings, and investigate routing protocols and OWC transceiver design. To realize a fully wireless DCN, servers in racks must also be connected using OWC links. There is, however, a difficulty of connecting multiple adjacent network components, such as servers in a rack, using point-to-point LOS links. To overcome this problem, we propose and validate the feasibility of an FSO-Bus to connect multiple adjacent network components using NLOS point-to-point OWC links. Finally, to complete the design of the OWC transceiver, we develop a new class of strictly and rearrangeably non-blocking multicast optical switches in which multicast is performed efficiently at the physical optical (lower) layer rather than upper layers (e.g., application layer). Advisors: Jitender S. Deogun and Dennis R. Alexande

    Art and Engineering Inspired by Swarm Robotics

    Get PDF
    Swarm robotics has the potential to combine the power of the hive with the sensibility of the individual to solve non-traditional problems in mechanical, industrial, and architectural engineering and to develop exquisite art beyond the ken of most contemporary painters, sculptors, and architects. The goal of this thesis is to apply swarm robotics to the sublime and the quotidian to achieve this synergy between art and engineering. The potential applications of collective behaviors, manipulation, and self-assembly are quite extensive. We will concentrate our research on three topics: fractals, stability analysis, and building an enhanced multi-robot simulator. Self-assembly of swarm robots into fractal shapes can be used both for artistic purposes (fractal sculptures) and in engineering applications (fractal antennas). Stability analysis studies whether distributed swarm algorithms are stable and robust either to sensing or to numerical errors, and tries to provide solutions to avoid unstable robot configurations. Our enhanced multi-robot simulator supports this research by providing real-time simulations with customized parameters, and can become as well a platform for educating a new generation of artists and engineers. The goal of this thesis is to use techniques inspired by swarm robotics to develop a computational framework accessible to and suitable for both artists and engineers. The scope we have in mind for art and engineering is unlimited. Modern museums, stadium roofs, dams, solar power plants, radio telescopes, star networks, fractal sculptures, fractal antennas, fractal floral arrangements, smooth metallic railroad tracks, temporary utilitarian enclosures, permanent modern architectural designs, guard structures, op art, and communication networks can all be built from the bodies of the swarm

    Virtual Reality Games for Motor Rehabilitation

    Get PDF
    This paper presents a fuzzy logic based method to track user satisfaction without the need for devices to monitor users physiological conditions. User satisfaction is the key to any productโ€™s acceptance; computer applications and video games provide a unique opportunity to provide a tailored environment for each user to better suit their needs. We have implemented a non-adaptive fuzzy logic model of emotion, based on the emotional component of the Fuzzy Logic Adaptive Model of Emotion (FLAME) proposed by El-Nasr, to estimate player emotion in UnrealTournament 2004. In this paper we describe the implementation of this system and present the results of one of several play tests. Our research contradicts the current literature that suggests physiological measurements are needed. We show that it is possible to use a software only method to estimate user emotion

    The Future of Humanoid Robots

    Get PDF
    This book provides state of the art scientific and engineering research findings and developments in the field of humanoid robotics and its applications. It is expected that humanoids will change the way we interact with machines, and will have the ability to blend perfectly into an environment already designed for humans. The book contains chapters that aim to discover the future abilities of humanoid robots by presenting a variety of integrated research in various scientific and engineering fields, such as locomotion, perception, adaptive behavior, human-robot interaction, neuroscience and machine learning. The book is designed to be accessible and practical, with an emphasis on useful information to those working in the fields of robotics, cognitive science, artificial intelligence, computational methods and other fields of science directly or indirectly related to the development and usage of future humanoid robots. The editor of the book has extensive R&D experience, patents, and publications in the area of humanoid robotics, and his experience is reflected in editing the content of the book
    • โ€ฆ
    corecore