9,043 research outputs found

    NASA space station automation: AI-based technology review

    Get PDF
    Research and Development projects in automation for the Space Station are discussed. Artificial Intelligence (AI) based automation technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics. AI technology will also be developed for the servicing of satellites at the Space Station, system monitoring and diagnosis, space manufacturing, and the assembly of large space structures

    Enhanced Accessibility for People with Disabilities Living in Urban Areas

    Get PDF
    [Excerpt] People with disabilities constitute a significant proportion of the poor in developing countries. If internationally agreed targets on reducing poverty are to be reached, it is critical that specific measures be taken to reduce the societal discrimination and isolation that people with disabilities continue to face. Transport is an important enabler of strategies to fight poverty through enhancing access to education, employment, and social services. This project aims to further the understanding of the mobility and access issues experienced by people with disabilities in developing countries, and to identify specific steps that can be taken to start addressing problems. A major objective of the project is to compile a compendium of guidelines that can be used by government authorities, advocacy groups, and donor/loan agencies to improve the access of people with disabilities to transport and other services in urban areas

    Melodic, using music to train visually impaired kids in computational thinking

    Get PDF
    Dissertação de mestrado em Engenharia InformĂĄticaThis document, in context of second year of Integrated Master of Informatics Engineering, reports the development of a project that intends to teach Computational Thinking to kids with special educational needs, in this case blindness. The aim of this research is to characterize both subjects, Computational Thinking and Blindness, and identify what are the current most used and best practises to teach this different way of thinking to kids with special needs. To achieve this, Melodic was created. This is a system composed by a software and a hardware where the user must create sequences with the tactile blocks (the hardware) and then read them with the mobile application (the software), that converts the sequence created into sound. With this, the user can easily hear the differences that the changes in the blocks sequence can make. This can be compared to the Computational Thinking teaching through the use of robots, because in that case, users can see the result of their instructions in the robot movement and with Melodic, the user can hear the result of their instruction with the musical note sequence played by the app. In this document more technical aspects such as the architecture of the application that is proposed to accomplish the goal of the present Master’s project, will also be discussed. After this, the project development process that lead to the creation of Melodic is described as well as all the decisions taken. A description of all functionalities of this system can also be seen in this document. To prove the research hypothesis initially stated, some exercises were created and described. The referred exercises were designed to access if Melodic actually develops Computational Thinking.Este documento, no contexto do segundo ano do Mestrado Integrado em Engenharia InformĂĄtica, relata o desenvolvimento de um projecto que pretende ensinar Pensamento Computacional a crianças com necessidades educativas especiais, neste caso a cegueira. O objetivo desta investigação Ă© caraterizar tanto o Pensamento Computacional como a Cegueira, e identificar quais sĂŁo atualmente as prĂĄticas mais usadas para ensinar esta forma diferente de pensar a crianças com necessidades especiais. Para o conseguir, foi criado o Melodic. Este Ă© um sistema composto por software e hardware onde o utilizador deve criar sequĂȘncias com os blocos tĂĄcteis (o hardware) e depois lĂȘ-los com a aplicação mĂłvel (o software), que converte a sequĂȘncia criada em som. Com isto, o utilizador pode facilmente ouvir as diferenças que as alteraçÔes na sequĂȘncia dos blocos podem fazer. Isto pode ser comparado com o ensino Pensamento Computacional atravĂ©s do uso de robĂŽs, sendo que nesse caso os utilizadores podem ver o resultado das suas instruçÔes no movimento do robĂŽ e com Melodic, os utilizadores podem ouvir o resultado das suas instruçÔes com a sequĂȘncia de notas musicais tocadas pela aplicação. Este Ă© um sistema composto por um software e um hardware onde o utilizador deve criar sequĂȘncias com os blocos tĂĄcteis (o hardware) e depois lĂȘ-los com a aplicação mĂłvel (o software), que converte a sequĂȘncia criada em som. Neste documento serĂŁo tambĂ©m discutidos aspetos mais tĂ©cnicos, tais como a arquitetura da aplicação que Ă© proposta para atingir o objectivo deste projecto de mestrado. Depois disto, descreve-se o processo de desenvolvimento do projecto que levou Ă  criação da Melodic, bem como todas as decisĂ”es tomadas. Uma descrição de todas as funcionalidades deste sistema tambĂ©m pode ser vista neste documento. Para comprovar a hipĂłtese de investigação inicialmente referida, foram criados e descritos alguns exercĂ­cios. Os referidos exercĂ­cios foram concebidos para verificar se o Melodic realmente se treina o Pensamento Computacional

    Message threads: Exploring interpersonal communication through smartphones: how we weave our lives in a hypermediated world

    No full text
    This thesis is about human behaviour as it relates to computer mediated communication. Smartphones are an accepted part of everyday life. We use them to wake us up in the morning, we play games on them while we wait for the bus, and take photos with them. Smartphones also enable communication. We can phone while in transit, coordinate meeting up with friends, share our lives on social networking sites, and check in on email and text throughout the day. How does this technology affect how we interact? In public situations we retain contact online, but this multitasking affects how we relate to others socially. Smartphone texting allows us to keep in constant touch with friends and family, though interaction is fragmented and asynchronous. As we are always available, and never alone, these open lines of communication also affect how we see ourselves. In choosing the smartphone I critically question the attention and priority given to these devices in daily life. Mobile phones have changed the soundscape in public places: dialtones, beeps and people speaking in public on their phones is common. Users interact continually with their phones, store substantial data on them, communicate through, and consequently develop a bond to, the physical object. What could these ubiquitous portable computers tell us if, instead of being passive agents in a dependent relationship of user and phone, they actively listened, or could reflect back the nature of their role in our lives

    Wheels, suitcases, angels: Kurt Schwitters and Walter Benjamin

    Get PDF

    City Tells:

    Get PDF
    City Tells. Guidelines to an Emotional Wayfinding System were developed to provide wayfinding information to visitors walking through historic environments and to ensure that unknown urban places become more welcoming, easier to navigate and more enjoyable for both visitors and tourists

    Machine learning in 3D space gesture recognition

    Get PDF
    The rapid increase in the development of robotic systems in a controlled and uncontrolled environment leads to the development of a more natural interaction system. One such interaction is gesture recognition. The proposed paper is a simple approach towards gesture recognition technology where the hand movement in a 3-dimensional space is utilized to write the English alphabets and get the corresponding output in the screen or a display device. In order to perform the experiment, an MPU-6050 accelerometer, a microcontroller and a Bluetooth for wireless connection are used as the hardware components of the system. For each of the letters of the alphabets, the data instances are recorded in its raw form. 20 instances for each letter are recorded and it is then standardized using interpolation. The standardized data is fed as inputs to an SVM (Support Vector Machine) classifier to create a model. The created model is used for classification of future data instances at real time. Our method achieves a correct classification accuracy of 98.94% for the English alphabets’ hand gesture recognition. The primary objective of our approach is the development of a low-cost, low power and easily trained supervised gesture recognition system which identifies hand gesture movement efficiently and accurately. The experimental result obtained is based on use of a single subject

    Personal Autonomy Rehabilitation in Home Environments by a Portable Assistive Robot

    Get PDF
    Increasingly disabled and elderly people with mobility problems want to live autonomously in their home environment. They are motivated to use robotic aids to perform tasks by themselves, avoiding permanent nurse or family assistant supervision. They must find means to rehabilitate their abilities to perform daily life activities (DLAs), such as eating, shaving, or drinking. These means may be provided by robotic aids that incorporate possibilities and methods to accomplish common tasks, aiding the user in recovery of partial or complete autonomy. Results are highly conditioned by the system's usability and potential. The developed portable assistive robot ASIBOT helps users perform most of these tasks in common living environments. Minimum adaptations are needed to provide the robot with mobility throughout the environment. The robot can autonomously climb from one surface to another, fixing itself to the best place to perform each task. When the robot is attached to its wheelchair, it can move along with it as a bundle. This paper presents the work performed with the ASIBOT in the area of rehabilitation robotics. First, a brief description of the ASIBOT system is given. A description of tests that have been performed with the robot and several impaired users is given. Insight into how these experiences have influenced our research efforts, especially, in home environments, is also included. A description of the test bed that has been developed to continue research on performing DLAs by the use of robotic aids, a kitchen environment, is given. Relevant conclusions are also included.This work has been supported by the CAM Project S2009/DPI-1559/ROBOCITY2030 I
    • 

    corecore