15,795 research outputs found

    Virtual Reality Games for Motor Rehabilitation

    Get PDF
    This paper presents a fuzzy logic based method to track user satisfaction without the need for devices to monitor users physiological conditions. User satisfaction is the key to any product’s acceptance; computer applications and video games provide a unique opportunity to provide a tailored environment for each user to better suit their needs. We have implemented a non-adaptive fuzzy logic model of emotion, based on the emotional component of the Fuzzy Logic Adaptive Model of Emotion (FLAME) proposed by El-Nasr, to estimate player emotion in UnrealTournament 2004. In this paper we describe the implementation of this system and present the results of one of several play tests. Our research contradicts the current literature that suggests physiological measurements are needed. We show that it is possible to use a software only method to estimate user emotion

    Design-for-delay-testability techniques for high-speed digital circuits

    Get PDF
    The importance of delay faults is enhanced by the ever increasing clock rates and decreasing geometry sizes of nowadays' circuits. This thesis focuses on the development of Design-for-Delay-Testability (DfDT) techniques for high-speed circuits and embedded cores. The rising costs of IC testing and in particular the costs of Automatic Test Equipment are major concerns for the semiconductor industry. To reverse the trend of rising testing costs, DfDT is\ud getting more and more important

    Integration of a big data emerging on large sparse simulation and its application on green computing platform

    Get PDF
    The process of analyzing large data and verifying a big data set are a challenge for understanding the fundamental concept behind it. Many big data analysis techniques suffer from the poor scalability, variation inequality, instability, lower convergence, and weak accuracy of the large-scale numerical algorithms. Due to these limitations, a wider opportunity for numerical analysts to develop the efficiency and novel parallel algorithms has emerged. Big data analytics plays an important role in the field of sciences and engineering for extracting patterns, trends, actionable information from large sets of data and improving strategies for making a decision. A large data set consists of a large-scale data collection via sensor network, transformation from signal to digital images, high resolution of a sensing system, industry forecasts, existing customer records to predict trends and prepare for new demand. This paper proposes three types of big data analytics in accordance to the analytics requirement involving a large-scale numerical simulation and mathematical modeling for solving a complex problem. First is a big data analytics for theory and fundamental of nanotechnology numerical simulation. Second, big data analytics for enhancing the digital images in 3D visualization, performance analysis of embedded system based on the large sparse data sets generated by the device. Lastly, extraction of patterns from the electroencephalogram (EEG) data set for detecting the horizontal-vertical eye movements. Thus, the process of examining a big data analytics is to investigate the behavior of hidden patterns, unknown correlations, identify anomalies, and discover structure inside unstructured data and extracting the essence, trend prediction, multi-dimensional visualization and real-time observation using the mathematical model. Parallel algorithms, mesh generation, domain-function decomposition approaches, inter-node communication design, mapping the subdomain, numerical analysis and parallel performance evaluations (PPE) are the processes of the big data analytics implementation. The superior of parallel numerical methods such as AGE, Brian and IADE were proven for solving a large sparse model on green computing by utilizing the obsolete computers, the old generation servers and outdated hardware, a distributed virtual memory and multi-processors. The integration of low-cost communication of message passing software and green computing platform is capable of increasing the PPE up to 60% when compared to the limited memory of a single processor. As a conclusion, large-scale numerical algorithms with great performance in scalability, equality, stability, convergence, and accuracy are important features in analyzing big data simulation

    Neuroplasticity and functional recovery: Training models and compensatory strategies in music therapy

    Get PDF
    New research developments in the recovery of function following neurological trauma as well as basic and applied research relevant to music perception and production, seem to point to the suggestion that specific music therapy interventions that irectly address the restoration of function as opposed to developing compensatory mechanisms, in certain circumstances, may now be a more appropriate treatment approach. We will address the issue of appropriate timing for the introduction of each strategy and discuss potential outcomes of each approach. As one might imagine, much of this research is published in the neurological journals, which music herapists may not regularly consult. It seems challenging enough just to keep abreast of new music therapy literature. Further, there is so much neurological research that the music therapy clinician often finds it difficult to know where to begin. This text provides an overview of a growing concept related to recovery known as neuroplasticity, and how specific training models in music therapy utilize this relatively recently identified phenomenon. Also, a framework will be provided to help guide the practicing clinician when attempting to build a lineage of systematic thought relevant to the use of music in neurorehabilitation, as well as discuss the frequently employed concept of behavioural compensation. Some music therapy literature that relates to these different concepts is outlined. Discussions surrounding the decision to use either of these two approaches are presented in relation to stages of recovery and the clinical presentation of the client

    Computer based laboratory simulation in maritime education

    Get PDF

    Integration of a big data emerging on large sparse simulation and its application on green computing platform

    Get PDF
    The process of analyzing large data and verifying a big data set are a challenge for understanding the fundamental concept behind it. Many big data analysis techniques suffer from the poor scalability, variation inequality, instability, lower convergence, and weak accuracy of the large-scale numerical algorithms. Due to these limitations, a wider opportunity for numerical analysts to develop the efficiency and novel parallel algorithms has emerged. Big data analytics plays an important role in the field of sciences and engineering for extracting patterns, trends, actionable information from large sets of data and improving strategies for making a decision. A large data set consists of a large-scale data collection via sensor network, transformation from signal to digital images, high resolution of a sensing system, industry forecasts, existing customer records to predict trends and prepare for new demand. This paper proposes three types of big data analytics in accordance to the analytics requirement involving a large-scale numerical simulation and mathematical modeling for solving a complex problem. First is a big data analytics for theory and fundamental of nanotechnology numerical simulation. Second, big data analytics for enhancing the digital images in 3D visualization, performance analysis of embedded system based on the large sparse data sets generated by the device. Lastly, extraction of patterns from the electroencephalogram (EEG) data set for detecting the horizontal-vertical eye movements. Thus, the process of examining a big data analytics is to investigate the behavior of hidden patterns, unknown correlations, identify anomalies, and discover structure inside unstructured data and extracting the essence, trend prediction, multi-dimensional visualization and real-time observation using the mathematical model. Parallel algorithms, mesh generation, domain-function decomposition approaches, inter-node communication design, mapping the subdomain, numerical analysis and parallel performance evaluations (PPE) are the processes of the big data analytics implementation. The superior of parallel numerical methods such as AGE, Brian and IADE were proven for solving a large sparse model on green computing by utilizing the obsolete computers, the old generation servers and outdated hardware, a distributed virtual memory and multi-processors. The integration of low-cost communication of message passing software and green computing platform is capable of increasing the PPE up to 60% when compared to the limited memory of a single processor. As a conclusion, large-scale numerical algorithms with great performance in scalability, equality, stability, convergence, and accuracy are important features in analyzing big data simulation

    The antecedents of low-level classroom disruption: a bio-ecological perspective

    Get PDF
    Low-level classroom disruption (LLCD) is the fundamental behavioural issue in primary schools across England. Typically defined as surface-level behaviours (Esturgó-Deu & Sala-Roca, 2010), LLCD includes talking unnecessarily, fidgeting, distracting others, rocking on the chair and daydreaming (Ofsted, 2014). Educational literature has extensively referenced LLCD, making inferences about the potential antecedents, from within the classroom to the wider contexts (home and societal factors). However, and contradicting this, LLCD is viewed as a concept controllable by effective teachers at classroom level. Thus, research is typically classroom based, and centred round the management and control of LLCD. To date no psychological research has investigated the bio-ecological antecedents of LLCD. This mixed methods study pioneers this line of enquiry. By applying the Person, Process, Context Time Model of Development (Bronfenbrenner, 1985) processes that influence behaviour were considered. Key Stage Two pupils aged 8-11 years (N=274) from 3 schools in England, provided quantitative data at two time points (with a year lag between) recording: gender, peer pressure, executive function, global self-worth, appropriate conduct, home chaos, screen time, sleep, television in bedroom, and extra-curricular activity. A sub-sample of these pupils’ parents (N=58) reported on their own personal screen time use, parenting practices and the family’s socioeconomic status. Semi-structured interviews with members of teaching staff (N=8) provided an in-depth account of the lived experience of LLCD in the classroom providing evidence of the impact LLCD on staff and pupils. Results show a significant increase to the presentation of LLCD across the two time points for the whole pupil sample, with male pupils displaying significantly higher levels of LLCD than the female pupils at both times. Findings also indicated at both time points that higher screen time use in the home context was directly associated with increased LLCD in the school context for the whole pupil sample. For the male pupils only this association was partially mediated through increase in proneness to boredom. The repeated measures investigations found the relationship between screen time and LLCD to then be converse, with increases in LLCD significantly related to higher screen time for the male pupils, suggesting a cyclical reciprocal pattern of influence. Repeated measures analysis also suggested converse gender differences between the pupils’ self-perceived appropriate conduct and LLCD. For the male pupils a significant result was found indicating that a lower self-perception of appropriate conduct was associated with a higher presentation of LLCD whereas, for the female pupils a higher perception of their own appropriate conduct was associated with a lower presentation of LLCD. The semi-structured interviews with teaching staff (N=8) supported the Ofsted (2014) report of LLCD having a negative impact on both the teaching and learning that takes place in the classroom. These and other results indicate that consideration needs to be given to the influences of low-level classroom disruption not only from the classroom context but also from outside the classroom, such as in the home

    How do potential users perceive the adoption of new technologies within the field of Artificial Intelligence and Internet-of-Things? - A revision of the UTAUT 2 model using Voice Assistants

    Get PDF
    The following study investigates the perception potential users have when considering the adoption of voice assistants (VAs). VAs are considered to possess characteristics linkable to both, Artificial Intelligence (AI) and the Internet-of-Things (IoT). This thesis aims to provide a deeper understanding of the determinants influencing the adoption of the new VA technology using the Unified Theory of Acceptance and Use of Technology 2 model (UTAUT 2), a theoretical model explaining technology adoption and usage behaviour. The amount of gadgets being released to the market which possess characteristics of the AI and IoT technology increases constantly, while the 2012 version of the UTAUT 2 model was not constructed for these. In a qualitative approach conducting four focus groups, the aim of this study is to find out about the perceptions of potential future users on the VA technology and as a consequence amend the current UTAUT 2 model to fit newly upcoming technologies which possess similar characteristics as VAs within the AI and IoT field. The study found out that while hedonic motivation seems to be of inferior relevance, the determinants data security, compatibility and relationship with the device are essential influencing factors to take into consideration when trying to fully understand users’ technology adoption perceptions. However, the fact that these technologies are still in the early stage of adoption make it difficult for future users, to fully judge their own adoption behaviour if they are no members of the early innovation adoption curve stages. For further research, it is recommended to look into different sampling groups and apply the model resulting from this study to new upcoming technologies within the area of AI and IoT
    • …
    corecore