14,619 research outputs found

    Control theoretic models of pointing

    Get PDF
    This article presents an empirical comparison of four models from manual control theory on their ability to model targeting behaviour by human users using a mouse: McRuer’s Crossover, Costello’s Surge, second-order lag (2OL), and the Bang-bang model. Such dynamic models are generative, estimating not only movement time, but also pointer position, velocity, and acceleration on a moment-to-moment basis. We describe an experimental framework for acquiring pointing actions and automatically fitting the parameters of mathematical models to the empirical data. We present the use of time-series, phase space, and Hooke plot visualisations of the experimental data, to gain insight into human pointing dynamics. We find that the identified control models can generate a range of dynamic behaviours that captures aspects of human pointing behaviour to varying degrees. Conditions with a low index of difficulty (ID) showed poorer fit because their unconstrained nature leads naturally to more behavioural variability. We report on characteristics of human surge behaviour (the initial, ballistic sub-movement) in pointing, as well as differences in a number of controller performance measures, including overshoot, settling time, peak time, and rise time. We describe trade-offs among the models. We conclude that control theory offers a promising complement to Fitts’ law based approaches in HCI, with models providing representations and predictions of human pointing dynamics, which can improve our understanding of pointing and inform design

    Impact of Covid-19 Pandemic on User Search Behavior: A Case Study of Postgraduate Students

    Get PDF
    Relevance Feedback (RF) is crucial for building a user profile which is a fundamental element of different intelligent systems such as information retrieval, information filtering, and personalization. RF is affected by a number of contextual factors such as mood, stress level, and sentimental state of the user. Covid-19 pandemic imposed dramatic changes to the user environment as well as the search context. This paper investigates user’s search behaviour to identify the differences in the behavior between the contexts before and during the Covid-19 pandemic. This can be practically translated into identifying the differences in the relationship between the implicit feedback and the explicit relevance level between the two contexts. For this purpose, we conducted three user studies (i) Pre-COVID-19,  (ii) Mid COVID-19 and (iii) after Covid-19. A user study was conducted on the same group of users on the three user studies. The Pre-COVID-19 user study took place before the pandemic started and the Mid-COVID-19 user study took place three months after the beginning of the pandemic. After Covid-19 stage took place after 18 months of the pandemic. A linear regression model was developed for each user study using IBM-SPSS. The analysis showed a significant variation in the user behavior between the two studies due to the COVID-19 context and its impact on user search behaviour. Also, two new RF parameters in Mid-COVID-19 were shown to have a significant relationship with the explicit user interest which were Mouse Clicks and Page/Down strikes. Furthermore, the comparison between the two models showed that the second regression model achieved a higher accuracy level that is attributed to the common behavioral change imposed by the pandemic

    A Sensory-Driven Trade-Off between Coordinated Motion in Social Prey and a Predator's Visual Confusion.

    Get PDF
    Social animals are capable of enhancing their awareness by paying attention to their neighbors, and prey found in groups can also confuse their predators. Both sides of these sensory benefits have long been appreciated, yet less is known of how the perception of events from the perspectives of both prey and predator can interact to influence their encounters. Here we examined how a visual sensory mechanism impacts the collective motion of prey and, subsequently, how their resulting movements influenced predator confusion and capture ability. We presented virtual prey to human players in a targeting game and measured the speed and accuracy with which participants caught designated prey. As prey paid more attention to neighbor movements their collective coordination increased, yet increases in prey coordination were positively associated with increases in the speed and accuracy of attacks. However, while attack speed was unaffected by the initial state of the prey, accuracy dropped significantly if the prey were already organized at the start of the attack, rather than in the process of self-organizing. By repeating attack scenarios and masking the targeted prey's neighbors we were able to visually isolate them and conclusively demonstrate how visual confusion impacted capture ability. Delays in capture caused by decreased coordination amongst the prey depended upon the collection motion of neighboring prey, while it was primarily the motion of the targets themselves that determined capture accuracy. Interestingly, while a complete loss of coordination in the prey (e.g., a flash expansion) caused the greatest delay in capture, such behavior had little effect on capture accuracy. Lastly, while increases in collective coordination in prey enhanced personal risk, traveling in coordinated groups was still better than appearing alone. These findings demonstrate a trade-off between the sensory mechanisms that can enhance the collective properties that emerge in social animals and the individual group member's predation risk during an attack

    Interactive form creation: exploring the creation and manipulation of free form through the use of interactive multiple input interface

    Get PDF
    Most current CAD systems support only the two most common input devices: a mouse and a keyboard that impose a limit to the degree of interaction that a user can have with the system. However, it is not uncommon for users to work together on the same computer during a collaborative task. Beside that, people tend to use both hands to manipulate 3D objects; one hand is used to orient the object while the other hand is used to perform some operation on the object. The same things could be applied to computer modelling in the conceptual phase of the design process. A designer can rotate and position an object with one hand, and manipulate the shape [deform it] with the other hand. Accordingly, the 3D object can be easily and intuitively changed through interactive manipulation of both hands.The research investigates the manipulation and creation of free form geometries through the use of interactive interfaces with multiple input devices. First the creation of the 3D model will be discussed; several different types of models will be illustrated. Furthermore, different tools that allow the user to control the 3D model interactively will be presented. Three experiments were conducted using different interactive interfaces; two bi-manual techniques were compared with the conventional one-handed approach. Finally it will be demonstrated that the use of new and multiple input devices can offer many opportunities for form creation. The problem is that few, if any, systems make it easy for the user or the programmer to use new input devices
    corecore