260,733 research outputs found
The Fuzzy and Dynamic Nature of Trust
Trust is one of the most fuzzy, dynamic and complex concepts in both social and business relationships. The difficulty in measuring Trust and predicting Trustworthiness in service-oriented network environments leads to many questions. These include issues such as how to measure the willingness and capability of individuals in the Trust dynamic and how to assign a concrete level of Trust to an individual or Agent. In this paper, we analyze the fuzzy, dynamic and complex nature of Trust.The dynamic nature of Trust creates the biggest challenge in measuring Trust and predicting Trustworthiness. In order to develop a Trustworthiness Measure and Prediction Method, we first need to understand what we can actually measure in a Trust Relationship
Predicting cognitive difficulty of the deductive mastermind game with dynamic epistemic logic models
Deductive Mastermind is a deductive reasoning game that isimplemented in the online educational game system Math Gar-den. A good understanding of the difficulty of Deductive Mas-termind game instances is essential for optimizing the learningexperience of players. The available empirical difficulty rat-ings, based on speed and accuracy, provide robust estimationsbut do not explain why certain game instances are easy or hard.In previous work a logic-based model was proposed that suc-cessfully predicted these difficulty ratings. We add to this workby providing a model based on a different logical principle—that of eliminating hypotheses (dynamic epistemic logic) in-stead of reasoning by cases (analytical tableaux system)—thatcan predict the empirical difficulty ratings equally well. Weshow that the informational content of the different feedbacksgiven in game instances is a core predictor for cognitive dif-ficulty ratings and that this is irrespective of the specific logicused to formalize the game
Granger causality among house price and macroeconomic variables in Victoria
This study analyses the dynamic causality of four macroeconomic variables on house prices. The four macroeconomic variables have interrelationships with house prices in certain lagged terms, but these relationships are not always the same as the notions put forward in prior research. The relationships are detected to be unstable in the three observation periods. The instability of these relationships would cause difficulty in predicting house prices in the market, especially for policy makers and market participants
Lattices of hydrodynamically interacting flapping swimmers
Fish schools and bird flocks exhibit complex collective dynamics whose
self-organization principles are largely unknown. The influence of
hydrodynamics on such collectives has been relatively unexplored theoretically,
in part due to the difficulty in modeling the temporally long-lived
hydrodynamic interactions between many dynamic bodies. We address this through
a novel discrete-time dynamical system (iterated map) that describes the
hydrodynamic interactions between flapping swimmers arranged in one- and
two-dimensional lattice formations. Our 1D results exhibit good agreement with
previously published experimental data, in particular predicting the
bistability of schooling states and new instabilities that can be probed in
experimental settings. For 2D lattices, we determine the formations for which
swimmers optimally benefit from hydrodynamic interactions. We thus obtain the
following hierarchy: while a side-by-side single-row "phalanx" formation offers
a small improvement over a solitary swimmer, 1D in-line and 2D rectangular
lattice formations exhibit substantial improvements, with the 2D diamond
lattice offering the largest hydrodynamic benefit. Generally, our
self-consistent modeling framework may be broadly applicable to active systems
in which the collective dynamics is primarily driven by a fluid-mediated
memory
Progress in the development of PDF turbulence models for combustion
A combined Monte Carlo-computational fluid dynamic (CFD) algorithm was developed recently at Lewis Research Center (LeRC) for turbulent reacting flows. In this algorithm, conventional CFD schemes are employed to obtain the velocity field and other velocity related turbulent quantities, and a Monte Carlo scheme is used to solve the evolution equation for the probability density function (pdf) of species mass fraction and temperature. In combustion computations, the predictions of chemical reaction rates (the source terms in the species conservation equation) are poor if conventional turbulence modles are used. The main difficulty lies in the fact that the reaction rate is highly nonlinear, and the use of averaged temperature produces excessively large errors. Moment closure models for the source terms have attained only limited success. The probability density function (pdf) method seems to be the only alternative at the present time that uses local instantaneous values of the temperature, density, etc., in predicting chemical reaction rates, and thus may be the only viable approach for more accurate turbulent combustion calculations. Assumed pdf's are useful in simple problems; however, for more general combustion problems, the solution of an evolution equation for the pdf is necessary
The Dynamic Effects of Subconscious Goal Pursuit on Resource Allocation, Task Performance, and Goal Abandonment
We test two potential boundary conditions for the effects of subconscious goals—the nature of the goal that is activated (achievement vs. underachievement) and conscious goal striving. Subconscious achievement goals increase the amount of time devoted to skill acquisition, and this increase in resource allocation leads to higher performance when conscious goals are neutral. However, specific conscious goals undermine the performance benefits of subconscious achievement goals. Subconscious underachievement goals cause individuals to abandon goal pursuit and this effect is mediated by task performance. Difficult conscious goals neutralize the detrimental effects of subconscious underachievement goals but only if implemented before performance is undermined. Overall, these results suggest that subconscious achievement goals facilitate task performance, subconscious underachievement goals trigger goal abandonment, and difficult conscious goals moderate these effects depending on the level of resource allocation and timing of goal implementation
Introducing a framework to assess newly created questions with Natural Language Processing
Statistical models such as those derived from Item Response Theory (IRT)
enable the assessment of students on a specific subject, which can be useful
for several purposes (e.g., learning path customization, drop-out prediction).
However, the questions have to be assessed as well and, although it is possible
to estimate with IRT the characteristics of questions that have already been
answered by several students, this technique cannot be used on newly generated
questions. In this paper, we propose a framework to train and evaluate models
for estimating the difficulty and discrimination of newly created Multiple
Choice Questions by extracting meaningful features from the text of the
question and of the possible choices. We implement one model using this
framework and test it on a real-world dataset provided by CloudAcademy, showing
that it outperforms previously proposed models, reducing by 6.7% the RMSE for
difficulty estimation and by 10.8% the RMSE for discrimination estimation. We
also present the results of an ablation study performed to support our features
choice and to show the effects of different characteristics of the questions'
text on difficulty and discrimination.Comment: Accepted at the International Conference of Artificial Intelligence
in Educatio
Recommended from our members
A Double Error Dynamic Asymptote Model of Associative Learning
In this paper a formal model of associative learning is presented which incorporates representational and computational mechanisms that, as a coherent corpus, empower it to make accurate predictions of a wide variety of phenomena that so far have eluded a unified account in learning theory. In particular, the Double Error Dynamic Asymptote (DDA) model introduces: 1) a fully-connected network architecture in which stimuli are represented as temporally clustered elements that associate to each other, so that elements of one cluster engender activity on other clusters, which naturally implements neutral stimuli associations and mediated learning; 2) a predictor error term within the traditional error correction rule (the double error), which reduces the rate of learning for expected predictors; 3) a revaluation associability rate that operates on the assumption that the outcome predictiveness is tracked over time so that prolonged uncertainty is learned, reducing the levels of attention to initially surprising outcomes; and critically 4) a biologically plausible variable asymptote, which encapsulates the principle of Hebbian learning, leading to stronger associations for similar levels of cluster activity. The outputs of a set of simulations of the DDA model are presented along with empirical results from the literature. Finally, the predictive scope of the model is discussed
- …