30,331 research outputs found
Reducing the length of a Goldberg Based Personality Questionnaire using Item Response Theory & Confirmatory Factor Analysis : a thesis presented in partial fulfilment of the requirements for the degree of Masters of Science in Psychology at Massey University, Albany, New Zealand
Objectives: This study seeks to reconstitute an existing personality questionnaire by identifying the items that capture the best quality information as measured through item Response Theory (IRT). This process will reduce the length of this measure and increase its measurement precision. Method: A polytomous IRT model (Graded Response: Samejima, 1969) will be used to assess the psychometric properties of each item in this questionnaire and produce item level graphs in order to select the best three items for each of the 26 first-order factors. Confirmatory Factor Analysis (CFA) will be used to assess the model fit and unidimensionality before and after the IRT selections are made. This will illustrate the improvement gained through both the deletion of redundant items and the selection of high-quality items. Results: This questionnaire was reduced from 246 items down to 78 items with three high-quality items identified for each of the 26 first-order factors. The model fit considerably improved through this selection process and the reduction of information was minimal in comparison to the amount of items that were deleted. Conclusions: This study illustrated the power of using IRT for test development. The item selections are not only of benefit for the organisation that supplied the data for this study, but also the original developers as well as any other users of these items as they are freely available via an online source
Recommended from our members
Pathological video game playing in Spanish and British adolescents: towards the exploration of Internet Gaming Disorder symptomatology
Research into problematic video gaming has increased greatly over the last decade and many screening instruments have been developed to identify such behaviour. This study re-examined the Problematic Videogame Playing [PVP] Scale. The objectives of the study were to (i) examine its psychometric properties in two European countries, (ii) estimate the prevalence of potential pathological gaming among adolescents in both countries, and (iii) assess the classification accuracy of the PVP Scale based on its symptomatology as a way of exploring its relationship with both the behavioural component model of addiction and the proposed Internet Gaming Disorder. The data were collected via a survey administered to 2,356 adolescents aged between 11 and 18 years from Spain (n=1,132) and Great Britain (n=1,224). Results indicated that the reliability of both versions was adequate, and the factorial and construct validity were good. Findings also showed that the prevalence of pathological gamers estimated with a rigorous cut-off point was 7.7% for Spanish and 14.6% for British adolescents. The scale showed adequate sensitivity, specificity and classification accuracy in both countries, and was able to differentiate between social and potential pathological gamers, and from their addictive symptomatology. The implications of these findings are discussed
A Preliminary Study of Applying Lean Six Sigma Methods to Machine Tool Measurement
Many manufacturers aim to increase their levels of high-quality production in order to improve their market competitiveness. Continuous improvement of maintenance strategies is a key factor to be capable of delivering high quality products and services on-time with minimal operating costs. However, the cost of maintaining quality is often perceived as a non-added-value task. Improving the efficiency and effectiveness of the measurement procedures necessary to guarantee accuracy of production is a more complex task than many other maintenance functions and so deserves particular analysis.
This paper investigates the feasibility of producing a concise yet effective framework that will provide a preliminary approach for integrating Lean and Six Sigma philosophies to the specific goal of reducing unnecessary downtime on manufacturing machines while maintaining its ability to machine to the required tolerance.
The purpose of this study is to show how a Six Sigma infrastructure is used to investigate the root causes of complication occurring during the machine tool measurement. This work recognises issues of the uncertainty of data, and the measurement procedures in parallel with the main tools of Six Sigma’s Define-Measure-Analyse-Improve-Control (DMAIC).
The significance of this work is that machine tool accuracy is critical for high value manufacturing. Over-measuring the machine to ensure accuracy potentially reduces production volume. However, not measuring them or ignoring accuracy aspects possibly lead to production waste. This piece of work aims to present a lean guidance to lessen measurement uncertainties and optimise the machine tool benchmarking procedures, while adopting the DMAIC strategy to reduce unnecessary downtime
Opinion and community formation in coevolving networks
In human societies opinion formation is mediated by social interactions,
consequently taking place on a network of relationships and at the same time
influencing the structure of the network and its evolution. To investigate this
coevolution of opinions and social interaction structure we develop a dynamic
agent-based network model, by taking into account short range interactions like
discussions between individuals, long range interactions like a sense for
overall mood modulated by the attitudes of individuals, and external field
corresponding to outside influence. Moreover, individual biases can be
naturally taken into account. In addition the model includes the opinion
dependent link-rewiring scheme to describe network topology coevolution with a
slower time scale than that of the opinion formation. With this model
comprehensive numerical simulations and mean field calculations have been
carried out and they show the importance of the separation between fast and
slow time scales resulting in the network to organize as well-connected small
communities of agents with the same opinion.Comment: 10 pages, 5 figures. New inset for Fig. 1 and references added.
Submitted to Physical Review
Computational Controversy
Climate change, vaccination, abortion, Trump: Many topics are surrounded by
fierce controversies. The nature of such heated debates and their elements have
been studied extensively in the social science literature. More recently,
various computational approaches to controversy analysis have appeared, using
new data sources such as Wikipedia, which help us now better understand these
phenomena. However, compared to what social sciences have discovered about such
debates, the existing computational approaches mostly focus on just a few of
the many important aspects around the concept of controversies. In order to
link the two strands, we provide and evaluate here a controversy model that is
both, rooted in the findings of the social science literature and at the same
time strongly linked to computational methods. We show how this model can lead
to computational controversy analytics that have full coverage over all the
crucial aspects that make up a controversy.Comment: In Proceedings of the 9th International Conference on Social
Informatics (SocInfo) 201
- …