191,036 research outputs found
Chaos in computer performance
Modern computer microprocessors are composed of hundreds of millions of
transistors that interact through intricate protocols. Their performance during
program execution may be highly variable and present aperiodic oscillations. In
this paper, we apply current nonlinear time series analysis techniques to the
performances of modern microprocessors during the execution of prototypical
programs. Our results present pieces of evidence strongly supporting that the
high variability of the performance dynamics during the execution of several
programs display low-dimensional deterministic chaos, with sensitivity to
initial conditions comparable to textbook models. Taken together, these results
show that the instantaneous performances of modern microprocessors constitute a
complex (or at least complicated) system and would benefit from analysis with
modern tools of nonlinear and complexity science
The Role of Practice in Chess: A Longitudinal Study
We investigated the role of practice in the acquisition of chess expertise by submitting a questionnaire to 104 players of different skill levels. Players had to report their chess rating, the number of hours of individual and group practice, their use of different learning resources and activities, and whether they had been trained by a coach. The use of archival data enabled us to track the rating of some of the players throughout their career. We found that there was a strong correlation between chess skill and number of hours of practice. Moreover, group practice was a better predictor of high-level performance than individual practice. We also found that masters had a higher chess rating than expert players after only three years of serious dedication to chess, although there were no differences in the number of hours of practice. The difference that may explain the variation in rating is that masters start practising at an earlier age than experts. Finally, we found that activities such as reading books and using computer software (game databases, but not playing programs) were important for the development of high-level performance. Together with previous data and theories of expert performance, our results indicate limits in the deliberate practice framework and make suggestions on how best to carry out learning in chess and in other fields
Language Time Series Analysis
We use the Detrended Fluctuation Analysis (DFA) and the Grassberger-Proccacia
analysis (GP) methods in order to study language characteristics. Despite that
we construct our signals using only word lengths or word frequencies, excluding
in this way huge amount of information from language, the application of
Grassberger- Proccacia (GP) analysis indicates that linguistic signals may be
considered as the manifestation of a complex system of high dimensionality,
different from random signals or systems of low dimensionality such as the
earth climate. The DFA method is additionally able to distinguish a natural
language signal from a computer code signal. This last result may be useful in
the field of cryptography.Comment: 21 pages, 5 figures, accepted in Physica
Probing quantum-classical boundary with compression software
We experimentally demonstrate that it is impossible to simulate quantum
bipartite correlations with a deterministic universal Turing machine. Our
approach is based on the Normalized Information Distance (NID) that allows the
comparison of two pieces of data without detailed knowledge about their origin.
Using NID, we derive an inequality for output of two local deterministic
universal Turing machines with correlated inputs. This inequality is violated
by correlations generated by a maximally entangled polarization state of two
photons. The violation is shown using a freely available lossless compression
program. The presented technique may allow to complement the common statistical
interpretation of quantum physics by an algorithmic one.Comment: 7 pages, 6 figure
Professional self-efficacy scale for information and computer technology teachers: validity and reliability study
This study aims at developing a valid and reliable scale to measure information and communication technology (ICT) teachers' self-efficacy related to the Turkish national framework of ICT competencies. For statistical procedures, data were respectively analyzed with exploratory factor analysis (EFA) and confirmatory factor analysis (CFA). Furthermore, test-retest procedure was carried out to confirm the time invariance of the scale. EFA results revealed that the scale's seven-factor structure accounts for 65.90 percent of total variance. CFA results produced an acceptable statistical support for model-data fit between the observed item scores and the seven-dimension scale structure (X-2/df = 1.98, RMSEA = .073, CFI = .86). The standardized regression weights between the latent and observed variables ranged from .57 to .89 and Cronbach's alpha coefficient of the scale sub-dimensions ranged from .80 to .88. Besides, the item-scale correlations varied between values of .53 and .79. As a result, the developed scale is a likert questionnaire and composed of 33 five-point items with seven sub-dimensions
- …