52,665 research outputs found
Nature of the low temperature ordering of Pr in PrBa_2Cu_3O_(6+x)
Theoretical model is presented to describe the anomalous ordered phase of Pr
ions in PrBa_2Cu_3O_(6+x) below T_Pr = 12-17 K. The model considers the Pr
multipole degrees of freedom and coupling between the Cu and Pr subsystems. We
identify the symmetry allowed coupling of Cu and Pr ions and conclude that only
an ab-plane Pr dipole ordering can explain the Cu spin rotation observed at
T_Pr by neutron diffraction by Boothroyd et al. [A. T. Boothroyd et al., Phys.
Rev. Lett. 78, 130 (1997)]. A substantial enhancement of the Pr ordering
temperature is shown to arise from the Cu-Pr coupling which is the key for the
anomalous magnetic behavior in PrBa_2Cu_3O_(6+x).Comment: 6 pages, 4 figure
Recommended from our members
Five seconds or sixty? Presentation time in expert memory
The template theory presented in Gobet and Simon (1996a, 1998) is based on the EPAM theory (Feigenbaum & Simon, 1984; Richman et al., 1995), including the numerical parameters that have been estimated in tests of the latter; and it therefore offers precise predictions for the timing of cognitive processes during the presentation and recall of chess positions. This paper describes the behavior of CHREST, a computer implementation of the template theory, in a task when the presentation time is systematically varied from one second to sixty seconds, on the recall of both game and random positions, and compares the model to human data. As predicted by the model, strong players are better than weak players with both types of positions. Their superiority with random positions is especially clear with long presentation times, but is also present after brief presentation times, although smaller in absolute value. CHREST accounts for the data, both qualitatively and quantitatively. Strong players’ superiority with random positions is explained by the large number of chunks they hold in LTM. Strong players’ high recall percentage with short presentation times is explained by the presence of templates, a special class of chunks. The model is compared to other theories of chess skill, which either cannot account for the superiority of Masters with random positions (models based on high-level descriptions and on levels of processing) or predict too strong a performance of Masters with random positions (long-term working memory)
Recall of rapidly presented random chess positions is a function of skill.
A widely cited result asserts that experts’ superiority over novices in recalling meaningful material from their domain of expertise vanishes when random material is used. A review of recent chess experiments where random positions served as control material (presentation time between 3 and 10 seconds) shows, however, that strong players generally maintain some superiority over weak players even with random positions, although the relative difference between skill levels is much smaller than with game positions. The implications of this finding for expertise in chess are discussed and the question of the recall of random material in other domains is raised
Recommended from our members
Recall of random and distorted positions: Implications for the theory of expertise.
This paper explores the question, important to the theory of expert performance, of the nature and number of chunks that chess experts hold in memory. It examines how memory contents determine players' abilities to reconstruct (a) positions from games, (b) positions distorted in various ways and (c) and random positions. Comparison of a computer simulation with a human experiment supports the usual estimate that chess Masters store some 50,000 chunks in memory. The observed impairment of recall when positions are modified by mirror image reflection, implies that each chunk represents a specific pattern of pieces in a specific location. A good account of the results of the experiments is given by the template theory proposed by Gobet and Simon (in press) as an extension of Chase and Simon's (1973a) initial chunking proposal, and in agreement with other recent proposals for modification of the chunking theory (Richman, Staszewski & Simon, 1995) as applied to various recall tasks
Recommended from our members
The Roles of recognition processes and look-ahead search in time-constrained expert problem solving: Evidence from grandmaster level chess.
Chess has long served as an important standard task environment for research on human memory and problem-solving abilities and processes. In this paper, we report evidence on the relative importance of recognition processes and planning (look-ahead) processes in very high level expert performance in chess. The data show that the rated skill of a top-level grandmaster is only slightly lower when he is playing simultaneously against a half dozen grandmaster opponents than under tournament conditions that allow much more time for each move. As simultaneous play allows little time for look-ahead processes, the data indicate that recognition, based on superior chess knowledge, plays a much larger part in high-level skill in this task than does planning by looking ahead
Core drill's bit is replaceable without withdrawal of drill stem - A concept
Drill bit is divided into several sectors. When collapsed, the outside diameter is forced down the drill stem, when it reaches bottom the sectors are forced outward and form a cutting bit. A dulled bit is retracted by reversal of this procedure
Logahedra: A new weakly relational domain
Weakly relational numeric domains express restricted classes of linear inequalities that strike a balance between what can be described and what can be efficiently computed. Popular weakly relational domains such as bounded differences and octagons have found application in model checking and abstract interpretation. This paper introduces logahedra, which are more expressiveness than octagons, but less expressive than arbitrary systems of two variable per inequality constraints. Logahedra allow coefficients of inequalities to be powers of two whilst retaining many of the desirable algorithmic properties of octagons
WMTrace : a lightweight memory allocation tracker and analysis framework
The diverging gap between processor and memory performance has been a well discussed aspect of computer architecture literature for some years. The use of multi-core processor designs has, however, brought new problems to the design of memory architectures - increased core density without matched improvement in memory capacity is reduc- ing the available memory per parallel process. Multiple cores accessing memory simultaneously degrades performance as a result of resource con- tention for memory channels and physical DIMMs. These issues combine to ensure that memory remains an on-going challenge in the design of parallel algorithms which scale. In this paper we present WMTrace, a lightweight tool to trace and analyse memory allocation events in parallel applications. This tool is able to dynamically link to pre-existing application binaries requiring no source code modification or recompilation. A post-execution analysis stage enables in-depth analysis of traces to be performed allowing memory allocations to be analysed by time, size or function. The second half of this paper features a case study in which we apply WMTrace to five parallel scientific applications and benchmarks, demonstrating its effectiveness at recording high-water mark memory consumption as well as memory use per-function over time. An in-depth analysis is provided for an unstructured mesh benchmark which reveals significant memory allocation imbalance across its participating processes
Modeling of the heat transfer in bypass transitional boundary-layer flows
A low Reynolds number k-epsilon turbulence model and conditioned momentum, energy and turbulence equations were used to predict bypass transition heat transfer on a flat plate in a high-disturbance environment with zero pressure gradient. The use of conditioned equations was demonstrated to be an improvement over the use of the global-time-averaged equations for the calculation of velocity profiles and turbulence intensity profiles in the transition region of a boundary layer. The approach of conditioned equations is extended to include heat transfer and a modeling of transition events is used to predict transition onset and the extent of transition on a flat plate. The events, which describe the boundary layer at the leading edge, result in boundary-layer regions consisting of: (1) the laminar, (2) pseudolaminar, (3) transitional, and (4) turbulent boundary layers. The modeled transition events were incorporated into the TEXSTAN 2-D boundary-layer code which is used to numerically predict the heat transfer. The numerical predictions in general compared well with the experimental data and revealed areas where additional experimental information is needed
- …