311 research outputs found

    The use of ash at Late Lower Paleolithic Qesem Cave, Israel—An integrated study of use-wear and residue analysis

    Get PDF
    Employing an integrated approach to investigate the use of Late Lower Paleolithic flint tools found at the site of Qesem Cave (Israel), we revealed a particular trace pattern related to the employment of ashes at the site. Using a designated collection of replica items and combining use-wear and residue (morphological analysis, FTIR, SEM-EDX) analyses, we revealed the intentional use of ashes in preserving foods for delayed consumption as well as hide for delayed processing. Our interpretation, we believe is the most plausible one since we were able to delineate the specific use-wear fingerprints of the intentional use of ashes for such purposes, suggesting that our approach might be useful for the recognition of other similar functional-behavioral patterns. Lastly, in support of previous findings at Qesem Cave, our current findings present evidence for the processing of organic matters intentionally mixed with ash, leading us to suggest that the inhabitants of Qesem Cave were proficient not only in the habitual use of fire but also of its main by-product, ash. Hence, we call for a reassessment of the timeline currently assigned to hominins’ utilization of ash for storing and processing vegetal foods and hide

    The Chickpea, Summer Cropping, and a New Model for Pulse Domestication in the Ancient Near East

    Get PDF
    The widely accepted models describing the emergence of domesticated grain crops from their wild type ancestors are mostly based upon selection (conscious or unconscious) of major features related either to seed dispersal (nonbrittle ear, indehiscent pod) or free germination (nondormant seeds, soft seed coat). Based on the breeding systems (self-pollination) and dominance relations between the allelomorphs of seed dispersal mode and seed dormancy, it was postulated that establishment of the domesticated forms and replacement of the wild ancestral populations occurred in the Near East within a relatively short time. Chickpea (Cicer arietinum L.), however, appears as an exception among all other “founder crops” of Old World agriculture because of its ancient conversion into a summer crop. The chickpea is also exceptional because its major domestication trait appears to be vernalization insensitivity rather than pod indehiscence or free germination. Moreover, the genetic basis of vernalization response in wild chickpea (Cicer reticulatum Ladiz.) is polygenic, suggesting that a long domestication process was imperative due to the elusive phenotype of vernalization nonresponsiveness. There is also a gap in chickpea remains in the archaeological record between the Late Prepottery Neolithic and the Early Bronze Age. Contrary to the common view that Levantine summer cropping was introduced relatively late (Early Bronze Age), we argue for an earlier (Neolithic) Levantine origin of summer cropping because chickpea, when grown as a common winter crop, was vulnerable to the devastating pathogen Didymella rabiei, the causal agent of Ascochyta blight. The ancient (Neolithic) conversion of chickpea into a summer crop required seasonal differentiation of agronomic operation from the early phases of the Neolithic revolution. This topic is difficult to deal with, as direct data on seasonality in prehistoric Old World field crop husbandry are practically nonexistent. Consequently, this issue was hardly dealt with in the literature. Information on the seasonality of ancient (Neolithic, Chalcolithic, and Early Bronze Age, calibrated 11,500 to 4,500 years before present) Near Eastern agriculture may improve our understanding of the proficiency of early farmers. This in turn may provide a better insight into Neolithic agrotechniques and scheduling. It is difficult to fully understand chickpea domestication without a Neolithic seasonal differentiation of agronomic practice because the rapid establishment of the successful Near Eastern crop package which included wheats, barley, pea, lentil, vetches, and flax, would have preempted the later domestication of this rare wild legume

    A blade for all seasons? Making and using Amudian blades at Qesem Cave, Israel

    Get PDF
    The Qesem Cave prehistoric sequence consists of one dominant lithic industry - the Amudian, a part of the Acheulo-Yabrudian (late Lower Paleolithic) cultural complex. The Acheulo-Yabrudian complex comprises three major lithic industries - Acheulo-Yabrudian, Yabrudian and Pre-Aurignacian/Amudian. While the first two industries are dominated by typical Lower Paleolithic lithic traditions, namely flake production technologies, hand-axes and scrapers, the Amudian presents an innovative blade industry. This relatively poorly known industry is of importance being stratigraphically situated between the Lower Paleolithic Acheulian and the Middle Paleolithic Mousterian. The available radiometric dates for this entity indicate a range from ca. 400 to about 200 kyr. The Amudian in the Levant is characterized by systematic blade production and a major component of shaped blades. At Qesem Cave the majority of the lithic artefacts belong to the Amudian industry with distinctive blade-dominated assemblages throughout a stratigraphic sequence of 7.5 meters. During the 2006 excavation season a scraper-dominated Yabrudian assemblage was discovered, indicating variability and more complex human behaviour at the cave rather than specialized blade-related activities only. The Amudian at Qesem Cave is a very early blade production industry and it reflects technological choices of the artisans as well as specific modes of resource exploitation and subsistence activities. This paper will summarize the current state of research on the Qesem cave lithic assemblages, focusing on the composition of the rich Amudian assemblages, the reconstruction of Amudian blade production and the functional interpretation of Amudian blades. A short survey of the new Yabrudian assemblage will be provided as well. We finally discuss interpretations of Acheulo-Yabrudian lithic variability and the meaning of late Lower Paleolithic blade production as a technological, functional and cultural phenomenon.The Qesem Cave prehistoric sequence consists of one dominant lithic industry - the Amudian, a part of the Acheulo-Yabrudian (late Lower Paleolithic) cultural complex. The Acheulo-Yabrudian complex comprises three major lithic industries - Acheulo-Yabrudian, Yabrudian and Pre-Aurignacian/Amudian. While the first two industries are dominated by typical Lower Paleolithic lithic traditions, namely flake production technologies, hand-axes and scrapers, the Amudian presents an innovative blade industry. This relatively poorly known industry is of importance being stratigraphically situated between the Lower Paleolithic Acheulian and the Middle Paleolithic Mousterian. The available radiometric dates for this entity indicate a range from ca. 400 to about 200 kyr. The Amudian in the Levant is characterized by systematic blade production and a major component of shaped blades. At Qesem Cave the majority of the lithic artefacts belong to the Amudian industry with distinctive blade-dominated assemblages throughout a stratigraphic sequence of 7.5 meters. During the 2006 excavation season a scraper-dominated Yabrudian assemblage was discovered, indicating variability and more complex human behaviour at the cave rather than specialized blade-related activities only. The Amudian at Qesem Cave is a very early blade production industry and it reflects technological choices of the artisans as well as specific modes of resource exploitation and subsistence activities

    Differences in Multitask Resource Reallocation After Change in Task Values

    Get PDF
    International audienceObjective The objective was to characterize multitask resource reallocation strategies when managing subtasks with various assigned values.Background When solving a resource conflict in multitasking, Salvucci and Taatgen predict a globally rational strategy will be followed that favors the most urgent subtask and optimizes global performance. However, Katidioti and Taatgen identified a locally rational strategy that optimizes only a subcomponent of the whole task, leading to detrimental consequences on global performance. Moreover, the question remains open whether expertise would have an impact on the choice of the strategy.Method We adopted a multitask environment used for pilot selection with a change in emphasis on two out of four subtasks while all subtasks had to be maintained over a minimum performance. A laboratory eye-tracking study contrasted 20 recently selected pilot students considered as experienced with this task and 15 university students considered as novices.Results When two subtasks were emphasized, novices focused their resources particularly on one high-value subtask and failed to prevent both low-value subtasks falling below minimum performance. On the contrary, experienced people delayed the processing of one low-value subtask but managed to optimize global performance.Conclusion In a multitasking environment where some subtasks are emphasized, novices follow a locally rational strategy whereas experienced participants follow a globally rational strategy.Application During complex training, trainees are only able to adjust their resource allocation strategy to subtask emphasis changes once they are familiar with the multitasking environment

    Striatal Volume Predicts Level of Video Game Skill Acquisition

    Get PDF
    Video game skills transfer to other tasks, but individual differences in performance and in learning and transfer rates make it difficult to identify the source of transfer benefits. We asked whether variability in initial acquisition and of improvement in performance on a demanding video game, the Space Fortress game, could be predicted by variations in the pretraining volume of either of 2 key brain regions implicated in learning and memory: the striatum, implicated in procedural learning and cognitive flexibility, and the hippocampus, implicated in declarative memory. We found that hippocampal volumes did not predict learning improvement but that striatal volumes did. Moreover, for the striatum, the volumes of the dorsal striatum predicted improvement in performance but the volumes of the ventral striatum did not. Both ventral and dorsal striatal volumes predicted early acquisition rates. Furthermore, this early-stage correlation between striatal volumes and learning held regardless of the cognitive flexibility demands of the game versions, whereas the predictive power of the dorsal striatal volumes held selectively for performance improvements in a game version emphasizing cognitive flexibility. These findings suggest a neuroanatomical basis for the superiority of training strategies that promote cognitive flexibility and transfer to untrained tasks.United States. Office of Naval Research (grant number N00014-07-1-0903

    Watching People Making Decisions: A Gogglebox on Online Consumer Interaction

    Get PDF
    This paper presents a research study, using eye tracking technology, to measure participant cognitive load when encountering micro-decision. It elaborates and improves on a pilot study that was used to test the experiment design. Prior research that led to a taxonomy of decision constructs faced in online transactional processes is discussed. The main findings relate to participants’ subjective cognitive load and task error rates

    Simulation Training in U.K. General Aviation: An Undervalued Aid to Reducing Loss of Control Accidents

    Get PDF
    Analysis of data from 1,007 U.K. general aviation (GA) accidents demonstrates the predominant cause of accidents is loss of control, exacerbated by a lack of recent flying experience. These are long-standing problems that can be targeted effectively with simulation training. Discussion on training strategies in commercial aviation reinforces the logic of introducing simulation training for the GA pilot. Conclusions drawn affirm the notion that GA safety would benefit from implementation of regulated simulation training

    Tactile Stimulation of the Human Head for Information Display

    Get PDF
    A series of three studies was conducted to explore the use of tactile stimulation or light tapping of the human head to inform a pilot of possible threats or other situations in the flight environment. Study I confirmed that subjects could achieve 100% detection of the tactile stimuli. Localization performance, measured in Study 2, depended on the number of different stimulus sites and ranged from 93% accuracy for 6 sites to 47% accuracy for 12 sites across the parietal meridian of the head. In Study 3 we investigated the effect of performing the localization task simultaneously with a dual memory/tracking task or an air combat simulation task. These studies demonstrated that tactile information display could be an integral contributor to improved situation awareness, but not without cost to other task performance. The results of Study 3 were also examined with reference to popular models of attention and workload.Yeshttps://us.sagepub.com/en-us/nam/manuscript-submission-guideline

    No-go trials can modulate switch cost by interfering with effects of task preparation

    Get PDF
    It has recently been shown that the cost associated with switching tasks is eliminated following ‘no-go’ trials, in which response selection is not completed, suggesting that the switch cost depends on response selection. However, no-go trials may also affect switch costs by interfering with the effects of task preparation that precede response selection. To test this hypothesis we evaluated switch costs following standard go trials with those following two types of non-response trials: no-go trials, for which a stimulus is presented that indicates no response should be made (Experiment 1); and cue-only trials in which no stimulus is presented following the task cue (Experiment 2). We hypothesized that eliminating no-go stimuli would reveal effects of task preparation on the switch cost in cue-only trials. We found no switch cost following no-go trials (Experiment 1), but a reliable switch cost in cue-only trials (i.e., when no-go stimuli were removed; Experiment 2). We conclude that no-go trials can modulate the switch cost, independent of their effect on response selection, by interfering with task preparation, and that the effects of task preparation on switch cost are more directly assessed by cue-only trials
    • 

    corecore