88,804 research outputs found

    Using protocol analysis to explore the creative requirements engineering process

    Full text link
    Protocol analysis is an empirical method applied by researchers in cognitive psychology and behavioural analysis. Protocol analysis can be used to collect, document and analyse thought processes by an individual problem solver. In general, research subjects are asked to think aloud when performing a given task. Their verbal reports are transcribed and represent a sequence of their thoughts and cognitive activities. These verbal reports are analysed to identify relevant segments of cognitive behaviours by the research subjects. The analysis results may be cross-examined (or validated through retrospective interviews with the research subjects). This paper offers a critical analysis of this research method, its approaches to data collection and analysis, strengths and limitations, and discusses its use in information systems research. The aim is to explore the use of protocol analysis in studying the creative requirements engineering process.<br /

    Best practices for HPM-assisted performance engineering on modern multicore processors

    Full text link
    Many tools and libraries employ hardware performance monitoring (HPM) on modern processors, and using this data for performance assessment and as a starting point for code optimizations is very popular. However, such data is only useful if it is interpreted with care, and if the right metrics are chosen for the right purpose. We demonstrate the sensible use of hardware performance counters in the context of a structured performance engineering approach for applications in computational science. Typical performance patterns and their respective metric signatures are defined, and some of them are illustrated using case studies. Although these generic concepts do not depend on specific tools or environments, we restrict ourselves to modern x86-based multicore processors and use the likwid-perfctr tool under the Linux OS.Comment: 10 pages, 2 figure

    Performance Analysis of a Novel GPU Computation-to-core Mapping Scheme for Robust Facet Image Modeling

    Get PDF
    Though the GPGPU concept is well-known in image processing, much more work remains to be done to fully exploit GPUs as an alternative computation engine. This paper investigates the computation-to-core mapping strategies to probe the efficiency and scalability of the robust facet image modeling algorithm on GPUs. Our fine-grained computation-to-core mapping scheme shows a significant performance gain over the standard pixel-wise mapping scheme. With in-depth performance comparisons across the two different mapping schemes, we analyze the impact of the level of parallelism on the GPU computation and suggest two principles for optimizing future image processing applications on the GPU platform

    Self-stabilizing K-out-of-L exclusion on tree network

    Get PDF
    In this paper, we address the problem of K-out-of-L exclusion, a generalization of the mutual exclusion problem, in which there are ℓ\ell units of a shared resource, and any process can request up to k\mathtt k units (1≤k≤ℓ1\leq\mathtt k\leq\ell). We propose the first deterministic self-stabilizing distributed K-out-of-L exclusion protocol in message-passing systems for asynchronous oriented tree networks which assumes bounded local memory for each process.Comment: 15 page

    What is Strategic Competence and Does it Matter? Exposition of the Concept and a Research Agenda

    Get PDF
    Drawing on a range of theoretical and empirical insights from strategic management and the cognitive and organizational sciences, we argue that strategic competence constitutes the ability of organizations and the individuals who operate within them to work within their cognitive limitations in such a way that they are able to maintain an appropriate level of responsiveness to the contingencies confronting them. Using the language of the resource based view of the firm, we argue that this meta-level competence represents a confluence of individual and organizational characteristics, suitably configured to enable the detection of those weak signals indicative of the need for change and to act accordingly, thereby minimising the dangers of cognitive bias and cognitive inertia. In an era of unprecedented informational burdens and instability, we argue that this competence is central to the longer-term survival and well being of the organization. We conclude with a consideration of the major scientific challenges that lie ahead, if the ideas contained within this paper are to be validated

    Life of occam-Pi

    Get PDF
    This paper considers some questions prompted by a brief review of the history of computing. Why is programming so hard? Why is concurrency considered an “advanced” subject? What’s the matter with Objects? Where did all the Maths go? In searching for answers, the paper looks at some concerns over fundamental ideas within object orientation (as represented by modern programming languages), before focussing on the concurrency model of communicating processes and its particular expression in the occam family of languages. In that focus, it looks at the history of occam, its underlying philosophy (Ockham’s Razor), its semantic foundation on Hoare’s CSP, its principles of process oriented design and its development over almost three decades into occam-? (which blends in the concurrency dynamics of Milner’s ?-calculus). Also presented will be an urgent need for rationalisation – occam-? is an experiment that has demonstrated significant results, but now needs time to be spent on careful review and implementing the conclusions of that review. Finally, the future is considered. In particular, is there a future

    From intersubjectivity to interculturalism in digital learning environments

    Get PDF
    The paper presents the work of the research program “Studies on\ud Intermediality as Intercultural Mediation” a joint international venture that seeks\ud to provide blended-learning -both online and in-classroom- methodologies for the\ud development of interculturalism and associated emotional empathic responses\ud through the study of art and literary fiction.1\ud Technological development is consistent with human desire to draw on\ud previous information and experiences in order to apply acquired knowledge to\ud present life conditions and, furthermore, make improvements for the future.\ud Therefore, it is logical that human agentive consciousness has been directed\ud towards encouraging action at a distance by all possible means. The evolution in\ud media technologies bears witness to this fact.\ud This paper explores the paradoxes behind the growing emphasis on spatial\ud metaphors during the 20th-century and a dynamic concept of space as the site of\ud relational constructions where forms and structural patterns become formations\ud constructed in interaction, and where the limit or border becomes a constitutive\ud feature, immanently connected with the possibility of its transgression. The paper\ud contends that the development of mass media communication, and particularly the\ud digital turn, has dramatically impacted on topographical spaces, both sociocultural and individual, and that the emphasis on „inter‟ perspectives, hybridism,\ud ambiguities, differences and meta-cognitive articulations of awareness of limits\ud and their symbolic representations, and the desire either to transgress limits or to\ud articulate „in-between‟, intercultural „third spaces‟, etc. are symptomatic of\ud structural problems at the spatial-temporal interface of culture and its\ud representations. Finally, the paper brings into attention research on the\ud neuroscientific basis of intersubjectivity in order to point out the material basis of\ud human knowledge and cognition and its relationship to the archiving of historical\ud memory and information transfer through education. It also offers and brief\ud introduction to the dynamics of SIIM
    • …
    corecore