84,036 research outputs found

    WEAK MEASUREMENT THEORY AND MODIFIED COGNITIVE COMPLEXITY MEASURE

    Get PDF
    Measurement is one of the problems in the area of software engineering. Since traditional measurement theory has a major problem in defining empirical observations on software entities in terms of their measured quantities, Morasca has tried to solve this problem by proposing Weak Measurement theory. In this paper, we tried to evaluate the applicability of weak measurement theory by applying it on a newly proposed Modified Cognitive Complexity Measure (MCCM). We also investigated the applicability of Weak Extensive Structure for deciding on the type of scale for MCCM. It is observed that the MCCM is on weak ratio scale

    The Effect of the Dynamics of Knowledge Base Complexity on Schumpeterian patterns of Innovation: the upstream petroleum industry

    Get PDF
    This paper addresses important changes in innovation patterns in the upstream petroleum industry over the period from the 1970s to 2005. It argues that the shifts in patterns of innovation over that period can be explained by the dynamics of knowledge base complexity (KBC). We develop a quantitative method to explore KBC and show that increasing KBC has shifted innovation patterns, from a broadly Schumpeter Mark I to a 'modified' form of Schumpeter Mark II, led less by the established oil majors, but by a new class of integrated service providers

    A theoretical framework for consumer Willingness to Adopt Novel

    Get PDF
    This study gives more insight in motives and barriers, i.e. positive and negative drivers, for European fruit consumption, as a basis to meet consumer requirements in developing new types of fruits and fruit products and to develop interventions. For that purpose, focus group discussions were held in Spain, Greece, Poland, and The Netherlands. Consistent with existing literature, healthiness, (sensory) pleasure, and (lack of) convenience emerged as major drivers of fruit consumption, with appearance, habit, and price as additional drivers. Talking about fruit, participants have fresh, unprocessed fruit in min

    Challenging the 'Law of diminishing returns'

    Get PDF
    [Abstract]: 'The Law of Diminishing Returns' (Spearman, 1927) states that the size of the average correlation between cognitive tasks tends to be relatively small in high ability groups and relatively high in low ability groups. Studies supporting this finding have tended to contrast very low ability subjects (IQ < 78) with subjects from higher ability ranges and to use tests that have poor discriminatory power among the higher ability levels. In the first study described in this paper, tasks that provide good discrimination among the higher ability levels were used. A sample of High ability (N = 25) and of Low ability (N = 20) 15-years old boys completed four single tests, two with low and two with high g saturations, and two competing tasks formed from these single tests. The results indicated that, contrary to the predictions of the Law of Diminishing Returns, the amount of common variance was greater in the High ability group. It is suggested that the Law of Diminishing Returns does not take into account the factor of task difficulty and that there are situations where the exact reverse of this law holds. A second study again compared correlations obtained with extreme groups (N=28 & N=29), this time on measures of Perceptual Speed, which are easy for all ability levels. Results indicated that correlations among the Perceptual Speed measures were the same for both groups. In neither of these studies was there any support for the Law, which seems to be dependent on the very high correlations obtained from samples at the extreme lower end of the ability continuum

    Effects of corrective feedback on EFL speaking task complexity in China’s university classroom

    Get PDF
    Corrective feedback (CF) and task complexity are two important pedagogical topics in second language acquisition research in recent years, but there is few research investigating effects of CF on speaking task complexity in China’s university classroom settings. This research, through conducting different versions of speaking task experiments among 24 university students in China, explores the effect of teachers’ CF on English as a Foreign Language (EFL) speaking task complexity. According to the analysis of first-hand data, this research finds CF has different effects on EFL oral production with different task complexity. In simple speaking task, the effects of five kinds of CF (from largest to smallest) are listed as follows: clarification quest, metalinguistic feedback, recast, repetition and confirmation check. Regarding complex speaking task, the effects of five categorized CF are ranked from largest to smallest as follows: metalinguistic feedback, confirmation check, recast, clarification request and repetition. Improving to provide CF in pedagogical practice is an important contribution to promote EFL speaking task, so, on the basis of above research results, appropriate ways and forms of providing CF are expected to promote efficiency of CF in EFL classroom under the context of Chinese university classroom

    The dissociation of subjective measures of mental workload and performance

    Get PDF
    Dissociation between performance and subjective workload measures was investigated in the theoretical framework of the multiple resources model. Subjective measures do not preserve the vector characteristics in the multidimensional space described by the model. A theory of dissociation was proposed to locate the sources that may produce dissociation between the two workload measures. According to the theory, performance is affected by every aspect of processing whereas subjective workload is sensitive to the amount of aggregate resource investment and is dominated by the demands on the perceptual/central resources. The proposed theory was tested in three experiments. Results showed that performance improved but subjective workload was elevated with an increasing amount of resource investment. Furthermore, subjective workload was not as sensitive as was performance to differences in the amount of resource competition between two tasks. The demand on perceptual/central resources was found to be the most salient component of subjective workload. Dissociation occurred when the demand on this component was increased by the number of concurrent tasks or by the number of display elements. However, demands on response resources were weighted in subjective introspection as much as demands on perceptual/central resources. The implications of these results for workload practitioners are described

    Evaluation Criteria for Object-oriented Metrics

    Get PDF
    In this paper an evaluation model for object-oriented (OO) metrics is proposed. We have evaluated the existing evaluation criteria for OO metrics, and based on the observations, a model is proposed which tries to cover most of the features for the evaluation of OO metrics. The model is validated by applying it to existing OO metrics. In contrast to the other existing criteria, the proposed model is simple in implementation and includes the practical and important aspects of evaluation; hence it suitable to evaluate and validate any OO complexity metric

    Neuroeconomics: infeasible and underdetermined

    Get PDF
    Advocates of neuroeconomics claim to offer the prospect of creating a “unified behavioral theory” by drawing upon the techniques of neuroscience and psychology and combining them with economic theory. Ostensibly, through the “direct measurement” of our thoughts, economics and social science will be “revolutionized.” Such claims have been subject to critique from mainstream and non-mainstream economists alike. Many of these criticisms relate to measurability, relevance, and coherence. In this article, we seek to contribute to this critical examination by investigating the potential of underdetermination, such as the statement that testing involves the conjunction of auxiliary assumptions, and that consequently it may not be possible to isolate the effect of any given hypothesis. We argue that neuroeconomics is especially sensitive to issues of underdetermination. Institutional economists should be cautious of neuroeconomists’ zeal as they appear to over-interpret experimental findings and, therefore, neuroeconomics may provide a false prospectus seeking to reinforce the nostrums of homo economicus

    Interest Points as a Focus Measure in Multi-Spectral Imaging

    Get PDF
    A novel multi-spectral focus measure that is based on algorithms for interest point detection, particularly on the FAST (Features from Accelerated Segment Test), Fast Hessian and Harris-Laplace detector, is described in this paper. The proposed measure methods are compared with commonly used focus measure techniques like energy of image gradient, sum-modified Laplacian, Tenenbaum's algorithm or spatial frequency when testing their reliability and performance. The measures have been tested on a newly created database containing 420 images acquired in visible, near-infrared and thermal spectrum (7 objects in each spectrum). Algorithms based on the interest point detectors proved to be good focus measures satisfying all the requirements described in the paper, especially in thermal spectrum. It is shown that these algorithms outperformed all commonly used methods in thermal spectrum and therefore can serve as a new and more accurate focus measure

    Anticipation and Risk – From the inverse problem to reverse computation

    Get PDF
    Abstract. Risk assessment is relevant only if it has predictive relevance. In this sense, the anticipatory perspective has yet to contribute to more adequate predictions. For purely physics-based phenomena, predictions are as good as the science describing such phenomena. For the dynamics of the living, the physics of the matter making up the living is only a partial description of their change over time. The space of possibilities is the missing component, complementary to physics and its associated predictions based on probabilistic methods. The inverse modeling problem, and moreover the reverse computation model guide anticipatory-based predictive methodologies. An experimental setting for the quantification of anticipation is advanced and structural measurement is suggested as a possible mathematics for anticipation-based risk assessment
    corecore