497,519 research outputs found

    Exploring the impact of technological competence development on speed and NPD program performance

    Get PDF
    With growing levels of competition across industries, technological competence is increasingly viewed as crucial for businesses to maintain their long-term competitive advantage. Although there are many theoretical arguments about how firms' competences can yield competitive advantage and performance improvement, we have a limited understanding of where the capabilities originate in the context of NPD or what kind of product portfolios, internal climate and strategic alignment are required to build them. Moreover, empirical evidence for technological competence development is limited and comes primarily from case studies, anecdotal evidence, and management impressions. Accordingly, this research addresses these gaps by presenting and testing a conceptual model of technological competence development in NPD. This study makes advances in applying a dynamic capability approach to technological competence development in NPD, and investigates the impact of innovative climate, technological alignment, and project portfolio management on technological competence development as well as NPD speed. Moreover, the factors that might influence NPD program performance are also investigated. The analysis, based on data collected from 164 firms, shows that a firm's innovative climate, technological alignment and portfolio management are positively associated with technological competence development. While technological alignment was found to be negatively related to NPD speed, portfolio management and technological competence development were found to have positive effects on speed. However, innovative climate had no significant impact on speed. Moreover, technological competence development and portfolio management were found to be positively related to NPD program performance. Finally, the authors found no support for the relationship between speed and NPD program performance

    Grammar Learning Strategies and Language Attainment: Seeking a Relationship

    Get PDF
    Despite major advances in research on language learning strategies, there are still areas that have received only scant attention, and one of them is undoubtedly learning grammar. The paper contributes to the paucity of empirical investigations in this domain by presenting the findings of a study which sought to investigate the relationship between the use of grammar learning strategies (GLS) reported by 142 English Department students and target language attainment, operationalized as their performance in a practical grammar course and the end-of-the-year examination. Information about GLS use was obtained by means of a tool that was designed on the basis of a theoretical scheme proposed by Oxford, Rang Lee and Park (2007) in which GLS are divided into three categories depending on whether they represent implicit learning with focus on form, explicit inductive learning and explicit deductive learning. The analysis failed to find a strong positive relationship between the use of GLS and achievement, irrespective of the level of the BA program, or statistically significant differences in this respect between lower-level and higher-level participants. The highest, albeit very weak, correlation was identified between the use of GLS associated with explicit deductive learning and grammar course grades, which testifies to the traditional nature of instruction the subjects receive. The findings serve as a basis for putting forward a handful of recommendations for learning, teaching and testing grammar as well as directions for future studies into grammar learning strategies

    From Pledge-Fulfilment to Mandate-Fulfllment: An Empirical Theory

    Get PDF
    The article presents a theoretical synthesis that could serve as the conceptual framework for empirical studies of the fulfilment of electoral pledges in modern democracies. Studies related to the program-to-policy linkage derived their hypotheses, for the most part, from an implicit, common sense model of mandate theory. The article presents a realistic version of positive mandate theory, one that is stripped of its normative assumptions and is suitable for empirical testing. It is informed by five theoretical building blocks: the concept of the binding mandate, the party theory of representation, the doctrine of responsible party government, modern normative mandate theory and the conceptual pair of delegation and mandate. The resulting framework incorporates the information content of the campaigns, the definiteness of the authorization and the strength of pledge enactment as its core components.</jats:p

    Facilitating Transformation in Workforce Training: Using Clinical Theory to Understand Psychological Self-Sufficiency

    Get PDF
    Acknowledging the scarcity of a bottom up social work practice model in facilitating the development of success in workforce development programs, this study explores Psychological self-sufficiency (PSS) as an emerging social work practice theory. Phenomenological studies of low-income jobseekers in employment training along with the empirical validation of measures of the core constructs of PSS – employment hope scale (EHS) and perceived employment barrier scale (PEBS) – and testing of the theoretical model resulted in the emergence of a new theory of PSS. PSS was conceptually defined as a dynamic and internal drive that activates the process of transforming cognitively and affectively perceived barriers into hope driven action – the process that enables individuals to move forward toward goals. Based on the evidence of PSS, a participant-centered group intervention model called transforming impossible into possible (TIP) program was developed. This article delineates the trajectory of PSS theory development by critically reviewing various streams of practice theories influencing the PSS theory. Next, the conditions that necessitated the creation of the TIP program and its core principles underlying the functions of PSS are explained. By depicting the TIP program with direct quotes of clients’ experiences, authors exemplify the successful self-discovery process through enhanced PSS skills as a result of participating in the TIP program

    A Substruction Approach to Assessing the Theoretical Validity of Measures

    Get PDF
    Background Validity is about the logic, meaningfulness, and evidence used to defend inferences made when interpreting results. Substruction is a heuristic or process that visually represent the hierarchical structure between theory and measures. Purpose To describe substruction as a method for assessing the toretical validity of research measures. Methods Using Fawcett\u27s Conceptual-Theoretical-Empirical Structure. an exemplar is presented of substruction from the Individual and Family Self-Management Theory to the Striving to be strong study concepts and empirical measures. Results Substruction tables display evidence supporting theoretical validity of the instruments used in the study. Conclusion A high degree of congruence between theory and measure is critical to support the validity of the theory and to support attributions made about moderating, mediating, causal relationships, and intervention effects

    Size, power and false discovery rates

    Full text link
    Modern scientific technology has provided a new class of large-scale simultaneous inference problems, with thousands of hypothesis tests to consider at the same time. Microarrays epitomize this type of technology, but similar situations arise in proteomics, spectroscopy, imaging, and social science surveys. This paper uses false discovery rate methods to carry out both size and power calculations on large-scale problems. A simple empirical Bayes approach allows the false discovery rate (fdr) analysis to proceed with a minimum of frequentist or Bayesian modeling assumptions. Closed-form accuracy formulas are derived for estimated false discovery rates, and used to compare different methodologies: local or tail-area fdr's, theoretical, permutation, or empirical null hypothesis estimates. Two microarray data sets as well as simulations are used to evaluate the methodology, the power diagnostics showing why nonnull cases might easily fail to appear on a list of ``significant'' discoveries.Comment: Published in at http://dx.doi.org/10.1214/009053606000001460 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Philosophical Commitments, Empirical Evidence, and Theoretical Psychology

    Get PDF
    The philosophical or theoretical commitments informing psychological research are sometimes characterized, even by theoretical psychologists themselves, as nonempirical, outside the bounds of methodological consideration, and/or nonrational. We argue that this characterization is incoherent. We illustrate our concern by analogy with problematic appeals to Kuhn’s work that have been influential in theoretical psychology. Following the contemporary pragmatist tradition, we argue that our philosophical/theoretical commitments are part of our larger webs of belief, and that for any of these beliefs to have meaning their content must be informed by our practical engagement with the world, i.e., they are based on empirical evidence, broadly construed. It is this empirical basis that allows us to recognize our commitments at all and rationally to assess and criticize them when necessary. We conclude by demonstrating a rational assessment of the philosophical/theoretical commitments underlying a recent study in the social psychology of religion

    The role of falsification in the development of cognitive architectures: insights from a Lakatosian analysis

    Get PDF
    It has been suggested that the enterprise of developing mechanistic theories of the human cognitive architecture is flawed because the theories produced are not directly falsifiable. Newell attempted to sidestep this criticism by arguing for a Lakatosian model of scientific progress in which cognitive architectures should be understood as theories that develop over time. However, Newell’s own candidate cognitive architecture adhered only loosely to Lakatosian principles. This paper reconsiders the role of falsification and the potential utility of Lakatosian principles in the development of cognitive architectures. It is argued that a lack of direct falsifiability need not undermine the scientific development of a cognitive architecture if broadly Lakatosian principles are adopted. Moreover, it is demonstrated that the Lakatosian concepts of positive and negative heuristics for theory development and of general heuristic power offer methods for guiding the development of an architecture and for evaluating the contribution and potential of an architecture’s research program
    corecore