219,034 research outputs found

    How Scale Affects Structure in Java Programs

    Full text link
    Many internal software metrics and external quality attributes of Java programs correlate strongly with program size. This knowledge has been used pervasively in quantitative studies of software through practices such as normalization on size metrics. This paper reports size-related super- and sublinear effects that have not been known before. Findings obtained on a very large collection of Java programs -- 30,911 projects hosted at Google Code as of Summer 2011 -- unveils how certain characteristics of programs vary disproportionately with program size, sometimes even non-monotonically. Many of the specific parameters of nonlinear relations are reported. This result gives further insights for the differences of "programming in the small" vs. "programming in the large." The reported findings carry important consequences for OO software metrics, and software research in general: metrics that have been known to correlate with size can now be properly normalized so that all the information that is left in them is size-independent.Comment: ACM Conference on Object-Oriented Programming, Systems, Languages and Applications (OOPSLA), October 2015. (Preprint

    Agile, Web Engineering and Capability Maturity ModelI ntegration : A systematic literature review

    Get PDF
    Context Agile approaches are an alternative for organizations developing software, particularly for those who develop Web applications. Besides, CMMI (Capability Maturity Model Integration) models are well-established approaches focused on assessing the maturity of an organization that develops software. Web Engineering is the field of Software Engineering responsible for analyzing and studying the specific characteristics of the Web. The suitability of an Agile approach to help organizations reach a certain CMMI maturity level in Web environments will be very interesting, as they will be able to keep the ability to quickly react and adapt to changes as long as their development processes get mature. Objective This paper responds to whether it is feasible or not, for an organization developing Web systems, to achieve a certain maturity level of the CMMI-DEV model using Agile methods. Method The proposal is analyzed by means of a systematic literature review of the relevant approaches in the field, defining a characterization schema in order to compare them to introduce the current state-of-the-art. Results The results achieved after the systematic literature review are presented, analyzed and compared against the defined schema, extracting relevant conclusions for the different dimensions of the problem: compatibility, compliance, experience, maturity and Web. Conclusion It is concluded that although the definition of an Agile approach to meet the different CMMI maturity levels goals could be possible for an organization developing Web systems, there is still a lack of detailed studies and analysis on the field

    Empirical modelling principles to support learning in a cultural context

    Get PDF
    Much research on pedagogy stresses the need for a broad perspective on learning. Such a perspective might take account (for instance) of the experience that informs knowledge and understanding [Tur91], the situation in which the learning activity takes place [Lav88], and the influence of multiple intelligences [Gar83]. Educational technology appears to hold great promise in this connection. Computer-related technologies such as new media, the internet, virtual reality and brain-mediated communication afford access to a range of learning resources that grows ever wider in its scope and supports ever more sophisticated interactions. Whether educational technology is fulfilling its potential in broadening the horizons for learning activity is more controversial. Though some see the successful development of radically new educational resources as merely a matter of time, investment and engineering, there are also many critics of the trends in computer-based learning who see little evidence of the greater degree of human engagement to which new technologies aspire [Tal95]. This paper reviews the potential application to educational technology of principles and tools for computer-based modelling that have been developed under the auspices of the Empirical Modelling (EM) project at Warwick [EMweb]. This theme was first addressed at length in a previous paper [Bey97], and is here revisited in the light of new practical developments in EM both in respect of tools and of model-building that has been targetted at education at various levels. Our central thesis is that the problems of educational technology stem from the limitations of current conceptual frameworks and tool support for the essential cognitive model building activity, and that tackling these problems requires a radical shift in philosophical perspective on the nature and role of empirical knowledge that has significant practical implications. The paper is in two main sections. The first discusses the limitations of the classical computer science perspective where educational technology to support situated learning is concerned, and relates the learning activities that are most closely associated with a cultural context to the empiricist perspective on learning introduced in [Bey97]. The second outlines the principles of EM and describes and illustrates features of its practical application that are particularly well-suited to learning in a cultural setting

    Designing as Construction of Representations: A Dynamic Viewpoint in Cognitive Design Research

    Get PDF
    This article presents a cognitively oriented viewpoint on design. It focuses on cognitive, dynamic aspects of real design, i.e., the actual cognitive activity implemented by designers during their work on professional design projects. Rather than conceiving de-signing as problem solving - Simon's symbolic information processing (SIP) approach - or as a reflective practice or some other form of situated activity - the situativity (SIT) approach - we consider that, from a cognitive viewpoint, designing is most appropriately characterised as a construction of representations. After a critical discussion of the SIP and SIT approaches to design, we present our view-point. This presentation concerns the evolving nature of representations regarding levels of abstraction and degrees of precision, the function of external representations, and specific qualities of representation in collective design. Designing is described at three levels: the organisation of the activity, its strategies, and its design-representation construction activities (different ways to generate, trans-form, and evaluate representations). Even if we adopt a "generic design" stance, we claim that design can take different forms depending on the nature of the artefact, and we propose some candidates for dimensions that allow a distinction to be made between these forms of design. We discuss the potential specificity of HCI design, and the lack of cognitive design research occupied with the quality of design. We close our discussion of representational structures and activities by an outline of some directions regarding their functional linkages

    Sequential Monte Carlo pricing of American-style options under stochastic volatility models

    Full text link
    We introduce a new method to price American-style options on underlying investments governed by stochastic volatility (SV) models. The method does not require the volatility process to be observed. Instead, it exploits the fact that the optimal decision functions in the corresponding dynamic programming problem can be expressed as functions of conditional distributions of volatility, given observed data. By constructing statistics summarizing information about these conditional distributions, one can obtain high quality approximate solutions. Although the required conditional distributions are in general intractable, they can be arbitrarily precisely approximated using sequential Monte Carlo schemes. The drawback, as with many Monte Carlo schemes, is potentially heavy computational demand. We present two variants of the algorithm, one closely related to the well-known least-squares Monte Carlo algorithm of Longstaff and Schwartz [The Review of Financial Studies 14 (2001) 113-147], and the other solving the same problem using a "brute force" gridding approach. We estimate an illustrative SV model using Markov chain Monte Carlo (MCMC) methods for three equities. We also demonstrate the use of our algorithm by estimating the posterior distribution of the market price of volatility risk for each of the three equities.Comment: Published in at http://dx.doi.org/10.1214/09-AOAS286 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    An Exploratory Study of Forces and Frictions affecting Large-Scale Model-Driven Development

    Full text link
    In this paper, we investigate model-driven engineering, reporting on an exploratory case-study conducted at a large automotive company. The study consisted of interviews with 20 engineers and managers working in different roles. We found that, in the context of a large organization, contextual forces dominate the cognitive issues of using model-driven technology. The four forces we identified that are likely independent of the particular abstractions chosen as the basis of software development are the need for diffing in software product lines, the needs for problem-specific languages and types, the need for live modeling in exploratory activities, and the need for point-to-point traceability between artifacts. We also identified triggers of accidental complexity, which we refer to as points of friction introduced by languages and tools. Examples of the friction points identified are insufficient support for model diffing, point-to-point traceability, and model changes at runtime.Comment: To appear in proceedings of MODELS 2012, LNCS Springe
    corecore