4 research outputs found

    Dynamic optimization through the use of automatic runtime specialization

    Get PDF
    Thesis (S.B. and M.Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1999.Includes bibliographical references (leaves 99-115).by John Whaley.S.B.and M.Eng

    Identifying Profitable Specialization in Object-Oriented Languages

    No full text
    The performance of object-oriented languages can be greatly improved if methods can be specialized for particular classes of arguments. Such specialization can provide the compiler with enough class information about the receivers of messages within the specialized routine to enable these messages to be statically-bound to their target methods and subsequently inlined. We present an algorithm for automatically determining which methods are most profitable to specialize for which argument classes. This algorithm improves on previous automatic techniques by avoiding the twin problems of over- and underspecialization and by being suitable for specializing programs that use multi-methods.

    A type-based prototype compiler for telescoping languages

    Get PDF
    Scientists want to encode their applications in domain languages with high-level operators that reflect the way they conceptualize computations in their domains. Telescoping languages calls for automatically generating optimizing compilers for these languages by pre-compiling the underlying libraries that define them to generate multiple variants optimized for use in different possible contexts, including different argument types. The resulting compiler replaces calls to the high-level constructs with calls to the optimized variants. This approach aims to automatically derive high-performance executables from programs written in high-level domain-specific languages. TeleGen is a prototype telescoping-languages compiler that performs type-based specializations. For the purposes of this dissertation, types include any set of variable properties such as intrinsic type, size and array sparsity pattern. Type inference and specialization are cornerstones of the telescoping-languages strategy. Because optimization of library routines must occur before their full calling contexts are available, type inference gives critical information needed to determine which specialized variants to generate as well as how to best optimize each variant to achieve the highest performance. To build the prototype compiler, we developed a precise type-inference algorithm that infers all legal type tuples, or type configurations, for the program variables, including routine arguments, for all legal calling contexts. We use the type information inferred by our algorithm to drive specialization and optimization. We demonstrate the practical value of our type-inference algorithm and the type-based specialization strategy in TeleGen

    Advances in component-oriented programming

    Get PDF
    WCOP 2006 is the eleventh event in a series of highly successful workshops, which took place in conjunction with every ECOOP since 1996. Component oriented programming (COP) has been described as the natural extension of object-oriented programming to the realm of independently extensible systems. Several important approaches have emerged over the recent years, including component technology standards, such as CORBA/CCM, COM/COM+, J2EE/EJB, and .NET, but also the increasing appreciation of software architecture for component-based systems, and the consequent effects on organizational processes and structures as well as the software development business as a whole. COP aims at producing software components for a component market and for late composition. Composers are third parties, possibly the end users, who are not able or willing to change components. This requires standards to allow independently created components to interoperate, and specifications that put the composer into the position to decide what can be composed under which conditions. On these grounds, WCOP\u2796 led to the following definition: "A component is a unit of composition with contractually specified interfaces and explicit context dependencies only. Components can be deployed independently and are subject to composition by third parties." After WCOP\u2796 focused on the fundamental terminology of COP, the subsequent workshops expanded into the many related facets of component software. WCOP 2006 emphasizes reasons for using components beyond reuse. While considering software components as a technical means to increase software reuse, other reasons for investing into component technology tend to be overseen. For example, components play an important role in frameworks and product-lines to enable configurability (even if no component is reused). Another role of components beyond reuse is to increase the predictability of the properties of a system. The use of components as contractually specified building blocks restricts the degrees of freedom during software development compared to classic line-by-line programming. This restriction is beneficial for the predictability of system properties. For an engineering approach to software design, it is important to understand the implications of design decisions on a system\u27s properties. Therefore, approaches to evaluate and predict properties of systems by analyzing its components and its architecture are of high interest. To strengthen the relation between architectural descriptions of systems and components, a comprehensible mapping to component-oriented middleware platforms is important. Model-driven development with its use of generators can provide a suitable link between architectural views and technical component execution platforms. WCOP 2006 accepted 13 papers, which are organised according to the program below. The organisers are looking forward to an inspiring and thought provoking workshop. The organisers thank Jens Happe and Michael Kuperberg for preparing the proceedings volume
    corecore