295,015 research outputs found

    Problems for a philosophy of software engineering

    Get PDF
    On the basis of an earlier contribution to the philosophy of computer science [Ede2007] this essay discusses to what extent the ‘paradigms’ of computer science from [Ede2007] are also applicable to software engineering, and how software engineering and computer science are related to each other. The essay concludes that neither can software engineering be fully subsumed by computer science, nor vice versa. Consequently, also the philosophies of computer science and software engineering –though related to each other– are not identical branches of a general philosophy of science. This also implies that not all of the arguments from [Ede2007] can be directly and immediately mapped from the domain of computer science into the domain of software science. After the discussion of this topic, the essay also points to some further problems and open issues for future efforts in the philosophy of software science and engineering.This essay is written in commemoration of the 100th birthdays of Konrad Zuse and Lothar Collatz (both *1910) in the year 2010. Zuse contributed to the science of computing coming from the domain of engineering, Collatz from the domain of mathematics.http://link.springer.com/journal/11023mv201

    Examining perceptions of agility in software development practice

    Get PDF
    This is the post-print version of the final published article that is available from the link below. Copyright @ 2010 ACM.Organizations undertaking software development are often reminded that successful practice depends on a number of non-technical issues that are managerial, cultural and organizational in nature. These issues cover aspects from appropriate corporate structure, through software process development and standardization to effective collaborative practice. Since the articulation of the 'software crisis' in the late-1960s, significant effort has been put into addressing problems related to the cost, time and quality of software development via the application of systematic processes and management practices for software engineering. Early efforts resulted in prescriptive structured methods, which have evolved and expanded over time to embrace consortia/ company-led initiatives such as the Unified Modeling Language and the Unified Process alongside formal process improvement frameworks such as the International Standards Organization's 9000 series, the Capability Maturity Model and SPICE. More recently, the philosophy behind traditional plan-based initiatives has been questioned by the agile movement, which seeks to emphasize the human and craft aspects of software development over and above the engineering aspects. Agile practice is strongly collaborative in its outlook, favoring individuals and interactions over processes and tools, working software over comprehensive documentation, customer collaboration over contract negotiation, and responding to change over following a plan (see Sidebar 1). Early experience reports on the use of agile practice suggest some success in dealing with the problems of the software crisis, and suggest that plan-based and agile practice are not mutually exclusive. Indeed, flexibility may arise from this unlikely marriage in an aim to strike a balance between the rigor of traditional plan-based approaches and the need for adaptation of those to suit particular development situations. With this in mind, this article surveys the current practice in software engineering alongside perceptions of senior development managers in relation to agile practice in order to understand the principles of agility that may be practiced implicitly and their effects on plan-based approach

    Towards a Reference Terminology for Ontology Research and Development in the Biomedical Domain

    Get PDF
    Ontology is a burgeoning field, involving researchers from the computer science, philosophy, data and software engineering, logic, linguistics, and terminology domains. Many ontology-related terms with precise meanings in one of these domains have different meanings in others. Our purpose here is to initiate a path towards disambiguation of such terms. We draw primarily on the literature of biomedical informatics, not least because the problems caused by unclear or ambiguous use of terms have been there most thoroughly addressed. We advance a proposal resting on a distinction of three levels too often run together in biomedical ontology research: 1. the level of reality; 2. the level of cognitive representations of this reality; 3. the level of textual and graphical artifacts. We propose a reference terminology for ontology research and development that is designed to serve as common hub into which the several competing disciplinary terminologies can be mapped. We then justify our terminological choices through a critical treatment of the ‘concept orientation’ in biomedical terminology research

    Kinetic research on heterogeneously catalysed processes: a questionnaire on the state-of-the-art in industry

    Get PDF
    On the initiative of the Working Party `Chemical Engineering in the Applications of CatalysisÂż of the European Federation of Chemical Engineering an assessment of the issues in the determination and application of kinetic data within the European industry was performed. The basis of the analysis consisted of a questionnaire put together by researchers from Dow, DSM, Shell and Eindhoven University of Technology. The 24 companies, which have responded to the questionnaire, can be classified into four groups: chemical, oil, engineering contractors and catalyst manufacturers. From the overall input it appears that there are three, equally important, utilisation areas for kinetic data: process development, process optimisation and catalyst development. There is a wide variety of kinetic data sources. Most of the respondents make use of test units which were primarily designed for development and optimisation. Avoiding transport limitation is, certainly in the case of short range projects or for complex feedstocks, not always taken care of. With respect to the modelling approaches, a common philosophy is `as simple as possibleÂż. Most of the respondents state that `in principleÂż one should strive for intrinsic kinetics, but the majority nevertheless does for various reasons not separate all transport phenomena from reaction kinetics. Kinetic models are mostly simple first or nth order or Langmuir-Hinshelwood type expressions. More complex kinetic models are scarcely used. Three areas were frequently identified to offer opportunities for improvement. Gathering of kinetic data is too costly and time consuming. There is no systematic approach at all for determination and application of kinetics in case of unstable catalytic performance. Furthermore, the software available for the regression of kinetic data to rate equations based on mechanistic schemes as well as software to model reactors are insufficiently user friendly. The majority of the respondents state that the problems indicated should be solved by cooperation, e.g., between companies, between industry and academia and between the catalysis and the chemical engineering community. A workshop on the above topics was held in December 1996 with 15 companies and 6 academics attending. More information can be obtained from the secretariat of the Working Party

    Integrated learning of computer applications for production engineering

    Get PDF
    A high productivity rate in Engineering is related to an efficient management of the flow of the large quantities of information and associated decision making activities that are consubstantial to the Engineering processes both in design and production contexts. Dealing with such problems from an integrated point of view and mimicking real scenarios is not given much attention in Engineering degrees. In the context of Engineering Education, there are a number of courses designed for developing specific competencies, as required by the academic curricula, but not that many in which integration competencies are the main target. In this paper, a course devoted to that aim is discussed. The course is taught in a Marine Engineering degree but the philosophy could be used in any Engineering field. All the lessons are given in a computer room in which every student can use each all the treated software applications. The first part of the course is dedicated to Project Management: the students acquire skills in defining, using Ms-PROJECT, the work breakdown structure (WBS), and the organization breakdown structure (OBS) in Engineering projects, through a series of examples of increasing complexity, ending up with the case of vessel construction. The second part of the course is dedicated to the use of a database manager, Ms-ACCESS, for managing production related information. A series of increasing complexity examples is treated ending up with the management of the pipe database of a real vessel. This database consists of a few thousand of pipes, for which a production timing frame is defined, which connects this part of the course with the first one. Finally, the third part of the course is devoted to the work with FORAN, an Engineering Production package of widespread use in the shipbuilding industry. With this package, the frames and plates where all the outfitting will be carried out are defined through cooperative work by the studens, working simultaneously in the same 3D model. In the paper, specific details about the learning process are given. Surveys have been posed to the students in order to get feed-back from their experience as well as to assess their satisfaction with the learning process. Results from these surveys are discussed in the pape

    Presolve, crash and software engineering for HiGHS

    Get PDF
    The efficient computational solution of linear optimization problems is generally enhanced significantly by using a "presolve" procedure to process the problem logically in order to reduce the dimension of the problem to be solved algorithmically. In the absence of other information about the optimal solution to the problem, it is often worth performing a cheap "crash" procedure to obtain a solution that is near-feasible and, ideally, near-optimal. When a problem has been presolved, it is essential to be able to deduce the original problem's optimal solution from the optimal solution of the presolved problem. This thesis provides an analysis of the Idiot crash algorithm (ICA) for linear programming (LP) problems, and techniques for primal and dual postsolve corresponding to established presolve techniques. It demonstrates that presolve yields significant performance enhancement for standard test problems, and identifies that use of the ICA enhances the solution process for a significant range of test problems. This is particularly so in the case of linearisations of quadratic assignment problems that are, otherwise, very challenging for standard methods of solution. The techniques are implemented in the HiGHS open-source software system. The use of modern techniques to create robust and efficient software system for HiGHS and its interfaces has been a critical feature of its success. Accordingly, this thesis sets out the philosophy and techniques of the software engineering underpinning HiGHS

    Transdisciplinarity seen through Information, Communication, Computation, (Inter-)Action and Cognition

    Full text link
    Similar to oil that acted as a basic raw material and key driving force of industrial society, information acts as a raw material and principal mover of knowledge society in the knowledge production, propagation and application. New developments in information processing and information communication technologies allow increasingly complex and accurate descriptions, representations and models, which are often multi-parameter, multi-perspective, multi-level and multidimensional. This leads to the necessity of collaborative work between different domains with corresponding specialist competences, sciences and research traditions. We present several major transdisciplinary unification projects for information and knowledge, which proceed on the descriptive, logical and the level of generative mechanisms. Parallel process of boundary crossing and transdisciplinary activity is going on in the applied domains. Technological artifacts are becoming increasingly complex and their design is strongly user-centered, which brings in not only the function and various technological qualities but also other aspects including esthetic, user experience, ethics and sustainability with social and environmental dimensions. When integrating knowledge from a variety of fields, with contributions from different groups of stakeholders, numerous challenges are met in establishing common view and common course of action. In this context, information is our environment, and informational ecology determines both epistemology and spaces for action. We present some insights into the current state of the art of transdisciplinary theory and practice of information studies and informatics. We depict different facets of transdisciplinarity as we see it from our different research fields that include information studies, computability, human-computer interaction, multi-operating-systems environments and philosophy.Comment: Chapter in a forthcoming book: Information Studies and the Quest for Transdisciplinarity - Forthcoming book in World Scientific. Mark Burgin and Wolfgang Hofkirchner, Editor
    • 

    corecore