527,964 research outputs found

    SPATIAL QUOTIENT IMPROVEMENT THROUGH THE DEVELOPMENT OF BRAINSTORMING LEARNING METHOD BASED ON SOFTWARE GRAPHMATICA RESEARCH OPERATION LINEAR PROGRAM

    Get PDF
    The aim of this research and development was to know the effect of the brainstorming model based on software graphmatica on spatial quotient research operation linear program. Graphmatica is a tool that plans calculationswith numbers and calculus facilities. Combining cartesian chart functions, relationships and equations, parameters, polarization and ordinary differential equations, has about 999 graphs. In addition Graphmatica also has the ability to break the sequence of numbers and display tangent, integral and clear lines. This tool can be applied from secondary school to college. Spatial intelligence is the ability to think someone mengimajinasikan an object in the form of images and space. The development of this product uses a literature study to strengthen the development of a product. This educational product is in the form of Brainstorming learning method and software (software) Graphmatica. Through this stage will examine the scope of Brainstorming and SoftwareGraphmatica methods, the breadth of use, supporting conditions for the product to be used or implementedoptimally, and its advantages and limitations. A literature study is also needed to determine the most appropriatesteps in the development of the product. In the results of the linear program exam as much as 19 muhammadiyah university students showed good learning outcomes after applying the development of software product graphmatica. This shows that there is an influence in the development of graphmatica software products with the brainstorming model of linear program learning outcomes.Keywords: Spatial Quotient, Brainstorming, Software Graphmatic

    A New Methodology for the Integration of Performance Materials into the Clothing Curriculum

    Get PDF
    This paper presents a model for integrating the study of performance materials into the clothing curriculum. In recent years there has been an increase in demand for stylish, functional and versatile sports apparel. Analysts predict this will reach US$126.30 billion by 2015. This growth is accredited to dramatic lifestyle changes and increasing participation in sports/leisurely pursuits particularly by women. The desire to own performance clothing for specific outdoor pursuits is increasing as it becomes more mainstream and affordable. There is a distinct blurring of lines as fashion/clothing designers enter the niche market of performance apparel. This results in a strong business case for embedding advanced product development and the study of performance materials into the undergraduate curriculum for mainstream clothing students. Traditionally modules within Higher Education are taught as discrete subjects. This has advantages since it enables students to develop knowledge and skills specific to each individual elements of the subject discipline. The expectation is that students will integrate, connect and make sense of all the discrete elements within the various elements of their learning during their studies. Whilst this is the ideal scenario, in practice often the first opportunity to integrate the various elements with a project occurs at final year, through a major project. The purpose of the model presented in the paper was to integrate sections of the curriculum previously taught as separate entities into a single element at second year, using a blended learning approach combining both practice and theory. Thus, providing the opportunity for student to synthesize the knowledge obtained in various elements of their studies and develop an understanding of emerging and new technologies relevant to the creation of specific end-products much earlier within their studies. A series of weekly guest lectures were provided with experts in relation to clothing comfort, advanced textiles, marketing, costing, garment realisation, advanced sewing technology, and innovative design. The students worked in teams to produce a range of garments for specific outdoor pursuits, underpinned by appropriate research. An integrated approach to teaching was adopted as the various team members simultaneously worked on testing performance materials, joining technically advanced fabrics, developing the design and specific stylelines based on ergonomics and investing novel construction methods. This challenged not only academic skills but also lifeskills - teamwork, organisation, communication, negotiation, and problem solving. Teams had to test, re-test and negotiate the most appropriate performance material, joining method, styleline and construction method to make the product fit for the selected advanced application. The model differs from others in its approach in a number of ways: firstly by utilising fully integrated team teaching, engaging a diverse range of subject experts which enabled the students to extend their network beyond the programme team, reinforcing research informed teaching and the teaching/learning nexus. Secondly active learning was employed as a means of challenging the learner, thus developing life/subject skills through establishing systematic connections of the different elements of their learning. Finally, in establishing knowledge-transfer thorough peer-support and networking, knowledge was exchanged between students as they progressed through the development stages. This paper presents a successful model of blended learning which integrates research, technology, design and practical skills underpinned by the advanced study of textiles which is essential to any clothing curriculum. Keywords: curriculum design, performance materials, product developmen

    A Vision for Behavioural Model-Driven Validation of Software Product Lines

    Get PDF
    International audienceThe Software Product Lines (SPLs) paradigm promises faster development cycles and increased quality by systematically reusing software assets. This paradigm considers a family of systems, each of which can be obtained by a selection of features in a variability model. Though essential, providing Quality Assurance (QA) techniques for SPLs has long been perceived as a very difficult challenge due to the combinatorics induced by variability and for which very few techniques were available. Recently, important progress has been made by the model-checking and testing communities to address this QA challenge, in a very disparate way though. We present our vision for a unified framework combining model-checking and testing approaches applied to behavioural models of SPLs. Our vision relies on Featured Transition Systems (FTSs), an extension of transition systems supporting variability. This vision is also based on model-driven technologies to support practical SPL modelling and orchestrate various QA scenarios. We illustrate such scenarios on a vending machine SPL

    Integrating the common variability language with multilanguage annotations for web engineering

    Get PDF
    Web applications development involves managing a high diversity of files and resources like code, pages or style sheets, implemented in different languages. To deal with the automatic generation of custom-made configurations of web applications, industry usually adopts annotation-based approaches even though the majority of studies encourage the use of composition-based approaches to implement Software Product Lines. Recent work tries to combine both approaches to get the complementary benefits. However, technological companies are reticent to adopt new development paradigms such as feature-oriented programming or aspect-oriented programming. Moreover, it is extremely difficult, or even impossible, to apply these programming models to web applications, mainly because of their multilingual nature, since their development involves multiple types of source code (Java, Groovy, JavaScript), templates (HTML, Markdown, XML), style sheet files (CSS and its variants, such as SCSS), and other files (JSON, YML, shell scripts). We propose to use the Common Variability Language as a composition-based approach and integrate annotations to manage fine grained variability of a Software Product Line for web applications. In this paper, we (i) show that existing composition and annotation-based approaches, including some well-known combinations, are not appropriate to model and implement the variability of web applications; and (ii) present a combined approach that effectively integrates annotations into a composition-based approach for web applications. We implement our approach and show its applicability with an industrial real-world system.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    PuLSE-I: Deriving instances from a product line infrastructure

    Get PDF
    Reusing assets during application engineering promises to improve the efficiency of systems development. However, in order to benefit from reusable assets, application engineering processes must incorporate when and how to use the reusable assets during single system development. However, when and how to use a reusable asset depends on what types of reusable assets have been created.Product line engineering approaches produce a reusable infrastructure for a set of products. In this paper, we present the application engineering process associated with the PuLSE product line software engineering method - PuLSE-I. PuLSE-I details how single systems can be built efficiently from the reusable product line infrastructure built during the other PuLSE activities

    A Quality Model for Actionable Analytics in Rapid Software Development

    Get PDF
    Background: Accessing relevant data on the product, process, and usage perspectives of software as well as integrating and analyzing such data is crucial for getting reliable and timely actionable insights aimed at continuously managing software quality in Rapid Software Development (RSD). In this context, several software analytics tools have been developed in recent years. However, there is a lack of explainable software analytics that software practitioners trust. Aims: We aimed at creating a quality model (called Q-Rapids quality model) for actionable analytics in RSD, implementing it, and evaluating its understandability and relevance. Method: We performed workshops at four companies in order to determine relevant metrics as well as product and process factors. We also elicited how these metrics and factors are used and interpreted by practitioners when making decisions in RSD. We specified the Q-Rapids quality model by comparing and integrating the results of the four workshops. Then we implemented the Q-Rapids tool to support the usage of the Q-Rapids quality model as well as the gathering, integration, and analysis of the required data. Afterwards we installed the Q-Rapids tool in the four companies and performed semi-structured interviews with eight product owners to evaluate the understandability and relevance of the Q-Rapids quality model. Results: The participants of the evaluation perceived the metrics as well as the product and process factors of the Q-Rapids quality model as understandable. Also, they considered the Q-Rapids quality model relevant for identifying product and process deficiencies (e.g., blocking code situations). Conclusions: By means of heterogeneous data sources, the Q-Rapids quality model enables detecting problems that take more time to find manually and adds transparency among the perspectives of system, process, and usage.Comment: This is an Author's Accepted Manuscript of a paper to be published by IEEE in the 44th Euromicro Conference on Software Engineering and Advanced Applications (SEAA) 2018. The final authenticated version will be available onlin

    Composition and Self-Adaptation of Service-Based Systems with Feature Models

    Get PDF
    The adoption of mechanisms for reusing software in pervasive systems has not yet become standard practice. This is because the use of pre-existing software requires the selection, composition and adaptation of prefabricated software parts, as well as the management of some complex problems such as guaranteeing high levels of efficiency and safety in critical domains. In addition to the wide variety of services, pervasive systems are composed of many networked heterogeneous devices with embedded software. In this work, we promote the safe reuse of services in service-based systems using two complementary technologies, Service-Oriented Architecture and Software Product Lines. In order to do this, we extend both the service discovery and composition processes defined in the DAMASCo framework, which currently does not deal with the service variability that constitutes pervasive systems. We use feature models to represent the variability and to self-adapt the services during the composition in a safe way taking context changes into consideration. We illustrate our proposal with a case study related to the driving domain of an Intelligent Transportation System, handling the context information of the environment.Work partially supported by the projects TIN2008-05932, TIN2008-01942, TIN2012-35669, TIN2012-34840 and CSD2007-0004 funded by Spanish Ministry of Economy and Competitiveness and FEDER; P09-TIC-05231 and P11-TIC-7659 funded by Andalusian Government; and FP7-317731 funded by EU. Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tec

    A Product Line Systems Engineering Process for Variability Identification and Reduction

    Full text link
    Software Product Line Engineering has attracted attention in the last two decades due to its promising capabilities to reduce costs and time to market through reuse of requirements and components. In practice, developing system level product lines in a large-scale company is not an easy task as there may be thousands of variants and multiple disciplines involved. The manual reuse of legacy system models at domain engineering to build reusable system libraries and configurations of variants to derive target products can be infeasible. To tackle this challenge, a Product Line Systems Engineering process is proposed. Specifically, the process extends research in the System Orthogonal Variability Model to support hierarchical variability modeling with formal definitions; utilizes Systems Engineering concepts and legacy system models to build the hierarchy for the variability model and to identify essential relations between variants; and finally, analyzes the identified relations to reduce the number of variation points. The process, which is automated by computational algorithms, is demonstrated through an illustrative example on generalized Rolls-Royce aircraft engine control systems. To evaluate the effectiveness of the process in the reduction of variation points, it is further applied to case studies in different engineering domains at different levels of complexity. Subject to system model availability, reduction of 14% to 40% in the number of variation points are demonstrated in the case studies.Comment: 12 pages, 6 figures, 2 tables; submitted to the IEEE Systems Journal on 3rd June 201

    Integrating Distributed Sources of Information for Construction Cost Estimating using Semantic Web and Semantic Web Service technologies

    Get PDF
    A construction project requires collaboration of several organizations such as owner, designer, contractor, and material supplier organizations. These organizations need to exchange information to enhance their teamwork. Understanding the information received from other organizations requires specialized human resources. Construction cost estimating is one of the processes that requires information from several sources including a building information model (BIM) created by designers, estimating assembly and work item information maintained by contractors, and construction material cost data provided by material suppliers. Currently, it is not easy to integrate the information necessary for cost estimating over the Internet. This paper discusses a new approach to construction cost estimating that uses Semantic Web technology. Semantic Web technology provides an infrastructure and a data modeling format that enables accessing, combining, and sharing information over the Internet in a machine processable format. The estimating approach presented in this paper relies on BIM, estimating knowledge, and construction material cost data expressed in a web ontology language. The approach presented in this paper makes the various sources of estimating data accessible as Simple Protocol and Resource Description Framework Query Language (SPARQL) endpoints or Semantic Web Services. We present an estimating application that integrates distributed information provided by project designers, contractors, and material suppliers for preparing cost estimates. The purpose of this paper is not to fully automate the estimating process but to streamline it by reducing human involvement in repetitive cost estimating activities
    • …
    corecore