7 research outputs found

    A mapping study on documentation in Continuous Software Development

    Get PDF
    Context: With an increase in Agile, Lean, and DevOps software methodologies over the last years (collectively referred to as Continuous Software Development (CSD)), we have observed that documentation is often poor. Objective: This work aims at collecting studies on documentation challenges, documentation practices, and tools that can support documentation in CSD. Method: A systematic mapping study was conducted to identify and analyze research on documentation in CSD, covering publications between 2001 and 2019. Results: A total of 63 studies were selected. We found 40 studies related to documentation practices and challenges, and 23 studies related to tools used in CSD. The challenges include: informal documentation is hard to understand, documentation is considered as waste, productivity is measured by working software only, documentation is out-of-sync with the software and there is a short-term focus. The practices include: non-written and informal communication, the usage of development artifacts for documentation, and the use of architecture frameworks. We also made an inventory of numerous tools that can be used for documentation purposes in CSD. Overall, we recommend the usage of executable documentation, modern tools and technologies to retrieve information and transform it into documentation, and the practice of minimal documentation upfront combined with detailed design for knowledge transfer afterwards. Conclusion: It is of paramount importance to increase the quantity and quality of documentation in CSD. While this remains challenging, practitioners will benefit from applying the identified practices and tools in order to mitigate the stated challenges

    A case study of agile software development for large-scale safety-critical systems projects

    Get PDF
    This study explores the introduction of agile software development within an avionics company engaged in safety-critical system engineering. There is increasing pressure throughout the software industry for development efforts to adopt agile software development in order to respond more rapidly to changing requirements and make more frequent deliveries of systems to customers for review and integration. This pressure is also being experienced in safety-critical industries, where release cycles on typically large and complex systems may run to several years on projects spanning decades. However, safety-critical system developments are normally highly regulated, which may constrain the adoption of agile software development or require adaptation of selected methods or practices. To investigate this potential conflict, we conducted a series of interviews with practitioners in the company, exploring their experiences of adopting agile software development and the challenges encountered. The study also explores the opportunities for altering the existing software process in the company to better fit agile software development to the constraints of software development for safety-critical systems. We conclude by identifying immediate future research directions to better align the tempo of software development for safety-critical systems and agile software development

    Introducing Computational Thinking in K-12 Education: Historical, Epistemological, Pedagogical, Cognitive, and Affective Aspects

    Get PDF
    Introduction of scientific and cultural aspects of Computer Science (CS) (called "Computational Thinking" - CT) in K-12 education is fundamental. We focus on three crucial areas. 1. Historical, philosophical, and pedagogical aspects. What are the big ideas of CS we must teach? What are the historical and pedagogical contexts in which CT emerged, and why are relevant? What is the relationship between learning theories (e.g., constructivism) and teaching approaches (e.g., plugged and unplugged)? 2. Cognitive aspects. What is the sentiment of generalist teachers not trained to teach CS? What misconceptions do they hold about concepts like CT and "coding"? 3. Affective and motivational aspects. What is the impact of personal beliefs about intelligence (mindset) and about CS ability? What the role of teaching approaches? This research has been conducted both through historical and philosophical argumentation, and through quantitative and qualitative studies (both on nationwide samples and small significant ones), in particular through the lens of (often exaggerated) claims about transfer from CS to other skills. Four important claims are substantiated. 1. CS should be introduced in K-12 as a tool to understand and act in our digital world, and to use the power of computation for meaningful learning. CT is the conceptual sediment of that learning. We designed a curriculum proposal in this direction. 2. The expressions CT (useful to distantiate from digital literacy) and "coding" can cause misconceptions among teachers, who focus mainly on transfer to general thinking skills. Both disciplinary and pedagogical teacher training is hence needed. 3. Some plugged and unplugged teaching tools have intrinsic constructivist characteristics that can facilitate CS learning, as shown with proposed activities. 4. Growth mindset is not automatically fostered by CS, while not studying CS can foster fixed beliefs. Growth mindset can be fostered by creative computing, leveraging on its constructivist aspects

    On the real world practice of Behaviour Driven Development

    Get PDF
    Surveys of industry practice over the last decade suggest that Behaviour Driven Development is a popular Agile practice. For example, 19% of respondents to the 14th State of Agile annual survey reported using BDD, placing it in the top 13 practices reported. As well as potential benefits, the adoption of BDD necessarily involves an additional cost of writing and maintaining Gherkin features and scenarios, and (if used for acceptance testing,) the associated step functions. Yet there is a lack of published literature exploring how BDD is used in practice and the challenges experienced by real world software development efforts. This gap is significant because without understanding current real world practice, it is hard to identify opportunities to address and mitigate challenges. In order to address this research gap concerning the challenges of using BDD, this thesis reports on a research project which explored: (a) the challenges of applying agile and undertaking requirements engineering in a real world context; (b) the challenges of applying BDD specifically and (c) the application of BDD in open-source projects to understand challenges in this different context. For this purpose, we progressively conducted two case studies, two series of interviews, four iterations of action research, and an empirical study. The first case study was conducted in an avionics company to discover the challenges of using an agile process in a large scale safety critical project environment. Since requirements management was found to be one of the biggest challenges during the case study, we decided to investigate BDD because of its reputation for requirements management. The second case study was conducted in the company with an aim to discover the challenges of using BDD in real life. The case study was complemented with an empirical study of the practice of BDD in open source projects, taking a study sample from the GitHub open source collaboration site. As a result of this Ph.D research, we were able to discover: (i) challenges of using an agile process in a large scale safety-critical organisation, (ii) current state of BDD in practice, (iii) technical limitations of Gherkin (i.e., the language for writing requirements in BDD), (iv) challenges of using BDD in a real project, (v) bad smells in the Gherkin specifications of open source projects on GitHub. We also presented a brief comparison between the theoretical description of BDD and BDD in practice. This research, therefore, presents the results of lessons learned from BDD in practice, and serves as a guide for software practitioners planning on using BDD in their projects

    Plataforma colaborativa, distribuida, escalable y de bajo costo basada en microservicios, contenedores, dispositivos móviles y servicios en la Nube para tareas de cómputo intensivo

    Get PDF
    A la hora de resolver tareas de cómputo intensivo de manera distribuida y paralela, habitualmente se utilizan recursos de hardware x86 (CPU/GPU) e infraestructura especializada (Grid, Cluster, Nube) para lograr un alto rendimiento. En sus inicios los procesadores, coprocesadores y chips x86 fueron desarrollados para resolver problemas complejos sin tener en cuenta su consumo energético. Dado su impacto directo en los costos y el medio ambiente, optimizar el uso, refrigeración y gasto energético, así como analizar arquitecturas alternativas, se convirtió en una preocupación principal de las organizaciones. Como resultado, las empresas e instituciones han propuesto diferentes arquitecturas para implementar las características de escalabilidad, flexibilidad y concurrencia. Con el objetivo de plantear una arquitectura alternativa a los esquemas tradicionales, en esta tesis se propone ejecutar las tareas de procesamiento reutilizando las capacidades ociosas de los dispositivos móviles. Estos equipos integran procesadores ARM los cuales, en contraposición a las arquitecturas tradicionales x86, fueron desarrollados con la eficiencia energética como pilar fundacional, ya que son mayormente alimentados por baterías. Estos dispositivos, en los últimos años, han incrementado su capacidad, eficiencia, estabilidad, potencia, así como también masividad y mercado; mientras conservan un precio, tamaño y consumo energético reducido. A su vez, cuentan con lapsos de ociosidad durante los períodos de carga, lo que representa un gran potencial que puede ser reutilizado. Para gestionar y explotar adecuadamente estos recursos, y convertirlos en un centro de datos de procesamiento intensivo; se diseñó, desarrolló y evaluó una plataforma distribuida, colaborativa, elástica y de bajo costo basada en una arquitectura compuesta por microservicios y contenedores orquestados con Kubernetes en ambientes de Nube y local, integrada con herramientas, metodologías y prácticas DevOps. El paradigma de microservicios permitió que las funciones desarrolladas sean fragmentadas en pequeños servicios, con responsabilidades acotadas. Las prácticas DevOps permitieron construir procesos automatizados para la ejecución de pruebas, trazabilidad, monitoreo e integración de modificaciones y desarrollo de nuevas versiones de los servicios. Finalmente, empaquetar las funciones con todas sus dependencias y librerías en contenedores ayudó a mantener servicios pequeños, inmutables, portables, seguros y estandarizados que permiten su ejecución independiente de la arquitectura subyacente. Incluir Kubernetes como Orquestador de contenedores, permitió que los servicios se puedan administrar, desplegar y escalar de manera integral y transparente, tanto a nivel local como en la Nube, garantizando un uso eficiente de la infraestructura, gastos y energía. Para validar el rendimiento, escalabilidad, consumo energético y flexibilidad del sistema, se ejecutaron diversos escenarios concurrentes de transcoding de video. De esta manera se pudo probar, por un lado, el comportamiento y rendimiento de diversos dispositivos móviles y x86 bajo diferentes condiciones de estrés. Por otro lado, se pudo mostrar cómo a través de una carga variable de tareas, la arquitectura se ajusta, flexibiliza y escala para dar respuesta a las necesidades de procesamiento. Los resultados experimentales, sobre la base de los diversos escenarios de rendimiento, carga y saturación planteados, muestran que se obtienen mejoras útiles sobre la línea de base de este estudio y que la arquitectura desarrollada es lo suficientemente robusta para considerarse una alternativa escalable, económica y elástica, respecto a los modelos tradicionales.Facultad de Informátic

    Tools and Algorithms for the Construction and Analysis of Systems

    Get PDF
    This open access two-volume set constitutes the proceedings of the 27th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2021, which was held during March 27 – April 1, 2021, as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2021. The conference was planned to take place in Luxembourg and changed to an online format due to the COVID-19 pandemic. The total of 41 full papers presented in the proceedings was carefully reviewed and selected from 141 submissions. The volume also contains 7 tool papers; 6 Tool Demo papers, 9 SV-Comp Competition Papers. The papers are organized in topical sections as follows: Part I: Game Theory; SMT Verification; Probabilities; Timed Systems; Neural Networks; Analysis of Network Communication. Part II: Verification Techniques (not SMT); Case Studies; Proof Generation/Validation; Tool Papers; Tool Demo Papers; SV-Comp Tool Competition Papers

    XVIII Simposio Internacional de Informática Educativa, SIIE 2016

    Get PDF
    El Simposio Internacional de Informática Educativa (SIIE) ofrece un foro internacional para la presentación y debate de los últimos avances en investigación sobre las tecnologías para el aprendizaje y su aplicación práctica en los procesos educativos. También pretende poner en contacto a investigadores, desarrolladores, representantes institucionales y profesores para compartir puntos de vista, conocimientos y experiencias
    corecore