231,530 research outputs found

    The role of software engineering in the space station program

    Get PDF
    Software engineering applications snapshots within the Space Station Freedom Program; software engineering and Ada training; software reuse; hierarchial command and control; program characteristics; integrated, international environments; software production, integration, and management; and integrated simulation environment are outlined in viewgraph format

    Cloud-Based Collaborative 3D Modeling to Train Engineers for the Industry 4.0

    Get PDF
    In the present study, Autodesk Fusion 360 software (which includes the A360 environment) is used to train engineering students for the demands of the industry 4.0. Fusion 360 is a tool that unifies product lifecycle management (PLM) applications and 3D-modeling software (PDLM—product design and life management). The main objective of the research is to deepen the students’ perception of the use of a PDLM application and its dependence on three categorical variables: PLM previous knowledge, individual practices and collaborative engineering perception. Therefore, a collaborative graphic simulation of an engineering project is proposed in the engineering graphics subject at the University of La Laguna with 65 engineering undergraduate students. A scale to measure the perception of the use of PDLM is designed, applied and validated. Subsequently, descriptive analyses, contingency graphical analyses and non-parametric analysis of variance are performed. The results indicate a high overall reception of this type of experience and that it helps them understand how professionals work in collaborative environments. It is concluded that it is possible to respond to the demand of the industry needs in future engineers through training programs of collaborative 3D modeling environments

    Empirical Studies in End-User Software Engineering and Viewing Scientific Programmers as End-Users -- POSITION STATEMENT --

    Get PDF
    My work has two relationships with End User Software Engineering. First, as an Empirical Software Engineer, I am interested in meeting with people who do research into techniques for improving end-user software engineering. All of these techniques need to have some type of empirical validation. In many cases this validation is performed by the researcher, but in other cases it is not. Regardless, an independent validation of a new approach is vital. Second, an area where I have done a fair amount of work is in software engineering for scientific software (typically written for a parallel supercomputer). These programmers are typically scientists who have little or no training in formal software engineering. Yet, to accomplish their work, they often write very complex simulation and computation software. I believe these programmers are a unique class of End-Users that must be addresse

    Software architecture standard for simulation virtual machine, version 2.0

    Get PDF
    The Simulation Virtual Machine (SBM) is an Ada architecture which eases the effort involved in the real-time software maintenance and sustaining engineering. The Software Architecture Standard defines the infrastructure which all the simulation models are built from. SVM was developed for and used in the Space Station Verification and Training Facility

    Skills development and recoding in engineering analysis and simulation : Industry needs

    Get PDF
    The EASIT2 project (Engineering Analysis and Simulation Innovation Transfer), funded under the European Union Lifelong Learning Programme, has the major goal to contribute to the competitiveness and quality of engineering, design and manufacturing in Europe through identifying the generic competencies that users of engineering analysis and simulation systems must possess. This competency framework will include a comprehensive Educational Base, a web-based interface compatible with other staff development systems, with links to associated resource material that engineers and analysts can use to develop and track their competencies. The project will also deliver an integrated Registered Analyst (RA) Scheme to provide recognition of achievement of these competencies. In order to help ensure that the deliverables of this project meet industry needs, a survey was undertaken and this paper summarises the findings of this survey. The survey comprised of an online questionnaire and was completed by 1094 respondents from 50 different countries. A large majority of respondents thought a system to define analyst skills and provide links to appropriate training resources would be useful. There was also strong support for a form of professional qualification in engineering analysis. The advantages to industry that these project deliverables would bring include incentives for staff development, marketing power and enhanced subcontractor qualification and internal resource management. The survey also provided a valuable insight into the current state of the engineering analysis and simulation industry. The most significant barriers to the effective use of engineering analysis were identified as recruitment of suitably qualified and experienced staff and a lack of analysis skills. “Pressure of work” was also identified as the most significant reason why organisations fail to get the most out of engineering analysis software. The findings of this survey are now being used in the development of the project deliverables to ensure that they meet the needs of industry as much as possible

    Army-NASA aircrew/aircraft integration program (A3I) software detailed design document, phase 3

    Get PDF
    The capabilities and design approach of the MIDAS (Man-machine Integration Design and Analysis System) computer-aided engineering (CAE) workstation under development by the Army-NASA Aircrew/Aircraft Integration Program is detailed. This workstation uses graphic, symbolic, and numeric prototyping tools and human performance models as part of an integrated design/analysis environment for crewstation human engineering. Developed incrementally, the requirements and design for Phase 3 (Dec. 1987 to Jun. 1989) are described. Software tools/models developed or significantly modified during this phase included: an interactive 3-D graphic cockpit design editor; multiple-perspective graphic views to observe simulation scenarios; symbolic methods to model the mission decomposition, equipment functions, pilot tasking and loading, as well as control the simulation; a 3-D dynamic anthropometric model; an intermachine communications package; and a training assessment component. These components were successfully used during Phase 3 to demonstrate the complex interactions and human engineering findings involved with a proposed cockpit communications design change in a simulated AH-64A Apache helicopter/mission that maps to empirical data from a similar study and AH-1 Cobra flight test

    Learning through practice via role-playing: Lessons learnt

    Get PDF
    Software engineering is the establishment and use of sound engineering principles in order to obtain economically software that is reliable and works efficiently on real machine. Sound software engineering closely related with socio-technical activity that depends on several human issues which are communication, collaboration, motivation, work environment, team harmony, engagement, training and education. These issues affect everything for students to fully understand software engineering and be prepared for software development careers. Therefore courses offered in the university must also consider the sociological and communication aspects, often called the socio-technical aspects. One popular method is to use role-playing exercises. Role-playing is a less technologically elaborate form of simulation for learning interpersonal skills and is analogous to rehearsal. It is particularly helpful when students are having difficulties to relate lessons learnt in the university to the applicability of the knowledge in the real implementation. This is because many students view software engineering as meaningless bureaucracy and have little interest in the knowledge delivered in the lecture hall. This scenario impedes the expansion of current knowledge and inhibits the possibility of knowledge exploration to solve range of industry problems. Simply lecturing about software engineering will never engage students or convince them that software engineering has value. Given this student bias, the goal of teaching software engineering often becomes convincing students that it has value. To achieve this, students need to experience firsthand the sociological and communication difficulties associated with developing software systems. In this paper, we argue that in teaching software engineering we must cover two essential things; delivery of knowledge and skills required in the software engineering domain in a form of lecture and hands-on practice to experience the value of the knowledge and skills learnt. We report on our experiences gained in deploying role-playing in master degree program. Role-playing is used as pedagogical tool to give students a greater appreciation of the range of issues and problems associated with software engineering in real settings. We believe that the lessons learnt from this exercise will be valuable for those interested in advancing software engineering education and training

    Automatic visualization and control of arbitrary numerical simulations

    Get PDF
    Authors’ preprint version as submitted to ECCOMAS Congress 2016, Minisymposium 505 - Interactive Simulations in Computational Engineering. Abstract: Visualization of numerical simulation data has become a cornerstone for many industries and research areas today. There exists a large amount of software support, which is usually tied to specific problem domains or simulation platforms. However, numerical simulations have commonalities in the building blocks of their descriptions (e. g., dimensionality, range constraints, sample frequency). Instead of encoding these descriptions and their meaning into software architecures we propose to base their interpretation and evaluation on a data-centric model. This approach draws much inspiration from work of the IEEE Simulation Interoperability Standards Group as currently applied in distributed (military) training and simulation scenarios and seeks to extend those ideas. By using an extensible self-describing protocol format, simulation users as well as simulation-code providers would be able to express the meaning of their data even if no access to the underlying source code was available or if new and unforseen use cases emerge. A protocol definition will allow simulation-domain experts to describe constraints that can be used for automatically creating appropriate visualizations of simulation data and control interfaces. Potentially, this will enable leveraging innovations on both the simulation and visualization side of the problem continuum. We envision the design and development of algorithms and software tools for the automatic visualization of complex data from numerical simulations executed on a wide variety of platforms (e. g., remote HPC systems, local many-core or GPU-based systems). We also envisage using this automatically gathered information to control (or steer) the simulation while it is running, as well as providing the ability for fine-tuning representational aspects of the visualizations produced

    Automatic visualization and control of arbitrary numerical simulations

    Get PDF
    Authors’ preprint version as submitted to ECCOMAS Congress 2016, Minisymposium 505 - Interactive Simulations in Computational Engineering. Abstract: Visualization of numerical simulation data has become a cornerstone for many industries and research areas today. There exists a large amount of software support, which is usually tied to specific problem domains or simulation platforms. However, numerical simulations have commonalities in the building blocks of their descriptions (e. g., dimensionality, range constraints, sample frequency). Instead of encoding these descriptions and their meaning into software architecures we propose to base their interpretation and evaluation on a data-centric model. This approach draws much inspiration from work of the IEEE Simulation Interoperability Standards Group as currently applied in distributed (military) training and simulation scenarios and seeks to extend those ideas. By using an extensible self-describing protocol format, simulation users as well as simulation-code providers would be able to express the meaning of their data even if no access to the underlying source code was available or if new and unforseen use cases emerge. A protocol definition will allow simulation-domain experts to describe constraints that can be used for automatically creating appropriate visualizations of simulation data and control interfaces. Potentially, this will enable leveraging innovations on both the simulation and visualization side of the problem continuum. We envision the design and development of algorithms and software tools for the automatic visualization of complex data from numerical simulations executed on a wide variety of platforms (e. g., remote HPC systems, local many-core or GPU-based systems). We also envisage using this automatically gathered information to control (or steer) the simulation while it is running, as well as providing the ability for fine-tuning representational aspects of the visualizations produced
    • …
    corecore