695 research outputs found

    Virtual Environments for multiphysics code validation on Computing Grids

    Get PDF
    We advocate in this paper the use of grid-based infrastructures that are designed for seamless approaches to the numerical expert users, i.e., the multiphysics applications designers. It relies on sophisticated computing environments based on computing grids, connecting heterogeneous computing resources: mainframes, PC-clusters and workstations running multiphysics codes and utility software, e.g., visualization tools. The approach is based on concepts defined by the HEAVEN* consortium. HEAVEN is a European scientific consortium including industrial partners from the aerospace, telecommunication and software industries, as well as academic research institutes. Currently, the HEAVEN consortium works on a project that aims to create advanced services platforms. It is intended to enable "virtual private grids" supporting various environments for users manipulating a suitable high-level interface. This will become the basis for future generalized services allowing the integration of various services without the need to deploy specific grid infrastructures

    Research and Education in Computational Science and Engineering

    Get PDF
    Over the past two decades the field of computational science and engineering (CSE) has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers of all persuasions with algorithmic inventions and software systems that transcend disciplines and scales. Carried on a wave of digital technology, CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society; and the CSE community is at the core of this transformation. However, a combination of disruptive developments---including the architectural complexity of extreme-scale computing, the data revolution that engulfs the planet, and the specialization required to follow the applications to new frontiers---is redefining the scope and reach of the CSE endeavor. This report describes the rapid expansion of CSE and the challenges to sustaining its bold advances. The report also presents strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie

    Distributed Workflows for Multi-physics Applications in Aeronautics

    Get PDF
    International audienceThe industry requires innovative technologies to support the numeric design and simulation of manufactured products in order to reduce time to market delays and improve the performance of the products and the efficiency of the industries in the global competitive market. Innovation also requires advanced tools to support the design of new products. For example, remote teams are working collaboratively on the preliminary design of future aircraft that will be “safer, quieter, cleaner”, and environmentally friendly by 2020. The automotive industry has similar concerns. The telecom industries (e.g., mobile phones design) and nuclear power plant design face large-scale multi-physics simulation and optimization challenges. This paper suggests that distributed workflows running on computational grids are adequate to support their application needs

    Collaborative Multidisciplinary Design in Virtual Environments

    Get PDF
    International audienceThe application designers can usually define their own “virtual environments” by selecting the appropriate computing resources required, or reuse and compose existing environments. The approach is generic by allowing various application domains to benefit from potential hardware and software resources located on remote computing facilities in a simple and intuitive way. The computing resources are defined by services made available as sets of standardized interfaces performing specific tasks: application workflow, input data streams, output visualization tools, monitoring facilities, etc. Services can be composed and hierarchically defined. Transparent access to heterogeneous hardware and software operating systems is guaranteed. An aeroelasticity example in airliner design is given

    A Review of Platforms for the Development of Agent Systems

    Full text link
    Agent-based computing is an active field of research with the goal of building autonomous software of hardware entities. This task is often facilitated by the use of dedicated, specialized frameworks. For almost thirty years, many such agent platforms have been developed. Meanwhile, some of them have been abandoned, others continue their development and new platforms are released. This paper presents a up-to-date review of the existing agent platforms and also a historical perspective of this domain. It aims to serve as a reference point for people interested in developing agent systems. This work details the main characteristics of the included agent platforms, together with links to specific projects where they have been used. It distinguishes between the active platforms and those no longer under development or with unclear status. It also classifies the agent platforms as general purpose ones, free or commercial, and specialized ones, which can be used for particular types of applications.Comment: 40 pages, 2 figures, 9 tables, 83 reference

    The Astrophysical Multipurpose Software Environment

    Get PDF
    We present the open source Astrophysical Multi-purpose Software Environment (AMUSE, www.amusecode.org), a component library for performing astrophysical simulations involving different physical domains and scales. It couples existing codes within a Python framework based on a communication layer using MPI. The interfaces are standardized for each domain and their implementation based on MPI guarantees that the whole framework is well-suited for distributed computation. It includes facilities for unit handling and data storage. Currently it includes codes for gravitational dynamics, stellar evolution, hydrodynamics and radiative transfer. Within each domain the interfaces to the codes are as similar as possible. We describe the design and implementation of AMUSE, as well as the main components and community codes currently supported and we discuss the code interactions facilitated by the framework. Additionally, we demonstrate how AMUSE can be used to resolve complex astrophysical problems by presenting example applications.Comment: 23 pages, 25 figures, accepted for A&

    Multiphysics Simulation and Model-based System Testing of Automotive E-Powertrains

    Get PDF
    Programa Oficial de Doutoramento en Enxeñaría Naval e Industrial . 5015V01[Abstract] Model-Based System Testing emerges as a new paradigm for the development cycle that is currently gaining momentum, especially in the automotive industry. This novel approach is focused on combining computer simulation and real experimentation to shift the bulk of problem detection and redesign tasks towards the early stages of the developments. Along these lines, Model-Based System Testing is aimed at decreasing the amount of resources invested in these tasks and enabling the early identification of design flaws and operation problems before a full-vehicle prototype is available. The use of Model-Based System Testing, however, requires to implement some critical technologies, three of which will be discussed in this thesis. The first task addressed in this thesis is the design of a multiplatform framework to assess the description and resolution of the equations of motion of virtual models used in simulation. This framework enables the efficiency evaluation of different modelling and solution methods and implementations. In Model-Based System Testing contexts virtual models interact with physical components, therefore it is mandatory to guarantee their real-time capabilities, regardless of the software or hardware implementations. Second, estimation techniques based on Kalman Filters are of interest in Model- Based System Testing applications to evaluate parameters, inputs or states of a virtual model of a given system. These procedures can be combined with the use of Digital Twins, virtual counterparts of real systems, with which they exchange information in a two-way communication. The available measurements from the sensors located at a physical system can be fused with the results obtained from the simulation of the virtual model. Thus, this avenue improves the knowledge of the magnitudes that cannot be measured directly by these sensors. In turn, the outcomes obtained from the simulation of the virtual model could serve to make decisions and apply corrective actions onto the physical system. Third, co-simulation techniques are necessary when a system is split into several subsystems that are coordinated through the exchange of a reduced set of variables at discrete points in time. This is the case with a majority of Model-Based System Testing applications, in which physical and virtual components are coupled through a discrete-time communication gateway. The resulting cyber-physical applications are essentially an example of real-time co-simulation, in which all the subsystems need to achieve real-time performance. Due to the presence of physical components, which cannot iterate over their integration steps, explicit schemes are often mandatory. These, however, introduce errors associated with the inherent delays of a discrete communication interface. These errors can render co-simulation results inaccurate and even unstable unless they are eliminated. This thesis will address this correction by means of an energy-based procedure that considers the power exchange between subsystems. This research work concludes with an example of a cyber-physical application, in which real components are interfaced to a virtual environment, which requires the application of all the MBST technologies addressed in this thesis.[Resumen] Los ensayos de sistemas basados en modelos emergen como un nuevo paradigma de desarrollo que actualmente está ganando popularidad, especialmente en la industria automotriz. Este nuevo enfoque se centra en combinar la simulación por ordenador con la experimentación para desplazar la mayor parte de la detección de problemas y rediseños hacia las fases tempranas del desarrollo. De esta forma, los ensayos de sistemas basados en modelos se centran en disminuir la cantidad de recursos invertidos en estas tareas y habilitar la identificación temprana de errores de diseño y problemas durante la operación, incluso antes de que los prototipos del vehículo completo estén disponibles. Sin embargo, el uso de esta estrategia requiere implementar algunas tecnologías críticas, tres de las cuales serán tratadas en esta tesis. La primera tarea abordada en esta tesis es el diseño de un entorno multiplataforma para evaluar la descripción y resolución de las ecuaciones de la dinámica de los modelos virtuales usados en las simulaciones. Este marco permite una evaluación eficiente de las diferentes formas de modelar los sistemas y de los métodos de resolución e implementación. En este contexto de ensayos basados en modelos, los sistemas virtuales interactúan con los componentes de los sistemas físicos, por lo tanto es necesario garantizar sus capacidades de ejecución en tiempo real, independientemente de la plataforma de software y hardware utilizada. En segundo lugar, las técnicas de estimación basadas en filtros de Kalman son de gran interés en las aplicaciones que usan ensayos basados en modelos para evaluar los parámetros, entradas o estados de los modelos virtuales de un sistema dado. Estos procedimientos se pueden combinar con el uso de gemelos digitales, homólogos virtuales de un sistema físico, con el cual mantienen un flujo bidireccional de intercambio de información. Las medidas disponibles procedentes de los sensores instalados en un sistema físico se pueden combinar con los resultados obtenidos de la simulación del sistema virtual. De este modo, este enfoque mejora el conocimiento de las magnitudes que no pueden ser medidas directamente por los sensores. A su vez, los resultados de la simulación de los sistemas de los modelos virtuales pueden servir para tomar decisiones y aplicar medidas correctivas al sistema real. En tercer lugar, las técnicas de co-simulación son necesarias cuando un sistema se divide en varios subsistemas, coordinados a través del intercambio de un reducido número de variables en momentos puntuales. Este es el caso de la mayor parte de las aplicaciones que siguen la estrategia de ensayos basados en modelos, en los cuales los componentes físicos y virtuales se acoplan mediante una comunicación en tiempo discreto. Como resultado las aplicaciones ciberfísicas son en esencia un ejemplo de co-simulación en tiempo real, en la que todos los subsistemas necesitan cumplir los requisitos de ejecución en tiempo real. Debido a la presencia de componentes físicos, que no pueden reiterar sus pasos de integración, el uso de esquemas explícitos es frecuentemente necesario. Sin embargo, estos esquemas introducen errores asociados con los retrasos propios de una interfaz de tiempo discreto. Estos errores pueden dar lugar a resultados erróneos e incluso inestabilizar la co-simulación, si no son eliminados. Esta tesis aborda la corrección de la co-simulación a través de métodos energéticos basados en la potencia intercambiada por los subsistemas. Este trabajo de investigación concluye con un ejemplo de aplicación ciberfísica, en la que se conectan componentes reales a una simulación por ordenador. Esta aplicación requiere la aplicación de las tecnologías de ensayos basados en modelos presentadas a lo largo de esta tesis.[Resumo] Os ensaios de sistemas baseados en modelos xorden como un novo paradigma de desenvolvemento que actualmente está gañando popularidade, especialmente na industria automotriz. Este novo enfoque céntrase en combinar a simulación por ordenador coa experimentación para desprazar a maior parte da detección de problemas e redeseños cara as fases iniciais do ciclo de produto. Deste xeito, os ensaios de sistemas baseados en modelos fundaméntanse en diminuír a cantidade de recursos investidos nestas tarefas e habilitar a identificación temperá de erros de deseño e problemas durante a operación, aínda se os prototipos do vehículo completo non están dispoñibeis. Porén, o uso desta estratexia require implementar algunhas tecnoloxías críicas, tres das cales serán tratadas nesta tese. A primeira tarefa tratada nesta tese é o deseño dun entorno multiplataforma para avaliar a descripción e resolución das ecuacións da dinámica dos modelos virtuais empregados nas simulacións. Este entorno permite unha avaluación eficiente dos diferentes xeitos de modelar os sistemas e dos métodos de resolución e implementación. Neste contexto de ensaios baseados en modelos, os sistemas virtuais interactúan cos compoñentes dos sistemas físicos, polo tanto é necesario garantir as súas capacidades de execución en tempo real, independentemente da plataforma de hardware e software escollida. En segundo lugar, as técnicas de estimación baseadas en filtros de Kalman son de grande interese nas aplicacións que usan ensaios baseados en modelos para avaliar os seus parámetros, entradas ou estados dos modelos virtuais dun certo sistema. Estes procedementos pódense combinar co uso de xemelgos dixitais, homólogos virtuais dun sistema físico, co cal manteñen un fluxo bidireccional de intercambio de información. As medidas dispoñíbeis procedentes dos sensores instalados nun sistema físico pódense combinar cos resultados obtidos da simulación do sistema virtual. Deste xeito, este enfoque mellora o coñecemento das magnitudes que non poden ser medidas directamente polos sensores. Á súa vez, os resultados da simulación dos sistemas dos modelos virtuais poden servir para tomar decisións e aplicar medidas correctivas ao sistema real. En terceiro lugar, as técnicas de co-simulación son necesarias cando un sistema é dividido en varios subsistemas, coordinados a través do intercambio dun reducido número de variables en momentos puntuais. Este é o caso da maior parte das aplicacións que seguen a estratexia de ensaios baseados en modelos, nos cales os compoñentes físicos e virtuais se acoplan mediante unha comunicación en tempo discreto. Como resultado as aplicacións ciberfísicas son esencialmente un exemplo de co-simulación en tempo real, na que tódolos subsistemas necesitan cumprir os requisitos de execución en tempo real. Debido á presenza de compoñentes físicos, que non poden reiterar os seus pasos de integración, o uso de esquemas explícitos é polo xeral necesario. Con todo, estes esquemas introducen erros asociados cos atrasos derivados dunha interface de tempo discreto. Estes erros poden provocar resultados incorrectos e incluso inestabilizar a co-simulación, de non seren eliminados. Esta tese aborda a corrección da co-simulación a través de métodos enerxéticos baseados na potencia intercambiada polos subsistemas. Este traballo conclúe cun exemplo de aplicación ciberfísica, na que os compoñentes reais son conectados a un entorno virtual. Isto require o emprego de tódalas tecnoloxías de ensaios baseadas en modelos presentadas ao longo desta tese

    Research and Education in Computational Science and Engineering

    Get PDF
    This report presents challenges, opportunities, and directions for computational science and engineering (CSE) research and education for the next decade. Over the past two decades the field of CSE has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers with algorithmic inventions and software systems that transcend disciplines and scales. CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society, and the CSE community is at the core of this transformation. However, a combination of disruptive developments---including the architectural complexity of extreme-scale computing, the data revolution and increased attention to data-driven discovery, and the specialization required to follow the applications to new frontiers---is redefining the scope and reach of the CSE endeavor. With these many current and expanding opportunities for the CSE field, there is a growing demand for CSE graduates and a need to expand CSE educational offerings. This need includes CSE programs at both the undergraduate and graduate levels, as well as continuing education and professional development programs, exploiting the synergy between computational science and data science. Yet, as institutions consider new and evolving educational programs, it is essential to consider the broader research challenges and opportunities that provide the context for CSE education and workforce development
    corecore