168,305 research outputs found

    Collaborative Engineering Environments. Two Examples of Process Improvement

    Get PDF
    Companies are recognising that innovative processes are determining factors in competitiveness. Two examples from projects in aircraft development describe the introduction of collaborative engineering environments as a way to improve engineering processes. A multi-disciplinary simulation environment integrates models from all disciplines involved in a common functional structure. Quick configuration for specific design problems and powerful feedback / visualisation capabilities enable engineering teams to concentrate on the integrated behaviour of the design. An engineering process management system allows engineering teams to work concurrently in tasks, following a defined flow of activities, applying tools on a shared database. Automated management of workspaces including data consistency enables engineering teams to concentrate on the design activities. The huge amount of experience in companies must be transformed for effective application in engineering processes. Compatible concepts, notations and implementation platforms make tangible knowledge like models and algorithms accessible. Computer-based design management makes knowledge on engineering processes and methods explicit

    Version control of pathway models using XML patches

    Get PDF
    <p>Background: Computational modelling has become an important tool in understanding biological systems such as signalling pathways. With an increase in size complexity of models comes a need for techniques to manage model versions and their relationship to one another. Model version control for pathway models shares some of the features of software version control but has a number of differences that warrant a specific solution.</p> <p>Results: We present a model version control method, along with a prototype implementation, based on XML patches. We show its application to the EGF/RAS/RAF pathway.</p> <p>Conclusion: Our method allows quick and convenient storage of a wide range of model variations and enables a thorough explanation of these variations. Trying to produce these results without such methods results in slow and cumbersome development that is prone to frustration and human error.</p&gt

    On the Ionisation Fraction in Protoplanetary Disks II: The Effect of Turbulent Mixing on Gas--phase Chemistry

    Get PDF
    We calculate the ionisation fraction in protostellar disk models using two different gas-phase chemical networks, and examine the effect of turbulent mixing by modelling the diffusion of chemical species vertically through the disk. The aim is to determine in which regions of the disk gas can couple to a magnetic field and sustain MHD turbulence. We find that the effect of diffusion depends crucially on the elemental abundance of heavy metals (magnesium) included in the chemical model. In the absence of heavy metals, diffusion has essentially no effect on the ionisation structure of the disks, as the recombination time scale is much shorter than the turbulent diffusion time scale. When metals are included with an elemental abundance above a threshold value, the diffusion can dramatically reduce the size of the magnetically decoupled region, or even remove it altogther. For a complex chemistry the elemental abundance of magnesium required to remove the dead zone is 10(-10) - 10(-8). We also find that diffusion can modify the reaction pathways, giving rise to dominant species when diffusion is switched on that are minor species when diffusion is absent. This suggests that there may be chemical signatures of diffusive mixing that could be used to indirectly detect turbulent activity in protoplanetary disks. We find examples of models in which the dead zone in the outer disk region is rendered deeper when diffusion is switched on. Overall these results suggest that global MHD turbulence in protoplanetary disks may be self-sustaining under favourable circumstances, as turbulent mixing can help maintain the ionisation fraction above that necessary to ensure good coupling between the gas and magnetic field.Comment: 11 pages, 7 figures; accepted for publication in A &

    Searching for New Leads to Treat Epilepsy: Target-Based Virtual Screening for the Discovery of Anticonvulsant Agents

    Get PDF
    The purpose of this investigation is to contribute to the development of new anticonvulsant drugs to treat patients with refractory epilepsy. We applied a virtual screening protocol that involved the search into molecular databases of new compounds and known drugs to find small molecules that interact with the open conformation of the Nav1.2 pore. As the 3D structure of human Nav1.2 is not available, we first assembled 3D models of the target, in closed and open conformations. After the virtual screening, the resulting candidates were submitted to a second virtual filter, to find compounds with better chances of being effective for the treatment of P-glycoprotein (P-gp) mediated resistant epilepsy. Again, we built a model of the 3D structure of human P-gp, and we validated the docking methodology selected to propose the best candidates, which were experimentally tested on Nav1.2 channels by patch clamp techniques and in vivo by the maximal electroshock seizure (MES) test. Patch clamp studies allowed us to corroborate that our candidates, drugs used for the treatment of other pathologies like Ciprofloxacin, Losartan, and Valsartan, exhibit inhibitory effects on Nav1.2 channel activity. Additionally, a compound synthesized in our lab, N,Nâ€Č-diphenethylsulfamide, interacts with the target and also triggers significant Na1.2 channel inhibitory action. Finally, in vivo studies confirmed the anticonvulsant action of Valsartan, Ciprofloxacin, and N,Nâ€Č-diphenethylsulfamide.Fil: Palestro, Pablo HernĂĄn. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Departamento de Ciencias BiolĂłgicas. CĂĄtedra de QuĂ­mica Medicinal; Argentina. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas; ArgentinaFil: Enrique, NicolĂĄs Jorge. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas. Centro CientĂ­fico TecnolĂłgico Conicet - La Plata. Instituto de Estudios InmunolĂłgicos y FisiopatolĂłgicos. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Instituto de Estudios InmunolĂłgicos y FisiopatolĂłgicos; ArgentinaFil: Goicoechea, Sofia. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas; Argentina. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Departamento de Ciencias BiolĂłgicas. CĂĄtedra de QuĂ­mica Medicinal; ArgentinaFil: Villalba, Maria Luisa. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Departamento de Ciencias BiolĂłgicas. CĂĄtedra de QuĂ­mica Medicinal; Argentina. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas; ArgentinaFil: Sabatier, Laureano Leonel. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Departamento de Ciencias BiolĂłgicas. CĂĄtedra de QuĂ­mica Medicinal; Argentina. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas; ArgentinaFil: MartĂ­n, Pedro. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas. Centro CientĂ­fico TecnolĂłgico Conicet - La Plata. Instituto de Estudios InmunolĂłgicos y FisiopatolĂłgicos. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Instituto de Estudios InmunolĂłgicos y FisiopatolĂłgicos; ArgentinaFil: Milesi, VerĂłnica. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas. Centro CientĂ­fico TecnolĂłgico Conicet - La Plata. Instituto de Estudios InmunolĂłgicos y FisiopatolĂłgicos. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Instituto de Estudios InmunolĂłgicos y FisiopatolĂłgicos; ArgentinaFil: Bruno Blanch, Luis Enrique. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Departamento de Ciencias BiolĂłgicas. CĂĄtedra de QuĂ­mica Medicinal; Argentina. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas; ArgentinaFil: Gavernet, Luciana. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Departamento de Ciencias BiolĂłgicas. CĂĄtedra de QuĂ­mica Medicinal; Argentina. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas; Argentin

    21st Century Simulation: Exploiting High Performance Computing and Data Analysis

    Get PDF
    This paper identifies, defines, and analyzes the limitations imposed on Modeling and Simulation by outmoded paradigms in computer utilization and data analysis. The authors then discuss two emerging capabilities to overcome these limitations: High Performance Parallel Computing and Advanced Data Analysis. First, parallel computing, in supercomputers and Linux clusters, has proven effective by providing users an advantage in computing power. This has been characterized as a ten-year lead over the use of single-processor computers. Second, advanced data analysis techniques are both necessitated and enabled by this leap in computing power. JFCOM's JESPP project is one of the few simulation initiatives to effectively embrace these concepts. The challenges facing the defense analyst today have grown to include the need to consider operations among non-combatant populations, to focus on impacts to civilian infrastructure, to differentiate combatants from non-combatants, and to understand non-linear, asymmetric warfare. These requirements stretch both current computational techniques and data analysis methodologies. In this paper, documented examples and potential solutions will be advanced. The authors discuss the paths to successful implementation based on their experience. Reviewed technologies include parallel computing, cluster computing, grid computing, data logging, OpsResearch, database advances, data mining, evolutionary computing, genetic algorithms, and Monte Carlo sensitivity analyses. The modeling and simulation community has significant potential to provide more opportunities for training and analysis. Simulations must include increasingly sophisticated environments, better emulations of foes, and more realistic civilian populations. Overcoming the implementation challenges will produce dramatically better insights, for trainees and analysts. High Performance Parallel Computing and Advanced Data Analysis promise increased understanding of future vulnerabilities to help avoid unneeded mission failures and unacceptable personnel losses. The authors set forth road maps for rapid prototyping and adoption of advanced capabilities. They discuss the beneficial impact of embracing these technologies, as well as risk mitigation required to ensure success
    • 

    corecore