168,305 research outputs found
Recommended from our members
Construction safety and digital design: a review
As digital technologies become widely used in designing buildings and infrastructure, questions arise about
their impacts on construction safety. This review explores relationships between construction safety and
digital design practices with the aim of fostering and directing further research. It surveys state-of-the-art
research on databases, virtual reality, geographic information systems, 4D CAD, building information
modeling and sensing technologies, finding various digital tools for addressing safety issues in the
construction phase, but few tools to support design for construction safety. It also considers a literature on
safety critical, digital and design practices that raises a general concern about âmindlessnessâ in the use of
technologies, and has implications for the emerging research agenda around construction safety and digital
design. Bringing these strands of literature together suggests new kinds of interventions, such as the
development of tools and processes for using digital models to promote mindfulness through multi-party
collaboration on safet
Collaborative Engineering Environments. Two Examples of Process Improvement
Companies are recognising that innovative processes are determining factors in competitiveness. Two examples from projects in aircraft development describe the introduction of collaborative engineering environments as a way to improve engineering processes. A multi-disciplinary simulation environment integrates models from all disciplines involved in a common functional structure. Quick configuration for specific design problems and powerful feedback / visualisation capabilities enable engineering teams to concentrate on the integrated behaviour of the design. An engineering process management system allows engineering teams to work concurrently in tasks, following a defined flow of activities, applying tools on a shared database. Automated management of workspaces including data consistency enables engineering teams to concentrate on the design activities. The huge amount of experience in companies must be transformed for effective application in engineering processes. Compatible concepts, notations and implementation platforms make tangible knowledge like models and algorithms accessible. Computer-based design management makes knowledge on engineering processes and methods explicit
Recommended from our members
Learning occupantsâ indoor comfort temperature through a Bayesian inference approach for office buildings in United States
A carefully chosen indoor comfort temperature as the thermostat set-point is the key to optimizing building energy use and occupantsâ comfort and well-being. ASHRAE Standard 55 or ISO Standard 7730 uses the PMV-PPD model or the adaptive comfort model that is based on small-sized or outdated sample data, which raises questions on whether and how ranges of occupant thermal comfort temperature should be revised using more recent larger-sized dataset. In this paper, a Bayesian inference approach has been used to derive new occupant comfort temperature ranges for U.S. office buildings using the ASHRAE Global Thermal Comfort Database. Bayesian inference can express uncertainty and incorporate prior knowledge. The comfort temperatures were found to be higher and less variable at cooling mode than at heating mode, and with significant overlapped variation ranges between the two modes. The comfort operative temperature of occupants varies between 21.9 and 25.4 °C for the cooling mode with a median of 23.7 °C, and between 20.5 and 24.9 °C for the heating mode with a median of 22.7 °C. These comfort temperature ranges are similar to the current ASHRAE standard 55 in the heating mode but 2â3 °C lower in the cooling mode. The results of this study could be adopted as more realistic thermostat set-points in building design, operation, control optimization, energy performance analysis, and policymaking
Version control of pathway models using XML patches
<p>Background: Computational modelling has become an important tool in understanding biological systems such as signalling pathways. With an increase in size complexity of models comes a need for techniques to manage model versions and their relationship to one another. Model version control for pathway models shares some of the features of software version control but has a number of differences that warrant a specific solution.</p>
<p>Results: We present a model version control method, along with a prototype implementation, based on XML patches. We show its application to the EGF/RAS/RAF pathway.</p>
<p>Conclusion: Our method allows quick and convenient storage of a wide range of model variations and enables a thorough explanation of these variations. Trying to produce these results without such methods results in slow and cumbersome development that is prone to frustration and human error.</p>
On the Ionisation Fraction in Protoplanetary Disks II: The Effect of Turbulent Mixing on Gas--phase Chemistry
We calculate the ionisation fraction in protostellar disk models using two
different gas-phase chemical networks, and examine the effect of turbulent
mixing by modelling the diffusion of chemical species vertically through the
disk. The aim is to determine in which regions of the disk gas can couple to a
magnetic field and sustain MHD turbulence. We find that the effect of diffusion
depends crucially on the elemental abundance of heavy metals (magnesium)
included in the chemical model. In the absence of heavy metals, diffusion has
essentially no effect on the ionisation structure of the disks, as the
recombination time scale is much shorter than the turbulent diffusion time
scale. When metals are included with an elemental abundance above a threshold
value, the diffusion can dramatically reduce the size of the magnetically
decoupled region, or even remove it altogther. For a complex chemistry the
elemental abundance of magnesium required to remove the dead zone is 10(-10) -
10(-8). We also find that diffusion can modify the reaction pathways, giving
rise to dominant species when diffusion is switched on that are minor species
when diffusion is absent. This suggests that there may be chemical signatures
of diffusive mixing that could be used to indirectly detect turbulent activity
in protoplanetary disks. We find examples of models in which the dead zone in
the outer disk region is rendered deeper when diffusion is switched on. Overall
these results suggest that global MHD turbulence in protoplanetary disks may be
self-sustaining under favourable circumstances, as turbulent mixing can help
maintain the ionisation fraction above that necessary to ensure good coupling
between the gas and magnetic field.Comment: 11 pages, 7 figures; accepted for publication in A &
Searching for New Leads to Treat Epilepsy: Target-Based Virtual Screening for the Discovery of Anticonvulsant Agents
The purpose of this investigation is to contribute to the development of new anticonvulsant drugs to treat patients with refractory epilepsy. We applied a virtual screening protocol that involved the search into molecular databases of new compounds and known drugs to find small molecules that interact with the open conformation of the Nav1.2 pore. As the 3D structure of human Nav1.2 is not available, we first assembled 3D models of the target, in closed and open conformations. After the virtual screening, the resulting candidates were submitted to a second virtual filter, to find compounds with better chances of being effective for the treatment of P-glycoprotein (P-gp) mediated resistant epilepsy. Again, we built a model of the 3D structure of human P-gp, and we validated the docking methodology selected to propose the best candidates, which were experimentally tested on Nav1.2 channels by patch clamp techniques and in vivo by the maximal electroshock seizure (MES) test. Patch clamp studies allowed us to corroborate that our candidates, drugs used for the treatment of other pathologies like Ciprofloxacin, Losartan, and Valsartan, exhibit inhibitory effects on Nav1.2 channel activity. Additionally, a compound synthesized in our lab, N,NâČ-diphenethylsulfamide, interacts with the target and also triggers significant Na1.2 channel inhibitory action. Finally, in vivo studies confirmed the anticonvulsant action of Valsartan, Ciprofloxacin, and N,NâČ-diphenethylsulfamide.Fil: Palestro, Pablo HernĂĄn. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Departamento de Ciencias BiolĂłgicas. CĂĄtedra de QuĂmica Medicinal; Argentina. Consejo Nacional de Investigaciones CientĂficas y TĂ©cnicas; ArgentinaFil: Enrique, NicolĂĄs Jorge. Consejo Nacional de Investigaciones CientĂficas y TĂ©cnicas. Centro CientĂfico TecnolĂłgico Conicet - La Plata. Instituto de Estudios InmunolĂłgicos y FisiopatolĂłgicos. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Instituto de Estudios InmunolĂłgicos y FisiopatolĂłgicos; ArgentinaFil: Goicoechea, Sofia. Consejo Nacional de Investigaciones CientĂficas y TĂ©cnicas; Argentina. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Departamento de Ciencias BiolĂłgicas. CĂĄtedra de QuĂmica Medicinal; ArgentinaFil: Villalba, Maria Luisa. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Departamento de Ciencias BiolĂłgicas. CĂĄtedra de QuĂmica Medicinal; Argentina. Consejo Nacional de Investigaciones CientĂficas y TĂ©cnicas; ArgentinaFil: Sabatier, Laureano Leonel. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Departamento de Ciencias BiolĂłgicas. CĂĄtedra de QuĂmica Medicinal; Argentina. Consejo Nacional de Investigaciones CientĂficas y TĂ©cnicas; ArgentinaFil: MartĂn, Pedro. Consejo Nacional de Investigaciones CientĂficas y TĂ©cnicas. Centro CientĂfico TecnolĂłgico Conicet - La Plata. Instituto de Estudios InmunolĂłgicos y FisiopatolĂłgicos. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Instituto de Estudios InmunolĂłgicos y FisiopatolĂłgicos; ArgentinaFil: Milesi, VerĂłnica. Consejo Nacional de Investigaciones CientĂficas y TĂ©cnicas. Centro CientĂfico TecnolĂłgico Conicet - La Plata. Instituto de Estudios InmunolĂłgicos y FisiopatolĂłgicos. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Instituto de Estudios InmunolĂłgicos y FisiopatolĂłgicos; ArgentinaFil: Bruno Blanch, Luis Enrique. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Departamento de Ciencias BiolĂłgicas. CĂĄtedra de QuĂmica Medicinal; Argentina. Consejo Nacional de Investigaciones CientĂficas y TĂ©cnicas; ArgentinaFil: Gavernet, Luciana. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Departamento de Ciencias BiolĂłgicas. CĂĄtedra de QuĂmica Medicinal; Argentina. Consejo Nacional de Investigaciones CientĂficas y TĂ©cnicas; Argentin
21st Century Simulation: Exploiting High Performance Computing and Data Analysis
This paper identifies, defines, and analyzes the limitations imposed on Modeling and Simulation by outmoded
paradigms in computer utilization and data analysis. The authors then discuss two emerging capabilities to
overcome these limitations: High Performance Parallel Computing and Advanced Data Analysis. First, parallel
computing, in supercomputers and Linux clusters, has proven effective by providing users an advantage in
computing power. This has been characterized as a ten-year lead over the use of single-processor computers.
Second, advanced data analysis techniques are both necessitated and enabled by this leap in computing power.
JFCOM's JESPP project is one of the few simulation initiatives to effectively embrace these concepts. The
challenges facing the defense analyst today have grown to include the need to consider operations among non-combatant
populations, to focus on impacts to civilian infrastructure, to differentiate combatants from non-combatants,
and to understand non-linear, asymmetric warfare. These requirements stretch both current
computational techniques and data analysis methodologies. In this paper, documented examples and potential
solutions will be advanced. The authors discuss the paths to successful implementation based on their experience.
Reviewed technologies include parallel computing, cluster computing, grid computing, data logging, OpsResearch,
database advances, data mining, evolutionary computing, genetic algorithms, and Monte Carlo sensitivity analyses.
The modeling and simulation community has significant potential to provide more opportunities for training and
analysis. Simulations must include increasingly sophisticated environments, better emulations of foes, and more
realistic civilian populations. Overcoming the implementation challenges will produce dramatically better insights,
for trainees and analysts. High Performance Parallel Computing and Advanced Data Analysis promise increased
understanding of future vulnerabilities to help avoid unneeded mission failures and unacceptable personnel losses.
The authors set forth road maps for rapid prototyping and adoption of advanced capabilities. They discuss the
beneficial impact of embracing these technologies, as well as risk mitigation required to ensure success
- âŠ