5,098 research outputs found

    Volunteer tourism: Evidence of cathartic tourist experiences

    Get PDF
    The study involved in-depth interviews with participants of an Australian non-government organization (NGO) that organizes projects in which young volunteers aged between 17 and 26 years from Australia and New Zealand participate in welfare projects with partner NGOs in developing countries. The welfare projects provide on the ground assistance to communities; these may not lead to longer-term sustainable development through longer-term skills training but engage the volunteers and community in a mutual exchange. Typically, participants will be engaged in short term courses in health and hygiene, micro-enterprise management skills, assisting in community health projects, community service with children with disabilities or orphans, painting, construction of school playgrounds and classrooms, guest teaching in schools, cultural exchange and disaster relief. The Australian NGO provides no financial assistance for participants; it primarily organizes and facilitates the travel, project and community work. Each project lasts between two and four weeks and are thus typically short-term in duration. As such, participants can be considered 'shallow volunteer tourists' (Callanan and Thomas 2005)

    Redesigning OP2 Compiler to Use HPX Runtime Asynchronous Techniques

    Full text link
    Maximizing parallelism level in applications can be achieved by minimizing overheads due to load imbalances and waiting time due to memory latencies. Compiler optimization is one of the most effective solutions to tackle this problem. The compiler is able to detect the data dependencies in an application and is able to analyze the specific sections of code for parallelization potential. However, all of these techniques provided with a compiler are usually applied at compile time, so they rely on static analysis, which is insufficient for achieving maximum parallelism and producing desired application scalability. One solution to address this challenge is the use of runtime methods. This strategy can be implemented by delaying certain amount of code analysis to be done at runtime. In this research, we improve the parallel application performance generated by the OP2 compiler by leveraging HPX, a C++ runtime system, to provide runtime optimizations. These optimizations include asynchronous tasking, loop interleaving, dynamic chunk sizing, and data prefetching. The results of the research were evaluated using an Airfoil application which showed a 40-50% improvement in parallel performance.Comment: 18th IEEE International Workshop on Parallel and Distributed Scientific and Engineering Computing (PDSEC 2017

    REinforcement learning based Adaptive samPling: REAPing Rewards by Exploring Protein Conformational Landscapes

    Full text link
    One of the key limitations of Molecular Dynamics simulations is the computational intractability of sampling protein conformational landscapes associated with either large system size or long timescales. To overcome this bottleneck, we present the REinforcement learning based Adaptive samPling (REAP) algorithm that aims to efficiently sample conformational space by learning the relative importance of each reaction coordinate as it samples the landscape. To achieve this, the algorithm uses concepts from the field of reinforcement learning, a subset of machine learning, which rewards sampling along important degrees of freedom and disregards others that do not facilitate exploration or exploitation. We demonstrate the effectiveness of REAP by comparing the sampling to long continuous MD simulations and least-counts adaptive sampling on two model landscapes (L-shaped and circular), and realistic systems such as alanine dipeptide and Src kinase. In all four systems, the REAP algorithm consistently demonstrates its ability to explore conformational space faster than the other two methods when comparing the expected values of the landscape discovered for a given amount of time. The key advantage of REAP is on-the-fly estimation of the importance of collective variables, which makes it particularly useful for systems with limited structural information

    The effect of mark enhancement techniques on the presumptive and confirmatory tests for blood

    Get PDF
    An investigation into the effects of physical and chemical enhancement on subsequent presumptive and confirmatory tests for human blood is presented. Human blood was deposited onto porous (white 80gsm paper and brown envelope) and non-porous (tile and linoleum) substrates in a depletion series (30 depletions on non-porous and 20 on porous) and subjected to three ageing periods; 1, 7 and 28 days. A number of enhancement techniques were tested [fluorescence, black magnetic powder (BMP), iron-oxide black powder suspension (PS), cyanoacrylate (CA) fuming, acid violet 17 (AV17), acid yellow 7 (AY7), ninhydrin, DFO and Bluestar Forensic Magnum (BFM) luminol] to evaluate their potential effects on subsequent presumptive and confirmatory tests. AV17 and Bluestar provided the best enhancement and fully enhanced all depletions in the series. The sensitivity of the Kastle-Meyer (KM) (presumptive), Takayama and RSID-Blood tests (confirmatory) was initially investigated to determine the range of detectable depletions. The KM test detected all depletions, whereas the Takayama test detected up to depletion 6 and RSID-Blood detected up to depletion 20 (paper), 10 (envelope), 15 (tile) and 9 (lino). The abilities of these tests to detect blood after enhancement were then observed.A number of techniques resulted in little to no effect on any of the blood tests, whereas adverse effects were observed for others. Ninhydrin and CA fuming caused weak but instantaneous positive KM results whereas methanol-based AV17 and AY7 delayed the reaction by as much as 1 min. The Takayama test was not very sensitive, therefore, its performance was easily affected by enhancement and negative results were often observed. RSID-Blood tests were largely unaffected by chemical enhancement although a drop in positive results was observed for some of the techniques when compared to positive controls.Using a standard procedure for DNA extraction, all the tested blood samples (before and after enhancement) gave a detectable quantity of DNA and were successfully profiled. Out of the 45 samples processed for DNA profiling, 44 gave full profiles, while the remaining showed allele drop out in one or two loci

    Effects of Microstructure Formation on the Stability of Vapor Deposited Glasses

    Full text link
    Glasses formed by physical vapor deposition (PVD) are an interesting new class of materials, exhibiting properties thought to be equivalent to those of glasses aged for thousands of years. Exerting control over the structure and properties of PVD glasses formed with different types of glass-forming molecules is now an emerging challenge. In this work, we study coarse grained models of organic glass formers containing fluorocarbon tails of increasing length, corresponding to an increased tendency to form microstructures. We use simulated PVD to examine how the presence of the microphase separated domains in the supercooled liquid influences the ability to form stable glasses. This model suggests that increasing molecule tail length results in decreased thermodynamic and kinetic stability of the molecules in PVD films. The reduced stability is further linked to the reduced ability of these molecules to equilibrate at the free surface during PVD. We find that as the tail length is increased, the relaxation time near the surface of the supercooled equilibrium liquid films of these molecules are slowed and become essentially bulk-like, due to the segregation of the fluorocarbon tails to the free surface. Surface diffusion is also markedly reduced due to clustering of the molecules at the surface. Based on these results, we propose a trapping mechanism where tails are unable to move between local phase separated domains on the relevant deposition time scales

    Analyzing the Heat Transfer Rate of Nanostructures of Poly (Methyl Methacrylate) / Al2O3 Utilizing Molecular Dynamics Simulations

    Get PDF
    The main methods for preventing fires are physical, chemical, or a combination of the two. One of the main thermophysical characteristics that connect the chemical structure is thermal diffusivity. The relationship between heat transport as well as heat resistance has been thoroughly established in the literature. Heat transmission can also be connected to various fire-retardant characteristics, like maximal heat release or time to ignite, which rank among the most crucial factors in defining the potential fire danger of a specific material. The thermal stability, as well as fire-retardant qualities of polymers, are enhanced by metal oxides. In the present investigation, simulations of molecular dynamics constructed using the single atom approach are used to examine the consequence of Al2O3 nanoparticles on thermal transfer of isotactic Polymethyl methacrylate. In order to examine the heat transfer rate of poly (methyl methacrylate) besides poly (methyl methacrylate)/Al2O3 nanocomposite, capacity, density, as well as thermal transfer were measured within 300–700 K variety. It is possible to calculate heat capacity using fluctuating characteristics. Conductivity was calculated through a non-equilibrium modeling simulation using Fourier's law. The results show that the Al2O3 nanoparticles increase a transition temperature of glass; conductivity, in addition diffusivity of the Poly (Methyl Methacrylate) while decreasing the heat capacity &nbsp

    Analysis of the {4-Nicotinamido-4-Oxo-2-Butenoic Acid's} Electrochemical Polymerization as an Anti-Corrosion Layer on Stainless-Steel Alloys

    Get PDF
    The [4-Nicotinamido-4-Oxo-2-Butenoic Acid] monomer was electropolymerized on 316-grade stainless steel to produce [poly4-Nicotinamido-4-Oxo-2-Butenoic Acid]. The structure and properties of the generated polymer layer were evaluated using SEM, cyclic voltammetry, and other techniques. The corrosion resistance of stainless steel, both uncoated and coated in a corrosive medium of 0.2M HCl solution was examined using an electrochemical polarisation technique at temperatures ranging from (293-323) K. Nanomaterials such as nano zinc oxide and graphene were introduced to monomer solutions at various concentrations to increase the corrosion resistance of stainless-steel surfaces. According to the findings, adding nano components to a polymeric coating increased its protective effectiveness. Thermodynamic and kinetic activation properties were also investigated. The percentage of protection efficiencies and polarisation resistance values of the covering polymer decreased as the temperature rose. As the temperature climbed, the corrosion current density increased, although the corrosion potential decreased. In SEM and AFM experiments, the development of a protective coating on the surface of 316-grade stainless steel was demonstrated to protect it

    Maximum Entropy Technique and Regularization Functional for Determining the Pharmacokinetic Parameters in DCE-MRI

    Get PDF
    This paper aims to solve the arterial input function (AIF) determination in dynamic contrast-enhanced MRI (DCE-MRI), an important linear ill-posed inverse problem, using the maximum entropy technique (MET) and regularization functionals. In addition, estimating the pharmacokinetic parameters from a DCE-MR image investigations is an urgent need to obtain the precise information about the AIF-the concentration of the contrast agent on the left ventricular blood pool measured over time. For this reason, the main idea is to show how to find a unique solution of linear system of equations generally in the form of y = Ax + b, named an ill-conditioned linear system of equations after discretization of the integral equations, which appear in different tomographic image restoration and reconstruction issues. Here, a new algorithm is described to estimate an appropriate probability distribution function for AIF according to the MET and regularization functionals for the contrast agent concentration when applying Bayesian estimation approach to estimate two different pharmacokinetic parameters. Moreover, by using the proposed approach when analyzing simulated and real datasets of the breast tumors according to pharmacokinetic factors, it indicates that using Bayesian inference-that infer the uncertainties of the computed solutions, and specific knowledge of the noise and errors-combined with the regularization functional of the maximum entropy problem, improved the convergence behavior and led to more consistent morphological and functional statistics and results. Finally, in comparison to the proposed exponential distribution based on MET and Newton's method, or Weibull distribution via the MET and teaching-learning-based optimization (MET/TLBO) in the previous studies, the family of Gamma and Erlang distributions estimated by the new algorithm are more appropriate and robust AIFs
    corecore