15 research outputs found

    Implementation and evaluation of the Level Set method: towards efficient and accurate simulation of wet etching for microengineering applications

    Full text link
    The use of atomistic methods, such as the Continuous Cellular Automaton (CCA), is currently regarded as a computationally efficient and experimentally accurate approach for the simulation of anisotropic etching of various substrates in the manufacture of Micro-electro-mechanical Systems (MEMS). However, when the features of the chemical process are modified, a time-consuming calibration process needs to be used to transform the new macroscopic etch rates into a corresponding set of atomistic rates. Furthermore, changing the substrate requires a labor-intensive effort to reclassify most atomistic neighborhoods. In this context, the Level Set (LS) method provides an alternative approach where the macroscopic forces affecting the front evolution are directly applied at the discrete level, thus avoiding the need for reclassification and/or calibration. Correspondingly, we present a fully-operational Sparse Field Method (SFM) implementation of the LS approach, discussing in detail the algorithm and providing a thorough characterization of the computational cost and simulation accuracy, including a comparison to the performance by the most recent CCA model. We conclude that the SFM implementation achieves similar accuracy as the CCA method with less fluctuations in the etch front and requiring roughly 4 times less memory. Although SFM can be up to 2 times slower than CCA for the simulation of anisotropic etchants, it can also be up to 10 times faster than CCA for isotropic etchants. In addition, we present a parallel, GPU-based implementation (gSFM) and compare it to an optimized, multicore CPU version (cSFM), demonstrating that the SFM algorithm can be successfully parallelized and the simulation times consequently reduced, while keeping the accuracy of the simulations. Although modern multicore CPUs provide an acceptable option, the massively parallel architecture of modern GPUs is more suitable, as reflected by computational times for gSFM up to 7.4 times faster than for cSFM. (c) 2013 Elsevier B.V. All rights reserved.We thank the anonymous reviewers for their valuable comments and suggestions. This work has been supported by the Spanish FPI-MICINN BES-2011-045940 grant and the Ramon y Cajal Fellowship Program by the Spanish Ministry of Science and Innovation. Also, we acknowledge support by the JAE-Doc grant from the Junta para la Ampliacion de Estudios program co-funded by FSE and the Professor Partnership Program by NVIDIA Corporation.Montoliu Álvaro, C.; Ferrando Jódar, N.; Gosalvez, MÁ.; Cerdá Boluda, J.; Colom Palero, RJ. (2013). Implementation and evaluation of the Level Set method: towards efficient and accurate simulation of wet etching for microengineering applications. Computer Physics Communications. 184(10):2299-2309. https://doi.org/10.1016/j.cpc.2013.05.016S229923091841

    Formulation, stabilisation and encapsulation of bacteriophage for phage therapy

    Get PDF
    Against a backdrop of global antibiotic resistance and increasing awareness of the importance of the human microbiota, there has been resurgent interest in the potential use of bacteriophages for therapeutic purposes, known as phage therapy. A number of phage therapy phase I and II clinical trials have concluded, and shown phages don’t present significant adverse safety concerns. These clinical trials used simple phage suspensions without any formulation and phage stability was of secondary concern. Phages have a limited stability in solution, and undergo a significant drop in phage titre during processing and storage which is unacceptable if phages are to become regulated pharmaceuticals, where stable dosage and well defined pharmacokinetics and pharmacodynamics are de rigueur. Animal studies have shown that the efficacy of phage therapy outcomes depend on the phage concentration (i.e. the dose) delivered at the site of infection, and their ability to target and kill bacteria, arresting bacterial growth and clearing the infection. In addition, in vitro and animal studies have shown the importance of using phage cocktails rather than single phage preparations to achieve better therapy outcomes. The in vivo reduction of phage concentration due to interactions with host antibodies or other clearance mechanisms may necessitate repeated dosing of phages, or sustained release approaches. Modelling of phage-bacterium population dynamics reinforces these points. Surprisingly little attention has been devoted to the effect of formulation on phage therapy outcomes, given the need for phage cocktails, where each phage within a cocktail may require significantly different formulation to retain a high enough infective dose. This review firstly looks at the clinical needs and challenges (informed through a review of key animal studies evaluating phage therapy) associated with treatment of acute and chronic infections and the drivers for phage encapsulation. An important driver for formulation and encapsulation is shelf life and storage of phage to ensure reproducible dosages. Other drivers include formulation of phage for encapsulation in micro- and nanoparticles for effective delivery, encapsulation in stimuli responsive systems for triggered controlled or sustained release at the targeted site of infection. Encapsulation of phage (e.g. in liposomes) may also be used to increase the circulation time of phage for treating systemic infections, for prophylactic treatment or to treat intracellular infections. We then proceed to document approaches used in the published literature on the formulation and stabilisation of phage for storage and encapsulation of bacteriophage in micro- and nanostructured materials using freeze drying (lyophilization), spray drying, in emulsions e.g. ointments, polymeric microparticles, nanoparticles and liposomes. As phage therapy moves forward towards Phase III clinical trials, the review concludes by looking at promising new approaches for micro- and nanoencapsulation of phages and how these may address gaps in the field

    LGEM: A lattice Boltzmann economic model for income distribution and tax regulation

    Full text link
    [EN] In this paper, a new econophysics model based on a lattice Boltzmann automata is presented. This model represents economic agents (people, countries...) as particles of a gas moving on a 2D lattice and interacting with each other. Economic transactions are modeled by particle-to-particle interactions in which money is conserved. If only particular transactions are considered (free market), the money distribution quickly converges to a Boltzmann-Gibbs distribution. But the model also introduces a third step of global income distribution that can be used for exploring tax regulation strategies. The model is presented, and some examples of income distribution are given. One of the most interesting features of the model is the fact that it is completely discrete, and it can be exactly implemented on any computational resource, leading to very fast, yet powerful simulations, especially when parallelization resources are available. Some results of these simulations, as well as performance data, are given.This work has supported by the Spanish CICYT FIS2010-21216-C02-02 grant and the Spanish FPI-MICINN grant, BES-2011-045940.Cerdá Boluda, J.; Montoliu Álvaro, C.; Colom Palero, RJ. (2013). LGEM: A lattice Boltzmann economic model for income distribution and tax regulation. Mathematical and Computer Modelling. 57(7-8):1648-1655. https://doi.org/10.1016/j.mcm.2011.10.051S16481655577-

    Does the "Silver bullet" lose its shine over the time? Assessment of loss of lithium response in a preliminary sample of bipolar disorder outpatients

    Get PDF
    Background: Though often perceived as a "silver bullet" treatment for bipolar disorder (BD), lithium has seldom reported to lose its efficacy over the time. Objective: The aim of the present study was to assess cases of refractoriness toward restarted lithium in BD patients who failed to preserve maintenance. Method: Treatment trajectories associated with re-instituted lithium following loss of achieved lithium-based maintenance in BD were retrospectively reviewed for 37 BD-I patients (median age 52 years; F: M=17: 20 or 46% of the total) over an 8.1-month period on average. Results: In our sample only 4 cases (roughly 11% of the total, of whom F: M=2: 2) developed refractoriness towards lithium after its discontinuation. Thirty-three controls (F: M=15: 18) maintained lithium response at the time of re-institution. No statistically significant difference between cases and controls was observed with respect to a number of demographic and clinical features but for time spent before first trial ever with lithium in life (8.5 vs. 3 years; U=24.5, Z=-2.048, p=.041) and length of lithium discontinuation until new therapeutic attempt (5.5 vs. 2 years; U=8, Z=-2.927, p=.003) between cases vs. controls respectively. Tapering off of lithium was significantly faster among cases vs. controls (1 vs. 7 days; U=22, Z=-2.187), though both subgroups had worrisome high rates of poor adherence overall. Conclusion: Although intrinsic limitations of the present preliminary assessment hamper the validity and generalizability of overall results, stating the clinical relevance of the topic further prospective research is warranted. The eventual occurrence of lithium refractoriness may indeed be associated with peculiar course trajectories and therapeutic outcomes ultimately urging the prescribing clinicians to put efforts in preserving maintenance of BD in the absence of any conclusive research insight on the matter. © Fornaro et al

    Deciding life-cycle inheritance on Petri nets

    No full text
    Abstract. One of the key issues of object-oriented modeling is inheritance. It allows for the definition of a subclass that inherits features from some superclass. When considering the dynamic behavior of objects, as captured by their life cycles, there is no general agreement on the meaning of inheritance. Basten and Van der Aalst introduced the notion of life-cycle inheritance for this purpose. Unfortunately, the search tree needed for deciding life-cycle inheritance is in general prohibitively large. This paper presents a backtracking algorithm to decide life-cycle inheritance on Petri nets. The algorithm uses structural properties of both the base life cycle and the potential sub life cycle to prune the search tree. Test cases show that the results are promising. Keywords. Object-orientation, workflow, life-cycle inheritance, branching bisimilarity, backtracking, Petri nets, structural properties, T-invariants.
    corecore