104 research outputs found

    Numerical Methods for Solving Convection-Diffusion Problems

    Full text link
    Convection-diffusion equations provide the basis for describing heat and mass transfer phenomena as well as processes of continuum mechanics. To handle flows in porous media, the fundamental issue is to model correctly the convective transport of individual phases. Moreover, for compressible media, the pressure equation itself is just a time-dependent convection-diffusion equation. For different problems, a convection-diffusion equation may be be written in various forms. The most popular formulation of convective transport employs the divergent (conservative) form. In some cases, the nondivergent (characteristic) form seems to be preferable. The so-called skew-symmetric form of convective transport operators that is the half-sum of the operators in the divergent and nondivergent forms is of great interest in some applications. Here we discuss the basic classes of discretization in space: finite difference schemes on rectangular grids, approximations on general polyhedra (the finite volume method), and finite element procedures. The key properties of discrete operators are studied for convective and diffusive transport. We emphasize the problems of constructing approximations for convection and diffusion operators that satisfy the maximum principle at the discrete level --- they are called monotone approximations. Two- and three-level schemes are investigated for transient problems. Unconditionally stable explicit-implicit schemes are developed for convection-diffusion problems. Stability conditions are obtained both in finite-dimensional Hilbert spaces and in Banach spaces depending on the form in which the convection-diffusion equation is written

    Features of Polymeric Structures By Surface—Selective Laser Sintering of Polymer Particles Using Water as Sensitizer

    Get PDF
    The development of scaffolds with strictly specific properties is a key aspect of functional tissue regeneration, and it still remains one of the greatest challenges for tissue engineering. This study is aimed to determine the possibility of producing three-dimensional polylactide (PLA) scaffolds using the method of surface-selectiv  laser sintering (SSLS) for bone tissue regeneration. In this work, the authors also improved PLA scaffold adhesion properties, which are crucial for successful cellular growth and expansion. Thus, SSLS method proved to be effective in designing threedimensional porous scaffolds with differentiated mechanical properties. Keywords: regenerative medicine, scaffolds, polylactide, surface – selective laser . sintering, tissue engeneering

    Implementing EM and Viterbi algorithms for Hidden Markov Model in linear memory

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The Baum-Welch learning procedure for Hidden Markov Models (HMMs) provides a powerful tool for tailoring HMM topologies to data for use in knowledge discovery and clustering. A linear memory procedure recently proposed by <it>Miklós, I. and Meyer, I.M. </it>describes a memory sparse version of the Baum-Welch algorithm with modifications to the original probabilistic table topologies to make memory use independent of sequence length (and linearly dependent on state number). The original description of the technique has some errors that we amend. We then compare the corrected implementation on a variety of data sets with conventional and checkpointing implementations.</p> <p>Results</p> <p>We provide a correct recurrence relation for the emission parameter estimate and extend it to parameter estimates of the Normal distribution. To accelerate estimation of the prior state probabilities, and decrease memory use, we reverse the originally proposed forward sweep. We describe different scaling strategies necessary in all real implementations of the algorithm to prevent underflow. In this paper we also describe our approach to a linear memory implementation of the Viterbi decoding algorithm (with linearity in the sequence length, while memory use is approximately independent of state number). We demonstrate the use of the linear memory implementation on an extended Duration Hidden Markov Model (DHMM) and on an HMM with a spike detection topology. Comparing the various implementations of the Baum-Welch procedure we find that the checkpointing algorithm produces the best overall tradeoff between memory use and speed. In cases where sequence length is very large (for Baum-Welch), or state number is very large (for Viterbi), the linear memory methods outlined may offer some utility.</p> <p>Conclusion</p> <p>Our performance-optimized Java implementations of Baum-Welch algorithm are available at <url>http://logos.cs.uno.edu/~achurban</url>. The described method and implementations will aid sequence alignment, gene structure prediction, HMM profile training, nanopore ionic flow blockades analysis and many other domains that require efficient HMM training with EM.</p

    Analysis of nanopore detector measurements using Machine-Learning methods, with application to single-molecule kinetic analysis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>A nanopore detector has a nanometer-scale trans-membrane channel across which a potential difference is established, resulting in an ionic current through the channel in the pA-nA range. A distinctive channel current blockade signal is created as individually "captured" DNA molecules interact with the channel and modulate the channel's ionic current. The nanopore detector is sensitive enough that nearly identical DNA molecules can be classified with very high accuracy using machine learning techniques such as Hidden Markov Models (HMMs) and Support Vector Machines (SVMs).</p> <p>Results</p> <p>A non-standard implementation of an HMM, emission inversion, is used for improved classification. Additional features are considered for the feature vector employed by the SVM for classification as well: The addition of a single feature representing spike density is shown to notably improve classification results. Another, much larger, feature set expansion was studied (2500 additional features instead of 1), deriving from including all the HMM's transition probabilities. The expanded features can introduce redundant, noisy information (as well as diagnostic information) into the current feature set, and thus degrade classification performance. A hybrid Adaptive Boosting approach was used for feature selection to alleviate this problem.</p> <p>Conclusion</p> <p>The methods shown here, for more informed feature extraction, improve both classification and provide biologists and chemists with tools for obtaining a better understanding of the kinetic properties of molecules of interest.</p
    • …
    corecore