860 research outputs found

    Modelling The Motion Of Pulp Inside Wood-chip Disc Refiners

    Get PDF
    The thesis studies the movement of wood pulp through a rotating disk refiner, by developing continuum and discrete models of the process. An existing continuum model of pulp movement for steady-state refining is first extended to include time-dependent effects, in order to study the dynamic aspects of refining operation. A system of hyperbolic P.D.E.\u27s is obtained, but it is shown to suffer from numerical instability. Attention is therefore shifted to a discrete model, and a stochastic model treating pulp as discontinuous flocs is developed. The prediction of the residence time distribution of pulp in a refiner is the first test for the model. The model treats pulp as individual flocs moving in three regions inside a refiner: the grooves in the stator, the gap between the plates, and the grooves in the rotor. As the pulp moves through the refiner, it changes regions stochastically. The model calculates the residence time by following each floc individually and then accumulating the results to obtain the distribution of the time. The model is also used to predict the treatment time, that is, the time that pulp spends between the refiner plates. The treatment time distribution shows a non-monotonic rise to a maximum, followed by a non-monotonic decay to zero. Several simple prototype simulations are analyzed to show that the behaviour is not due to errors in the numerical simulation, but is inherent in the class of models used. The model is then extended to a time-dependent one, by keeping track of all flocs in the refiner simultaneously. The fluctuation of the locally averaged densities or pulp inside the refiner are simulated. The trends in the treatment time and the residence time of pulp in the refiner, as well as the correlation between the locally averaged densities and the treatment time are also given. The stochastic model is improved by introducing formulas that calculate the probabilities for flocs to switch regions based on both the locally averaged densities and the densities of pulp flocs averaged over the refining zone. A new set of residence-time and treatment-time distributions calculated using the probability formulas is given. Finally, the relation between the thrust load on the refiner and the plate gap is predicted by considering the forces supported by a single floc, and a mechanism is found to take into account all the flocs collectively

    Pre-nuclear level of I-129 in Chinese loess-paleosol sections: A search for the natural I-129 level for dating in terrestrial environments

    Get PDF
    Due to its long half-life (15.7 Myr), radioactive I-129 has great potential for dating geologic materials as old as 100 Myr. Thus, knowing the natural level of I-129 is crucial to dating applications. The initial ratio of I-129/I-127 in the ocean has been quantified by a number of researchers who have reached a consensus value. However, the applicability of I-129 dating in the terrestrial environment remains problematic because the lack of an initial I-129/I-127 value. In this work, samples of loess-paleosol sections from the Chinese Loess Plateau (CLP) were analyzed for I-129/I-127, aiming to provide an Initial I-129/I-127 ratio that can be adopted for dating purposes in terrestrial environments. A value of (2.0 +/- 1.0) x 10(-11) for the I-129/I-127 ratio was found in two investigated loess-paleosol sections from Xifeng and Luochuan, China. This ratio is one order of magnitude higher than the initial value reported for the marine environment. Alteration of the natural initial I-129 In the investigated samples due to the downward migration of anthropogenic I-129 and by excess fissiogenic I-129 from uranium was not supported. Consequently, the I-129/I-127 ratio measured is considered to be a pristine value, and the difference from that In the marine systems is attributed to an Isotopic dilution effect. (C) 2018 Elsevier Ltd. All rights reserved

    Random Fourier Features for Asymmetric Kernels

    Full text link
    The random Fourier features (RFFs) method is a powerful and popular technique in kernel approximation for scalability of kernel methods. The theoretical foundation of RFFs is based on the Bochner theorem that relates symmetric, positive definite (PD) functions to probability measures. This condition naturally excludes asymmetric functions with a wide range applications in practice, e.g., directed graphs, conditional probability, and asymmetric kernels. Nevertheless, understanding asymmetric functions (kernels) and its scalability via RFFs is unclear both theoretically and empirically. In this paper, we introduce a complex measure with the real and imaginary parts corresponding to four finite positive measures, which expands the application scope of the Bochner theorem. By doing so, this framework allows for handling classical symmetric, PD kernels via one positive measure; symmetric, non-positive definite kernels via signed measures; and asymmetric kernels via complex measures, thereby unifying them into a general framework by RFFs, named AsK-RFFs. Such approximation scheme via complex measures enjoys theoretical guarantees in the perspective of the uniform convergence. In algorithmic implementation, to speed up the kernel approximation process, which is expensive due to the calculation of total mass, we employ a subset-based fast estimation method that optimizes total masses on a sub-training set, which enjoys computational efficiency in high dimensions. Our AsK-RFFs method is empirically validated on several typical large-scale datasets and achieves promising kernel approximation performance, which demonstrate the effectiveness of AsK-RFFs

    Optical Study of Liquid Crystal Lens Doped with Multiwalled Carbon Nanotubes

    Get PDF
    In this paper, a new kind of electrically controlled liquid crystal lens, which respond in a relatively fast time, is presented. The multiwalled carbon nanotubes are doped into liquid crystal to fabricate the liquid crystal lens. As 0.02 % concentration of multiwalled carbon nanotubes is uniformly distributed in the liquid crystal, the optical features of the liquid crystal lens are obviously improved. The liquid crystal lens with a diameter of 2.0 mm was fabricated with about 0.2 s response time and less than 5 Vrms applied voltage. The focal length can vary from 16 to 510 mm, and the operation voltage changes from 1.0 to 5.5 Vrms. This liquid crystal lens has the very attractive feature of submillisecond response time, which is a much faster response time in comparison with conventional liquid crystal lens. Thus, this kind of liquid crystal lens has high potential for implementation in many practical imaging applications and imaging commercialisation

    Progressive Denoising Model for Fine-Grained Text-to-Image Generation

    Full text link
    Recently, vector quantized autoregressive (VQ-AR) models have shown remarkable results in text-to-image synthesis by equally predicting discrete image tokens from the top left to bottom right in the latent space. Although the simple generative process surprisingly works well, is this the best way to generate the image? For instance, human creation is more inclined to the outline-to-fine of an image, while VQ-AR models themselves do not consider any relative importance of each component. In this paper, we present a progressive denoising model for high-fidelity text-to-image image generation. The proposed method takes effect by creating new image tokens from coarse to fine based on the existing context in a parallel manner and this procedure is recursively applied until an image sequence is completed. The resulting coarse-to-fine hierarchy makes the image generation process intuitive and interpretable. Extensive experiments demonstrate that the progressive model produces significantly better results when compared with the previous VQ-AR method in FID score across a wide variety of categories and aspects. Moreover, the text-to-image generation time of traditional AR increases linearly with the output image resolution and hence is quite time-consuming even for normal-size images. In contrast, our approach allows achieving a better trade-off between generation quality and speed.Comment: Technique report. arXiv admin note: text overlap with arXiv:2206.10789 by other author
    corecore