200 research outputs found

    Is the Broido - Shafizadeh model for cellulose pyrolysis true?

    Get PDF
    The widely accepted Broido-Shafizadeh model describes cellulose pyrolysis kinetics in terms of two parallel (competing) reactions preceded by an initiation step. In spite of the fact that many recent experimental results seem to contradict the predictions of the model, its validity has not been seriously questioned. In this paper we report thermogravimetric analyses of Avicel cellulose involving prolonged thermal pretreatments of small samples (0.5 to 3 mg). The weight loss curves were simulated by modern numerical techniques using the Broido-Safizadeh and other related models. Results were not consistent with the presence of an initiation reaction, but they did strongly confirm the role of parallel reactions in the decomposition chemistry. A subsequent, high temperature (370 °C), pyrolytic degradation of solid intermediates formed below 300 °C was also detected. In the absence of a prolonged thermal pretreatment, only one of the two parallel reactions can be observed. This reaction is first order, irreversible, and manifests a high activation energy (238 kJ/mol). The kinetic parameters of this reaction are not influenced by the large quantity of solid intermediates formed during prolonged, low-temperature thermal pretreatments, indicating that chemical processes are much more significant than the physical structure of the sample during pyrolysis

    Cellulose pyrolysis kinetics: Revisited

    Get PDF
    In the same thermogravimetric analyzer (TGA) under identical conditions, samples of pure, ash-free cellulose (i.e. Avicel PH-105, Whatman CF-11, Millipore ash-free filter pulp, and Whatman #42) obtained from different manufacturers undergo pyrolysis at temperatures which differ by as much as 30 C. Thus the pyrolysis chemistry of a sample of pure cellulose is not governed by a universal rate law, as is the case with a pure hydrocarbon gas (for example). Nevertheless, the pyrolytic weight loss of all the samples studied in this work is well represented by a high activation energy (228 kJ/mol), first order rate law at both low and high heating rates. These results do not corroborate the recent findings of Milosavljevic and Suuberg (1995). For a particular cellulose sample (for example Avicel PH-105), variations in the pre-exponential constant determined at different heating rates reflect uncontrolled, systematic errors in the dynamic sample temperature measurement (thermal lag)

    Kinetic modeling of biomass pyrolysis : 10 years of a US - Hungarian cooperation

    Get PDF
    The thermal decomposition of lignocellulosic biomass materials and their major components is discussed. Thermogravimetric and DSC curves at different T(t) heating programs were evaluated by the method of least squares. Pseudo-first order models, parallel, successive and competitive reaction schemes and complex reaction networks were employed in the modeling. The following topics are treated: thermal decomposition of cellulose at low (2°/min) and high (50 - 80°C/min) heating rates; low temperature phenomena; the validity of the Broido – Shafizadeh model; effects of mineral catalysts; cellulose pyrolysis in closed sample holders; thermal decomposition kinetics of xylan, lignin and lignocellulosic plant samples. Keywords: Biomass, Cellulose, Hemicellulose, Lignin, Reaction kinetics, Thermogravimetry, DS

    Cellulose Pyrolysis Kinetics:  Revisited

    Full text link

    Motif Cut Sparsifiers

    Full text link
    A motif is a frequently occurring subgraph of a given directed or undirected graph GG. Motifs capture higher order organizational structure of GG beyond edge relationships, and, therefore, have found wide applications such as in graph clustering, community detection, and analysis of biological and physical networks to name a few. In these applications, the cut structure of motifs plays a crucial role as vertices are partitioned into clusters by cuts whose conductance is based on the number of instances of a particular motif, as opposed to just the number of edges, crossing the cuts. In this paper, we introduce the concept of a motif cut sparsifier. We show that one can compute in polynomial time a sparse weighted subgraph G′G' with only O~(n/ϵ2)\widetilde{O}(n/\epsilon^2) edges such that for every cut, the weighted number of copies of MM crossing the cut in G′G' is within a 1+ϵ1+\epsilon factor of the number of copies of MM crossing the cut in GG, for every constant size motif MM. Our work carefully combines the viewpoints of both graph sparsification and hypergraph sparsification. We sample edges which requires us to extend and strengthen the concept of cut sparsifiers introduced in the seminal work of to the motif setting. We adapt the importance sampling framework through the viewpoint of hypergraph sparsification by deriving the edge sampling probabilities from the strong connectivity values of a hypergraph whose hyperedges represent motif instances. Finally, an iterative sparsification primitive inspired by both viewpoints is used to reduce the number of edges in GG to nearly linear. In addition, we present a strong lower bound ruling out a similar result for sparsification with respect to induced occurrences of motifs.Comment: 48 pages, 3 figure

    Application of complex reaction kinetic models in thermal analysis. The least squares evaluation of series of experiments

    Get PDF
    The complexity of the phenomena which arise during the heating of the various substances seldom can be described by a single reaction kinetic equation. As a consequence, sophisticated models with several unknown parameters have to be developed. The determination of the unknown parameters and the validation of the models requires the simultaneous evaluation of whole series of experiments. We can accept a model and its parameters if, and only if we get a reasonable fit to several experiments carried out at different experimental conditions. In the field of the thermal analysis the method of least squares alone seldom can select a best model or a best set of parameter values. Nevertheless, the careful evaluation of the experiments may help in the discerning between various chemical or physical assumptions by the quality of the corresponding fit between the experimental and the simulated data. The problem is illustrated by the thermal decomposition of cellulose under various experimental conditions

    Combustion Kinetics of Corncob Charcoal and Partially Demineralized Corncob Charcoal in the Kinetic Regime

    Get PDF
    Charcoals produced by a modern, efficient method were studied in the kinetic regime, at oxygen partial pressures of 0.2 and 1 bar by thermogravimetric experiments and their reaction kinetic modeling. The charcoals were ground to an average particle size of 5 – 13 µm. A partial removal of minerals from the feedstock (corncobs) by an acid-washing procedure resulted in ca. 6 times higher specific surface area in the charcoal. In spite of the increased surface area, this sample evidenced a much lower reactivity. A model based on three reactions gave an adequate description over a wide range of experimental conditions. 38 experiments on 4 charcoal samples were evaluated. The experiments differed in their temperature programs, in the ambient gas composition and in the grinding of the samples. Characteristics of the combustion process were determined, including activation energy values characteristic for the temperature dependence of the burn-off; formal reaction orders characterizing the dependence on the oxygen content of the ambient; and functions describing the conversion dependence of the partial processes

    Noisy Boolean Hidden Matching with Applications

    Get PDF
    The Boolean Hidden Matching (BHM) problem, introduced in a seminal paper of Gavinsky et al. [STOC\u2707], has played an important role in lower bounds for graph problems in the streaming model (e.g., subgraph counting, maximum matching, MAX-CUT, Schatten p-norm approximation). The BHM problem typically leads to ?(?n) space lower bounds for constant factor approximations, with the reductions generating graphs that consist of connected components of constant size. The related Boolean Hidden Hypermatching (BHH) problem provides ?(n^{1-1/t}) lower bounds for 1+O(1/t) approximation, for integers t ? 2. The corresponding reductions produce graphs with connected components of diameter about t, and essentially show that long range exploration is hard in the streaming model with an adversarial order of updates. In this paper we introduce a natural variant of the BHM problem, called noisy BHM (and its natural noisy BHH variant), that we use to obtain stronger than ?(?n) lower bounds for approximating a number of the aforementioned problems in graph streams when the input graphs consist only of components of diameter bounded by a fixed constant. We next introduce and study the graph classification problem, where the task is to test whether the input graph is isomorphic to a given graph. As a first step, we use the noisy BHM problem to show that the problem of classifying whether an underlying graph is isomorphic to a complete binary tree in insertion-only streams requires ?(n) space, which seems challenging to show using either BHM or BHH
    • …
    corecore