98 research outputs found
Effect of Cutting Bill Requirements on Lumber Yield in a Rip-First Rough Mill
In recent years, producers of solid wood dimension parts have emphasized improvements in lumber yield, focusing primarily on lumber grade and cutting technology rather than cutting bill design. Yet, cutting bills have a significant impact on yield. Using rip-first rough mill simulation software, a data bank of red oak lumber samples, and a cutting bill that resembles those used in industry, we determined the effect of changes in part size within an existing cutting bill and the impact of part-quantity requirements on yield. The results indicated that cutting bill requirements have a large influence on yield when the shortest part length in the bill is changed. Medium-length part sizes also affect yield except when the cutting bill requires an unlimited number of small parts; in this case, yield always will be high. When an all-blades-movable arbor is used, length changes in the bill affect yield more than changes in width. This study reveals our current lack of understanding of the complex relationship between cutting bill and lumber yield, and points out the yield gains that are possible when properly designed cutting bills are used
Validation of the Standardized and Simplified Cutting Bill
This research validated the framework for the standardized and simplified cutting bill presented in an earlier paper. The cutting bill validation was carried out in two ways. First, all 20 of the cutting bill's part groups were examined to determine if significant yield influences resulted from changing specific part sizes within the boundaries of a given part group. Second, five cutting bills from industrial operations were fit into the framework of the cutting bill, and the simulated yields from these industrial cutting bills were compared with the fitted cutting bills. Yield differences between the two were calculated and tested for significance. Tests revealed that the standardized and simplified cutting bill framework performed as designed. The maximum yield difference observed was 2% and the average less than 1%. Clustering the industrial cutting bill part requirements according to the cutting bill framework led to an average absolute yield deviation between the original cutting bills and the clustered cutting bills of 3.25%. These results show while cutting bill part-size requirements can be clustered into part groups, yield differences of a certain magnitude are introduced by so doing
The Influence of Cutting-Bill Requirements on Lumber Yield Using a Fractional-Factorial Design Part I. Linearity and Least Squares
The importance of lumber yield on the financial success of secondary solid wood products manufacturers has been known for quite some time. Various efforts have been undertaken to improve yield, such as inclusion of character marks (defects) in parts, "cookie-cutting" of boards, improved optimization algorithms, or improved cut-up technologies. For a variety of reasons, the relationship between cutting-bill requirements and lumber yield has attracted limited attention. This is Part I of a 2-part examination of this relationship.The standardized and simplified Buehlmann cutting bill and the Forest Service's Romi-Rip lumber cut-up simulator were used in this study. An orthogonal, 220-11 fractional-factorial design of resolution V was used to determine the influence of different part sizes on lumber yield. All 20 part sizes contained in the cutting bill and 113 of a total of 190 unique secondary interactions were found to be significant variables in explaining the variability in observed yield. Parameter estimates for the part sizes and the secondary interactions were used to specify the average yield contribution of each variable. Parts 445 mm long and 64 mm wide were found to have the most positive influence on yield. Parts smaller than 445 by 64 mm (such as, for example 254 by 64 mm) had a less pronounced positive yield effect because their quantity requirement is relatively small in an average cutting bill. Thus, the quantity required is obtained quickly during the cut-up process. Parts with size 1842 by 108 mm, on the other hand, had the most negative influence on high yield. However, as further analysis showed, not only the individual parts required by a cutting bill, but also their interaction determines yield. In general, it was found that by adding a sufficiently large number of smaller parts to a cutting bill that required large parts, high levels of yield can be achieved
The Influence of Cutting-Bill Requirements on Lumber Yield Using a Fractional-Factorial Design Part II. Correlation and Number of Part Sizes
Cutting-bill requirements, among other factors, influence the yield obtained when cutting lumber into parts. The first part of this 2-part series described how different cutting-bill part sizes, when added to an existing cutting-bill, affect lumber yield, and quantified these observations. To accomplish this, the study employed linear least squares estimation technique. This second paper again looks at the influence of cutting-bill requirements but establishes a measure of how preferable it is to have a given part size required by the cutting-bill. The influence of the number of different part sizes to be cut simultaneously on lumber yield is also investigated.Using rip-first rough mill simulation software and an orthogonal, 220-11 fractional-factorial design of resolution V, the correlation between lengths, widths, and 20 part sizes as defined by the Buehlmann cutting-bill with high yield was established. It was found that, as long as the quantity of small parts is limited, part sizes larger than the smallest size are more positively correlated with high yield. Furthermore, only 4 out of the 20 part sizes tested were identified with having a significant positive correlation with above average yield (65.09%), while 10 were found with a significant negative correlation and above average yield. With respect to the benefit of cutting varying numbers of part sizes simultaneously, this study showed that there is a positive correlation between yield and the number of different part sizes being cut. However, Duncan's test did not detect significant yield gains for instances when more than 11 part sizes are contained in the cutting-bill
Creating A Standardized and Simplified Cutting Bill Using Group Technology
From an analytical viewpoint, the relationship between rough mill cutting bill part requirements and lumber yield is highly complex. Part requirements can have almost any length, width, and quantity distribution within the boundaries set by physical limitations, such as maximum length and width of parts. This complexity makes it difficult to understand the specific relationship between cutting bill requirements and lumber yield, rendering the optimization of the lumber cutting process through improved cutting bill composition difficult.An approach is presented to decrease the complexity of cutting bills to allow for easier analysis and, ultimately, to optimize cutting bill compositions. Principles from clustering theory were employed to create a standardized way to describe cutting bills. Cutting bill part clusters are part groups within the cutting bill's total part size space, where all parts are reset to a given group's midpoint. Statistical testing was used to determine a minimum resolution part group matrix that had no significant influence on yield compared to an actual cutting bill.Iterative search led to a cutting bill part group matrix that encompasses five groups in length and four groups in width, forming a 20-part group matrix. The lengths of the individual part groups created vary widely, with the smallest group being only 5 inches in length, while the longest two groups were 25 inches long. Part group widths were less varied, ranging from 0.75 inches to 1.0 inch. The part group matrix approach allows parts to be clustered within given size ranges to one part group midpoint value without changing cut-up yield beyond set limits. This standardized cutting bill matrix will make the understanding of the complex cutting bill requirements-yield relationship easier in future studies
Numerical Discreteness Errors in Multi-Species Cosmological N-body Simulations
We present a detailed analysis of numerical discreteness errors in
two-species, gravity-only, cosmological simulations using the density power
spectrum as a diagnostic probe. In a simple setup where both species are
initialized with the same total matter transfer function, biased growth of
power forms on small scales when the solver force resolution is finer than the
mean interparticle separation. The artificial bias is more severe when
individual density and velocity transfer functions are applied. In particular,
significant large-scale offsets in power are measured between simulations with
conventional offset grid initial conditions when compared against converged
high-resolution results where the force resolution scale is matched to the
interparticle separation. These offsets persist even when the cosmology is
chosen so that the two particle species have the same mass, indicating that the
error is sourced from discreteness in the total matter field as opposed to
unequal particle mass. We further investigate two mitigation strategies to
address discreteness errors: the frozen potential method and softened
interspecies short-range forces. The former evolves particles under the
approximately "frozen" total matter potential in linear theory at early times,
while the latter filters cross-species gravitational interactions on small
scales in low density regions. By modeling closer to the continuum limit, both
mitigation strategies demonstrate considerable reductions in large-scale power
spectrum offsets.Comment: Accepted for publication in MNRA
Simulating Hydrodynamics in Cosmology with CRK-HACC
We introduce CRK-HACC, an extension of the Hardware/Hybrid Accelerated
Cosmology Code (HACC), to resolve gas hydrodynamics in large-scale structure
formation simulations of the universe. The new framework couples the HACC
gravitational N-body solver with a modern smoothed particle hydrodynamics (SPH)
approach called CRKSPH. onservative
eproducing ernel
utilizes smoothing functions that exactly interpolate
linear fields while manifestly preserving conservation laws (momentum, mass,
and energy). The CRKSPH method has been incorporated to accurately model
baryonic effects in cosmology simulations - an important addition targeting the
generation of precise synthetic sky predictions for upcoming observational
surveys. CRK-HACC inherits the codesign strategies of the HACC solver and is
built to run on modern GPU-accelerated supercomputers. In this work, we
summarize the primary solver components and present a number of standard
validation tests to demonstrate code accuracy, including idealized hydrodynamic
and cosmological setups, as well as self-similarity measurements
Optimization and Quality Assessment of Baryon Pasting for Intracluster Gas using the Borg Cube Simulation
Synthetic datasets generated from large-volume gravity-only simulations are
an important tool in the calibration of cosmological analyses. Their creation
often requires accurate inference of baryonic observables from the dark matter
field. We explore the effectiveness of a baryon pasting algorithm in providing
precise estimations of three-dimensional gas thermodynamic properties based on
gravity-only simulations. We use the Borg Cube, a pair of simulations
originating from identical initial conditions, with one run evolved as a
gravity-only simulation, and the other incorporating non-radiative
hydrodynamics. Matching halos in both simulations enables comparisons of gas
properties on an individual halo basis. This comparative analysis allows us to
fit for the model parameters that yield the closest agreement between the gas
properties in both runs. To capture the redshift evolution of these parameters,
we perform the analysis at five distinct redshift steps, spanning from to
. We find that the investigated algorithm, utilizing information solely from
the gravity-only simulation, achieves few-percent accuracy in reproducing the
median intracluster gas pressure and density, albeit with a scatter of
approximately 20%, for cluster-scale objects up to . We measure the
scaling relation between integrated Compton parameter and cluster mass
(), and find that the imprecision of baryon pasting adds
less than 5% to the intrinsic scatter measured in the hydrodynamic simulation.
We provide best-fitting values and their redshift evolution, and discuss future
investigations that will be undertaken to extend this work.Comment: 14 pages, 8 figures, 3 tables; accepted in the Open Journal of
Astrophysic
High-efficiency micromorph silicon solar cells with in-situ intermediate reflector deposited on various rough LPCVD ZnO
Light management using intermediate reflector layers (IRL) and advanced front transparent conductive oxide (TCO) morphologies is needed to rise the short-circuit current density (Jsc) of micromorph tandem solar cells above 14 mA/cm2. For micromorph cells deposited on surface-textured ZnO layers grown by low-pressure chemical vapour deposition (LPCVD), we study the interplay between the front TCO layer and the IRL and its impact on fill factor and current matching conditions. The key role of the angular distribution of the light scattered by the front LPCVD ZnO layer is highlighted. A micromorph cell with 11.1% stabilized conversion efficiency is demonstrated. By increasing the bottom cell thickness and adding an antireflection coating, a Jsc value of 13.8 mA/cm2 is achieved. This remarkably high Jsc yields 13.3% initial conversion efficiency
- …