7,921 research outputs found
Meso-scale FDM material layout design strategies under manufacturability constraints and fracture conditions
In the manufacturability-driven design (MDD) perspective, manufacturability of the product or system is the most important of the design requirements. In addition to being able to ensure that complex designs (e.g., topology optimization) are manufacturable with a given process or process family, MDD also helps mechanical designers to take advantage of unique process-material effects generated during manufacturing. One of the most recognizable examples of this comes from the scanning-type family of additive manufacturing (AM) processes; the most notable and familiar member of this family is the fused deposition modeling (FDM) or fused filament fabrication (FFF) process. This process works by selectively depositing uniform, approximately isotropic beads or elements of molten thermoplastic material (typically structural engineering plastics) in a series of pre-specified traces to build each layer of the part. There are many interesting 2-D and 3-D mechanical design problems that can be explored by designing the layout of these elements. The resulting structured, hierarchical material (which is both manufacturable and customized layer-by-layer within the limits of the process and material) can be defined as a manufacturing process-driven structured material (MPDSM). This dissertation explores several practical methods for designing these element layouts for 2-D and 3-D meso-scale mechanical problems, focusing ultimately on design-for-fracture. Three different fracture conditions are explored: (1) cases where a crack must be prevented or stopped, (2) cases where the crack must be encouraged or accelerated, and (3) cases where cracks must grow in a simple pre-determined pattern. Several new design tools, including a mapping method for the FDM manufacturability constraints, three major literature reviews, the collection, organization, and analysis of several large (qualitative and quantitative) multi-scale datasets on the fracture behavior of FDM-processed materials, some new experimental equipment, and the refinement of a fast and simple g-code generator based on commercially-available software, were developed and refined to support the design of MPDSMs under fracture conditions. The refined design method and rules were experimentally validated using a series of case studies (involving both design and physical testing of the designs) at the end of the dissertation. Finally, a simple design guide for practicing engineers who are not experts in advanced solid mechanics nor process-tailored materials was developed from the results of this project.U of I OnlyAuthor's request
Recommended from our members
Ensuring Access to Safe and Nutritious Food for All Through the Transformation of Food Systems
Self-Supervised Learning to Prove Equivalence Between Straight-Line Programs via Rewrite Rules
We target the problem of automatically synthesizing proofs of semantic
equivalence between two programs made of sequences of statements. We represent
programs using abstract syntax trees (AST), where a given set of
semantics-preserving rewrite rules can be applied on a specific AST pattern to
generate a transformed and semantically equivalent program. In our system, two
programs are equivalent if there exists a sequence of application of these
rewrite rules that leads to rewriting one program into the other. We propose a
neural network architecture based on a transformer model to generate proofs of
equivalence between program pairs. The system outputs a sequence of rewrites,
and the validity of the sequence is simply checked by verifying it can be
applied. If no valid sequence is produced by the neural network, the system
reports the programs as non-equivalent, ensuring by design no programs may be
incorrectly reported as equivalent. Our system is fully implemented for a given
grammar which can represent straight-line programs with function calls and
multiple types. To efficiently train the system to generate such sequences, we
develop an original incremental training technique, named self-supervised
sample selection. We extensively study the effectiveness of this novel training
approach on proofs of increasing complexity and length. Our system, S4Eq,
achieves 97% proof success on a curated dataset of 10,000 pairs of equivalent
programsComment: 30 pages including appendi
The Effect of Token Economies on Student Behavior in the Preschool Classroom: A Meta-Analysis
There has been a recent push in the literature to identify and use more evidence-based practices for positive behavioral supports for challenging student behaviors in the classroom environment. Further, interest in targeting early education environments such as preschool has been growing given the persistence of behavioral difficulties in the absence of early and effective intervention (Campbell & Ewing, 1990; Kazdin, 1987; Powell et al., 2006; Stormont, 2002). Two previous meta-analyses (Maggin et al., 2011; Soares et al., 2016) provided some initial support for effectiveness of token economies with challenging student behavior; however, the inclusion of the preschool setting was limited and both studies used older versions of design standards to evaluate the quality of studies in the literature. The present study served to extend those meta-analyses by targeting preschool classrooms. Further, the current study included the most recent What Works Clearinghouse Design Standards to evaluate whether token economies meet criteria as an evidence-based practice. Ten studies were included in the final analyses. Two sets of effect sizes were calculated: Baseline-Corrected Tau and Hedge’s g. An omnibus effect size showed an overall large effect; however, similar to previous meta-analyses, several methodological concerns were identified. Moderator analyses for several variables were conducted; however, no moderator analyses were significant. Limitations and future directions were discussed
A Decision Support System for Economic Viability and Environmental Impact Assessment of Vertical Farms
Vertical farming (VF) is the practice of growing crops or animals using the vertical dimension via multi-tier racks or vertically inclined surfaces. In this thesis, I focus on the emerging industry of plant-specific VF. Vertical plant farming (VPF) is a promising and relatively novel practice that can be conducted in buildings with environmental control and artificial lighting. However, the nascent sector has experienced challenges in economic viability, standardisation, and environmental sustainability. Practitioners and academics call for a comprehensive financial analysis of VPF, but efforts are stifled by a lack of valid and available data.
A review of economic estimation and horticultural software identifies a need for a decision support system (DSS) that facilitates risk-empowered business planning for vertical farmers. This thesis proposes an open-source DSS framework to evaluate business sustainability through financial risk and environmental impact assessments. Data from the literature, alongside lessons learned from industry practitioners, would be centralised in the proposed DSS using imprecise data techniques. These techniques have been applied in engineering but are seldom used in financial forecasting. This could benefit complex sectors which only have scarce data to predict business viability.
To begin the execution of the DSS framework, VPF practitioners were interviewed using a mixed-methods approach. Learnings from over 19 shuttered and operational VPF projects provide insights into the barriers inhibiting scalability and identifying risks to form a risk taxonomy. Labour was the most commonly reported top challenge. Therefore, research was conducted to explore lean principles to improve productivity.
A probabilistic model representing a spectrum of variables and their associated uncertainty was built according to the DSS framework to evaluate the financial risk for VF projects. This enabled flexible computation without precise production or financial data to improve economic estimation accuracy. The model assessed two VPF cases (one in the UK and another in Japan), demonstrating the first risk and uncertainty quantification of VPF business models in the literature. The results highlighted measures to improve economic viability and the viability of the UK and Japan case.
The environmental impact assessment model was developed, allowing VPF operators to evaluate their carbon footprint compared to traditional agriculture using life-cycle assessment. I explore strategies for net-zero carbon production through sensitivity analysis. Renewable energies, especially solar, geothermal, and tidal power, show promise for reducing the carbon emissions of indoor VPF. Results show that renewably-powered VPF can reduce carbon emissions compared to field-based agriculture when considering the land-use change.
The drivers for DSS adoption have been researched, showing a pathway of compliance and design thinking to overcome the ‘problem of implementation’ and enable commercialisation. Further work is suggested to standardise VF equipment, collect benchmarking data, and characterise risks. This work will reduce risk and uncertainty and accelerate the sector’s emergence
Minotaur: A SIMD-Oriented Synthesizing Superoptimizer
Minotaur is a superoptimizer for LLVM's intermediate representation that
focuses on integer SIMD instructions, both portable and specific to x86-64. We
created it to attack problems in finding missing peephole optimizations for
SIMD instructions-this is challenging because there are many such instructions
and they can be semantically complex. Minotaur runs a hybrid synthesis
algorithm where instructions are enumerated concretely, but literal constants
are generated by the solver. We use Alive2 as a verification engine; to do this
we modified it to support synthesis and also to support a large subset of
Intel's vector instruction sets (SSE, AVX, AVX2, and AVX-512). Minotaur finds
many profitable optimizations that are missing from LLVM. It achieves limited
speedups on the integer parts of SPEC CPU2017, around 1.3%, and it speeds up
the test suite for the libYUV library by 2.2%, on average, and by 1.64x
maximum, when targeting an Intel Cascade Lake processor
DESIGN OF RADIAL/LINEAR HYBRID PI-CONJUGATED SYSTEMS WITH DISJOINT SUBSTITUTION PATTERN
The understanding of pi conjugation in organic electronics has largely grown from linear conjugated oligomers and polymers. Recent research enabled the synthesis of [n]cycloparaphenylenes (CPPS) which show unique electronic properties from radial pi conjugation. Previous work in this lab incorporated CPPs into linear conjugated systems to investigate fundamental properties of the curved pi surface. This dissertation details the design of disjointly-substituted CPP incorporated into linear small molecule and polymeric systems, and synthetic attempts to extend functionalized CPPs with various aryl groups and fused polycyclic aromatic hydrocarbons (PAHs). Chapter 1 describes the design and optoelectronic characterization of [8]CPP with disjointly substituted di-alkyne subunits in collaboration with Dr. Ramesh Jasti that allows for pi extension primarily through Sonogashira cross couplings to afford small molecule and polymer systems. New electronic states arise from multiple operative radial/linear conjugation pathways, as the disjoint pattern results in both ortho and meta connections to the CPP ring. Chapter 2 details oxidation studies of linearly extended CPPs and progress made on post-construction cyclization chemistry to form fused PAH/CPP hybrids. Chapter 3 discusses progress made to synthesize CPPs with directly linked aryl groups, including the design of terphenyl model compounds, and ends with proposed CPP materials targets. Chapter 4 introduces a design theory for organic diradicals through computational studies of linear conjugated diradicals with lower symmetry patterns, as well as CPP diradicals. Synthetic progress toward some of these low symmetry molecules are included alongside considerations for designing and characterizing stable and persistent organic diradicals
Modelling uncertainties for measurements of the H → γγ Channel with the ATLAS Detector at the LHC
The Higgs boson to diphoton (H → γγ) branching ratio is only 0.227 %, but this
final state has yielded some of the most precise measurements of the particle. As
measurements of the Higgs boson become increasingly precise, greater import is
placed on the factors that constitute the uncertainty. Reducing the effects of these
uncertainties requires an understanding of their causes. The research presented
in this thesis aims to illuminate how uncertainties on simulation modelling are
determined and proffers novel techniques in deriving them.
The upgrade of the FastCaloSim tool is described, used for simulating events in
the ATLAS calorimeter at a rate far exceeding the nominal detector simulation,
Geant4. The integration of a method that allows the toolbox to emulate the
accordion geometry of the liquid argon calorimeters is detailed. This tool allows
for the production of larger samples while using significantly fewer computing
resources.
A measurement of the total Higgs boson production cross-section multiplied
by the diphoton branching ratio (σ × Bγγ) is presented, where this value was
determined to be (σ × Bγγ)obs = 127 ± 7 (stat.) ± 7 (syst.) fb, within agreement
with the Standard Model prediction. The signal and background shape modelling
is described, and the contribution of the background modelling uncertainty to the
total uncertainty ranges from 18–2.4 %, depending on the Higgs boson production
mechanism.
A method for estimating the number of events in a Monte Carlo background
sample required to model the shape is detailed. It was found that the size of
the nominal γγ background events sample required a multiplicative increase by
a factor of 3.60 to adequately model the background with a confidence level of
68 %, or a factor of 7.20 for a confidence level of 95 %. Based on this estimate,
0.5 billion additional simulated events were produced, substantially reducing the
background modelling uncertainty.
A technique is detailed for emulating the effects of Monte Carlo event generator
differences using multivariate reweighting. The technique is used to estimate the
event generator uncertainty on the signal modelling of tHqb events, improving the
reliability of estimating the tHqb production cross-section. Then this multivariate
reweighting technique is used to estimate the generator modelling uncertainties
on background V γγ samples for the first time. The estimated uncertainties were
found to be covered by the currently assumed background modelling uncertainty
- …