8,080 research outputs found
Short and slim nacelle design for ultra-high BPR engines
An optimisation method consisting of the non-dominated sorting genetic algorithm (NSGA-II) and computational fluid dynamics of aero-engine nacelles is outlined. The method is applied to three nacelle lengths to determine the relative performance of different ultra-high bypass ratio engine nacelles. The optimal designs at each nacelle length are optimised for three objective functions: cruise drag, drag rise Mach number and change in spillage drag from mid to end of cruise. The Pareto sets generated from these optimisation computations demonstrate that the design space for short nacelles is much narrower in terms of these performace metrics and there are significant penalties in the off design conditions compared to the longer nacelle. Specifically the minimum spillage drag coefficient attainable, for a nacelle with a drag rise Mach number above 0.87, was 0.0040 for the shortest nacelle compared to 0.0005 for a nacelle which was 23% longer
An optimization method for nacelle design
A multi-objective optimiZation method is demonstrated using an evolutionary genetic algorithm. The applicability of this method to preliminary nacelle design is demonstrated by coupling it with a response surface model of a wide range of nacelle designs. These designs were modelled using computational fluid dynamics and a Kriging interpolation was carried out on the results. The NSGA-II algorithm was tested and verified on established multi-dimensional problems. Optimisation on the nacelle model provided 3-dimensional Pareto surfaces of optimal designs at both cruise and off-design conditions. In setting up this methodology several adaptations to the basic NSGA-II algorithm were tested including constraint handling, weighted objective functions and initial sample size. The influence of these operators is demonstrated in terms of the hyper volume of the determined Pareto set
Recommended from our members
A multimedia tutorial shell with qualitative assessment in biology
The project is developing methods to produce multimedia tutorials relatively quickly and cheaply, using a generic software shell suitable for any subject area. The shell is a version of one produced originally as part of the HEFC-funded TLTP initiative by the Biodiversity Consortium. Tutorials presented in the shell will provide the student with a structured learning experience that will allow their initial knowledge level or their knowledge acquisition and progress to be qualitatively and quantitatively assessed. Where areas of weakness are revealed by the assessment, students will be advised to study particular parts of the tutorial in order to improve their understanding
The Challenges of Prediction: Lessons from Criminal Justice
An appeal from a final judgment of the Second Circuit Court, State of Utah, Weber County, Ogden Department, Hon. Pamela Heffernan presiding
Thermomechanical deformation behavior of a dynamic strain aging alloy, Hastelloy X
An experimental study was performed to identify the effects of dynamic strain aging (solute drag) and metallurgical instabilities under thermomechanical loading conditions. The study involved a series of closely controlled thermomechanical deformation tests on the solid-solution-strenghened nickel-base superalloy, Hastelloy X. This alloy exhibits a strong isothermal strain aging peak at approximately 600 C, promoted by the effects of solute drag and precipitation hardening. Macroscopic thermomechanical hardening trends are correlated with microstructural characteristics through the use of transmission electron microscopy. These observations are compared and contrasted with isothermal conditions. Thermomechanical behavior unique to the isothermal database is identified and discussed. The microstructural characteristics were shown to be dominated by effects associated with the highest temperature of the thermomechanical cycle. Results indicate that the deformation behavior of Hastelloy X is thermomechanically path dependent. In addition, guidance is given pertaining to deformation modeling in the context of macroscopic unified theory. An internal state variable is formulated to qualitatively reflect the isotropic hardening trends identified in the TMD experiments
Ceramic Petrographic Analysis of Sites 41CP71, 41BW2, 41BW5, and 41SM442, Northeast Texas
A total of 61 ancestral Caddo ceramic sherds from four village sites in Northeast Texas were studied by ceramic petrographic methods in 2014. The sample sherds were excavated from their sites under controlled conditions and were either archived at the Texas Archeological Research Laboratory at The University of Texas at Austin (41BW2), Stephen F. Austin State University (41CP71), or remain in private hands (41SM442). Recently, they were selected for combined petrographic and instrumental neutron activation analysis (INAA), although only the sherds from the Alligator Pond site (41SM442) have actually been submitted and analyzed by INAA at this time. This combination of approaches is part of a change in Northeast Texas ceramic technological studies termed a second generation by some (Robinson 2014), although such multiple combined approaches have long been advocated and applied in general archaeological literature. The approach looks at the geochemical and petrological characteristics of ceramics in tandem to gain a broader and more informative background on the character of ancient pottery. This study is the petrographic branch of the overall approach; the objective here is to gain clues or suggestions on local, community, and regional scales of Caddo ceramic production and distribution. Part of this effort is to attempt to identify localities and types of clay beds used in ceramic manufacture
Integrals of Periodic Functions
Computing integrals of powers of the sine function is a standard exercise in calculus. The authors show that the first integral is representative of the integral of any periodic function
Danger Ahead: Risk Assessment and the Future of Bail Reform
In the last five years, legislators in all fifty states have made changes to their pretrial justice systems. Reform efforts aim to shrink jails by incarcerating fewer people—particularly poor, low-risk defendants and racial minorities. Many jurisdictions are embracing pretrial risk assessment instruments—statistical tools that use historical data to forecast which defendants can safely be released—as a centerpiece of reform. Now, many are questioning the extent to which pretrial risk assessment instruments actually serve reform goals. Existing scholarship and debate centers on how the instruments themselves may reinforce racial disparities and on how their opaque algorithms may frustrate due process interests. This Article highlights three underlying challenges that have yet to receive the attention they require. First, today’s risk assessment tools lead to what we term “zombie predictions.” That is, predictive models trained on data from older bail regimes are blind to the risk-reducing benefits of recent bail reforms. This may cause predictions that systematically overestimate risk. Second, “decision-making frameworks” that mediate the court system’s use of risk estimates embody crucial moral judgments, yet currently escape appropriate public scrutiny. Third, in the long-term, these tools risk giving an imprimatur of scientific objectivity to ill-defined concepts of “dangerousness,” may entrench the Supreme Court’s historically recent blessing of preventive detention for dangerousness, and could pave the way for an increase in preventive detention. Pretrial risk assessment instruments, as they are currently built and used, cannot safely be assumed to support reformist goals of reducing incarceration and addressing racial and poverty-based inequities. This Article contends that system stakeholders who share those goals are best off focusing their reformist energies on other steps that can more directly promote decarceral changes and greater equity in pretrial justice. Where pretrial risk assessments remain in use, this Article proposes two vital steps that should be seen as minimally necessary to address the challenges surfaced. First, where they choose to embrace risk assessment, jurisdictions must carefully define what they wish to predict, gather and use local, recent data, and continuously update and calibrate any model on which they choose to rely, investing in a robust data infrastructure where necessary to meet these goals. Second, instruments and frameworks must be subject to strong, inclusive governance
- …