1,467 research outputs found
De-grading Assessment: Rejecting Rubrics in Favor of Authentic Analysis
Assigning grades is the least joyful duty of the law professor. In the current climate of legal education, law professors struggle with issues such as increased class size, providing “practice-ready” graduates, streamlining assignments, and accountability in assessment. In an effort to ease the burden of grading written legal analyses, individual professors or law school writing programs or both may develop articulated rubrics to assess students’ written work. Rubrics are classification tools that allow us to articulate our judgment of a written work. Rubrics may be as extensive as twenty categories and subcategories or may be limited to only a few criteria. By definition, rubrics require the development of rigid, standardized criteria that the student must fulfill to earn a certain number of points. Points earned in each section of the rubric are totaled to form the basis for the student’s grade. In assessing legal analyses according to a standardized rubric, however, many subtleties of structure or content and much of the creativity of legal writing is lost or unrewarded or both. Using a rubric to assess legal analytical writing may result in the exact opposite of the intended result: an excellent and creatively written persuasive brief or legal analytical argument may “fail” the rubric and earn a lower overall grade, while a legal analysis that fulfills the exacting criteria of the rubric may earn a top grade despite lacking the intangible aspects of excellent persuasive writing. Good writing does not result when locked into the matrix of a rubric. Rubrics may impair writing and result in bad legal analytical writing. Rubrics replace the authentic, holistic analysis of writing and reasoning with inauthentic pigeonholing that “stamps standardization” onto a creative and analytical, that is, nonstandard, process. A holistic approach to grading and evaluating legal analytical writing, including engaging in authentic conversations about writing, leads to more comprehensible written work product and ultimately better lawyering
Assessment of the performance of alternative aviation fuel in a modern air-spray combustor (MAC)
Recent concerns over energy security and environmental considerations have highlighted the importance of finding alternative aviation fuels. It is expected that coal and biomass derived fuels will fulfil a substantial part of these energy requirements. However, because of the physical and chemical difference in the composition of these fuels, there are potential problems associated with the efficiency and the emissions of the combustion process. Over the past 25 years Computational Fluid Dynamics (CFD) has become increasingly popular with the gas turbine industry as a design tool for establishing and optimising key parameters of systems prior to starting expensive trials. In this paper the performance of a typical aviation fuel, kerosene, an alternative aviation fuel, biofuel and a blend have been examined using CFD modelling. A good knowledge of the kinetics of the reaction of bio aviation fuels at both high and low temperature is necessary to perform reliable simulations of ignition, combustion and emissions in aero-engine. A novel detailed reaction mechanism was used to represent aviation fuel oxidation mechanism. The fuel combustion is calculated using a 3D commercial solver using a mixture fraction/pdf approach. Firstly, the study demonstrates that CFD predictions compare favourably with experimental data obtained by QinetiQ for a Modern Airspray Combustor (MAC) when used with traditional jet fuel (kerosene). Furthermore, the 3D CFD model has been refined to use the laminar flamelet model (LFM) approach that incorporates recently developed chemical reaction mechanisms for the bio-aviation fuel. This has enabled predictions for the bio-aviation fuel to be made. The impact of using the blended fuel has been shown to be very similar in performance to that of the 100% kerosene, confirming that aircraft running on 20% blended fuel should have no significant reduction in performance. It was also found that for the given operating conditions there is a significant reduction in performance when 100% biofuel if used. Additionally, interesting predictions were obtained, related to NOx emissions for the blend and 100% biofuel
A study of possible sea state information in the sample and hold gate statistics for the GEOS-3 satellite altimeter
The statistical variations in the sample gate outputs of the GEOS-3 satellite altimeter were studied for possible sea state information. After examination of a large number of statistical characteristics of the altimeter waveforms, it was found that the best sea predictor for H-1/3 in the range of 0 to 3 meters was the 75th percentile of sample and hold gate number 11
Hydraulic free-surface modelling with a novel validation approach
This work shows that a three-dimensional transient two-phase RANS CFD-VOF model can be used to predict the position of waves and hydraulic jumps within a complex hydraulic flow environment as measured during a series of full-scale experiments. A novel application of LIDAR is used to provide detailed measurements of the position of the water free-surface location during the physical experiments. The test environment is a recreational white-water course that provides a means to vary the flow rates of water and restrict the flow easily as required. Obstructions are added to the channel to create hydraulic jumps and other specific flow features. The influence of these obstructions on the flow has been analysed for size, velocity and position. The results of the study demonstrate that, although computationally intensive, the free-surface CFD approach can reliably predict a range of complex hydraulic flow features in medium/largescale open channel flow conditions. In order to reliably capture the full three-dimensional characteristics of the water free-surface a high resolution mesh (greater than 2.5 million cells) with time-steps in the order of milliseconds is necessary (Simulations presented here represent between 30 and 60 seconds of real-time)
A New Zealand regional work-related sprains and strains surveillance, management and prevention programme: study protocol.
BACKGROUND: The impact and costs associated with work-related sprains and strains in New Zealand and globally are substantial and a major occupational and public health burden. In New Zealand around one-third of all sprains and strains workers compensation (ACC) claims (2019) are for back injuries, but shoulder and arm injuries are increasing at a faster rate than other sprain and strain injuries (ACC, 2020). A need exists for a change to current approaches to sprains and strains prevention, to more effectively manage this significant and persistent problem in workplaces. Designing out hazards is one of the most effective means of preventing occupational injuries and illnesses. This paper outlines the study protocol of the surveillance, management and prevention programme and describes the utilisation of prevention through design principles in the prevention of work-related sprains and strains in agriculture/horticulture/food production in the Hawkes Bay region of New Zealand. METHODS: This is a prospective mixed methods study incorporating the collection of quantitative data to describe the epidemiology of work related sprains and strains injuries presenting to the regional health centre (Hastings Health Centre) over a period of 24 months and qualitative data from participants presenting at the health centre to identify high risk industry sectors/ occupations/ workplaces and tasks and design, develop and apply prevention through design principles/ solutions/interventions to critical features of the work and work environment and undertake an outcome evaluation during the last 6 months of the project. DISCUSSION: The purpose of this project is to establish an epidemiological surveillance programme to assess the incidence and prevalence of work-related sprains and strains according to age, sex, industry sector and occupation to target efforts to prevent work-related sprains and strains, by applying prevention through design (PtD) principles in selected workplaces in agriculture. The collection of more detailed case, occupational and work history data from a sample of patients presenting at the HHC clinic will identify high risk industry sectors/occupations/workplaces and tasks. Assessment techniques will include comprehensive design, design thinking and human factors/ergonomics methodologies through co-design and participatory ergonomics techniques. The PtD solutions/ interventions implemented will be evaluated using a quasi-experimental design consisting of a pre-test/ post-test with-in subjects design with control groups that do not receive the intervention.Published onlin
A Cognitively-Oriented Approach to Task Analysis and Test Development
Clear descriptions of job expertise are required to support applications and improvements in personnel training and job performance. This report describes a practical approach to task analysis that integrates the issues, content, and methods of cognitive science and personnel psychology. Cognitively oriented task analysis employs a breadth, then depth, strategy for identifying job expertise. Starting with a task-by-knowledge framework, job expertise is successively elaborated using interviews, expert ratings, and protocol analyses. The application of task analysis results to the development of written performance measures is described to illustrate the contributions of this approach to measurement validity. Task analysis results show that much of what has been missing in using existing task analysis methods is the mental aspects of performance related to interactions among task dimensions, task characteristics, and contexts. Two appendixes provide an example of knowledge elicitation and representation and item writing guidelines for performance measures
Behaviour of the Blazar CTA 102 during two giant outbursts
Blazar CTA 102 underwent exceptional optical and high-energy outbursts in 2012 and 2016-2017. We analyze its behaviour during these events, focusing on polarimetry as a tool that allows us to trace changes in the physical conditions and geometric configuration of the emission source close to the central black hole. We also use Fermi gamma-ray data in conjunction with optical photometry in an effort to localize the origin of the outbursts.AST-1615796 - Boston Universit
Auto-calibration of ultrasonic lubricant-film thickness measurements
The measurement of oil film thickness in a lubricated component is essential information for performance monitoring and design. It is well established that such measurements can be made ultrasonically if the lubricant film is modelled as a collection of small springs. The ultrasonic method requires that component faces are separated and a reference reflection recorded in order to obtain a reflection coefficient value from which film thickness is calculated. The novel and practically useful approach put forward in this paper and validated experimentally allows reflection coefficient measurement without the requirement for a reference. This involves simultaneously measuring the amplitude and phase of an ultrasonic pulse reflected from a layer. Provided that the acoustic properties of the substrate are known, the theoretical relationship between the two can be fitted to the data in order to yield reflection coefficient amplitude and phase for an infinitely thick layer. This is equivalent to measuring a reference signal directly, but importantly does not require the materials to be separated. The further valuable aspect of this approach, which is demonstrated experimentally, is its ability to be used as a self-calibrating routine, inherently compensating for temperature effects. This is due to the relationship between the amplitude and phase being unaffected by changes in temperature which cause unwanted changes to the incident pulse. Finally, error analysis is performed showing how the accuracy of the results can be optimized. A finding of particular significance is the strong dependence of the accuracy of the technique on the amplitude of reflection coefficient input data used. This places some limitations on the applicability of the technique. © 2008 IOP Publishing Ltd
- …