1,532 research outputs found

    Thirty Years of New Mexico Architecture Magazine

    Get PDF

    Comparison of Three Degree of Freedom and Six Degree of Freedom Motion Bases Utilizing Classical Washout Algorithms

    Get PDF
    Vehicle Simulations have many practical applications ranging from leisure use, public safety, civilian training, and commercial design applications. Vehicle simulations are commonly designed to work with three-degree of freedom or six-degree of freedom motion bases. Each system has pros and cons, which make them more suitable for specific applications. These two systems differ drastically when it comes to cost of hardware, software, and complexity. The three degree of freedom system is limited to rotation about the x, y, and z axes. This means this system is limited to causing the sensation of acceleration through rotation about these three axis and from maintaining a tilt angle and using gravity. The tilt in this system is limited typically to a 45 degree angle, which prevents the system from creating a sensation of acceleration over 0.707g\u27s. The six-degree of freedom system not only has the ability to rotate about the three axes, but can move laterally along all three axes. The ability of this system to more accurately produce these forces would be a given, but is limited to the lateral distance available for movement about each of the three axes. This gives the system the ability to produce additional acceleration, which theoretically allows the system to change accelerations more freely. Tuning of each system is accomplished by placing limits on the system\u27s ability to rotate and move laterally to insure that the simulation does not request accelerations beyond the capabilities of the motion base. This will prevent the system from creating undesired sensations, which should not be present. These limits unfortunately will not prevent all undesired sensations. Altering control parameters contained in the washout algorithms is another method of tuning, which can attempt to prevent or limit the magnitude of these sensations. The purpose of this study is to compare the two systems ability to replicate the forces or feel of actually riding in the vehicle. This comparison will show whether or not the extra cost and effort in designing a more complex system is warranted for the motion bases applications

    Rapid Uplift Of Southern Alaska Caused By Recent Ice Loss

    Get PDF
    Thesis (Ph.D.) University of Alaska Fairbanks, 2003Changing surface loads, such as melting glaciers, can induce deformation of the Earth's crust. The speed of the Earth's response to load changes and the pattern of deformation they cause can be used to infer material properties of the lithosphere and mantle. Rapid uplift of southern Alaska has been measured with tide gauges, Global Positioning System (GPS) measurements and studies of raised shorelines. With multiple sites uplifting at rates in excess of 25 mm/yr, these measurements reveal the world's fastest regional uplift. Southern Alaska has over 75000 km2 of glaciers, the rapid melting of which is contributing more to global sea level rise than Greenland. Southern Alaska also has intense tectonic activity, and uplift driven by tectonics has been suggested to be comparable with that driven by glacial unloading. The majority of the uplift measurements examined here are located along the strike-slip portion of the Pacific - North America plate boundary. GPS measurements show little compressional strain associated with tectonic forcing. Tide gauges indicate long term linear uplift rates within the strike-slip regime, contrasting with tectonically influenced non-linear uplift to the northwest, where the Pacific Plate subducts beneath North America. Dating of raised shorelines within southeast Alaska show that the rapid uplift there began simultaneously with glacial unloading ~1790 AD. These observations indicate that the tectonic contribution to the uplift in southeast Alaska is small. Multiple independent studies are used here to constrain the load changes in southern Alaska over the past ~1000--2000 yrs. A detailed model of the advance, standstill and retreat phases of the Little Ice Age glaciation is used as input to a simple viscoelastic Earth model. This model can match the pattern and magnitude of the region's uplift observations with a low degree of misfit, verifying that the region's uplift can be entirely attributed to glacial isostatic rebound. Furthermore, the uplift observations require at the 95% confidence level a three-layer Earth model consisting of a 50+30-25 km thick elastic lithosphere, an asthenosphere with viscosity eta A = (1.4 +/- 0.3) x 1019 Pa s and thickness 110+20-15 km, overlaying a viscous upper mantle half-space (etaum = 4 x 1020 Pa s)

    A case study of the cognitive apprenticeship model in leadership education

    Get PDF
    The cognitive apprenticeship model (CAM) has been examined for more than a quarter century as an instructional model from the perspectives of instructors. However, CAM is also a learning model. Remarkably little has been offered regarding the manner by which learners experience this model, and yet such perspectives are relevant to the successful design of CAM for instruction and learning. Accordingly, this research sought to describe learner perspectives, motivations, and coping strategies through the lived experiences of students as they used CAM within an education program to develop leadership competencies. Collins, Brown, and Newman's (1987) seminal work in CAM followed the theoretical traditions of Piaget, Bandura, and Vygotsky in cognitive and social learning models. Collins et al., elaborated beyond the physical task mastery of traditional apprenticeships to discover tacit knowledge within cognitive apprenticeships by asking, "How do masters think?" That past work begs new questions: How do learners describe their experiences using CAM in education? And are learner and instructor perspectives of mastery congruent? This research developed a case study using a grounded theory technique. Four students nearing the end of a three-year leadership program participated over the duration of a weeklong leadership session. Findings discovered that (1) learners preferred to explore through non-evaluated play; (2) failure elicited greater effort only if the learner initially expected to succeed; (3) humor was a preferred learner coping strategy; and (4) the learner's emotional state influenced adherence to the cognitive model. These findings suggest four key assumptions of learner participation in CAM require further study and refinement

    Tic: a Methodology for the Improvement of DHCP

    Full text link
    Linked lists must work. In fact, few biologists would disagree with the deployment of e-business, which embodies the private principles of artificial intelligence. Our focus in this paper is not on whether lambda calcu- lus can be made cacheable, event-driven, and pervasive, but rather on proposing a method- ology for the investigation of IPv6 (Tic) [4]

    Refinements to data acquired by 2-dimensional video disdrometers

    Get PDF
    The 2-Dimensional Video Disdrometer (2DVD) is a commonly used tool for exploring rain microphysics and for validating remotely sensed rain retrievals. Recent work has revealed a persistent anomaly in 2DVD data. Early investigations of this anomaly concluded that the resulting errors in rain measurement were modest, but the methods used to flag anomalous data were not optimized, and related considerations associated with the sample sensing area were not fully investigated. Here, we (i) refine the anomaly-detecting algorithm for increased sensitivity and reliability and (ii) develop a related algorithm for refining the estimate of sample sensing area for all detected drops, including those not directly impacted by the anomaly. Using these algorithms, we explore the corrected data to measure any resulting changes to estimates of bulk rainfall statistics from two separate 2DVDs deployed in South Carolina combining for approximately 10 total years of instrumental uptime. Analysis of this data set consisting of over 200 million drops shows that the error induced in estimated total rain accumulations using the manufacturer-reported area is larger than the error due to considerations related to the anomaly. The algorithms presented here imply that approximately 4.2% of detected drops are spurious and the mean reported effective sample area for drops believed to be correctly detected is overestimated by ~8.5%. Simultaneously accounting for all of these effects suggests that the total accumulated rainfall in the data record is approximately 1.1% larger than the raw data record suggests

    Global optimization for accurate determination of EBSD pattern centers

    Full text link
    Accurate pattern center determination has long been a challenge for the electron backscatter diffraction (EBSD) community and is becoming critically accuracy-limiting for more recent advanced EBSD techniques. Here, we study the parameter landscape over which a pattern center must be fitted in quantitative detail and reveal that it is both sloppy and noisy, which limits the accuracy to which pattern centers can be determined. To locate the global optimum in this challenging landscape, we propose a combination of two approaches: the use of a global search algorithm and averaging the results from multiple patterns. We demonstrate the ability to accurately determine pattern centers of simulated patterns, inclusive of effects of binning and noise on the error of the fitted pattern center. We also demonstrate the ability of this method to accurately detect changes in pattern center in an experimental dataset with noisy and highly binned patterns. Source code for our pattern center fitting algorithm is available online
    • …
    corecore