55,555 research outputs found
Helicopter human factors research
Helicopter flight is among the most demanding of all human-machine integrations. The inherent manual control complexities of rotorcraft are made even more challenging by the small margin for error created in certain operations, such as nap-of-the-Earth (NOE) flight, by the proximity of the terrain. Accident data recount numerous examples of unintended conflict between helicopters and terrain and attest to the perceptual and control difficulties associated with low altitude flight tasks. Ames Research Center, in cooperation with the U.S. Army Aeroflightdynamics Directorate, has initiated an ambitious research program aimed at increasing safety margins for both civilian and military rotorcraft operations. The program is broad, fundamental, and focused on the development of scientific understandings and technological countermeasures. Research being conducted in several areas is reviewed: workload assessment, prediction, and measure validation; development of advanced displays and effective pilot/automation interfaces; identification of visual cues necessary for low-level, low-visibility flight and modeling of visual flight-path control; and pilot training
Considering Human Aspects on Strategies for Designing and Managing Distributed Human Computation
A human computation system can be viewed as a distributed system in which the
processors are humans, called workers. Such systems harness the cognitive power
of a group of workers connected to the Internet to execute relatively simple
tasks, whose solutions, once grouped, solve a problem that systems equipped
with only machines could not solve satisfactorily. Examples of such systems are
Amazon Mechanical Turk and the Zooniverse platform. A human computation
application comprises a group of tasks, each of them can be performed by one
worker. Tasks might have dependencies among each other. In this study, we
propose a theoretical framework to analyze such type of application from a
distributed systems point of view. Our framework is established on three
dimensions that represent different perspectives in which human computation
applications can be approached: quality-of-service requirements, design and
management strategies, and human aspects. By using this framework, we review
human computation in the perspective of programmers seeking to improve the
design of human computation applications and managers seeking to increase the
effectiveness of human computation infrastructures in running such
applications. In doing so, besides integrating and organizing what has been
done in this direction, we also put into perspective the fact that the human
aspects of the workers in such systems introduce new challenges in terms of,
for example, task assignment, dependency management, and fault prevention and
tolerance. We discuss how they are related to distributed systems and other
areas of knowledge.Comment: 3 figures, 1 tabl
Inventory drivers in a pharmaceutical supply chain
In recent years, inventory reduction has been a key objective of pharmaceutical companies, especially within cost optimization initiatives. Pharmaceutical supply chains are characterized by volatile and unpredictable demands –especially in emergent markets-, high service levels, and complex, perishable finished-good portfolios, which makes keeping reasonable amounts of stock a true challenge. However, a one-way strategy towards zero-inventory is in reality inapplicable, due to the strategic nature and importance of the products being commercialised. Therefore, pharmaceutical supply chains are in need of new inventory strategies in order to remain competitive.
Finished-goods inventory management in the pharmaceutical industry is closely related to the manufacturing systems and supply chain configurations that companies adopt. The factors considered in inventory management policies, however, do not always cover the full supply chain spectrum in which companies operate. This paper works under the pre-assumption that, in fact, there is a complex relationship between the inventory configurations that companies adopt and the factors behind them.
The intention of this paper is to understand the factors driving high finished-goods inventory levels in pharmaceutical supply chains and assist supply chain managers in determining which of them can be influenced in order to reduce inventories to an optimal degree. Reasons for reducing inventory levels are found in high inventory holding and scrap related costs; in addition to lost sales for not being able to serve the customers with the adequate shelf life requirements. The thesis conducts a single case study research in a multi-national pharmaceutical company, which is used to examine typical inventory configurations and the factors affecting these configurations.
This paper presents a framework that can assist supply chain managers in determining the most important inventory drivers in pharmaceutical supply chains. The findings in this study suggest that while external and downstream supply chain factors are recognized as being critical to pursue inventory optimization initiatives, pharmaceutical companies are oriented towards optimizing production processes and meeting regulatory requirements while still complying with high service levels, being internal factors the ones prevailing when making inventory management decisions.
Furthermore, this paper investigates, through predictive modelling techniques, how various intrinsic and extrinsic factors influence the inventory configurations of the case study company. The study shows that inventory configurations are relatively unstable over time, especially in configurations that present high safety stock levels; and that production features and product characteristics are important explanatory factors behind high inventory levels. Regulatory requirements also play an important role in explaining the high strategic inventory levels that pharmaceutical companies hold
Collinsville solar thermal project: yield forecasting (draft report)
The final report has been published and is available here.
Executive Summary
1 Introduction
This report’s primary aim is to provide yield projections for the proposed Linear Fresnel Reflector (LFR) technology plant at Collinsville, Queensland, Australia. However, the techniques developed in this report to overcome inadequate datasets at Collinsville to produce the yield projections are of interest to a wider audience because inadequate datasets for renewable energy projects are commonplace. The subsequent report called ‘Energy economics and dispatch forecasting’ (Bell, Wild & Foster 2014a) uses the yield projections from this report to produce long-term wholesale market price and dispatch forecasts for the plant.
2 Literature review
The literature review discusses the four drivers for yield for LFR technology:
DNI (Direct Normal Irradiance)
Temperature
Humidity
Pressure
Collinsville lacks complete historical datasets of the four drivers to develop yield projects but its three nearby neighbours do possess complete datasets, so could act as proxies for Collinsville. However, analysing the four drivers for Collinsville and its three nearby sites shows that there is considerable difference in their climates. This difference makes them unsuitable to act as proxies for yield calculations. Therefore, the review investigates modelling the four drivers for Collinsville.
We introduce the term “effective” DNI to help clarify and ameliorate concerns over the dust and dew effects on terrestrial DNI measurement and LFR technology.
We also introduce a modified TMY technique to overcome technology specific Typical Metrological Year (TMY). We discuss the effect of climate change and the El Nino Southern Oscillation (ENSO) on yield and their implications for a TMY.
2.1 Research questions
Research question arising from the literature review include:
The overarching research question:
Can modelling the weather with limited datasets produce greater yield predictive power than using the historically more complete datasets from nearby sites?
This overarching question has a number of smaller supporting research questions:
Is BoM’s DNI satellite dataset adequately adjusted for cloud cover at Collinsville?
Given the dust and dew effects, is using raw satellite data sufficient to model yield?
Does elevation between Collinsville and nearby sites affect yield?
How does the ENSO affect yield?
Given the 2007-2012 constraint, will the TMY process provide a “Typical” year over the ENSO cycle?
How does climate change affect yield?
A further research question arises in the methodology but is included here for completeness.
What is the expected frequency of oversupply from the Linear Fresnel Novatec Solar Boiler?
3 Methodology
In the methodology section, we discuss the data preparation and the model selection process for the four drivers of yield.
4 Results and analysis
In the results section we present the four driver models selected and the process that was undertaken to arrive at the models.
5 Discussion
We analyse the extent to which the research questions are informed by the results.
6 Conclusion
In this report, we have identified the key research questions and established a methodology to address these questions. The models for the four drivers have been established allowing the calculation of the yield projections for Collinsville
Query processing of spatial objects: Complexity versus Redundancy
The management of complex spatial objects in applications, such as geography and cartography,
imposes stringent new requirements on spatial database systems, in particular on efficient
query processing. As shown before, the performance of spatial query processing can be improved
by decomposing complex spatial objects into simple components. Up to now, only decomposition
techniques generating a linear number of very simple components, e.g. triangles or trapezoids, have
been considered. In this paper, we will investigate the natural trade-off between the complexity of
the components and the redundancy, i.e. the number of components, with respect to its effect on
efficient query processing. In particular, we present two new decomposition methods generating
a better balance between the complexity and the number of components than previously known
techniques. We compare these new decomposition methods to the traditional undecomposed representation
as well as to the well-known decomposition into convex polygons with respect to their
performance in spatial query processing. This comparison points out that for a wide range of query
selectivity the new decomposition techniques clearly outperform both the undecomposed representation
and the convex decomposition method. More important than the absolute gain in performance
by a factor of up to an order of magnitude is the robust performance of our new decomposition
techniques over the whole range of query selectivity
Autonomous 3D Exploration of Large Structures Using an UAV Equipped with a 2D LIDAR
This paper addressed the challenge of exploring large, unknown, and unstructured
industrial environments with an unmanned aerial vehicle (UAV). The resulting system combined
well-known components and techniques with a new manoeuvre to use a low-cost 2D laser to measure
a 3D structure. Our approach combined frontier-based exploration, the Lazy Theta* path planner, and
a flyby sampling manoeuvre to create a 3D map of large scenarios. One of the novelties of our system
is that all the algorithms relied on the multi-resolution of the octomap for the world representation.
We used a Hardware-in-the-Loop (HitL) simulation environment to collect accurate measurements
of the capability of the open-source system to run online and on-board the UAV in real-time. Our
approach is compared to different reference heuristics under this simulation environment showing
better performance in regards to the amount of explored space. With the proposed approach, the UAV
is able to explore 93% of the search space under 30 min, generating a path without repetition that
adjusts to the occupied space covering indoor locations, irregular structures, and suspended obstaclesUnión Europea Marie Sklodowska-Curie 64215Unión Europea MULTIDRONE (H2020-ICT-731667)Uniión Europea HYFLIERS (H2020-ICT-779411
A strategy for achieving manufacturing statistical process control within a highly complex aerospace environment
This paper presents a strategy to achieve process control and overcome the previously mentioned industry constraints by changing the company focus to the process as opposed to the product. The strategy strives to achieve process control by identifying and controlling the process parameters that influence process capability followed by the implementation of a process control framework that marries statistical methods with lean business process and change management principles. The reliability of the proposed strategy is appraised using case study methodology in a state of the art manufacturing facility on Multi-axis CNC machine tools
- …